Meta and Google Held Liable for “SNS Addiction” Design, First Damages Verdict Targets Even Like and Notification Features
Input
Modified
Recognition of platform liability points to broader wave of similar lawsuits
Repeated exposure and reward-based design structures come under scrutiny
Shift from illness to liability could reshape the broader industry

A U.S. jury has found that Meta and Google caused harm to users through social media designs that foster addiction. The verdict, which imposed $6 million in damages on the two companies, drew industry attention because it was the first case to take issue not merely with content but with the functions and design architecture of the platforms themselves. As academic research points to addictive social media use as a driver of risky behavior, and some countries move to restrict youth access, debate surrounding the issue is also intensifying.
Compensatory damages followed by punitive liability
According to Bloomberg on the 25th (local time), a first-instance jury in Los Angeles, California, ruled the previous day that Meta and Google were liable for damages because their designs encouraged user addiction and caused harm. The jury ordered Meta to pay $4.2 million and Google $1.8 million. Half of that amount was awarded as compensation for actual harm, including treatment costs, while the other half consisted of punitive damages intended to punish the companies’ conduct and deter future wrongdoing.
The verdict came after a trial that lasted more than a month and more than 40 hours of jury deliberation. The plaintiff, a woman in her 20s identified as Kailey G.M., filed suit claiming that she developed depression and physical disabilities as a result of social media addiction after beginning to use YouTube at age six and Instagram at age nine. She argued that the platforms were designed to maximize user engagement through features such as autoplay, “like” notifications, and infinite scroll. Counsel for Kailey said that “adolescents are more vulnerable to addiction because their brains are still developing,” emphasizing the platforms’ responsibility.
The focus of the case consequently shifted to platform design. Under current U.S. law, platform liability is often limited with respect to user-posted content, but in this case the key question was whether the design structure itself had fostered addiction. Meta argued that factors other than social media had contributed to the plaintiff’s mental health problems, while Google countered that YouTube is a streaming platform rather than social media, but the jury rejected both arguments. In doing so, it recognized that platform design itself can lead to addiction and mental health harm for users.
If the verdict stands, similar outcomes may follow in future lawsuits against social media companies. Cornell University professor Sarah Kreps said, “There are hundreds of similar cases pending in California alone and thousands across the United States,” adding that “even a single ruling like this could open the floodgates for a wave of follow-on lawsuits.” Experts broadly agree that if courts continue to find a direct link between platform design and user harm, the scope of corporate liability could expand to encompass the architecture of digital services as a whole, with major consequences for the entire industry.

The core issue is “addiction,” not “screen time”
The impact of social media on adolescent mental health has become a major subject of academic research. Recent studies have identified “addictive use,” rather than sheer time spent, as the key criterion. A research team led by Cornell University professor Xiao Yunyue tracked 4,285 children across the United States over four years and analyzed the relationship between their screen habits and suicidal behavior. The study found that prolonged screen time itself did not show a statistically meaningful correlation with suicidal behavior, but among those who said social media use “felt addictive,” the risk of suicidal ideation and self-harm rose by as much as threefold. This suggests that uncontrolled, repetitive use is the more direct risk factor.
Addictive use is accompanied by distinct behavioral traits. It is marked by difficulty staying away from devices, increasingly intense urges to use them, and repeated experiences of emotional distress when use is restricted. The study found that nearly half of the children surveyed maintained high levels of addiction after age 11. Another 25% began with low levels of addiction but showed a sharp increase over time, and in that group the risk of suicidal behavior more than doubled. Addictive use was especially prevalent among adolescents from households earning $75,000 or less annually, single-parent households, and Black and Hispanic youth, highlighting a link to socioeconomic conditions as well.
These findings suggest that the risks associated with social media cannot be addressed simply by limiting usage time. Earlier research had already shown that excessive screen time can harm mental health through sleep deprivation, reduced exercise, and less face-to-face interaction, but addictive use is increasingly categorized as a more direct causal pathway. Professor Xiao said, “Addictive use can have more severe effects during childhood, when self-control is not yet fully developed,” adding that “simply taking smartphones away may intensify conflict, so professional intervention such as cognitive behavioral therapy is needed.”
Policy responses are also moving in a direction that reflects this shift in understanding. In Australia, for example, a law restricting social media use by children under 16 took effect on December 10 of last year. It applies to 10 platforms, including Facebook, Instagram, YouTube, TikTok, X, and Snapchat, and places responsibility for enforcement on the platforms themselves. Companies that fail to meet obligations such as age verification face fines of up to $32 million. However, after restrictions took effect, users began shifting to alternative platforms such as Lemon8 and Yope, producing a kind of balloon effect, and debate over the actual effectiveness of the regulation remains intense.
Potential expansion of liability into treatment and compensation
Efforts have also continued to define social media addiction as a serious disorder and to treat it as such. As studies increasingly link excessive use to a range of psychiatric conditions, including narcissistic personality disorder, body dysmorphic disorder, erotomania, and anorexia, social media addiction has begun to be recognized as a broader social issue. A research team at Simon Fraser University in Canada, after reviewing more than 2,500 academic papers, found that social media environments can sustain distorted self-perceptions without reality testing, contributing to delusions and other forms of mental disorder.
The researchers noted that such distortions become more severe when they persist in an environment with limited real-world feedback. Dr. Bernard Crespi, who led the study, said, “Social media creates a kind of virtual space that satisfies the conditions for psychiatric disorders and, because there is no effective reality checking, becomes an environment in which delusions can more easily emerge and persist.” He added, “Social media itself is not inherently problematic, but the virtual world, combined with social isolation in real life, creates conditions in which people can maintain delusional senses of self-identity without close scrutiny.”
This approach is grounded in the view that social media addiction should be treated as a therapeutic issue. That is why digital detox programs and cognitive behavioral therapy, both centered on individual users, are often presented as the main solutions. Social media operates through reward systems that reinforce user behavior, while repeated feedback such as likes and comments increases dopamine release and sustains continued use. It also fuels social comparison and anxiety, with repeated exposure driven by comparison with others and fear of missing out. These mechanisms are highly likely to take the form of behavioral addiction, which in turn often leads to weakened self-control and psychological instability.
More recently, however, discussion has begun to move beyond treating social media addiction solely as a matter of individual therapy. Cases like those involving Google and Meta have pushed design and environmental conditions that foster addiction to the center of the debate, shifting the focus of responsibility from users to platforms. Advocates of platform liability argue that because social media has become a channel shaping social interaction and information consumption more broadly, mental health issues can no longer be separated from service design. That implies the possibility that responsibility could expand across the wider internet services industry, depending on how platform features affect users’ behavior and perception.