The case, known as K.G.M. v. Meta et al., centres on allegations that social media platforms were deliberately designed to encourage compulsive usage among minors.

Shares of Meta Platforms and Alphabet Inc. declined after a US jury found both companies liable in a social media addiction case, triggering concerns about broader legal and regulatory risks. Meta recorded an 8% fall in overnight trade, while Google slipped by over 3%.
While the damages awarded in the case were relatively modest, the market reaction was driven by the precedent the ruling may establish. According to analysts, investors are factoring in the possibility of a wave of similar lawsuits, along with the risk that companies may be forced to alter platform features that currently drive user engagement and, by extension, advertising revenue. This comes as a blow at a time when countries like Australia and Spain—which have already placed bans on social media access for audiences under 16.
Fortune India presents an explainer about the case and what it means for companies like Meta, Google, TikTok and Snap.
The case, known as K.G.M. v. Meta et al., centres on allegations that social media platforms were deliberately designed to encourage compulsive usage among minors. The 20-year-old plaintiff argued that early and prolonged exposure to platforms such as Instagram and YouTube contributed to deteriorating mental health, including anxiety, depression, and body image issues.
What distinguishes this case from earlier litigation is its focus, as it does not target harmful content, instead, the lawsuit examined the structural design of the platforms themselves, questioning whether their core features incentivise excessive use.
What did the jury decide?
The jury in the Los Angeles trial found both Meta and Google negligent and awarded damages amounting to approximately $6 million. Meta was found liable for 70% of the damages, amounting to $4.2 million, and the remaining by Google. More importantly, the verdict recognised that certain platform features, including algorithm-driven feeds and continuous content delivery mechanisms, can contribute to patterns of compulsive use.
This acknowledgement marks a shift in legal reasoning, as it links user harm not merely to content exposure but to the way platforms are engineered.
Google and Meta disagreed with the verdict and are planning to appeal.
This is not the only case that is looming over Meta—a jury in New Mexico found the company liable for failing to protect children from explicit content, solicitation and human trafficking and was ordered to pay $375 million in civil penalties.
The comparison arises from the framing of social media platforms as potentially addictive products. Much like litigation against tobacco companies focused on whether firms were aware of and profited from the addictive nature of their products, this case raises questions about whether technology companies knowingly designed systems that maximise user dependence.
If this analogy gains legal traction, it could pave the way for treating social media platforms as products with foreseeable risks, rather than as neutral conduits of information.
How does this affect existing legal protections?
Technology companies in the United States have long relied on Section 230, which shields them from liability for user-generated content. However, this case bypasses that protection by shifting the focus to product design rather than content moderation.
This distinction is significant because it opens a new legal pathway for holding platforms accountable, even within the existing regulatory framework.
What is India’s position on social media addiction?
India has not yet witnessed a comparable judicial verdict, but the issue of social media use among minors is gaining policy attention. The Union government has increasingly framed excessive screen time and online exposure as concerns linked to child safety and mental well-being.
Discussions around age-gating mechanisms, parental controls, and platform accountability are gradually moving from advisory conversations to potential regulatory action.
At the state level, both Karnataka and Andhra Pradesh have proposed restrictions aimed at limiting minors’ access to social media platforms. Karnataka has suggested curbs for users below the age of 16, while Andhra Pradesh has proposed restrictions for children under 13, with the possibility of extending them further.
These proposals are driven by growing concerns around digital addiction, cyberbullying, and the psychological impact of prolonged social media use among children.
What is the broader takeaway?
The ruling against Meta and Google signals a shift in how courts may approach digital platforms. The focus is moving beyond user behaviour to examine whether platforms are intentionally designed to maximise engagement in ways that may lead to harm.
If this line of reasoning gains momentum, it could reshape not only legal accountability but also the fundamental design principles of social media platforms globally.