Two landmark verdicts against Meta and YouTube— and What They Mean for You: Is it a tobacco moment? Or will BigTech simply appeal and stand on the might that roared?
Within a mere 48 hours this week, two American juries rendered verdicts that legal experts are already deeming a pivotal moment for the global technology industry. American juries have found Meta, the parent company of Facebook, Instagram, and WhatsApp, and Google’s YouTube liable for intentionally designing their platforms to addict children. The comparison being drawn by commentators on both sides of the Atlantic is unmistakable: this week is the moment Big Tech faces its Big Tobacco reckoning.
At Lawdit Solicitors, we believe these verdicts are not simply a matter for American lawyers and American courts. They carry direct and serious implications for UK businesses, parents, and consumers. Here is what happened, why it matters, and—critically—what action you should be considering right now.
The always excellent https://nmdoj.gov/press-release/new-mexico-department-of-justice-wins-landmark-verdict-against-meta/ provides excellent commentary on this issue.
Case One: KGM v Meta and YouTube — Los Angeles, 25 March 2026
The first case was heard at the Los Angeles County Superior Court before Judge Carolyn B. Kuhl. The plaintiff, a 20-year-old woman referred to throughout proceedings only by her initials KGM – and in the courtroom as ‘Kaley’ – brought a personal injury claim against Meta and Google, alleging that their platforms had caused her severe depression, anxiety and addiction through deliberate design choices to which she was exposed from childhood.
Kaley told the court she began using YouTube at the age of six and Instagram at nine. By the time she left primary school, she had posted 284 videos online. She described spending her entire day on social media, withdrawing from her family, and developing anxiety and depression at the age of ten—conditions formally diagnosed shortly afterwards.
| The Jury’s Findings After more than 44 hours of deliberation over nine days, the jury found that Meta and YouTube had been negligent in the design and operation of their platforms, and this negligence was a substantial factor in causing harm to the plaintiff. Jurors also found that the companies had failed to adequately warn users about the dangers of using their services. In a further and highly significant step, jurors recommended $3 million in punitive damages — a finding that requires them to have concluded that Meta and Google had acted with malice, oppression, or fraud. |
The total damages award stands at $6 million: $3 million in compensatory damages and $3 million in punitive damages. Meta bears 70% of that liability, with YouTube responsible for the remainder. The judge has yet to enter final judgement, and both companies have stated their intention to appeal.
How the Lawyers Got Around Section 230
For decades, US technology companies have sheltered behind Section 230 of the Communications Decency Act 1996, which provides that internet platforms cannot be held liable for content posted by their users. This legal shield has made it extraordinarily difficult to bring successful claims against social media companies for harms caused on their platforms.
The plaintiff’s legal team adopted a fundamentally different strategy. Rather than attacking the content appearing on the platforms, they attacked the architecture of the platforms themselves—the design. Features such as infinite scrolling, autoplay video, algorithmic recommendation engines, beauty filters, and push notification systems were presented as deliberately engineered tools of addiction, designed to maximise engagement regardless of psychological cost to the user. By framing the case as one of defective product design rather than harmful content, the lawyers were able to sidestep Section 230 entirely.
Internal Meta documents proved devastating in court. One memorandum, read aloud to the jury, stated: ‘If we want to win big with teens, we must bring them in as tweens.’ Another showed that 11-year-olds were four times more likely to keep returning to Instagram than to competing apps— despite the platform’s own minimum age requirement of 13. A separate internal study, referred to as ‘Project Myst’, allegedly found that children who had experienced adverse effects were among the most likely to become addicted to Instagram and that parents were powerless to intervene.
Mark Zuckerberg himself provided evidence for Meta’s defence, telling the court that he was not attempting to maximise the time users spent on the platform and acknowledging that the company wished it could have introduced safety tools sooner. The jury plainly did not find this persuasive.
Case Two: New Mexico Attorney General v Meta — Santa Fe, 24 March 2026
Just one day before the Los Angeles verdict, a jury in Santa Fe, New Mexico, concluded a separate trial brought by New Mexico’s Attorney General, Raúl Torrez, against Meta. This case was distinct in nature: it was a state enforcement action rather than a private civil claim, and it focused not on addiction but on the failure to protect children from online predators and sexual exploitation.
https://nmdoj.gov/press-release/new-mexico-department-of-justice-wins-landmark-verdict-against-meta/
The New Mexico Attorney General’s office conducted an undercover operation, creating a fake social media profile of a 13-year-old girl. Adult users seeking to exploit the apparent minor quickly inundated the profile. The state argued that Meta had made false and misleading statements to consumers about the safety of its platforms and had engaged in what the jury found to be ‘unconscionable’ trade practices that exploited children’s vulnerability and inexperience.
| The Verdict The New Mexico jury found Meta liable on all counts under the state’s Unfair Practices Act and identified thousands of individual violations. The total penalty imposed was $375 million. Attorney General Torrez characterised it as ‘a historic victory for every child and family who has suffered due to Meta’s decision to prioritise profits over children’s safety. ‘Meta has indicated it will appeal. |
This verdict is significant for a different reason to the Los Angeles case. It demonstrates that social media companies face exposure not only in private litigation brought by injured individuals but also from state regulatory and enforcement action framed around consumer protection laws. The $375 million penalty dwarfs the Los Angeles award in financial terms — and the case’s second phase, in which a judge will determine whether Meta created a public nuisance, is scheduled for May 2026. That phase could result in further substantial remedies.
The Wider Legal Landscape
These two verdicts do not stand in isolation. TikTok and Snap, who were named as co-defendants in the Los Angeles proceedings, reached settlements before the matter went to trial—a detail that, as one commentator noted, tells you a great deal about how seriously the technology industry views the litigation risk these cases represent.
There are now more than 3,000 similar lawsuits pending in courts across the United States, involving claims ranging from eating disorders and self-harm to the deaths of teenagers by suicide. The Los Angeles verdict functions as a “bellwether”—a test case whose outcome provides a strong signal to all parties and the courts about how juries respond to this evidence when they see it unfiltered. The implications for settlement negotiations across those 3,000 pending cases are significant.
Many legal commentators have explicitly drawn the comparison to the tobacco litigation of the 1990s, which ultimately forced an entire industry to accept liability, change its practices, and pay billions in damages and public health funding. Whether social media litigation follows the same trajectory remains to be seen — but the trajectory has begun.
What This Means for the United Kingdom
UK clients and businesses reading this report should not assume that American verdicts are a distant concern. Here, we are already experiencing the legal, regulatory, and political ramifications.
The Regulatory Picture
The Online Safety Act 2023 gave Ofcom significant new powers to regulate online platforms operating in the UK, with a particular focus on protecting children from harm. Ofcom now has genuine regulatory teeth: the ability to impose fines of up to 10% of global annual turnover and, ultimately, to block access to non-compliant services in the UK. The American verdicts will inevitably inform Ofcom’s enforcement priorities and its interpretation of what constitutes adequate protection for child users.
UK ministers have already responded to the verdicts. Prime Minister Sir Keir Starmer described the rulings as reflecting a broader shift in public mood, stating clearly: ‘It’s not if things are going to change; things are going to change.’ The government’s consultation on potential restrictions on social media use for under-16s closes in late May 2026, with a response expected before the end of July. Australia moved first, banning under-16s from major social platforms in December 2025; the UK is watching closely.
The Consumer Law Angle
UK legal experts are advancing an analysis in the wake of these verdicts, which is perhaps most immediately relevant for UK consumers and claimants. The Consumer Rights Act 2015 contains product liability provisions that—unlike the Online Safety Act— focus on product defects rather than content. Crucially, this ruling means consumers may be able to seek remedies, including damages, without proving intent on the part of the platform. The American plaintiffs’ lawyers’ strategy of framing addictive design as a defective product has clear parallels in UK consumer law.
The Digital Markets, Competition, and Consumers Act (2024) provides additional regulatory levers. UK competition authorities, including the Competition and Markets Authority, have shown a growing appetite for taking on large technology companies — as demonstrated by the CMA’s successful action requiring Meta to divest GIPHY. The landscape for challenging Big Tech in the UK has never been more favourable.
| Lawdit’s View – |
For platform operators, the immediate priority is a candid review of your product’s design features. Infinite scroll, autoplay, algorithmic personalisation, push notifications, reward mechanics—any feature specifically designed to maximise engagement or time on theplatform should be examined through the lens of what these verdicts have established: that deliberate design for addiction, particularly where minors are concerned, can constitute a defective product carrying liability. Equally critical is reviewing what your internal documents say. The Meta internal memoranda were among the most damaging evidence in both trials. If your business has internal communications that describe user engagement in terms that could be characterised as exploitative, you need to know that now—and take appropriate steps.
These verdicts provide parents and individuals with the opportunity to pursue claims that previously appeared unlikely to succeed. If you or your child have suffered significant documented mental health harm that you believe is connected to the deliberate design of a social media platform, the legal landscape has shifted materially in your favour. We would encourage you to seek specialist legal advice.
For all businesses with a digital footprint, the lesson from these cases is one of transparency and documentation. Companies that can demonstrate genuine, proactive efforts to protect their users—particularly younger and more vulnerable users—and that maintain clear internal records of those efforts are in a substantially stronger position than those relying on historical assumptions about immunity.
Conclusion
What happened in two American courtrooms this week is not a local legal development. It is the opening of a new chapter in the relationship between technology, law, and the protection of children and consumers. The dam, as one commentator put it, is beginning to break.
The UK’s regulatory framework is already in place. Consumer law provides additional routes to redress. Political pressure is intensifying. And the evidence — internal documents, expert testimony, and now two jury verdicts — is in the public domain. For any business or individual with a stake in the digital economy, the time to take these developments seriously is now.
At Lawdit Solicitors, our commercial and technology law team has extensive experience advising businesses on regulatory compliance, digital product liability, and consumer law. If you have questions arising from these verdicts — whether you are a platform operator concerned about your legal exposure or an individual seeking to understand your rights — we are here to help.
Lawdit Solicitors
Founded 2001 | Commercial Law & Intellectual Property | lawdit.co.uk | trademarkroom.com

