At Blaszkow Legal, PLLC we know that in recent years a growing chorus of sentiment has advanced the proposition that social media is bad for our health in any number of ways. On March 25, 2026, a Los Angeles Superior Court jury lent credence to that point of view, finding Meta (Instagram) and Google (YouTube) negligent in designing social media platforms intentionally engineered to addict children to their platforms, awarding $3 million in damages to a 20-year-old California woman who said the platforms destroyed her childhood.
The Plaintiff, identified as Kaley G.M., is a 20-year-old from Chico, California, who began using YouTube at age 6 and Instagram at age 9. The seven-week trial began in late January 2026. Co-Defendants TikTok and Snapchat settled with Kaley for undisclosed sums before the trial began.
The case has focused on several critical legal and factual questions:
Negligent Product Design – Kaley’s attorneys alleged that Instagram and YouTube incorporated features specifically designed to hook young users, including autoplay functionality,
infinite scroll, push notifications, and algorithmic content delivery calibrated to maximize engagement. Jurors heard testimony from addiction experts, therapists, platform engineers. Meta’s CEO Mark Zuckerberg defended Instagram’s safety record while acknowledging the difficulty of keeping children off the app.
Failure to Warn – The plaintiff argued both companies knew their platforms could cause mental health harm to children but failed to warn users or parents of these risks.
Causation vs. Preexisting Conditions – Meta and Google contended Kaley’s depression and suicidal ideation stemmed from her “fractious home life” and the COVID-19 pandemic—not from social media. Defense attorneys emphasized that “not one of her therapists identified social media as the cause” and noted there is no formal DSM diagnosis for “social media addiction”.
Section 230 Defense – A pivotal legal battle centered around whether Section 230 of the Communications Decency Act (1996) shield social media platforms from liability. The trial judge issued an early ruling that Section 230 does not protect companies when the claims focus on their own design choices rather than on third-party content.
YouTube as a “Social Media Platform” – Google argued that YouTube is fundamentally different from Instagram—more like television than a social network, and therefore not subject to the same addiction claims. Kaley’s team called it “a gateway” to her social media addiction. Google’s counsel argued that YouTube was “a toy a child liked and then put down”—essentially television, not social media. Lanier demolished this with a simple analogy: “Substitute ‘YouTube’ for ‘methamphetamine.’ Ask yourselves: Could anyone suffering from addiction just ‘lose interest’? Or does the gateway drug lead to the next, and the next, until you can’t stop?”.
In a memorable moment in the trial, plaintiff’s attorney Mark Lanier wielded an actual hammer, telling jurors: “When you have a hammer, everything looks like a nail. For Meta and Google, children looked like revenue”. The prop underscored his argument that the companies’ engagement-maximizing tools were weapons, not neutral features.
Lanier’s courtroom persona is deliberately anti-corporate. He wears cowboy boots, speaks in a measured Southern cadences, and addresses jurors as “folks” rather than “ladies and gentlemen.” His direct examinations were conversational; his cross-examinations were deceptively gentle before landing devastating blows.
Designed to Addict
Lanier’s central thesis was that Instagram and YouTube weren’t accidentally addictive—they were engineered that way. He walked jurors through internal documents showing:
● Algorithms calibrated to maximize “time on device”
● A/B testing of notification timing to interrupt sleep
● Features like autoplay and infinite scroll deliberately removing “stopping cues”
Lanier told the jury: “They didn’t stumble into addiction. They studied it. They hired behavioral psychologists. They built it in”. He emphasized that both companies conducted internal research showing mental health harms to teens—then buried it. He quoted a 2020 Meta slide: “We make body image issues worse for 1 in 3 teen girls” and asked: “If you know your product harms children, what is your duty? Warn? Fix? Or keep selling?”.
The jury deliberated for more than 40 hours across nine days before returning their verdict:
| Defendant | Liability Finding | Damages Allocation |
|---|---|---|
| Meta (Instagram) | Negligent | Substantial factor in harm, 70% ($2.1 million) |
| Google (YouTube) | Negligent | Substantial factor in harm, 30% ($900,000) |
| Total Award | $3 million | |
Kaley remained stoic as the verdict was read in court. Observers wept silently despite the Judge’s warnings against courtroom reactions. Meta and Google have vowed to appeal. But for now, the legal shield that protected Silicon Valley for three decades has cracked. As attorney Lanier asked jurors in closing: “What is a lost childhood worth?”. For Kaley G.M., the answer was $3 million. For the tech industry, the cost may be far higher.
This verdict potentially represents a seismic shift in tech accountability litigation:
Bellwether Effect
This was the first-ever jury trial holding social media companies accountable for youth addiction. The case was selected as a bellwether from hundreds of consolidated lawsuits in California state court. There are hundreds more cases proceeding in federal court as well, with the first federal trial set for June 2026 in San Francisco.
Section 230 Erosion
The verdict confirms that Section 230 does not shield platforms from design-defect claims. This opens the door to thousands of pending cases to proceed past motions to dismiss. Anticipating the Section 230 defense, Lanier drew a sharp line: “This case isn’t about what someone posted. It’s about what Meta and Google built. Section 230 protects speech. It doesn’t protect defective products”.
Damages Benchmark
While $3 million is modest compared to the $375 million Meta was ordered to pay in a separate New Mexico child-safety verdict announced just one day earlier, experts say this award will set the baseline for individual plaintiff cases.
Internal Documents Exposed
The trial made public tens of thousands of pages of internal Meta and Google documents showing companies studied and intentionally targeted children’s engagement patterns. Plaintiff’s attorney Mark Lanier told jurors: “It’s given you exposure that the world hasn’t had”.
Insurance Coverage Challenges
A recent Delaware court ruling cleared Meta’s insurers of responsibility for child-harm lawsuits, meaning Meta and other tech companies may face billions in direct liability without insurance coverage.
Regulatory Momentum
The verdict arrives amid heightened scrutiny from state attorneys general. New Mexico’s successful $375 million case and now California’s jury verdict will likely accelerate legislative efforts at both state and federal levels.
Implications in Virginia
Virginia is already out in front with the Virginia Consumer Data Protection Act’s (VCDPA) one-hour-per-day limit for users under 16, enforceable by the Attorney General with up to $7,500 per violation, contingent on surviving the NetChoice challenge. A federal judge has preliminarily enjoined the separate time limit law on First Amendment grounds as overbroad and not narrowly tailored, but that injunction is on appeal. A jury finding that design choices (not “speech”) are defective and addictive will be cited heavily by the AG to justify aggressive VCDPA enforcement and defend these laws in the Fourth Circuit and, eventually, at the Supreme Court of the United States, also called SCOTUS.
Virginia Attorney General Jay Jones is already positioning himself as a national player on youth social media issues, publicly committing to “fully enforce” minors’ time limit provisions and framing social media harms as a consumer protection problem for Virginia families. The LA verdict creates political cover for tightening Virginia’s regime: broader definitions of “social media platform,” clearer duties to deploy age gating, and possibly a private right of action tied to specific design features (autoplay, infinite scroll, push notification cadence).
This article is for informational purposes only and does not constitute legal advice. These claims involve potential damages that are severe including anxiety, depression, eating disorders (anorexia, bulimia, ARFID), self-harm (cutting, burning), suicide attempts or hospitalizations, substance abuse triggered or worsened by social media, academic collapse, and sexual exploitation/sextortion. If you believe you have been harmed by social media platforms seek prompt legal advice to protect your rights. Consulting with a Springfield, VA personal injury lawyer can provide additional clarity on how these claims are evaluated and what steps may be available moving forward.