A Los Angeles jury ruled Wednesday that Meta’s Instagram and Google’s YouTube deliberately designed features to hook children, ordering the tech giants to pay $6 million in combined damages in a landmark verdict that could reshape thousands of pending lawsuits against Big Tech platforms.
Landmark Verdict Bypasses Legal Protections
The case centered on 20-year-old Kaley, who began using YouTube at age 6 and Instagram at age 9. By age 10, she had uploaded over 200 YouTube videos. By 15, she maintained 15 separate Instagram accounts and spent as much as 16 hours daily scrolling through content. The jury awarded $3 million in compensatory damages and $3 million in punitive damages, with Meta liable for $4.2 million and Google for $1.8 million.
The verdict bypassed Section 230 of the Communications Decency Act of 1996, which traditionally shields tech companies from liability. Attorneys targeted product design features like infinite scroll and autoplay rather than content itself. Clay Calvert of the American Enterprise Institute explained the distinction matters because it addresses whether platform design constitutes a defective product causing harm, not merely objectionable content users encounter.
Thousands of Cases Could Follow
More than 3,000 similar lawsuits against Meta, YouTube, Snapchat, and TikTok currently await resolution in California courts. Both Snapchat and TikTok settled with Kaley before the trial began. The Tech Oversight Project’s Sacha Haworth called the outcome an earthquake shaking Big Tech’s business model to its core, particularly after testimony from executives like Mark Zuckerberg revealed the industry’s approach to user safety.
What This Means
At least half of American teenagers use YouTube or Instagram daily, according to Pew Research Center data. Legal experts consider this verdict a breakthrough, validating that platform design can constitute a defective product causing measurable harm. One day earlier, a separate New Mexico jury ruled Meta failed to protect children from sexual predators and misled users about platform safety, finding the company violated consumer protection laws. For families concerned about children’s technology use, these verdicts signal growing accountability for companies profiting from features designed to maximize engagement regardless of developmental impact on young users.

Her parents should be held accountable. Where were they during all the time she spent on the computer? Her parents probable used the computer as a babysitting service.