Internal Docs Show Instagram Tracked Teen Engagement as Growth Target Amid Legal Storm
Meta faces mounting litigation as evidence reveals the company monitored rising teen usage metrics while courts weigh liability for youth mental health harms.
Internal Meta documents revealed in California court proceedings show Instagram tracked daily user engagement climbing from 40 minutes in 2023 to 46 minutes in 2026, with executives treating teen screen time as a key performance indicator despite public commitments to youth safety. The disclosures, emerging from testimony in K.G.M. v. Platforms et al. in Los Angeles County Superior Court, add evidentiary weight to claims that the company prioritized growth metrics over minor protection—a charge now central to more than 2,300 pending cases across federal and state jurisdictions.
Testimony Surfaces Internal Growth Priorities
Mark Zuckerberg’s February testimony before a Los Angeles jury exposed internal communications showing TechCrunch that “the top priority for the company in the first half of 2017 is teens,” according to an internal email from a former Instagram product manager. Lawyers argued the company set goals to increase “teen time spent” on the platform even as NPR documented it was aware of 4 million underage users—representing 30% of all 10- to 12-year-olds in the U.S.—as early as 2015.
Prosecutors presented evidence that Instagram did not require birthdate entry until August 2021, despite knowing millions of children under 13 were accessing the platform. A former advisor to Zuckerberg described the age requirements as “basically unenforceable” in internal correspondence, according to Los Angeles Today. Meta’s defense distinguished between tracking usage “milestones” and setting explicit growth “goals,” though plaintiff attorneys characterized this as semantic evasion.
“Meta clearly knew that youth safety was not its corporate priority… that youth safety was less important than growth and engagement.”
— Donald Migliori, attorney representing New Mexico
Product Liability Strategy Bypasses Section 230
The Los Angeles case represents the first of approximately two dozen bellwether trials selected from Motley Rice more than 2,325 claims consolidated in federal multidistrict Litigation. Plaintiffs frame allegations as product liability claims targeting algorithmic design—infinite scroll, autoplay, notification patterns—rather than content moderation, a strategy intended to circumvent Section 230 protections. Judge Carolyn B. Kuhl’s September 2025 ruling allowed expert testimony on addiction mechanisms and platform design to proceed, establishing that social media apps can be evaluated as defective products under tort law.
Snap and TikTok settled their portions of the litigation before trial commenced in January 2026. Meta and YouTube (Alphabet) remain as defendants. The plaintiff, identified as K.G.M. or “Kaley,” alleges Instagram use beginning before age 10 led to depression, body dysmorphia, and suicidal ideation. Fortune reported that nearly all photos in a 35-foot courtroom display of her Instagram posts used cosmetic filters, which expert witnesses linked to self-image disorders in adolescent users.
Multistate Regulatory Pressure Intensifies
More than 40 state attorneys general have filed parallel actions alleging Meta violated consumer protection statutes and caused public health harms requiring reimbursement for school counseling and mental health infrastructure costs. Massachusetts Attorney General Andrea Campbell alleges the company “deliberately exploited young users’ vulnerabilities for profit” through features including infinite scroll and variable reward mechanisms similar to slot machines.
A separate trial in New Mexico focuses on child sexual exploitation, with prosecutors arguing Meta knew sextortion risks and debated making teen accounts private by default but rejected the measure. Social Media Victims Law Center cited internal estimates showing private-by-default settings would have prevented 5.4 million unwanted direct message interactions daily. The company’s growth team “prioritized profit over safety,” according to legal filings.
Section 230 of the Communications Decency Act traditionally shields platforms from liability for user-generated content. Plaintiffs in these cases argue design features—algorithms, notifications, engagement mechanics—constitute product defects distinct from content, a legal theory federal courts have preliminarily accepted. Successful verdicts could establish precedent extending product liability doctrine to software architecture.
Market Implications and Investor Exposure
Meta acknowledged during its earnings call that it could face “material financial losses” in 2026 from litigation outcomes. The company disputes causation, arguing Meta’s official blog that teen mental health is “deeply complex and multifaceted” and that narrowing challenges to a single factor “ignores the scientific research and the many stressors impacting young people today.” Meta shares (NASDAQ: META) traded at $765.60 as of mid-February, down from a 52-week high of $796.25, with regulatory overhang cited by analysts as a discount factor.
Shareholder resolutions demanding third-party child safety audits have gained institutional support, with ISS and Glass Lewis both recommending votes in favor. Regulatory risk extends beyond U.S. borders: the European Union’s Digital Services Act subjects Facebook and Instagram to potential fines up to 6% of global revenue for failure to mitigate disinformation and harms to minors, with formal proceedings already underway.
- Internal documents show Instagram tracked teen engagement growth from 40 to 46 minutes daily between 2023-2026
- Over 2,300 cases consolidated in federal MDL; first bellwether trials underway in California and New Mexico
- Product liability framing bypasses Section 230 by targeting design features rather than content moderation
- Meta faces potential material financial losses and EU fines up to 6% of global revenue
What to Watch
Verdict timing in the Los Angeles case will determine settlement leverage for remaining bellwether trials scheduled through 2026. If juries find Meta liable and award significant damages, the company faces pressure to negotiate global resolution of the MDL before additional cases reach trial. Regulatory developments in the EU—particularly enforcement actions under the Digital Services Act—could force product redesigns affecting billions of users globally. Congressional scrutiny of AI chatbot safety features, following reports of inappropriate interactions with minors, may accelerate federal legislation that would supersede state-level enforcement. Investors should monitor quarterly disclosures for litigation reserve adjustments and any language indicating settlement negotiations have commenced.