In a high-stakes trial unfolding in Los Angeles Superior Court, Mark Zuckerberg, CEO of Meta Platforms, has taken the stand in a lawsuit that could reshape how social media giants are held accountable for youth safety and online harm. The case centers on allegations that Meta’s platforms, including Instagram, were designed in ways that deliberately promote addictive use among children and teens, contributing to serious mental health issues such as anxiety, depression and suicidal ideation. The trial, which also includes Google’s YouTube as a co-defendant, is widely regarded as one of the most consequential legal tests confronting Big Tech in recent years. A verdict here could influence thousands of similar lawsuits nationwide.The civil suit was filed by a plaintiff identified as “KGM,” now 20, who alleges that using platforms like Instagram from a young age fostered compulsive behaviour and exacerbated her mental health struggles. Her lawyers argue that Meta’s engagement-driven features, such as algorithmic recommendation systems and infinite scrolling, are engineered to keep young users engaged in ways that mirror addictive design practices used in other industries. Meta, for its part, vigorously denies these claims and maintains that the company has no intention to addict children or to profit from youth vulnerability. In court testimony, Zuckerberg emphasised that Meta does not allow children under 13 on Instagram, though he acknowledged it is “very difficult” to verify ages and enforce this rule perfectly.
Inside the courtroom: Zuckerberg’s testimony and defense strategy
During his testimony, Zuckerberg faced intense questioning about internal policies, platform design goals and historical strategy decisions. Plaintiffs’ attorneys challenged him over past internal documents suggesting that Meta once tracked metrics related to user time spent on its apps, a key indicator critics use to argue the company prioritised engagement over safety. While Zuckerberg insisted that Meta has shifted away from those metrics in recent years, he stopped short of admitting that the platforms were engineered to intentionally create addictive behaviours.Another flashpoint in the courtroom was Instagram’s age-restriction enforcement. Zuckerberg reiterated that under-13 use is prohibited but also conceded that age verification is imperfect and that many young users misrepresent their birth years to gain access. Plaintiffs seized on this admission, arguing that Meta has known about underage engagement for years yet did not do enough to protect minors. Meta’s legal team countered by pointing to new safety features and protections rolled out in recent years, while asserting that external factors beyond their control contribute to how and why young people interact with social media.During testimony in the high-profile social media safety trial, Zuckerberg revealed that he personally reached out to Apple CEO Tim Cook to discuss the “wellbeing of teens and kids” in the digital ecosystem. The Meta chief said that the conversation was part of broader efforts to explore how major tech platforms can work together to improve online safety standards, particularly for younger users. Zuckerberg framed the outreach as evidence that youth mental health and responsible product design have become cross-industry concerns, not just competitive talking points between Silicon Valley rivals.
Tech Rivals Unite? Zuckerberg Reached Out to Apple’s Tim Cook Over Kids’ Wellbeing
The disclosure is notable given the often tense relationship between Meta and Apple, especially after Apple’s privacy changes disrupted Meta’s advertising business. By invoking his dialogue with Cook, Zuckerberg appeared to signal that safeguarding minors transcends corporate rivalry. The testimony comes amid mounting scrutiny over how platforms like Instagram are designed and whether engagement-driven features disproportionately affect teens. His remarks suggest that behind the scenes, tech leaders may be engaging in conversations about shared accountability, even as their companies face growing legal and regulatory pressure in courtrooms and legislatures alike.
Social media addiction, design and big tech accountability
This trial is not just a test of one company’s practices, it reflects a broader legal and cultural moment where society is questioning the role of social media in children’s lives. Similar lawsuits have been filed against other platforms and although companies like TikTok and Snap Inc. reached early settlements, Meta’s case has moved forward, making Zuckerberg’s testimony especially pivotal. Legal experts describe the case as potentially precedential, with far-reaching implications for how digital platforms must consider safety in their design and business decisions.Significantly, the trial parallels debates currently unfolding in governments around the world about regulating online spaces. Some lawmakers have called for stricter age controls, algorithmic transparency and safety mandates for social media companies. In the US, discussions over reforming Section 230 of the Communications Decency Act, the law that broadly shields online platforms from liability for user-generated content, have gained traction in light of cases like this one. A ruling against Meta could embolden calls for updated regulatory frameworks, while a defence victory might reinforce existing legal protections.
Why Mark Zuckerberg’s trial matters
The outcome of this trial could have multiple ripple effects. A decision against Meta could open the door for similar liability claims against other tech giants. Governments may feel increased urgency to legislate online child safety and enforce stricter age verification. Tech companies might re-examine features such as recommendation algorithms, engagement metrics and design choices linked to youth usage.The trial has amplified public discourse on the mental health impacts of social media, especially for vulnerable populations like teens and pre-teens. Critics liken this case to the Big Tobacco lawsuits of the past, where corporate design and marketing practices were scrutinised for contributing to widespread harm. If the jury finds Meta accountable or if significant evidence alters public perception, the digital landscape could face new standards of corporate responsibility and safety compliance.
Meta CEO Mark Zuckerberg arrives for a landmark trial over whether social media platforms deliberately addict and harm children, Wednesday, Feb. 18, 2026, in Los Angeles. (AP Photo/Ryan Sun)
A landmark trial in Los Angeles is testing whether Meta’s social media platforms intentionally foster addictive use and harm children’s mental health. Mark Zuckerberg testified that while Meta prohibits under-13 users and has moved away from maximising screen time goals, age enforcement is challenging and the company disputes the core allegations. The lawsuit and thousands like it could reshape Big Tech liability, regulation and product design across the global digital ecosystem. The case is seen as a bellwether for future legal actions and regulatory reforms related to youth safety online.