Meta CEO Mark Zuckerberg will to testify on Wednesday in a California social media trial that questions whether features on his company’s platforms harm children or foster addiction.
Zuckerberg will testify in front of a jury and before bereaved parents of children who have taken their lives due to what they say is social media use. He has apologised to families at a United States Congress hearing whose lives had been upended by what they believe was social media use.
Zuckerberg could face questions about Instagram’s algorithm and in-app features that the plaintiffs argue are engineered to keep young users hooked.
The lawsuit comes as European countries are considering age-related restrictions on Meta, Google and other Big Tech social media platforms as a way of protecting children from harmful online content.
What is the trial about?
KGM, a 20-year-old woman identified only by her initials, claims in the lawsuit that using Meta and Google social media platforms as a child fostered an addiction to the technology, which made her depressive and suicidal thoughts worse.
The lawsuit argues the companies made deliberate design choices similar to techniques used at casinos to make their platforms more addictive to children in order to boost profits.
The case was initially filed against four companies: Zuckerberg’s Meta, Google’s YouTube, ByteDance’s TikTok and Snap Inc., the parent company of Snapchat. TikTok and Snap Inc. reportedly reached settlements before the trial began.
This case is what’s known in the United States as a “bellwether trial,” which means the outcome could impact how thousands of other lawsuits could play out against social media companies.
Euronews Next contacted KGM’s lawyer to get a statement about what the legal team is expecting from Zuckerberg’s testimony but did not receive an immediate reply.
A Meta spokesperson told the Associated Press that the company strongly disagrees with the allegations in the lawsuit and said they are “confident the evidence will show our longstanding commitment to supporting young people.” Euronews Next has also reached out to Google.
Meta’s attorney Paul Schmidt debated in his opening statement last week whether Instagram is really to blame for KGM’s mental health struggles, pointing instead to documents that showed a turbulent home life. Schmidt argued that KGM turned to social media platforms as a coping mechanism for her struggles.
Meta and Google tried to get the Los Angeles Superior Court to dismiss KGM’s case last November, but the court ruled against it, which allowed the trial to proceed
Meta and Google have also argued they are protected under American law from being held legally responsible for third-party content that a user receives on a social media platform or website, according to a court file from November.
Courts interpret this law, known as Section 230 of the Communications Decency Act, as a way to “protect online service providers like social media companies from lawsuits based on their decisions to transit or take down user-generated content,” according to US Congress.
In court filings, Meta argued that design elements such as “infinite scroll” cannot be blamed for KGM’s use of Instagram because she chose to continue viewing content.
However, the judge said there is sufficient evidence for a jury to consider whether Instagram’s engagement-driven features contributed to her mental health struggles.
KGM accused Google of exposing her to harmful content through YouTube’s recommendation system, comment sections and autoplay feature. She also alleges the platform fails to impose adequate age restrictions.
Who else has testified?
Adam Mosseri, head of Meta’s Instagram, testified last week during the trial that he disagrees with the idea that people can be clinically addicted to social media platforms.
Mosseri said Instagram and Meta work to protect young people using the social media site, adding that it’s “not good for the company, over the long run, to make decisions that profit for us but are poor for people’s well-being.”
Mosseri said it’s important that this case differentiate between clinical addiction and what he considers problematic use, which is someone who spends “more time on Instagram than they feel good about.”
Instagram has several features that protect children from harmful content, such as private teen accounts, safety notices, content filtering and parental controls.
Two-thirds of Meta’s safety tools on platforms, including Instagram, were found to be ineffective, according to a 2025 study from Meta whistleblower Arturo Béjar and academics from two American universities.
That meant that teen accounts received recommendations for sexual content, brief displays of nudity and content regarding themes of self-harm, self-injury and body image.
Meta called the report “misleading, dangerously speculative,” at the time, according to the Associated Press.

