Meta Seeks to Bar Mentions of Mental Health—and Zuckerberg’s Harvard Past—From Child Safety Trial

As Meta heads on trial in the state of New Mexico for allegedly failing to protect minors from sexual exploitation, the company is aggressively pushing to have certain information excluded from court proceedings.
The company asked the judge to exclude certain studies and research articles on social media and youth mental health; any mention of a recent high-profile case involving teenage suicide and social media content; and any references to Meta’s financial resources, employees’ personal activities, and Mark Zuckerberg’s time as a student at Harvard University.
Meta’s requests to exclude information, known as motions in limine, are an integral part of pretrial proceedings, in which a party can ask a judge to determine in advance what evidence or arguments are admissible in court. This is to ensure that the jury has facts, not irrelevant or prejudicial information, and that the accused receives a fair trial.
Meta emphasized in pretrial motions that the only questions the jury should ask are whether Meta violated New Mexico’s Unfair Practices Act because of how it allegedly handled child safety and youth mental health, and that other information, such as Meta’s alleged election interference and misinformation, or privacy violations, should not be considered.
But some of the requests seem unusually aggressive, two lawyers told WIRED, including requests that the court not mention the company’s AI chatbots and the broad reputational protection that Meta seeks. WIRED was able to review Meta’s in limine claims through a public records request from the New Mexico courts.
The motions are part of a landmark case filed by New Mexico Attorney General Raúl Torrez in late 2023. The state alleges that Meta failed to protect minors from online solicitation, human trafficking and sexual abuse on its platforms. It claims the company proactively served pornographic content to minors on its apps and failed to take certain child safety measures.
The state’s complaint details how its investigators were able to easily create fake Facebook and Instagram accounts posing as underage girls, and how those accounts quickly received explicit messages and showed algorithmically amplified pornographic content. In another test case cited in the complaint, investigators created a fake account as a mother seeking to traffick her young daughter. According to the complaint, Meta failed to report suggestive remarks that other users allegedly commented on its posts, nor did it close some of the accounts flagged as violating Meta’s policies.
Meta spokesperson Aaron Simpson told WIRED via email that the company has, for more than a decade, listened to parents, experts and law enforcement, and conducted extensive research to “understand the most important issues” and “use that information to make meaningful changes, like introducing teen accounts with built-in protections and providing parents with tools to manage their teens’ experiences.”
“While New Mexico makes sensationalist, irrelevant and distracting arguments, we strive to demonstrate our long-standing commitment to supporting young people,” Simpson said. “We are proud of the progress we have made and we are always working to do better. »
In his motions ahead of the New Mexico trial, Meta asked the court to exclude any reference to a public advisory issued by Vivek Murthy, the former U.S. surgeon general, on social media and youth mental health. He also asked the court to exclude an opinion piece by Murthy and Murthy’s calls for social media to come with a warning label. Meta contends that the former surgeon general’s statements treat social media companies as a monolith and are “irrelevant, inadmissible and unduly prejudicial hearsay.”


