Zuckerberg considered changing how Meta studies social issues after research got it in trouble

One day later The Wall Street Journal After posting a hit Instagram story about Meta’s dismal findings on adolescent girls’ mental health, CEO Mark Zuckerberg questioned whether Meta should change the way it researched the potential harms of its platforms.
“Recent events have led me to think about whether we should change our approach to research and analysis on social issues,” Zuckerberg wrote in a September 15, 2021 email to senior executives, including Sheryl Sandberg, then COO, and Nick Clegg, head of global affairs. The day before, the Journal published an article based on documents obtained from a whistleblower later revealed to be Francis Haugen, which showed that the company’s own research found that “Thirty-two percent of teenage girls said that when they felt bad about their body, Instagram made them feel worse.” The subject line of Zuckerberg’s email read: “Research and analysis on social issues – privileged and confidential.” »
The 2021 email was unsealed Thursday after being previously collected by New Mexico Attorney General Raúl Torrez as part of a case alleging that Meta deceptively positioned its products as safe for teenagers, despite being aware of harmful design choices that the state claims addicted children to and allowed child predators to thrive. In the complaint, the AG’s office alleged that disclosing the harms Meta identified on its platforms “would have corrected the misleading and deceptive nature of its public statements proclaiming its platforms to be ‘safe,'” Meta spokesperson Andy Stone said. The edge in a statement that the company “is proud of our ongoing commitment to conducting transparent, industry-leading research. As we have for years, we continue to use this information to make meaningful improvements, like introducing teen accounts with built-in protections and providing parents with tools to manage their teens’ experiences.”
The email is just one example of the type of internal conversations expected to be revealed throughout this trial and in a series of cases with similar allegations in California. Opening statements in the New Mexico case are expected to begin next week.
In the email, Zuckerberg writes that it appears that Meta’s peers have managed to circumvent public criticism on social issues by conducting far less proactive research into harms on their platforms. “Apple, for example, doesn’t seem to be studying any of this,” he writes. “From what I understand, no one reviews or moderates content and doesn’t even have a reporting feed in iMessage. They’ve taken the approach that what they do on the platform is the users’ own responsibility, and because Apple hasn’t taken that responsibility upon itself, they haven’t created a team or a plethora of studies examining the tradeoffs in their approach. It’s worked surprisingly well for them.”
“[W]When Apple tried to do something about CSAM, they were heavily criticized for it.
While Apple seemed to escape criticism, according to Zuckerberg, Meta “faced more criticism” as it reported more child sexual abuse material (CSAM), which “creates the impression that there is more of such behavior on our platforms.” On the other hand, he noted, “when Apple tried to do something about CSAM, they were heavily criticized for it, which might encourage them to double down on their initial approach.” Zuckerberg may have been referring to Apple’s announcement earlier that year about rolling out new features aimed at protecting children, including scanning users’ iCloud photos for CSAM. But privacy advocates feared the move would create a giant backdoor for surveillance of user accounts. Apple then reversed its proposals. Apple did not immediately respond to a request for comment on the email.
Apple and Meta have long sparred publicly and privately over their different approaches to policy issues such as privacy and age verification. But Zuckerberg also made similar observations about Meta’s other peers. “YouTube, Twitter and Snap take a similar approach, to lesser degrees,” he wrote. “YouTube seems to be intentionally burying its head in the sand to stay below the radar and not be the center of attention. Twitter and Snap may simply not have the resources to do this kind of research.” Over the years, many platforms have publicly shared research and initiatives investigating the safety of their platforms, including YouTube’s Youth and Family Advisory Board, made up of independent experts charged with guiding the well-being of teens on the platform, as well as Snap’s Digital Well-being Index (launching in 2022).
“I think we should be commended for the work we do to study, understand and improve social issues on our platforms”
Zuckerberg seemed to believe that the public response to his internal research was unfair. “I think we should be commended for the work we do to study, understand and improve social issues on our platforms,” he wrote. “Unfortunately, the media is more likely to use the research or recommendations produced to say that we are not doing everything we can (implied for loose ends) rather than taking these issues more seriously than anyone else in our industry by studying them and looking for solutions, not all of which are reasonable to implement because everything has trade-offs.”
In response to the email, at least some senior executives endorsed pursuing some level of research on social issues, even despite the public’s perceived risks. “Leaks suck and will continue to happen unless we find a way to stamp them out,” wrote Javier Olivan, then vice president of core products. “Given that, is it still worth trying to understand these issues? I think it’s the responsible thing to do/I’d like us to continue to try to understand how we can make our products better for everyone, but perhaps we should limit the surface area to areas where we see at least some clear degree of correlation between the use of our products/the specific issue.” David Ginsberg, then vice president of product, choice and competition, said that “after having struggled a lot with this myself in recent days,” he largely agreed with Olivan. “I think internal work is important to deliver a good product and user experience – regardless of any goals related to societal issues.”
A few days later, Guy Rosen, product manager responsible for the integrity work, shared several potential options for how to change the company’s organization around internal and external search, including the pros and cons of each. Rosen wrote that this was only a “preliminary/discretionary exercise” to understand the “spectrum of options.” These range from centralizing teams that research highly sensitive topics in an effort to better control access to documents, to the more extreme option of disbanding teams that research sensitive topics and outsourcing the work when needed. Ultimately, executives recommended the less extreme option of centralizing research teams, planning to announce it shortly after Instagram CEO Adam Mosseri’s upcoming congressional testimony. Mosseri, recently added to the thread, added: “Announcing this after my testimony [sic] It’s worse than before, and we talked [about] This. It will leak and it will look like I was hiding something. Meta ended up announcing the changes before Mosseri’s testimony and said she continues to study sensitive topics like adolescent welfare.
In the initial email, Zuckerberg lamented that leaks of internal documents make this work more difficult. “This may partly explain why the rest of the industry has taken a different approach to these issues.”


