Meta Allegedly Concealed ‘Causal’ Evidence of Social Media’s Harm, Court Documents Reveal

Meta has reportedly halted internal research into the mental health impacts of Facebook after uncovering evidence suggesting its products may harm users’ mental well-being. This revelation comes from unredacted documents in a lawsuit filed by U.S. school districts against Meta and other social media platforms.
In a 2020 initiative dubbed “Project Mercury,” Meta collaborated with Nielsen to assess the effects of temporarily deactivating Facebook. The findings were surprising; users who took a week-long break from the platform reported decreased feelings of depression, anxiety, loneliness, and social comparison. Internal documents indicate that Meta was disappointed by these results.
Instead of publishing the findings or pursuing further research, Meta decided to terminate the project, claiming that the negative results were influenced by the prevailing media narrative surrounding the company. However, one internal staff member maintained that the research conclusions were valid, expressing concern that ignoring these findings was reminiscent of the tobacco industry’s historical practices of concealing harmful information.
Despite its own research linking its products to adverse mental health outcomes, Meta allegedly informed Congress that it could not quantify the potential harm its products posed to teenage girls. In a statement, Meta spokesman Andy Stone defended the decision to halt the study, citing methodological flaws and emphasizing the company’s commitment to improving product safety.
PLAINTIFFS ALLEGE PRODUCT RISKS WERE HIDDEN
The claims of Meta suppressing evidence of social media-related harms are part of a broader lawsuit filed by the law firm Motley Rice, which represents school districts across the nation. The plaintiffs argue that Meta, Google, TikTok, and Snapchat have intentionally concealed the recognized risks associated with their products from users, parents, and educators.
Other allegations against these platforms include encouraging children under 13 to use their services, failing to adequately address child sexual abuse content, and attempting to increase social media usage among teenagers during school hours. Furthermore, the plaintiffs assert that these companies sought to financially support child-focused organizations to publicly defend the safety of their products.
For instance, TikTok reportedly sponsored the National PTA and boasted internally about its influence over the organization, suggesting that the PTA would comply with its requests in the future.
While the allegations against Meta are detailed, those against other social media platforms are less so. The internal documents referenced by the plaintiffs allege several concerning practices:
- Meta allegedly designed its youth safety features to be ineffective and infrequently used, while blocking tests of features that could hinder growth.
- It purportedly required users to be flagged 17 times for sex trafficking before removal, described as an exceedingly high threshold.
- Meta recognized that optimizing for teen engagement led to exposure to harmful content but proceeded regardless.
- Internal efforts to prevent child predators from contacting minors were reportedly stalled for years due to growth concerns.
- In a 2021 text, Mark Zuckerberg indicated that child safety was not his primary focus, prioritizing other areas like building the metaverse instead.
Stone refuted these allegations, asserting that Meta’s safety measures for teens are effective and that the company promptly removes flagged accounts related to sex trafficking. He characterized the lawsuit as misrepresenting Meta’s efforts to enhance safety features for both teens and parents, labeling the safety work as “broadly effective.”
The underlying Meta documents cited in the lawsuit remain confidential, and the company has filed a motion to strike them from the record, arguing that the plaintiffs’ request to unseal is overly broad. A hearing regarding this matter is scheduled for January 26 in Northern California District Court.
Interested in Digital Marketing?
Get automatic alerts for this topic.

Meta has reportedly halted internal research into the mental health impacts of Facebook after uncovering evidence suggesting its products may harm users’ mental well-being. This revelation comes from unredacted documents in a lawsuit filed by U.S. school districts against Meta and other social media platforms.
In a 2020 initiative dubbed “Project Mercury,” Meta collaborated with Nielsen to assess the effects of temporarily deactivating Facebook. The findings were surprising; users who took a week-long break from the platform reported decreased feelings of depression, anxiety, loneliness, and social comparison. Internal documents indicate that Meta was disappointed by these results.
Instead of publishing the findings or pursuing further research, Meta decided to terminate the project, claiming that the negative results were influenced by the prevailing media narrative surrounding the company. However, one internal staff member maintained that the research conclusions were valid, expressing concern that ignoring these findings was reminiscent of the tobacco industry’s historical practices of concealing harmful information.
Despite its own research linking its products to adverse mental health outcomes, Meta allegedly informed Congress that it could not quantify the potential harm its products posed to teenage girls. In a statement, Meta spokesman Andy Stone defended the decision to halt the study, citing methodological flaws and emphasizing the company’s commitment to improving product safety.
PLAINTIFFS ALLEGE PRODUCT RISKS WERE HIDDEN
The claims of Meta suppressing evidence of social media-related harms are part of a broader lawsuit filed by the law firm Motley Rice, which represents school districts across the nation. The plaintiffs argue that Meta, Google, TikTok, and Snapchat have intentionally concealed the recognized risks associated with their products from users, parents, and educators.
Other allegations against these platforms include encouraging children under 13 to use their services, failing to adequately address child sexual abuse content, and attempting to increase social media usage among teenagers during school hours. Furthermore, the plaintiffs assert that these companies sought to financially support child-focused organizations to publicly defend the safety of their products.
For instance, TikTok reportedly sponsored the National PTA and boasted internally about its influence over the organization, suggesting that the PTA would comply with its requests in the future.
While the allegations against Meta are detailed, those against other social media platforms are less so. The internal documents referenced by the plaintiffs allege several concerning practices:
- Meta allegedly designed its youth safety features to be ineffective and infrequently used, while blocking tests of features that could hinder growth.
- It purportedly required users to be flagged 17 times for sex trafficking before removal, described as an exceedingly high threshold.
- Meta recognized that optimizing for teen engagement led to exposure to harmful content but proceeded regardless.
- Internal efforts to prevent child predators from contacting minors were reportedly stalled for years due to growth concerns.
- In a 2021 text, Mark Zuckerberg indicated that child safety was not his primary focus, prioritizing other areas like building the metaverse instead.
Stone refuted these allegations, asserting that Meta’s safety measures for teens are effective and that the company promptly removes flagged accounts related to sex trafficking. He characterized the lawsuit as misrepresenting Meta’s efforts to enhance safety features for both teens and parents, labeling the safety work as “broadly effective.”
The underlying Meta documents cited in the lawsuit remain confidential, and the company has filed a motion to strike them from the record, arguing that the plaintiffs’ request to unseal is overly broad. A hearing regarding this matter is scheduled for January 26 in Northern California District Court.
Interested in Digital Marketing?
Get automatic alerts for this topic.
