Impact of Social Media on Children’s Mental Health: A Closer Look at Meta’s Controversy

In recent years, concerns about the impact of social media on children’s mental health have been on the rise. One company that has come under scrutiny is Meta, the parent company of Instagram.

Allegations have surfaced that Meta intentionally collected personal information from children under the age of 13 without parental consent, leading to a federal lawsuit and demands for accountability.

Meta’s Alleged Violation of Privacy Laws

According to a recently revealed court document, Meta has been accused of intentionally refusing to close accounts belonging to children under the age of 13 while collecting their personal information without parental consent. Prosecutors from 33 states in the US have alleged that Meta received over a million reports of Instagram users under the age of 13 between early 2019 and mid-2023. However, the lawsuit claims that Meta only deactivated a fraction of these accounts, highlighting the company’s alleged disregard for privacy laws.

The federal lawsuit seeks court orders to prohibit Meta from engaging in practices that prosecutors claim violate the law. Civil penalties could amount to hundreds of millions of dollars, considering Meta’s purportedly large user base of teenagers and children. The majority of states involved in the lawsuit are seeking fines ranging from $1,000 to $50,000 per violation.

The Privacy Dilemma and COPPA Compliance

The allegations against Meta include a violation of the Children’s Online Privacy Protection Act (COPPA), which prohibits companies from collecting personal information from children under the age of 13 without parental consent. Meta is accused of failing to comply with COPPA in relation to both Facebook and Instagram.

Court document states that Meta’s own records reveal the presence of millions of children under 13 on Instagram and that hundreds of thousands of teenage users spend more than five hours a day on the platform.

In response to these allegations, Meta has stated that verifying the age of individuals online is a complex challenge for the industry. They argue that many people, especially those under 13, do not possess identification documents necessary for age verification.

Meta supports federal legislation that would require app stores to obtain parental approval whenever their children under the age of 16 download applications. According to Meta, this approach would eliminate the need for parents and teenagers to provide sensitive information, such as identification documents, to individual apps.

Harmful Content and Mental Well-being

The lawsuit also alleges that Meta was aware that its algorithm could lead children to harmful content, thereby negatively impacting their well-being. Internal communications within the company, as cited in the court document, reveal employees expressing concern about Instagram content triggering negative emotions among preteens and affecting their mental well-being. The document also references a Meta study conducted in July 2021, which concluded that Instagram’s algorithm may amplify negative social comparison and content that makes users feel worse about their bodies or appearance.

Furthermore, internal emails from February 2021, as mentioned in the lawsuit, allegedly show Meta employees acknowledging that social comparison is associated with increased time spent on Meta’s social media platforms. They discussed how this phenomenon is valuable for Instagram’s business model while simultaneously causing harm to adolescents.

In a March 2021 internal investigation focused on content related to eating disorders, the Meta team followed users whose account names referenced starvation, thinness, and eating disorders. The Instagram algorithm then generated a list of recommended accounts that included ones related to anorexia, according to the court document.

However, Antigone Davis, Meta’s Global Head of Safety, stated in a September 2021 congressional hearing that Meta does not “direct people to content that promotes eating disorders. In fact, that violates our policies, and we remove that content when we become aware of it. In fact, we use AI to find content like that and remove it.”

Meta’s Knowledge of Content Issues

The lawsuit asserts that high-ranking Instagram officials were aware of the problematic content on the platform. Adam Mosseri, the head of Instagram, allegedly wrote in an internal email that “social comparison is to Instagram [what] election interference is to Facebook.” However, the document does not specify when this email was sent.

Despite internal investigations highlighting concerns about social comparison, the lawsuit claims that Meta refused to change its algorithm. According to internal communications cited in the lawsuit, an employee noted that content that triggers negative appearance comparison is the most appealing content on the Explore page, contradicting other top-level measures undertaken by the teams.

Meanwhile, the lawsuit alleges that Meta’s external communications denied or concealed the fact that their Recommendation Algorithms promote highly negative appearance comparison content among young users.

Internal documents cited in the lawsuit also suggest that Meta’s recommendation algorithms trigger intermittent dopamine releases in young users, potentially leading to addictive consumption cycles on their platforms.

The Impact on Children’s Mental Health

Meta’s alleged intentional design of platforms with manipulative features that foster addiction and diminish self-esteem has prompted strong criticism. Letitia James, the Attorney General of New York, stated in a recent press release that “Meta has profited from the pain of children by intentionally designing its platforms with manipulative features that turn children into addicts while eroding their self-esteem.”

New York is one of the states involved in the federal lawsuit. James asserts that social media companies, including Meta, have contributed to a national youth mental health crisis and must be held accountable.

Several other states have filed separate lawsuits against Meta, echoing similar claims made in the multi-state federal lawsuit. Florida, for instance, has filed its own federal lawsuit, alleging that Meta deceived users about the potential health risks associated with its products.

The wave of lawsuits comes as a result of a bipartisan and multi-state investigation that dates back to 2021. The investigation gained momentum after Frances Haugen, a former Facebook employee turned whistleblower, released tens of thousands of internal company documents suggesting that Facebook knew its products could have negative effects on young people’s mental health.

Meta’s handling of the lawsuit and the outcome of the legal proceedings will be closely watched, as they could have far-reaching implications for the regulation of social media platforms and the protection of children’s mental well-being.

Conclusion

Controversy surrounding Meta’s alleged collection of personal information from children without parental consent raises serious concerns about the impact of social media on children’s mental health. The lawsuit highlights the need for stricter regulations and stronger safeguards to protect young users from harmful content and manipulative features.

As the legal battle unfolds, it remains to be seen how Meta and other social media platforms will address these issues and prioritize the well-being of their users, especially the most vulnerable ones.