Meta collected childrens data from Instagram accounts unsealed court document

Meta collected children’s data from Instagram accounts, unsealed court document claims

Thomas Trutschel/Photothek/Getty Images

BERLIN, GERMANY – OCTOBER 06: In this photo illustration, Instagram’s logo is illuminated on a smartphone near a finger on October 06, 2023 in Berlin, Germany. (Photo illustration by Thomas Trutschel/Photothek via Getty Images)

CNN –

Since at least 2019, Meta has knowingly refused to close most accounts of children under 13 while collecting their personal information without their parents’ consent, a recently unsealed court document from an ongoing federal lawsuit against the social media giant claims.

Attorneys general from 33 states accuse Meta of receiving more than a million reports about Instagram users under the age of 13 from parents, friends and online community members between early 2019 and mid-2023. However, “Meta deactivated only a fraction of these accounts,” the complaint states.

The federal complaint seeks court orders prohibiting Meta from engaging in practices that state attorneys general say violate the law. Civil penalties could amount to hundreds of millions of dollars because Meta allegedly hosts millions of users who are teenagers and children. Most states require fines ranging from $1,000 to $50,000 per violation.

According to the 54-count lawsuit, Meta violated a number of state consumer protection laws as well as the Children’s Online Privacy Protection Rule (COPPA), which prohibits companies from collecting personal information from children under 13 without parental consent. Meta allegedly failed to comply with COPPA with respect to both Facebook and Instagram, even though “Meta’s own records show that Instagram’s audience composition includes millions of children under 13” and that “hundreds of thousands of teen users have been spending for more than five years.” “Hours a day on Instagram,” the court document says.

A Meta product designer wrote in an internal email that “the young are the best,” adding that “you want to attract young and early people to your service,” the lawsuit says.

“Instagram’s Terms of Service prohibit users under the age of 13 (or older in certain countries) and we have taken steps to remove these accounts when we identify them. However, verifying people’s age online is a complex challenge for the industry,” Meta said in a statement to CNN on Sunday. “For example, many people – especially under the age of 13 – do not have ID. That’s why Meta supports federal legislation requiring app stores to obtain parental consent when their teens under 16 download apps. With this approach, parents and teens don’t have to feed sensitive information like government IDs to hundreds of individual apps to verify their age.”

The unsealed complaint also alleges that Meta knew its algorithm could direct children to harmful content, thereby harming their well-being. According to internal company communications cited in the document, employees wrote that they were concerned that “content on IG could trigger negative emotions in tweens and impact their mental well-being (and) acceptance of our ranking algorithms.” [them] get into negative spirals and feedback loops that are difficult to get out of.”

For example, in July 2021, meta-researchers conducted a study that concluded that Instagram’s algorithm promotes negative social comparisons and “content with a tendency to make users feel worse about their bodies or appearance.” “, according to the complaint. In internal emails from February 2021 cited in the lawsuit, Meta employees allegedly recognized that social comparison “involved an increased amount of time spent” on Meta’s social media platforms and discussed how this phenomenon “is valuable to Instagram’s business model while simultaneously harming teenage girls.”

As part of an internal investigation in March 2021 into eating disorder content, Meta’s team followed users whose account names related to hunger, thinness, and eating disorders. Instagram’s algorithm then began creating a list of recommended accounts “that included accounts related to anorexia,” the lawsuit says.

However, Antigone Davis, Meta’s global head of security, testified before Congress in September 2021 that Meta “does not alert people to content that promotes eating disorders.” This is actually against our policies and we will remove this content if we find out about it. We actually use AI to find and remove such content.”

“We want teens to have safe, age-appropriate online experiences, and we have over 30 tools to support them and their parents,” Meta said in a statement to CNN. “We have spent a decade working on these issues and hiring people who have dedicated their careers to ensuring the safety and support of young people online. The complaint misrepresents our work through selective citations and carefully selected documents.”

Instagram’s senior executives also knew that problematic content was a critical issue for the platform, the lawsuit says. Adam Mosseri, the head of Instagram, reportedly wrote in an internal email: “The social comparison is comparable to Instagram.” [what] Election interference is Facebook’s business.” The lawsuit does not specify when that email was sent.

CNN reached out to Meta about Davis and Mosseri’s comments and did not immediately receive a response.

Although the company’s internal investigations confirmed concerns about social comparison on its platforms, the lawsuit alleges that Meta refused to change its algorithm. An employee pointed out in internal communications cited in the lawsuit that content that draws negative comparisons to appearance is “some of the most engaging content (on the Explore page), so this idea is actively contradictory.” The top-line measures of many other teams.” Meanwhile, Meta’s external communications “denied or obscured the fact that its recommendation algorithms promote content with a high negative image for young users,” the lawsuit says.

Meta was also aware that its recommendation algorithms “trigger intermittent dopamine release in young users,” which can lead to addictive cycles of consumption on its platforms, according to internal documents cited in the lawsuit.

“Meta has profited from the suffering of children by intentionally equipping its platforms with manipulative features that make children addicted to their platforms while reducing their self-esteem,” New York Attorney General Letitia James said in a statement last month. New York is one of the states involved in the federal lawsuit. “Social media companies, including Meta, have contributed to a national youth mental health crisis and must be held accountable,” James said.

Eight other attorneys general sued Meta in various state courts last month, making similar claims to the massive multi-state federal lawsuit. Florida sued Meta in a separate federal lawsuit, claiming the company misled users about potential health risks of its products.

The wave of lawsuits is the result of a bipartisan, multistate investigation in 2021 after Facebook whistleblower Frances Haugen submitted tens of thousands of internal company documents that she said showed the company knew its products could have negative impacts on young people Mental health.

CNN’s Brian Fung contributed to this report