41 States Sue Meta Claiming Instagram and Facebook Are Addictive.jpgw1440

41 States Sue Meta, Claiming Instagram and Facebook Are Addictive and Harmful to Children – The Washington Post

Comment on this storyCommentAdd to your saved storiesSave

Forty-one states and the District of Columbia are suing Meta, claiming the tech giant is harming children by building addictive features into Instagram and Facebook – legal action that represents the most significant effort by state law enforcement to examine the impact of social media on children’s mental health to combat health.

The flood of lawsuits is the culmination of an extensive 2021 investigation into claims that Meta contributes to mental health problems among young people. While the scope of the legal claims varies, they paint a picture of a company that lures children to its platforms using harmful and manipulative tactics.

A 233-page federal complaint alleges the company engaged in a “plot to exploit young users for profit” by misleading them about security features and the distribution of harmful content, harvesting their data and violating federal laws protecting children’s privacy violated. State officials allege that the company knowingly made changes to keep children on the site to the detriment of their well-being, violating consumer protection laws.

The allegations represent a rare bipartisan agreement and underscore growing concern among government leaders that social networks are harming younger users by emphasizing engagement over safety.

“At a time when our nation is not recognizing the level of bipartisan collaboration on problem-solving that we need, you can see it here in this group of attorneys general,” said Colorado Attorney General Phil Weiser (D), who leads the federal government together is leading the lawsuit, said during a joint press conference on Tuesday.

Thirty-three states, including Colorado and California, are filing a joint lawsuit in federal court in the Northern District of California, while the attorneys general of D.C. and eight states are filing separate lawsuits in federal, state or local courts.

“Our bipartisan investigation has reached a solemn conclusion: Meta harmed our children and youth and cultivated addiction to boost corporate profits,” California Attorney General Rob Bonta (D), one of the officials who led the effort, said in an explanation.

Meta spokeswoman Liza Crenshaw said in a statement that the company was “disappointed that state attorneys general have chosen this path rather than productively working with companies across the industry to create clear, age-appropriate standards for the many apps that teens use.” .”

Weiser said state officials have not discussed whether the cases will be consolidated in court, as in recent lawsuits by school districts and parents, but said the lawsuits would likely be “managed in tandem.” The attorney general expressed optimism that the multifaceted measure, whether through settlement or regulatory pressure, could force the company to change its behavior toward children.

Possible consequences could include civil penalties, changes in business practices and refunds, they said.

The impact of Meta’s products on young people came into the national spotlight after a 2021 Wall Street Journal report detailed internal research leaked by Facebook whistleblower Frances Haugen that showed Instagram was addressing the body issues of some teens. Girl aggravated.

The revelations sparked a political battle in Washington and state capitals across the country, with lawmakers launching new efforts to restrict children’s use of social media and regulators taking renewed scrutiny of Meta’s security practices.

But efforts to pass new privacy and safety regulations for children online are failing at the federal level, largely leaving states to push forward with aggressive new measures.

Weiser said state officials have not discussed whether the cases will be consolidated in court, as was the case with recent lawsuits by school districts and parents, but he said they recognized they were “pretty similar” and likely “in the would be treated in tandem”. ” The attorneys general expressed optimism that the multifaceted measures, whether through a settlement or through regulatory pressure, could force the company to change its behavior toward children.

Possible consequences could include civil penalties, changes in business practices and refunds, they said.

States like Arkansas and Utah have passed laws banning children under 13 from accessing social media and requiring those under 18 to get parental consent to access the sites. California, meanwhile, has passed rules requiring tech companies to screen their products for risks and build security and privacy protections into its tools. In lieu of federal legislation, parents and school districts have also taken up the issue, filing lawsuits accusing Meta, TikTok and other platforms of worsening the nation’s youth mental health crisis and exacerbating anxiety, depression and body image issues among students.

The increasing legal cases come at a time when research on the connection between social media use and mental health problems remains unclear. Earlier this year, U.S. Surgeon General Vivek H. Murthy released a note arguing that excessive social media use as a child could lead to a higher risk of poor mental health, including sleep problems or body dissatisfaction. However, a report from the American Psychological Association concluded that social media use “is neither inherently beneficial nor harmful to young people” and that more research should be done on the topic.

In launching their investigation in 2021, state law enforcement agencies said the company had “failed to protect young people on its platforms” and accused it of “exploiting children for profit.”

The tech giant declined the investigation at the time. Meta spokesman Andy Stone said the allegations were “false and show a profound misunderstanding of the facts.”

Since then, Meta has announced numerous policy and product changes designed to make its apps safer for children, including providing parents with tools to track their children’s activities, incorporating alerts encouraging teens to take a break of social media and introducing stricter privacy settings by default for young users.

The changes have done little to appease critics at the state and federal levels who say the company has shirked its responsibility to protect its most vulnerable young users.

For years, Meta has worried that young people are spending less time on Facebook as teenagers flock to competitors like TikTok and Snapchat. To attract younger users, the company has tried to emulate TikTok with its short-video service Reels.

But the push to lure young people has drawn the attention of regulators, who fear apps like Facebook and Instagram are harming young people’s mental health, luring them into addictive products at a young age and endangering their privacy. Meta argues that research on the effects of social media on young people is mixed and that the company is taking precautions to protect users.

After Haugen’s revelations became public, Meta announced that the company would pause its plans to develop an Instagram app specifically designed for children under 13. Advocacy groups, state attorneys general and lawmakers had called on the company to halt the project out of concern for young people’s mental health.

The company said at the time that it still believed in the concept of a kid-focused Instagram app because kids simply lied about their age to join Instagram.

The Biden administration is separately reviewing Meta’s record on child safety. The Federal Trade Commission is proposing a plan to ban the company from monetizing data collected from young users. Meta’s Stone called it a “political ploy” and said the company would “vigorously fight” the move.

As efforts to curb the impact of social media on children gain momentum among lawmakers and law enforcement agencies, they are increasingly running into major hurdles in the courts.

Federal judges recently blocked newly passed child safety laws in California and Arkansas, saying they could violate First Amendment protections, sometimes casting doubt on their effectiveness and whether they would actually keep children safer.

State and federal law enforcement agencies have been scrutinizing technology companies’ handling of children’s private personal data for years, at times imposing hefty fines on social media companies. The FTC and New York State agreed to a $170 million settlement with Google-owned YouTube in 2019 over allegations that the company illegally collected data from users under the age of 13.

In recent years, authorities have focused on how tech companies could be worsening anxiety, depression and other mental illnesses in children and teenagers.

Indiana, Arkansas and Utah have filed separate lawsuits accusing TikTok of harming children through addictive features by exposing them to inappropriate content or misleading consumers about its safety precautions. Arkansas filed a similar lawsuit, accusing Meta of violating state rules against deceptive trade practices.

Tennessee Attorney General Jonathan Skrmetti (right), who co-led the multi-state investigation and filed one of the lawsuits against Meta in a state court, said at Tuesday’s news conference that the federal litigation in California is being used as a “vehicle for settlement negotiations throughout California could serve industry.” Colorado’s Weiser said that while he was “always open” to striking settlements, “that wasn’t possible here.”

Attorneys general described their investigations into other technology companies as ongoing.

“This is not just about meta, but as one of the largest players and as a company that has clear evidence of misleading the public and making intentional decisions that harm children, I think it is appropriate to that we start with this particular lawsuit,” Skrmetti said.