Facebook and Instagram used aggressive tactics targeting children lawsuit claims

Facebook and Instagram used “aggressive tactics” targeting children, lawsuit claims

Meta has knowingly employed “aggressive tactics” aimed at addicting children to social media “in the name of growth,” according to a lawsuit against Meta alleging children suffered at the hands of Facebook and Instagram

A Meta software engineer claimed that “it’s no secret” how Facebook and Instagram used meticulous algorithms to encourage repeated and compulsive use among minors, regardless of whether the content was harmful – and was “pretty uncompromising.” “.

The redacted revelations were disclosed in a lawsuit against Meta, but were unsealed and obtained by .

Despite CEO March Zuckerberg publicly claiming that his company prioritizes profit over safety and that wellbeing is simply “not true,” the filings show child sexual exploitation on both platforms and claim “Meta’s engagement-based algorithm exploited extreme content to drive more engagement.” to promote”. the document reads.

The document says that 20 percent of users aged nine to 13 on Facebook and Instagram have had a sexual experience with an adult on the pages.

And this despite Meta’s “zero tolerance policy, which prohibits abuse such as the exploitation of children”.

 has obtained an unredacted version of a lawsuit against Meta filed by parents alleging children have suffered at the hands of its platforms

has obtained an unredacted version of a lawsuit against Meta filed by parents alleging children have suffered at the hands of its platforms

has contacted Meta, who has not commented on specific issues.

A spokesman for the court-appointed lead plaintiff’s counsel told : “These never-before-seen documents show that social media companies are treating the youth mental health crisis as a public relations matter rather than a pressing societal issue through their products.

“That includes burying internal research documenting these harms, blocking security measures because they reduce ‘engagement,’ and defunding teams focused on protecting adolescent mental health.”

The lawsuit, filed Feb. 14 in California, cites that more than a third of 13- to 17-year-old children say they use one of the defendants’ apps “almost constantly” and admit it was “too much,” allegations the parents involved filed the lawsuit.

The complaints, which were later consolidated into multiple class-action lawsuits, alleged that Meta’s social media platforms were designed to be dangerously addictive and drive children and teens to consume content that could increase the risk of sleep disorders, eating disorders, increase depression and suicide.

The case also states that teenagers and children are more vulnerable to the negative effects of social media.

The unredacted version was published on March 10th.

It said that Thorn, an international anti-trafficking organization, released a report in 2021 detailing issues of sexual exploitation on Facebook and Instagram and “provided these findings to Meta.”

Thorn’s report shows “no bans or reports”. [offenders] protects minors from continued harassment,” and 55 percent of those in the report who blocked or reported someone said they were recontacted online.

And younger cubs are particularly vulnerable to predators.

The unsealed complaint also alleges that 80 percent of “adult/minor affiliation violations” on Facebook stem from the platform’s “People You May Know” feature.

.  The filings claim the company is aware of child sexual exploitation on Facebook and Instagram, claiming that

. The filings claim the company is aware of child sexual exploitation on Facebook and Instagram, claiming that “Meta’s engagement-based algorithm exploited extreme content to encourage more engagement.”

“An internal study conducted in or around June 2020 concluded that 500,000 child Instagram accounts receive “IIC” — which stands for “inappropriate interactions with children” — every day,” the redacted statement said on pages 135 and 136 of the document .

‘Yet at the time, ‘Child Safety [was] expressly designated as a non-target. . . . So if we do something here, cool. But if we can’t do anything, that’s okay too.”

Meta has since improved its ability to reduce inappropriate adult-teen interactions.

The company has developed technology that allows it to find accounts that have exhibited potentially suspicious behavior and block those accounts from interacting with young people’s accounts.

And Meta claims that when they scroll through the list of people who’ve liked a post, or when they look at an account’s followers or followers list, it doesn’t show those young people’s accounts.

However, these changes were made after 2020.

The complaint also states that Meta had considered making teen users’ profiles “private by default” as early as July 2020, but decided against the move after trading off “security, privacy, and policy gains” against “growth impact.” had.

On page 135 of the lawsuit, a redacted portion alleges that Meta was aware that Apple was so pissed off about allowing adults to contact children on Instagram that it threatened to remove us from the App Store. The company had no timetable for this. when we prevent adults from notifying minors in IG Direct.

‘This was also after Meta received reports that a 12-year-old minor was being solicited on its platform.’ [the] daughter of [an] Apple Security Exec,” the statement continued.

However, Meta has moved to making teen user accounts private by default in November 2022.

A meta spokesman told : “The claim that we have defunded work to support people’s well-being is false.”

The redacted version of the complaint reads “Instead of” taking [this] seriously” and “introduce new tools” to protect children, Meta did the opposite.

“In late 2019, Meta’s ‘mental health team stopped doing things’, ‘it was defunded,’ and ‘completely discontinued.’ And as previously mentioned, Meta allowed security tools she knew were broken to be held off for repairs.’

A Meta spokesman told because this is one of the top priorities for the company, “we have indeed increased funding as evidenced by the 30+ tools we offer to support teenagers and families. Today, hundreds of people across the company are working to develop these capabilities,” he said.

Other “shocking” information in the unsealed complaint reports the existence of Meta’s “rabbit hole project.”

A meta spokesperson told that the rabbit hole project does not exist.

“Someone feeling bad sees content that makes them feel bad, they engage with it, and then their IG gets flooded with w[ith] it,” says the unedited version.

“Meta recognizes that Instagram users who are at risk of suicide or self-harm are more likely to “encounter more harmful suicide and self-harm content (through exploration, related, follower suggestions).”

The document quotes Molly Russel, a London teenager who died by suicide in 2017.

“Meta had conducted internal research that warned there was a risk of ‘Molly Russell-like incidents’ due to algorithmic product features”[l]Lead users to disturbing content,” the document says on page 84.

“Our recommendation algorithms will push you down a rabbit hole with outrageous content.”

“They were clear about possible solutions: Targeted changes to the algorithm lead to ‘significant reductions in exposure’ to problematic content.

“But they have resisted changes for the explicit, for-profit reason that such tweaks “clearly came at a cost of engagement.”

The lawsuit alleges that Meta’s ongoing stance on the importance of child safety was never serious and was just “all drama.”

‘Our currently displayed dates are incorrect. . . . We share bad metrics externally. . . we vouch for those numbers,” said one employee, as shown in the document.