The Federal Trade Commission proposed sweeping changes Wednesday to strengthen the key federal rule protecting children's online privacy. This is one of the U.S. government's most significant attempts to strengthen consumer privacy in more than a decade.
The changes are intended to strengthen rules underlying the Children's Online Privacy Protection Act of 1998, a law that restricts the online tracking of teenagers through services such as social media apps, video game platforms, toy retailers and digital advertising networks. Regulators said the measures would “shift the burden” of online safety from parents to apps and other digital services, while restricting the way platforms can use and monetize children’s data.
The proposed changes would result in certain online services turning off targeted advertising to children under 13 by default. They would ban online services from using personal information like a child's cell phone number to entice teens to stay on their platforms longer. This means that online services would no longer be able to use personal data to bombard young children with push notifications.
The proposed updates would also tighten security requirements for online services that collect data from children and limit the length of time that online services could retain that information. And they would limit the collection of student data by learning apps and other educational technology providers by allowing schools to consent to the collection of children's personal data only for educational purposes and not for commercial purposes.
“Children need to be able to play and learn online without being endlessly tracked by companies seeking to hoard and monetize their personal data,” Federal Trade Commission Chairwoman Lina M. Khan said in a statement Wednesday . She added: “By requiring companies to better protect children's data, our proposal imposes positive obligations on service providers and prohibits them from outsourcing their responsibilities to parents.”
COPPA is the core federal law protecting children online in the United States, although members of Congress have since attempted to introduce more comprehensive online safety laws for children and teens.
Under the COPPA law, online services that are aimed at children or know that they have children on their platform must obtain a parent's permission before collecting personal information – such as first and last names, addresses and phone numbers – from a child , use or distribute to children under 13 years of age.
To comply with the law, popular apps like Instagram and TikTok have terms of service that prohibit children under 13 from creating accounts. Social media and video game apps typically ask new users to provide their date of birth.
Still, regulators have filed numerous complaints against major technology companies, accusing them of failing to establish effective age-restriction systems; Show targeted advertising to children based on their online behavior without parental permission; Allowing strangers to contact children online; or retain children's data even after parents have requested its deletion. Amazon; Microsoft; Google and its YouTube platform; Epic Games, the maker of Fortnite; and Musical.ly, the social app now known as TikTok, have all paid multimillion-dollar fines to settle allegations of law violations.
Separately, a coalition of 33 state attorneys general filed a joint federal lawsuit in October against Meta, the parent company of Facebook and Instagram, alleging the company violated the Children's Privacy Act. The states particularly criticized Meta's age verification system, saying the company had allowed millions of underage users to create accounts without parental consent. Meta said it has spent a decade making online experiences safe and age-appropriate for teens and that the states' complaint “misrepresents” the company's work.
The FTC proposed stronger protections for children's privacy as public concern has increased about the potential risks to mental health and physical safety that popular online services could pose to young people online. Parents, pediatricians and children's groups warn that social media content recommendation systems routinely display inappropriate content that promotes self-harm, eating disorders and plastic surgery in young girls. And some school officials worry that social media platforms are distracting students from their work in class.
States have passed more than a dozen laws this year restricting minors' access to social media or pornography sites. Industry groups have successfully sued to temporarily block several of these laws.
The FTC began reviewing children's privacy rules in 2019 and received more than 175,000 comments from technology and advertising trade groups, video content developers, consumer advocacy groups and members of Congress. The resulting proposal runs to more than 150 pages.
The proposed changes include limiting an exemption that allows online services to collect persistent identification codes from children without parental consent for certain internal operations such as product improvement, consumer personalization or fraud prevention.
The proposed changes would ban online operators from using such user tracking codes to maximize the time children spend on their platforms. That means online services would not be able to use techniques such as sending cell phone notifications “to entice the child to engage with the website or service without verifiable parental consent,” it says Suggestion.
How online services would comply with the changes is not yet known. The public has 60 days to comment on the proposals, after which the commission will vote.
The initial reactions from industry associations were mixed.
The Software and Information Industry Association, whose members include Amazon, Apple, Google and Meta, said it was “grateful” for the FTC's efforts to consider outside input and that the agency's proposal cited the group's recommendations .
“We are interested in participating in the next phase of the effort and hope the FTC will take a similarly thoughtful approach,” Paul Lekas, the group’s head of global public policy, said in an email.
In contrast, NetChoice, whose members include TikTok, Snap, Amazon, Google and Meta, said the agency's proposed changes went too far by setting defaults that parents may not want. The group has sued several states to block new laws that would restrict minors' access to online services.
“With this new rule, the FTC is overriding parents’ wishes,” Carl Szabo, the group’s general counsel, said in a statement. It “will make it even more difficult for websites to provide children with the necessary services approved by their parents.”