As Preteens Ignore Social-Media Age Limits, Governments Push for Better Checks

- Lawmakers in the U.S., Europe propose age-verification tools, sparking debate over digital rights
Governments across the U.S. and Europe are moving to tighten online age restrictions following evidence that companies aren’t effectively enforcing limits on what children see and do on social media.
State and national lawmakers from Salt Lake City to Paris are looking to make services such as Meta Platforms’ Instagram and ByteDance’s TikTok take steps to verify ages when users sign up. Some legislatures also are requiring companies to collect parental consent for users under a certain age.
The momentum portends a sea change in how children and adults access the internet, and could end the days when one needed only to click a button or fill in a birth date to avoid online age restrictions.
The new rules also are sparking debate between child-protection advocates concerned about social media’s impact on mental health and digital-rights groups arguing that verifying ages creates privacy risks and could discourage access to useful information.
Many of the new rules target pornography websites. But another major driver is evidence of rising preteen use of social media despite the fact that most such platforms say they don’t allow users under at least 13, in part to comply with U.S. federal law.
Up to a third or more of children ages 8 to 12 say they use social-media and video sites including YouTube, TikTok and Instagram, according to surveys in recent years in the U.S., U.K., France and Ireland conducted for child-protection advocates and regulators.
Those advocates say social-media use opens children up to content and social pressures that are difficult for adults to handle, with surveys showing significant rates of children being bullied and communicating with strangers online.
Instagram says it requires everyone to be at least 13 years old before they can create an account. YouTube and TikTok also require users to be at least 13 before creating profiles on their main services, though both offer alternatives for children.
On Instagram, The Wall Street Journal found dozens of accounts that self-identified as being younger than 13 and remained active until the Journal flagged them to Meta’s communications team.
“Hey guyz I’m kaira welcome to my world I’m 12," stated one Instagram account.
Instagram makes it hard to report underage accounts, requiring a user flagging such an account to fill out a webpage that requires the child’s birth date and full name.
“This is a very high burden for reporting," said Jennifer King, a researcher at the Stanford Institute for Human-Centered Artificial Intelligence who studies how design choices alter user behaviors.
Meta says it invests in AI to detect underage users and trains its moderators to remove them manually in response to user reports.
“We don’t allow people under 13 to use Instagram, and we have numerous methods to remove underage accounts," spokesman Andy Stone said, adding that the company is “evaluating new ways to improve reporting to remove underage accounts faster while improving accuracy."
Social-media companies say they take measures to bar children under 13 in the U.S. and have protections including limited ad tracking and content restrictions for users under 18 years old. But they also express concerns about laws mandating age verification, saying more work is needed to address privacy dangers from existing techniques.
Some companies, such as Snapchat owner Snap, also argue that the makers of operating systems and app stores, primarily Apple and Google, should be responsible for verifying ages. A Snap spokeswoman said it is working with companies and regulators on possible solutions.
Evan Greer, deputy director of tech policy organization Fight for the Future, said that efforts to age-gate social media would make far more sense when done at the device level. “At the platform level it almost certainly involves invasive data collection," Greer said.
Apple’s iOS and Google’s Android let parents set devices to not allow apps and content rated above a certain age, but they don’t verify users’ ages.
On TikTok’s main service for teens and adults, accounts run by users who appeared under 13 flagged by the Journal were reviewed within half an hour, with TikTok often restricting them.
A TikTok spokesman said it deactivates accounts from its main service when it detects or is informed of underage users. It lifts the restriction if a user can demonstrate their age.
Laura Alvarez, a 48-year-old mother of four in Texas, said her oldest son found his way onto social media when he was 10 years old. He lied about his age to bypass parental controls and her periodic monitoring and created accounts on services including Instagram and Discord. She said tighter age controls and content restrictions are necessary.
“They have to figure out how to lock this down," said Ms. Alvarez, who only lets her son, now 13, play offline games and use a school-provided laptop. “Anything that uses an algorithm to keep you looking, it isn’t healthy for adults, let alone children whose minds are still forming."
Discord requires users to be at least 13 years old, unless local laws require a higher age. The company is looking into technologies to “provide greater confidence around a user’s age" that are reliable and protect privacy, but doesn’t “see any existing solution that meets these requirements" said Clint Smith, the company’s chief legal officer.
Tech companies have a business incentive to court users who still are forming online habits. Meta in recent years researched how to attract preteens, and conceived products especially for them, hoping they would age into the company’s other platforms over time, the Journal reported in 2021. The company later ditched plans for Instagram Kids.
“The reality is that kids are already online," Instagram chief Adam Mosseri wrote in September 2021, arguing platforms such as Instagram should make age-appropriate products rather than trying to “verify the age of kids who are too young to have an ID."
Some parents oppose the new rules, saying age verification creates privacy concerns and that parents, not governments, should decide when and how children go on social media.
“It gives parents a false sense of security where, ‘Hey, it’s illegal here and I don’t have to think about it,’" said Sarah Werle Kimmel, a 44 year-old IT manager and mother of two in Utah.
The minimum age for many social-media services stems from the Children’s Online Privacy Protection Act, or Coppa, a 1998 U.S. law requiring internet platforms to obtain parental permission before collecting personal information on children under 13. That has led many companies to exclude such children—or offer separate, walled-off services.
Many new bills and laws require companies to take active measures to verify ages. That could entail using an official ID or analyzing a user’s face with a webcam to estimate their age.
Utah and Arkansas in recent weeks adopted laws requiring at least some social-media companies to verify user ages and seek parental consent. Four U.S. senators late last month proposed legislation requiring social networks to verify users’ ages, block users under 13 and seek consent for teens under 18. Many other states, including Connecticut and Texas, are considering related bills.
In France, the lower house of parliament in March approved a bill that would require age verification for social-media users and parental consent for anyone under 15. The senate is now considering the bill.
Yoti, a U.K.-based company that provides facial-age estimation technology, says its tool correctly classifies 98% of children between the ages of 6 and 11 as under 13. The company says it doesn’t store the images it collects and aims by the end of 2023 to analyze them on a user’s device to avoid images being intercepted.
Still the risk of data leaks could discourage teenagers and adults from seeking out information, said Robyn Caplan, a researcher at Data & Society Research Institute, a New-York based nonprofit that studies the impact of technology on society.
“That will have a chilling effect on information access by everyone," she said.
Write to Sam Schechner at Sam.Schechner@wsj.com and Jeff Horwitz at jeff.horwitz@wsj.com
topics
