Meta, the parent company of Facebook and Instagram, called on federal lawmakers Wednesday to pass legislation requiring parental approval for app store downloads by teenagers, as the company faces pressure to strengthen protections for children on its social media platforms.
Antigone Davis, Meta’s global head of safety, said in a blog post the company supports requiring app stores to get parental approval for downloads by teens younger than 16 years, as opposed to an app-by-app approach.
“Technology companies are developing distinct, age-appropriate experiences for teens, while lawmakers consider new legislation designed to protect their safety and privacy online,” Davis said.
“Legislation is needed so all apps teens use can be held to the same standard,” she continued. “But what’s happening is much more complicated than that.”
Davis argued that proposed legislation would require teens to repeatedly go through processes to verify their age and their parents’ approval for different apps, including providing “potentially sensitive identification information to apps with inconsistent security and privacy practices.”
“Teens move interchangeably between many websites and apps, and social media laws that hold different platforms to different standards in different states will mean teens are inconsistently protected,” Davis added.
Under Meta’s suggested approach, parents would receive notifications when their teens attempt to download an app and could decide whether to approve the download. They would also be able to verify their teen’s age when setting up their phone initially.
“This way parents can oversee and approve their teen’s online activity in one place,” Davis said. “They can ensure their teens are not accessing adult content or apps, or apps they just don’t want their teens to use. And where apps like ours offer age-appropriate features and settings, parents can help ensure their teens use them.”
Davis also argued this “industry-wide” approach would preserve individuals’ privacy because apps would not need to collect potentially sensitive identifying information about teens and their parents.
Meta has repeatedly faced pushback in recent years over its handling of children’s safety on Facebook and Instagram.
A former Meta employee came forward earlier this month, alleging top executives dismissed warnings that teens were facing unwanted sexual advances and widespread bullying on Instagram.
The revelations came just more than two years after former Facebook employee Frances Haugen shared internal company documents, including reports about the impact of Instagram on teens.
A bipartisan coalition of 33 states also sued Meta in October, alleging the company knowingly designed and deployed features that harmed young users’ mental health. Another eight states, as well as the District of Columbia, also filed lawsuits against Meta in state court.
Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.