Americans everywhere want to protect children from harmful social media and online content. Of course, the overwhelming push to make the internet a safer place for children should come as no surprise. The harmful effects of social media on children’s mental health are well-documented, and dozens of state attorneys general have sued top social media platforms like Facebook, Instagram, and TikTok for intentionally designing dangerous, addictive algorithms. Additionally, reports have revealed that popular dating apps are failing to keep underage users off of their platforms, and even video game companies have come under fire for engaging in manipulative and privacy-abusive behavior targeting children. All of this has led parents, lawmakers, and tech industry experts to agree that something has to be done to protect children online.
However, that raises the question: how do you keep kids safe on the internet? This question has sparked debate that has given a group of bad actors, including top social media platforms, dating apps, video game makers known for targeting minors, and even pornography companies, an opening to try to convince lawmakers to make others responsible for their dangerous actions. Many of these companies – unwilling to tweak their dangerous algorithms or change their business practices to protect children – have placed their support behind age verification legislation. Their plan would make other companies, like internet providers, device manufacturers, and app store providers responsible for verifying users’ ages to prevent minors from accessing their products. At first glance, this idea seems like an easy solution to protect children online. In truth, age verification mandates would allow bad actors to evade accountability for endangering their young users while they continue to profit off of children and exploit sensitive user information.
Take the idea of forcing app stores to verify users' ages, for example, a proposal that companies like Meta, which owns Facebook and Instagram, TikTok, dating app conglomerate Match Group, and video game maker Roblox have lined up behind. On its face, this idea makes little sense. It would be akin to forcing a shopping mall to ID every consumer to stop minors from entering a single store that sells alcohol. Additionally, this proposal would not prevent children from accessing harmful content. Instead of downloading Instagram from an app store, underage users could simply type in www.instagram.com on any web browser.
Recommended
Pornography companies have tried a similar trick. They’ve lobbied in favor of legislation that would require manufacturers of devices such as smartphones and tablets to block minors from reaching pornographic websites. A coalition representing the pornography industry even argued for this solution in front of the Supreme Court recently, which Justice Samuel Alito pointed out would simply “put the costs” of protecting children onto Google and Apple. Additionally, this porn industry-backed proposal wouldn’t prevent children from accessing pornographic websites on any computer, laptop, gaming console, or other smart device.
There are, of course, other ways to protect children online. In response to increasing consumer demand for safety measures, the industry has developed several tools designed to limit or restrict children’s access time, content viewing, and engagement features. Other officials have taken steps to educate children and parents about the dangers of social media and harmful online content and to raise awareness about parental controls and other technologies that exist designed to protect children on the internet. Additionally, some states have passed laws putting the onus on content providers – including pornography websites – to verify users’ ages to keep children safe. And while the laws compelling content providers to use age verification technology are currently being debated at the Supreme Court, some lawmakers have had success in putting pressure on some tech companies, including Facebook, to voluntarily begin using the technology.
Lawmakers are right to work toward solutions that will protect children online. However, letting bad actors off the hook won’t keep children safe on the internet. Instead, policymakers can and should place their focus on the companies responsible for harming children and raising awareness about the tools that exist to protect children from their dangerous products.
Join the conversation as a VIP Member