Don’t Let App Stores Off the Hook

Apps icons on an iPhone in Bath, England, March 13, 2024. (Anna Barclay/Getty Images)

Without any accountability, these virtual stores are marketing a myriad of goods that are unsuitable for children.

Sign in here to read more.

Without any accountability, these virtual stores are marketing a myriad of goods that are unsuitable for children.

O n July 9, for the first time, federal regulators banned a digital platform from serving users under 18. As part of this landmark settlement with the Federal Trade Commission, NGL, an anonymous-messaging app that pitched itself as a “safe space for teens,” will now be required to prevent users from accessing the app if they indicate they are under 18. NGL also agreed to stop marketing its apps to kids and teens. As the Washington Post reported, “the settlement marks a major milestone in the federal government’s efforts to tackle concerns that tech platforms are exposing children to noxious material and profiting from it.”

The impetus for the FTC’s investigation was complaints that NGL had become a hotbed for cyberbullying, despite claiming that it had a sophisticated AI system to prevent such actions on its platforms. The settlement is an important step forward in preventing apps from harming America’s kids. But NGL is only one app of many that are harmful and dangerous for children and yet are easily available for download and are being actively promoted to children in the app stores.

Unlike practically every other company in America that markets products to children and adolescents, tech companies in Silicon Valley have up until now been totally exempted from regulations for child safety.

The tide is starting to turn. In recent years, lawmakers have turned their attention to efforts to keep children safe online. Some federal bills have focused on changing social-media companies’ business model, like the bipartisan Kids Online Safety Act sponsored by Senators Blumenthal (D., Conn.) and Blackburn (R., Tenn.), which should be passed by Congress and signed by the president without delay. Meanwhile, some states have sought to ensure that parents have meaningful control over the decision to allow their children to get on social media, by requiring platforms to age-verify their users and receive parental consent for minor users before a minor opens an account. These are all important solutions.

However, one significant player that facilitates children’s access to harmful digital content has largely flown under the radar: app stores.

Without any accountability, these virtual stores are marketing a myriad of goods that are unsuitable for children, even though many parents have enabled app age-rating restrictions on their children’s devices.

One mom of five shared that whenever she opens the Apple App Store for her ten-year-old son, it directs him to “must-have” apps that include Tinder, Hinge, Bumble, and TikTok. One of these facilitates hookups between adults, two are dating apps, and the fourth is a social-media app for ages 13 and up. Yet she has set up her child’s device to allow him to access only apps rated appropriate for nine- to twelve-year-olds. So how is this happening? “The App Store knows his exact age, so I’m not sure why it is allowed to advertise 17+ apps to him with words like ‘must-have,’” she said. She also noted that her 13-year-old’s App Store recently featured a prominent Bumble advertisement that was “teaching about how to maximize a dating profile and use ‘SuperSwipe’ if he’s interested in a potential dating partner.”

Frighteningly, the marketing is working. According to one report, a quarter of preteen boys — ages nine to twelve — say they’ve been on online dating apps.

These are the sad results of a system that has cut parents completely out of the picture. While Apple and Google devices offer a parental-control setting that gives parents the ability to approve or deny any app download on the device, the companies are not required to enable this by default. Moreover, the setting is buried and difficult to identify and find. Even more troubling, a recent report in the Wall Street Journal found that this setting lacks effectiveness. Parents are frustrated with a loophole in Apple’s “Ask to Buy” setting that allows children to redownload an app without permission after a parent has deleted it. The “Ask to Buy” approval happens only once, so there isn’t a way to take apps away from kids once they have them.

Attention up until now has been focused on social-media platforms. And while laws requiring age verification and parental consent for social-media platforms, such as Instagram and Snapchat, are an important first step in protecting parental authority over children’s social-media use, the app stores themselves — where kids are predominantly accessing these platforms — should also be on the hook.

Why should Apple and Google be allowed to offer social-media platforms that are supposed to be age-restricted to those under 13, and now in certain states will be age-restricted for minors under 18 with parental consent, without checking users’ ID to confirm that they are of age? It does not make sense to exempt app stores from the customary care that we require of brick-and-mortar stores, which are entrusted with the duty of age-verifying their customers for the purchase of age-restricted items such as tobacco, alcohol, pornographic magazines, spray paint, pharmaceuticals, or even things like glue and cough syrup that are too risky, or downright inappropriate, for kids to buy.

We have long deemed it immoral for businesses to profit from the sale of commodities that may cause harm to, or addict, children — even if the stores themselves have no hand in their production. With the mental-health crisis among adolescents, and its connection to social-media platforms, now fully established, app-store exceptionalism is no longer tenable.

In app stores, children often enter into service contracts with some of history’s most powerful entities when they download apps that they are legally incapable of consenting to. We don’t allow minors to enter into complex agreements to obtain loans or credit cards. So why do app stores continue to overlook the age of their users and let kids download apps that require complex terms of service without parental consent?

For all these reasons, we (the authors of this op-ed) released a comprehensive policy paper on needed device- and app-store-level solutions in fall of 2023, which among other measures called for state and federal lawmakers to require app stores to age-verify users seeking to download social-media platforms onto their devices and require minors to receive parental consent before doing so.

App stores are already well suited technologically to conduct age verification. In setting up a new smartphone, the user is required to establish an Apple or Google ID and enter their birth date. Age verification could easily be tacked on to this setup process for any smartphone or tablet.

Children identified as under 18 should be required to be linked to a verified supervisory account. This is currently the practice for device users twelve and under. In the case of Apple, its Family Sharing unit provides the means to confirm that the individual setting up a child’s account is actually a parent or guardian. Linking the device to a supervisory account would allow parents to vouch for their child’s age without requiring additional ID for the underage user. The App Store could then communicate the minor user’s age to apps upon download, providing a simple, anonymous, encrypted signal indicating whether the user is old enough to use their product. If a child tried to download OnlyFans, an adult app, for example, the device would communicate that the user is not eligible to download it, and access would be blocked.

Of course, app stores should not be solely responsible for conducting age verification. If a user goes straight to a social-media platform from a browser and doesn’t log in with an Apple or Google ID, then that platform should still ensure age verification and parental consent. Both app stores and social-media platforms should be required to age-verify as relevant.

For an additional layer of protection beyond age verification, parental consent should also be mandatory for all app downloads and in-app purchases by users under 18. This way even if a child is old enough to get on social media, parents can still decide whether they want their child on specific apps. Requiring parental consent ensures that parents are always aware of, and can guide, their children’s online activities.

These solutions are simple and straightforward and address a key gap in the problem of harms to kids online. Lawmakers must not let app stores off the hook when it comes to protecting our kids.

Clare Morell is the director of the Technology and Human Flourishing Project at the Ethics and Public Policy Center. Her forthcoming book, The Tech Exit: A Manifesto for Freeing Our Kids, will be published by Penguin Random House. Michael Toscano is executive director of the Institute for Family Studies.

You have 1 article remaining.
You have 2 articles remaining.
You have 3 articles remaining.
You have 4 articles remaining.
You have 5 articles remaining.
Exit mobile version