Congress’s Well-Intentioned Efforts to Micromanage Children’s Online Safety Will End Poorly

(sakkmesterke/Getty Images)

Congress must seek balance in prioritizing children’s safety, civil liberties, and parental rights.

Sign in here to read more.

Congress must seek balance in prioritizing children’s safety, civil liberties, and parental rights.

S mart government requires lawmakers to balance competing goods: liberty and order, privacy and security, fiscal responsibility and necessary spending, among many others. Two such goods are child welfare and parental prerogatives. Government, at all levels in America, has traditionally allowed parents wide latitude, interfering only in cases of glaring and severe abuse.

When regulating the digital world, many policy-makers and pundits have determined that America’s legal and societal norms ought not apply. This is true for issues of free speecheconomicsprivacy, and, now, parental rights. This new coalition seeks to impose on families its own conception of responsible online behavior. As the late Supreme Court justice Antonin Scalia wrote regarding a similar effort to regulate video games in Brown v. Entertainment Merchants Association (2011), while this agenda “may indeed be in support of what some parents . . . actually want, its entire effect is only in support of what the State thinks parents ought to want.”

Hyperregulating children’s online lives, as with most government intrusions into matters best left to individual families and the market broadly, will inevitably lead to hordes of unintended consequences. Nonetheless, the Senate Commerce Committee recently advanced the Kids Online Safety Act (KOSA) and Children and Teens’ Online Privacy Protection Act (COPPA 2.0). Although proponents of this legislation wish to protect children from very real online harms, their proposals are ill fitted to that end. Instead, they would generate significant constitutional violations, privacy risks, and needless regulatory burdens.

KOSA defines “covered platform” as an online service “that is used, or is reasonably likely to be used, by a minor.” This vague definition encompasses nearly every website that exists. The bill would laden covered platforms with the responsibility to “prevent and mitigate” a host of ills, including mental-health disorders, “addiction-like behaviors,” “predatory, unfair, or deceptive marketing practices,” and more. This duty of care would extend to any user whom a platform “knows or reasonably should know is a minor.”

Society surely has an interest in protecting children from these harms. Mitigating such wide and nebulous categories, however, would require platforms to suppress legally protected speech. The effects that users perceive from speech vary greatly, according to each individual’s personality, social circle, and medical history. For example, common fitness-related content, which most teens would find benign or inspirational, could exacerbate another child’s preexisting eating disorder. Likewise, the range of speech that theoretically could contribute to depression or anxiety has no bounds. KOSA’s breadth and vagueness would incentivize platforms to over-moderate online speech to avoid overzealous retribution and litigation from regulators.

This duty of care would arm anti-free-speech crusaders at the Federal Trade Commission (FTC) and those in the offices of state attorneys general. Some observers have even lately suggested that online platforms ought to bear liability for gun violenceterrorist activity, and children’s mental-health issues.

Following the May 2022 mass shooting in Buffalo, N.Y., the Empire State’s attorney general, Leticia James, proposed foisting new liability on platforms. Under KOSA, James could target any platform that, in her view, had insufficiently suppressed online speech that had — again, in her view — contributed to the proliferation of white supremacy. Legislation such as KOSA is entirely subjective and at the whim of bureaucrats. However, the First Amendment explicitly forbids officials to suppress any speech that falls within extraordinarily wide parameters.

Both KOSA and COPPA 2.0 would also promote mandatory age verification. Either outright to exclude underage users or to avoid liability altogether, online platforms will likely mandate user-age verification. Or this could be used to establish definitively which users qualify for KOSA’s special treatment. Tech entrepreneurs would have little incentive to attempt navigating the bill’s vague reasonable-knowledge standard. Any wrong guess would invite legal scrutiny.

Although KOSA’s text says that it would not require age verification, the bill resembles a robber who, with a gun to his victim’s head, says, “You’re free to choose your next move.” Meanwhile, COPPA 2.0 would provide to many websites an explicit carve-out if they institute age verification.

Age verification requires extensive data disclosures — of either government-issued documentation or biometric data — from the user. Proliferating such sensitive information across the internet invites hacks and other security breaches. Even governments and the largest corporations often fall victim to cybercriminals.

Moreover, state-enforced age verification largely eliminates the individual’s right to speak anonymously, which the Supreme Court recognized in McIntyre v. Ohio Elections Commission (1995). Private companies may require prospective users to provide any information as a condition of access to their online platforms. Courts, however, will look skeptically on statutes that functionally, if not explicitly, mandate age verification.

Both bills would encumber vast swaths of the internet with excessive regulatory burdens. For example, COPPA 2.0, which would revise a 1998 statute of the same name, seeks to extend its predecessor’s protections to teens (as well as younger children) and to increase compliance costs. Perhaps worse, COPPA 2.0 nixes the existing “actual knowledge” compliance standard — that to be covered by COPPA’s provisions, websites not targeted to children must actually know of the presence of child users. Like KOSA, it instead would regulate any website that “is used or reasonably likely to be used by children or teens.” One early estimate pegged 1998 COPPA’s annual compliance cost for many websites at $115,000 to $290,000, while a later figure stood at $60,000 to $100,000.

It is doubtful that bureaucrats would implement the provisions of KOSA and COPPA 2.0 with prudence, temperance, and sensitivity to the digital world’s intricacies and societal benefits. The bills would vest extensive discretion and implementation authority in the FTC. They would empower increasingly bellicose state attorneys general to launch civil suits. Today’s FTC, led by the vigorously tech-hostile Lina Khan, has defined itself through invasive regulation and enforcement.

Too many view the internet as an accessory to “real life,” and policy-makers may strip users of their basic economic or civil liberties. They seemingly argue that in the digital world, neither constitutional constraints nor knowledge problems apply. These misapprehensions become increasingly untenable as “real life” increasingly migrates online.

When regulating the digital world, Congress must seek balance in prioritizing children’s safety, civil liberties, and parental rights. Parents must do more to protect their children’s safety, and the law should support, not preempt, their efforts. Such work requires more time, hard work, and consideration but will yield results far better than the ill-conceived KOSA and COPPA 2.0.

You have 1 article remaining.
You have 2 articles remaining.
You have 3 articles remaining.
You have 4 articles remaining.
You have 5 articles remaining.
Exit mobile version