How Congress can protect kids online

Share

In September 2021, the Facebook whistleblower Frances Haugen released a trove of internal Facebook documents. These documents showed, among other things, that Facebook knew that Instagram is toxic for teen girls. One slide summarizing internal company research said, “Thirty-two percent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse.”

These revelations turbocharged policymakers at the state level to enact laws aimed at protecting kids online. A year after her revelations, California adopted the California Age-Appropriate Design Code Act, a kids online safety law modelled on the Age-Appropriate Design Code adopted in the United Kingdom (UK) in 2020. The new California law requires online websites that are “likely to be accessed” by children under 18 to prioritize their safety and to take a variety of measures to identify and mitigate systemic risks to their mental health and wellbeing.

The Kids Online Safety Act (KOSA), a federal bill sponsored by Senators Richard Blumenthal (D-CT) and Marsha Blackburn (R-TN), takes a similar risk-based, system design approach to protecting kids from online harms. KOSA narrowly missed inclusion in the comprehensive budget bill that passed Congress at the end of the last cohort, and has been reintroduced this year.

As KOSA neared passage last year, a group of free speech and civil rights groups argued against it. They argued that the bill established a burdensome and vague “duty of care” to prevent harms to minors. The group also charged that the bill would require overly broad content filtering to limit minors’ access to certain online content. Moreover, online services would face substantial pressure to over-moderate, including from state Attorneys General seeking to make political points about what kind of information is appropriate for young people. Finally, the bill would cut off a vital avenue of access to information for vulnerable youth.

These state and federal measures seem focused on allowing kids to enjoy social media and other online experiences, but with design constraints to make sure they do so in a safe manner. For instance, the UK code from which the California bill is based explicitly says it aims at kids’ safety online “not by seeking to protect children from the digital world, but by protecting them within it.”

The state of Utah, however, has taken a different direction in reacting to online dangers. It recently adopted a parental consent law that requires social media companies to obtain parental consent before allowing children 18 or under to access their services. A companion Utah law would ban addictive social media features and designs for minors. These laws seem aimed at restricting kids access to online material, as if the legislatures had made an implicit cost-benefit assessment that the risks of online harm justified measures to make it harder for children to avail themselves of online tools.

How the U.S. has generally addressed children’s privacy

In the U.S., online protections for minors are largely embodied in privacy law such as the Children’s Online Privacy Protection Act (COPPA), passed in 1998 in the wake of the first national scare about online harms to children. It requires websites that are directed toward children under 13 years of age and websites that have actual knowledge that they are collecting personal information online from a child under 13 years of age to obtain verifiable parental consent before collecting personal data from this age group.

However, this law left a gap in privacy protection for children 13 to 18. Article 1 of the 1989 U.N. Convention on the Rights of the Child applies children’s rights to “every human being below the age of eighteen years” or the age at which a person attains majority. And, for years, Senator Ed Markey has been trying to amend COPPA to expand the age group covered. In late 2022, he nearly succeeded as his Children and Teens’ Online Privacy Protection Act (also call COPPA 2) was reported out of the Senate Commerce Committee, and like KOSA, failed at the last minute to make the cut for inclusion in a must-pass budget bill.

COPPA was not the only legislation passed in the early internet era aimed at protecting kids. Policymakers’ initial concern in the early days of the internet was pornography. In the context of a major reform of the nation’s telecommunications laws in 1996, it adopted the Communications Decency Act. This Act is famous or infamous for its Section 230 grant of immunity to online actors for the material posted by their users.

But other provisions sought to protect minors from harmful online material. In Reno v. ACLU, a landmark First Amendment decision, the Supreme struck down the indecency portions of the statute, holding that the measures were not narrowly tailored since other effective means were available to block indecent material from children and that the age verification defenses proposed were not workable in practice.

Unsurprisingly then, industry and civil liberties groups have raised free speech concerns in connection with today’s measures to protect kids online, including KOSA. After the California law passed the legislature without a single negative vote, the industry trade association NetChoice filed a First Amendment challenge. It argued that the law was overly broad in applying to virtually all websites. It also said the requirement that online companies assess the risks of various online harms to children and create a plan to mitigate these risks before launching a new product or service “will pressure businesses to identify distant or unlikely harms—and to self-censor accordingly.” Further, NetChoice said the law’s age verification requirement is “unrealistic” and will result in “self-censorship,” and the ban on using children’s information in ways that are materially detrimental is plagued by “undefined” terms, “amorphous” concepts and “generalities,” which would lead companies to “self-censor.”

NetChoice has not yet brought a case against the Utah bill. But, in its letter to Utah governor Spencer J. Cox urging him to veto the bill, it argued that the bill was unconstitutional. The trade group said the bill violates the First Amendment by banning anonymous speech and by infringing on adults’ lawful access to constitutional speech. Moreover, it endangers children by requiring them to share their sensitive personally identifiable information, thereby creating new risks of abuse.

Despite these First Amendment concerns, which will be resolved in court in due course, states appear to be rushing to pass laws to protect children, with red states moving toward the parental consent model and blue states looking to design restrictions to make online safe for kids. Perhaps, these efforts will put pressure on Congress to act either by moving with a design approach or a parental consent model. In addition to the revised KOSA bill, Congress also has before it the Protecting Kids on Social Media Act, a bipartisan bill that would ban children under 13 from having an account at a social media firm and would require parental consent for kids 13 to 17, and a proposal from Senator Josh Hawley that bans children under 16 from social media.

Where is the compromise?

A logical compromise might make Congressional action easier. To bring in conservatives, such a compromise could require parental consent; to attract liberals, it could impose design duties. Everyone would get something and children would be protected even after parents had allowed their kids to go online.

However, this both/and approach might just alienate both sides and produce gridlock. The free speech and civil rights groups that had concerns about KOSA, for instance, would not feel better about a bill that compounded what they viewed as KOSA’s failures with an even more draconian restriction on kids access to online services.

My own preference is for a version of the design restrictions approach. It would create a workable and effective framework for managing online risks to children. Given the urgency of protecting kids online and the narrow scope of the design approach, it should withstand First Amendment scrutiny. The danger of stifling kids exploration of the online world is real, but it can be managed through proper implementation. The design approach also avoids the overly restrictive steps of banning children’s access or requiring parental control.

Regardless of which approach is taken, however, nothing can be expected to change unless the Congressional legislation empowers a strong regulatory agency to implement and enforce the new requirements. Much of the vagueness in KOSA, for instance, could be remedied by detailed guidelines imposed by regulation. KOSA put the FTC in charge of enforcement, but the Act would be stronger if it authorized the agency to promulgate regulations under the Administrative Procedure Act to carry out and clarify the provisions.

The California law is enforced by the California Attorney General, not the California Privacy Protection Agency, and this limits the state’s capacity to develop implementing regulations.  The Utah law is enforced by the state’s Division of Consumer Protection but appears to provide little new regulatory authority, except for a rulemaking to establish means by which companies can satisfy the law’s age verification and parental consent requirements.

A new national law to protect kids no matter what state they live in should be a priority for this Congress and appears to be within reach politically. Crucially, such a law should designate a fully empowered regulator to implement and enforce the new requirements. Congress should seize this opportunity to move forward.

Meta is a general, unrestricted donor to the Brookings Institution. The findings, interpretations, and conclusions posted in this piece are solely those of the author and are not influenced by any donation.

You may also like...