Skip to main content
Clear icon
47º

What to know about the Kids Online Safety Act that just passed the Senate

1 / 2

Copyright 2023 The Associated Press. All Rights Reserved.

FILE - Students work on a laptop computer at Stonewall Elementary in Lexington, Ky., Feb. 6, 2023. A bill aiming to protect kids from the harms of social media, gaming sites and other online platforms appears to have enough bipartisan support to pass, though whether it actually will remains uncertain. (AP Photo/Timothy D. Easley, File)

The last time Congress passed a law to protect children on the internet was in 1998 — before Facebook, before the iPhone and long before today’s oldest teenagers were born. Now, a bill aiming to protect kids from the harms of social media, gaming sites and other online platforms has passed in the Senate with overwhelming support. Its fate is less certain in the House, but President Biden has indicated he would sign it if it passes.

Proponents of the Kids Online Safety Act include parents’ groups and children’s advocacy organizations as well as companies like Microsoft, X and Snap. They say the it is a necessary first step in regulating tech companies and requiring them to protect children from dangerous online content and take responsibility for the harm their platforms can cause.

Recommended Videos



Opponents, however, fear KOSA would violate the First Amendment and harm vulnerable kids who wouldn’t be able to access information on LGBTQ issues or reproductive rights — although the bill has been revised to address many of those concerns, and major LGBTQ groups have dropped their opposition to the legislation.

Here is what to know about KOSA and the likelihood of it going into effect.

What does KOSA do?

KOSA would create a “duty of care” — a legal term that requires companies to take reasonable steps to prevent harm — for online platforms minors will likely use.

They would have to “prevent and mitigate” harms to children, including bullying and violence, the promotion of suicide, eating disorders, substance abuse, sexual exploitation and advertisements for illegal products such as narcotics, tobacco or alcohol.

Social media platforms would also have to provide minors with options to protect their information, disable addictive product features, and opt out of personalized algorithmic recommendations. They would also be required to limit other users from communicating with children and limit features that “increase, sustain, or extend the use” of the platform — such as autoplay for videos or platform rewards. In general, online platforms would have to default to the safest settings possible for accounts it believes belong to minors.

“So many of the harms that young people experience online and on social media are the result of deliberate design choices that these companies make,” said Josh Golin, executive director of Fairplay, a nonprofit working to insulate children from commercialization, marketing and harms from Big Tech.

How would it be enforced?

An earlier version of the bill empowered state attorneys general to enforce KOSA’s “duty of care” provision but after concerns from LGBTQ groups and others who worried they could use this to censor information about LGBTQ or reproductive issues. In the updated version, state attorneys general can still enforce other provisions but not the “duty of care” standard.

Broader enforcement would fall to the Federal Trade Commission, which would have oversight over what types of content is “harmful” to children.

Who supports it?

KOSA is supported a broad range of nonprofits, tech accountability and parent groups and pediatricians such as the American Academy of Pediatrics, the American Federation of Teachers, Common Sense Media, Fairplay, The Real Facebook Oversight Board and the NAACP. Some prominent tech companies, including Microsoft, X and Snap, have also signed on.

ParentsSOS, a group of some 20 parents who have lost children to harm caused by social media, has also been campaigning for the bill’s passage. One of those parents is Julianna Arnold, whose 17-year-old daughter died in 2022 after purchasing tainted drugs through Instagram.

“We should not bear the entire responsibility of keeping our children safe online,” she said. “Every other industry has been regulated. And I’m sure you’ve heard this all the time. From toys to movies to music to, cars to everything. We have regulations in place to keep our children safe. And this, this is a product that they have created and distributed and yet over all these years, since the ’90s, there hasn’t been any legislation regulating the industry.”

KOSA was introduced in 2022 by Senators Richard Blumenthal, D-Conn., and Marsha Blackburn, R-Tenn.

Who opposes it?

The ACLU, the Electronic Frontier Foundation and other groups supporting free speech are concerned it would violate the First Amendment. Even with the revisions that stripped state attorneys general from enforcing its duty of care provision, EFF calls it a “dangerous and unconstitutional censorship bill that would empower state officials to target services and online content they do not like.”

Kate Ruane, director of the Free Expression Project at the nonprofit Center for Democracy and Technology, said she remains concerned that the bill’s care of duty provision can be “misused by politically motivated actors to target marginalized communities like the LGBTQ population and just politically divisive information generally,” to try to suppress information because someone believes it is harmful to kids’ mental health.

She added that while these worries remain, there has been progress in reducing concerns.

The bigger issue, though, she added, is that platforms don’t want to get sued for showing minors content that could be “politically divisive,” so to make sure this doesn’t happen they could suppress such topics — about abortion or transgender healthcare or even the wars in Gaza or Ukraine.

NetChoice, a tech industry group whose members include Meta, Google and X among others, also opposes KOSA — and has four injunctions against similar state laws.

Last year, for instance, after NetChoice’s challenge, a federal judge has halted implementation of a California law that would require businesses to report to the state on any product or service they offer on the internet that is likely to be accessed by minors and provide plans to reduce any harms children might suffer. It also would ban businesses from collecting certain types of data from young users.

“The State has no right to enforce obligations that would essentially press private companies into service as government censors,” U.S. District Judge Beth Labson Freeman wrote in her ruling last September.

While Congress has “good intentions” wanting to protect kids from online harms, KOSA “fails to meet basic constitutional principles and fails parents because it won’t make a single child safer online or address their concerns,” said Carl Szabo, NetChoice’s vice president and general counsel. “Taking away parents’ and guardians’ authority and choice, while forcing them to give up their and their children’s personal information to access and engage in free speech, is both dangerous and a violation of their rights.”

What happens next?

House Speaker Mike Johnson has been non-committal about whether he will bring the bill up in the House, but has said that he is committed to trying to find consensus. If the House does pass the bill before the congressional session ends in January, President Biden has strongly indicated he will sign it.

In a statement Tuesday encouraging the House to pass the legislation, Biden said that “we need action by Congress to protect our kids online and hold Big Tech accountable for the national experiment they are running on our children for profit.”

Associated Press Writer Mary Clare Jalonick contributed to this story from Washington D.C.


Recommended Videos