U.S Senate passes the Kids Online Safety Act
In the US, the Kids Online Safety Act (KOSA) is a significant new law that was passed by the Senate on July 30, 2024. This blog explores the key requirements of KOSA and its implications for companies.
What is the purpose of KOSA?
KOSA, the Kids Online Safety Act, was introduced in February 2022 by Senators Richard Blumenthal (D-CT) and Marsha Blackburn (R-TN) to safeguard children from online dangers. The act mandates platforms to minimize addictive features, give young users the option to opt out of algorithm-driven content, and protect children’s personal data.
For the past three years, President Joe Biden has advocated for enhanced online safety for young people, first addressing the issue in his 2022 State of the Union speech and calling for new privacy protections for children online. He reiterated the need for such legislation in his 2024 State of the Union address, emphasizing the importance of bipartisan efforts to protect children online.
In July 2024, the Senate passed KOSA along with the Children’s and Teens Online Privacy Protection Act (COPPA 2.0). This marks the first significant legislative action to protect children online in nearly 30 years.
Who does KOSA apply to?
The bill establishes a duty of care to protect minors (children under 17) on certain online platforms, such as social media, online video games, virtual reality worlds, online messaging, and video streaming sites. However, it exempts entities like email and internet service providers, and educational institutions. Covered platforms must take "reasonable" measures to limit harm, such as reducing addictive or harmful features, setting the highest privacy settings by default, and enhancing parental controls.
The Federal Trade Commission (FTC) will enforce the bill's requirements. Companies that do not comply or fail to prevent or reduce online harm will face significant lawsuits. State attorneys general can enforce parts of the law related to safeguards for minors, disclosure, and transparency.
Has KOSA been signed into law?
KOSA was passed by the Senate on July 30, 2024, with a 91-3 vote and now moves to the House.
Civil liberties advocates and opponents have expressed concerns about the bill's impact on privacy, free speech, and access to essential information, including LGBTQ+ topics. However, supporters argue that the bill targets the design of algorithms recommending content, not specific content types.
In response to free speech and information access concerns, a February 2024 update to the bill defined "design feature" more clearly. It now focuses on elements that encourage minors to spend more time on a platform, such as infinite scrolling, notifications, and rewards for staying online. This shift aims to regulate platform design rather than content.
What are the current requirements of the bill?
The central aim of KOSA is to establish a duty of care to protect users under 17 from risks. Platforms must reduce harmful content related to self-harm, anxiety, depression, and eating disorders, activate the highest privacy and safety settings by default for users under 17, and limit design features like infinite scrolling and online rewards.
Platforms are required to implement the most restrictive privacy and safety settings by default. KOSA mandates safeguards such as preventing unknown adults from contacting children or viewing their personal information, restricting the sharing of minors' geolocation data, and allowing children to opt out of personalized recommendations. Parents or guardians will have access to children's privacy and account settings. These measures are similar to the UK's Age Appropriate Design Code and aim to restrict the collection of personal data.
The bill's current wording also requires platforms to:
- Disclose specified information, including details on personalized recommendation systems and advertising to minors.
- Allow guardians, parents, schools and minors to report certain harms.
- Refrain from advertising age-restricted products or services, like tobacco and gambling, to minors.
- Provide dedicated pages for reporting harmful content.
Platforms with over 10 million monthly active users in the US must also annually report on foreseeable risks of harm to minors and the mitigation steps taken.
Are there any age verification requirements of KOSA?
The age requirements in KOSA have been updated since its introduction. The bill does not mandate age verification by platforms but allows them to use 'objective circumstances.' This means platforms can use existing empirical data to reasonably estimate a user's age. If a platform can reasonably identify a child using their service, they must proactively protect them.
Different platforms may interpret this requirement differently, but the FTC is expected to provide guidance. To avoid misjudging a user's age, some companies might still implement new measures and technologies to verify users' ages, especially for platforms with 18+ content.
The bill also requires platforms to obtain 'verifiable parental consent' for minors to create social media accounts, along with implementing safeguards and parental control settings.
Although the age requirements in KOSA have been updated, it's possible to perform minimal data age checks online without compromising privacy. Users do not need to upload identity documents or share extensive personal information. Parents can grant consent using FTC-approved methods.
What’s next for KOSA?
With KOSA now passed by the Senate, it must go through a few more steps before it becomes law. The bill will next move to the House, where it will be introduced and reviewed by relevant committees.
As other countries implement their own online safety regulations, like the UK’s Online Safety Act and Europe’s Digital Services Act, the progress of KOSA will be closely monitored.