KOSA Law Project
It seeks to impose a federal framework requiring online platforms to act proactively to protect children under 17 from online risks, such as harmful content, digital addiction, violence, and grooming.
It establishes a "duty of care" that requires preventing and mitigating risks to children's physical and mental health. It also requires transparency about algorithms, effective parental controls, and processes for disabling personalization, advertising, and tracking features.
Kids Online Safety Act (KOSA) – U.S. Senate Bill S.1409 (118th Congress). Federal bill in progress (passed in Senate committee, pending full vote in Senate and House).
Age-appropriate digital design. Duty of care for the protection of minors. Parental control tools. Restriction of harmful or addictive content. Risk impact assessment for children. Algorithmic transparency. Oversight by the federal regulator (FTC).
It is a bill focused on the protection of rights in digital environments.
The Federal Trade Commission (FTC) as the enforcement authority, empowered to impose civil penalties.
This is a first for the regulatory ecosystem:
It imposes a duty of care on platforms to prevent harmful content (self-harm, eating disorders, sexual exploitation, etc.).
It requires child risk impact assessments on all services accessible to minors. It requires accessible and configurable parental controls, including the ability to disable algorithmic functions. It requires clear information about the operation of algorithms, personalized content, and data collection. It establishes transparency standards for independent researchers.
Other States
The Age-Appropriate Online Design Code Act (LB504)
Signed in 2025, it will take effect on January 1, 2026, with fines beginning in July 2026; it requires a chronological feed, privacy by default, notification reduction, and a ban on dark patterns.Maryland Kids Code
It will go into effect in October 2025 and essentially establishes that companies or entities offering an online product "that is reasonably likely to be accessed by children" must conduct a "Data Protection Impact Assessment" on or before April 1, 2026.
The Minnesota Age-Appropriate Design Code bill, introduced on February 27, 2023, is being considered by the State Legislature.
The “Hawaii Age-Appropriate Design Code” is currently being drafted. Enacted in January of this year, its goals are to “Promote privacy protection for minors” and ensure that online products, features, or services likely to be accessed by minors are designed to ensure their protection. It also grants powers to the State Attorney General and creates a data protection task force.
The State General Assembly has been processing a bill called the "Age-Appropriate Design Code" since February 2024. This bill mandates that all companies operating in the state and processing data of minors must do so in a manner consistent with respect for the child's best interests.
In 2025, the State of Vermont passed a law similar to California's.
In the State of South Carolina, the "South Carolina Age-Appropriate Design Code Act (HB 4842)" was introduced in January of this year, and has remained unchanged since then. Similar in structure to the aforementioned bills, it establishes obligations for entities that minors may access through the Internet, aimed at protecting the interests of minors regarding both their personal data and content that could be detrimental to them, such as material, physical, or financial harm to minors, among others.
Additionally, California has House Bill No. 56 (2025) – the Social Media Warning Law – which seeks to adopt a preventative approach to public health, modeled after warning models similar to those for tobacco and alcohol.
Kids Online Safety Act (KOSA) – U.S. Senate Bill S.1409 (118th Congress). Federal bill in progress (passed by Senate committee, pending full vote in the Senate and House).
Age-appropriate digital design. Duty of care for the protection of minors. Parental control tools. Restriction of harmful or addictive content. Risk impact assessment for children. Algorithmic transparency. Oversight by the federal regulator (FTC).
It is a bill focused on the protection of rights in digital environments.
Federal Trade Commission (FTC) as the enforcement authority, empowered to impose civil penalties.
This is a first for the regulatory ecosystem:
It imposes a duty of care on platforms to prevent harmful content (self-harm, eating disorders, sexual exploitation, etc.).
It requires child risk impact assessments on all services accessible to minors. It requires accessible and configurable parental controls, including the ability to disable algorithmic functions. It requires clear information about the operation of algorithms, personalized content, and data collection. It establishes transparency standards for independent researchers.
California
Following the publication of the “Age-Appropriate Design Code” in the United Kingdom, California passed the similar law “California Age-Appropriate Design Code Act” (CAADCA) in 2022.
September 15, 2022 (entered into force on July 1, 2024, currently subject to judicial intervention).
Age-appropriate digital design. Minimum age and digital consent. Protection of personal data of minors. Regulation of algorithms and profiling practices. Data Protection Impact Assessment (DPIA). Transparency and parental control. Prohibition of "dark patterns" in services for children.
It is a law designed to protect the rights of children and adolescents in digital environments.
The California Children's Data Protection Working Group is established, with experts in health, privacy, technology, and children's rights. Purposes: up to $2,500 per negligent violation and up to $7,500 per intentional violation per affected child.
The law establishes new obligations for companies that provide online services, products, or features "likely to be accessed by minors" (those under 18 years of age). It requires default privacy settings, clear policy wording, data protection impact assessments, restrictions on profiling and geolocation, and prohibits digital manipulation ("dark patterns"). Its objective is to protect children's rights against invasive digital practices.
Furthermore, it recognizes developmental stages by age and establishes that the interests of the child prevail over commercial interests. It also promotes ethical standards in digital design and could serve as a regulatory benchmark for other jurisdictions.
United States
The lack of progress at the National Congressional level on digital protection for children and adolescents led various states to promote their own laws. California led the way in 2022 with the Age-Appropriate Design Code Act, followed by Maryland, Vermont, and, most recently, Nebraska with regulations that will take effect in 2026. These laws, inspired by the British model, aim to guarantee privacy by default, limit notifications, require chronological feeds, and prohibit digital manipulation practices. Although they face litigation and challenges from the technology industry, they demonstrate a decentralized and experimental progress that transforms states into laboratories for digital regulation.