Public Sector & Online Safety

Protecting Children Online: Current Laws, Risks, and What's Next

Fox Hill Consulting

Children live online—at school, on phones, in games, and across social platforms. Laws are evolving quickly to keep up. This article offers a grounded overview of the risks, where policy stands today, and where momentum is heading—so leaders can take practical steps now.

Protecting Children Online

1. The Reality of Growing Up Online

Children encounter the internet earlier than any previous generation: chat apps, classroom platforms, games, streaming, and social feeds blur into a single environment. That environment can be empowering and educational. It can also be extractive, manipulative, and unsafe if we fail to design and govern it well.

Schools, parents, platforms, and policymakers all share responsibility for creating safer digital experiences. Online safety spans data, content, privacy, and digital well-being. It is not just about blocking harmful content; it is about minimizing data capture, discouraging manipulative design, and ensuring children have age-appropriate controls and support.

2. The Core Risks Facing Children Online

Privacy risk is foundational. Data collection, behavioral tracking, and profiling can build a lasting record that children never knowingly consented to. When combined with advertising and recommendation systems, the incentives can tilt toward maximizing engagement rather than minimizing harm or data exposure.

Safety risks include grooming, bullying, coercion, and exposure to age-inappropriate content. These issues are amplified by real-time chat, anonymous interactions, and algorithms that push content based on engagement signals instead of suitability. School communities feel this most acutely where social dynamics spill from the feed into hallways and homes.

Integrity risks are rising fast: deepfakes, synthetic media, and AI-generated content complicate trust and attribution. For students, that means a confusing information environment and new angles for deception. For schools and edtech vendors, it raises the bar on content moderation, provenance, and abuse detection.

These risks compound in educational contexts where multiple platforms, vendors, and data flows converge. Policies must balance access and learning benefits with rigorous privacy, safety, and accountability requirements for the tools children use every day.

3. Key U.S. Legal and Regulatory Frameworks

In the U.S., the Children’s Online Privacy Protection Act (COPPA) governs data collection from children under 13. It places obligations on services to obtain verifiable parental consent, provide clear notices, and limit use and sharing of children’s data. Although narrow in scope and age coverage, COPPA remains a baseline many vendors must meet.

Schools and educational technology vendors face additional responsibilities around student data—contractual and statutory—covering data minimization, transparency, and safeguards. Many states are also proposing or enacting child online safety laws with varying approaches to age-gating, privacy controls, design requirements, and enforcement. The regulatory pattern is expanding, but uneven.

Current & Pending Child Online Safety Laws

Current Child Safety Laws – United States (Federal)

LawYearWhat It RegulatesNotes / Impact
COPPA – Children's Online Privacy Protection Act1998 (updates ongoing)Data collection, storage, and use for children under 13Requires verifiable parental consent; limits tracking & targeted ads; applies to child-directed services
CIPA – Children's Internet Protection Act2000Internet filtering & safe access in schools and librariesRequires blocking harmful content and enforcing acceptable-use policies on institution-managed networks
FERPA – Family Educational Rights and Privacy Act1974Student educational data privacyLimits sharing of student information; applies to edtech vendors receiving student data from schools

Current Child Safety Laws – United States (State Level, Selected)

State / LawYearFocus AreaKey Requirements
Utah – Social Media Regulation Act2023 (amended 2024)Social media access, parental consent, account controlsParental permission for minors; limits on algorithmic feeds; tools for parents to oversee accounts
State "Kids Codes" (e.g., CA/CT/CO/VA)2023–2025Child data privacy, profiling, behavioral designRestrict sale of children's data, tighten targeted advertising, and push safer default settings
Age-Assurance / Social Media Access Laws (10+ states)OngoingAge verification and parental consent requirementsRequire age checks, restrict teen access, or mandate parental approval for new accounts
Comprehensive State Privacy Laws (with minor protections)2023–2025Youth data protectionInclude explicit safeguards for minors even when not branded as "child safety" statutes

Current Child Safety Laws – Worldwide

Country / RegionLawYearFocus AreaKey Provisions
United KingdomOnline Safety Act2023Platform safety duties; child protectionEstablishes a duty of care; requires age-assurance and risk mitigation; enforced by Ofcom
European UnionDigital Services Act (DSA)2022–2023 rolloutChild protection on large online platformsRequires risk assessments, age-appropriate design, and stronger safeguards for minors
AustraliaOnline Safety Act2021Harmful content removal; child protectionExpands powers for the eSafety Commissioner; rapid takedowns; under-16 social media restrictions emerging

Pending & Emerging Child Safety Laws – United States (Federal)

BillStatusWhat It Would Change
Kids Online Safety Act (KOSA)Reintroduced 2025Would create a platform "duty of care" to minors, require safer defaults, constrain addictive features, and expand parental tools and risk audits
COPPA 2.0 – Children & Teens' Online Privacy Protection ActActive proposalExtends COPPA protections up to age 16, tightens data minimization, and restricts targeted advertising to minors

Pending & Emerging Child Safety Laws – United States (State Trends)

CategoryExamples / StatesWhat's Pending
Strict Age Verification BillsMultiple statesRequire platforms to verify user age before account creation, often using third-party age-assurance providers
Social Media Access Restrictions for MinorsNumerous statesParental consent requirements, nighttime usage limits, and constraints on algorithmic feeds for teens
Kids Code–Style Design StandardsCA-inspired proposalsSafer defaults, limits on dark patterns, profiling restrictions, and mandatory risk assessments
Youth Data Privacy EnhancementsEmerging across privacy billsNew limits on collection, sharing, and retention of minors' data in state comprehensive privacy laws

Pending & Expanding Rules – Worldwide

Country / RegionStatusWhat's Coming Next
United Kingdom – Online Safety ActPhased implementationOfcom issuing detailed codes on age verification, content moderation, and risk management for services likely used by children
European Union – Digital Services ActActive enforcementOngoing investigations, additional guidance on protecting minors, and age-verification pilots for access to adult content
Global TrendEmergingGrowing number of countries exploring age-verification mandates, youth data protection rules, and stronger platform accountability

Figure 1 & 2: State-level momentum for child online safety legislation, 2023–2025. Left chart shows the number of states with active bills versus enacted laws. Right chart shows total bills introduced nationwide. Values are based on analyses from TechPolicy Press (2023), UNC/TechPolicy (2023), Huang et al. (2024), and NCSL (2025). 2025 values are estimated based on active legislation and should be interpreted as trend indicators rather than complete totals.

4. A Global Perspective on Child Online Safety

Globally, online safety rules vary dramatically—from comprehensive child-centered frameworks to minimal guidance. One way to understand the landscape is to compare national capabilities and policy maturity. The Child Online Safety Index (COSI) provides a composite view of policy, infrastructure, and digital safety measures for children.

The top performers tend to pair strong privacy law and clear platform duties with investment in digital literacy and reporting mechanisms. Lower scoring countries often lack enforceable standards or resourcing, making implementation inconsistent and leaving families to navigate safety on their own.

Figure 2: Top 10 countries on the Child Online Safety Index (COSI), published by the DQ Institute. Higher scores indicate stronger policy, infrastructure, and digital safety measures for children.

5. How Platforms and Institutions Are Responding

Schools apply device management, content filters, and awareness programs while negotiating the realities of BYOD and cloud-first tools. Many edtech vendors now include clearer privacy statements and more granular administrative controls, but the landscape is still uneven and requires careful review and configuration.

Platforms are expanding age-gating, parental controls, and safety settings, often prompted by regulation or public scrutiny. Progress is real—yet persistent gaps remain in verification, data minimization, and meaningful defaults. The best results come from pairing policy with product design and operational guardrails.

6. Building “Safety by Design” Into Digital Experiences

Safety by design starts with the defaults: private profiles, limited data collection, clear controls, and no dark patterns. Age-appropriate UX matters—interfaces should support younger users and avoid nudging them toward oversharing or endless feeds.

Engineering, product, and policy teams share accountability. Abuse reporting, logging, and monitoring must be reliable and actionable, with audited response workflows. Data retention should be minimized by default, and third-party integrations tightly scoped and reviewed.

When these principles are implemented together, organizations can protect children’s privacy and well-being while still delivering useful services that meet educational or community goals.

7. A Practical Checklist for Organizations

A short checklist helps teams align on priorities before diving into tooling or long policy rewrites.

8. What’s Next: AI, Deepfakes, and the Next Wave

AI-generated content will continue to shape the online environment—from benign personalization to convincing deepfakes and automated persuasion. The volume and quality of synthetic media raise the stakes for verification, provenance, and digital literacy, especially for younger users.

Lawmaking is accelerating worldwide, but alignment will take time. Organizations that treat safety as a continuous discipline rather than a compliance checkbox will adapt faster and protect users better.

9. Recent U.S. Congressional Scrutiny on Child Safety

On January 31, 2024, the Senate Judiciary Committee held a hearing titled “Big Tech and the Online Child Sexual Exploitation Crisis,” taking testimony from leaders at Meta, TikTok, Snap, Discord, and X. Senators pressed on grooming, algorithms, CSAM failures, and the amplification of risk by recommendation systems.

Earlier, on March 23, 2023, the House Energy & Commerce Committee explored “TikTok: How Congress Can Safeguard American Data Privacy and Protect Children from Online Harms,” focusing on data usage, algorithms, and age-controls. Together, these hearings underscore political and public momentum around creating safer online spaces for minors.

10. Navigating the Future of Child Online Safety

A balanced approach—clear law, responsible platform design, and strong education—offers the best path forward. Safety for young users is not a single feature; it is the outcome of defaults, incentives, and accountability that align with children’s needs.

Fox Hill helps organizations build safe, compliant, and practical systems for minors—from data audits and architecture reviews to safety-by-design implementations.

Need help designing safer digital experiences for young users?

We can assist with data audits, technical architecture, and safety‑by‑design reviews tailored to your goals and constraints.


Get notified when we publish

Join our newsletter for new articles on security, architecture, and delivery—no spam, unsubscribe anytime.

Sources