The latest “emergency” for America’s attorneys general isn’t fentanyl, organized fraud, or corporate crime.
It’s “online harms.”

To solve it, a new alliance has appeared: state attorneys general, Roblox and other platforms, and a web of NGOs and foundations under the banner of youth online safety. On paper, this sounds like protection. In practice, it looks more like the construction of a permission-based internet, where access to speech is increasingly tied to identity, and private platforms act as auxiliaries of the state.

A soft name for a hard architecture

At the center of this shift is the Attorney General Alliance (AGA), a nonprofit forum representing attorneys general from 49 US states and territories. In October 2025 it announced the Partnership for Youth Online Safety — an initiative that promises to “reduce online harms” through “practical, collaborative, and measurable design-based safeguards.”(prnewswire.com)

Beneath the soft PR language, the core goals are explicit:

  • develop design standards for “safe” products
  • build rapid, lawful data-sharing frameworks between companies and law enforcement
  • strengthen long-term partnerships between platforms, regulators, and NGOs

“Rapid, lawful data sharing” sounds unobjectionable until you realize it describes a permanent pipeline between private user data and law enforcement, standardized and normalized across the industry. Once such a pipeline exists, its scope is defined not by code or physics, but by whatever lawmakers or regulators later decide counts as “harm.”

Today, the justification is child safety. Tomorrow, it can be disinformation, extremism, “hate,” or simply “threats to democracy.” The mechanism is the same.

Roblox as test case: lawsuits on one side, praise on the other

The first big tech partner in the AGA initiative is Roblox, the children’s gaming platform. AGA has publicly praised Roblox for its “readiness to collaborate directly with policymakers” and highlighted the partnership as a model for youth safety.(prnewswire.com)

At the same time, Roblox is being sued by multiple states (including Texas, Louisiana, and Kentucky) for allegedly failing to protect kids from predators and harmful content.(PC Gamer)

That contradiction is instructive:

  • On one track, attorneys general attack platforms in court for not doing enough.
  • On another, the same ecosystem rewards cooperation through alliances like the Partnership for Youth Online Safety.

The message to platforms is simple:
Work with us, or we’ll work on you.

Roblox already runs heavy, AI-driven moderation — scanning chat, flagging uploaded assets, and filtering personal information before human staff ever see it.(Roblox) This is presented as cutting-edge safety. In reality it helps normalize a surveillance-by-default model for minors that, once embedded, can easily be extended to adults through “age verification” mandates.

The more these automated filters and classifiers become tied into formal data-sharing frameworks with law enforcement, the more platforms start to resemble privatized regulatory agencies rather than neutral infrastructure.

LUE MYÖS:  Talousfoorumi Painottaa Disinformaation Uhkaa Yli Talousriskien

KOSA: from safety bill to enforcement backbone

The legal backbone for this model is the Kids Online Safety Act (KOSA).

KOSA passed the US Senate in July 2024 in a 91–3 vote, bundled with new children’s privacy rules (COPPA 2.0).(commerce.senate.gov) The bill failed to clear the House before the session ended, but has since been reintroduced in the new Congress with broadly similar language.(Wikipedia)

KOSA does a few key things:

  • Imposes a “duty of care” on major platforms to prevent and mitigate content deemed harmful to minors.
  • Requires design changes to limit “addictive” features and strengthen default privacy for under-17s.(alstonprivacy.com)
  • Gives the Federal Trade Commission and state attorneys general enforcement powers over what counts as “harmful” content to children.(Wikipedia)

On its face, this sounds like consumer protection. But civil liberties groups, including the Electronic Frontier Foundation, have warned that KOSA’s approach effectively hands regulators — and by extension, large platforms — broad discretion to decide which topics minors should be shielded from, incentivizing over-censorship by default.(Squarespace)

The enforcement logic pushes platforms toward two outcomes:

  1. Aggressive content filtering, to avoid the risk that any controversial topic (from gender identity to politics) might later be labeled “harmful.”
  2. Age and identity checks, to distinguish minors from adults, which in practice means dismantling anonymous access.

KOSA’s politics are powerful because the bill is framed around grief and protection. After emotional testimonies from parents who lost children or saw them harmed via social media, few senators wanted to be seen voting “against” child safety. But high-emotion law is often blunt law — and blunt law in the digital sphere tends to land on speech and anonymity first.

The NGO and philanthropy layer: moral cover for infrastructure

The AGA partnership and KOSA aren’t appearing in a vacuum. They sit on top of a dense ecosystem of NGOs and philanthropic networks that have spent years shaping “responsible tech” and “kids’ safety” narratives.

Key players include:

  • Center for Humane Technology (CHT), co-founded by Tristan Harris, which champions “humane design,” has promoted ideas for “human-authenticated real-ID type systems,” and supports KOSA and related youth-safety bills.(The Washington Post)
  • Youth-branded coalitions like Log Off Movement and Design It For Us, which lobby for age-based design codes and stricter platform regulation while being structurally tied to large funding vehicles such as the North Fund and major foundations.(InfluenceWatch)
  • Philanthropic hubs like the Omidyar Network, which has channeled significant grants into these organizations and into broader “Responsible Technology Youth Power Fund” initiatives that explicitly push for stronger online safety regulation.(Omidyar Network)
LUE MYÖS:  Miten NYT horjutti maskitodisteita, vuotaneet sähköpostiviestit paljastavat miten tiedemiehiä mustamaalattiin

This doesn’t make their concerns about youth mental health fake. It does mean that policy direction is heavily pre-steered:

  • The answer is never “teach kids better threat models and reduce data extraction.”
  • The answer is almost always: more regulationmore platform obligationsmore identity and age checks, and more data sharing in the name of “duty of care.”

The NGOs supply the moral story. Philanthropy supplies the cash. Lawmakers and attorneys general supply the coercive power. Platforms supply the technical stack.

From safety narrative to ID-gated speech

Taken together, the Partnership for Youth Online Safety and KOSA’s enforcement model reveal a shared trajectory:

  1. Standardize “safety” expectations across platforms via design codes and best practices.
  2. Build data pipes so that law enforcement and platforms can exchange information quickly and quietly.(prnewswire.com)
  3. Tie speech to identity through age and ID verification systems framed as necessary for compliance.
  4. Outsource enforcement to automated tools and trust-and-safety teams acting as a semi-regulatory class.

Under this model, anonymity isn’t formally outlawed; it just becomes incompatible with compliance. If every “covered” service must prove it has kept minors away from harmful content, the simplest path is to demand hard identity checks from everyone.

That is the shift hiding inside phrases like “rapid, lawful data sharing” and “human-authenticated real ID-type systems.”(prnewswire.com)

The result isn’t a safer internet in the broad sense. It’s a more tightly instrumented one:

  • Platforms have more visibility into who you are.
  • Governments have more visibility into what you do.
  • NGOs and foundations gain more leverage in defining what counts as acceptable digital behavior.

The Partnership for Youth Online Safety, then, is less a single project than a governance template. It fuses the legal authority of attorneys general, the infrastructure of Big Tech, and the legitimacy language of civil-society organizations into a system where “protecting kids” becomes the all-purpose justification for monitoring everyone.

Once that kind of architecture is in place, the hardest part isn’t building it.
The hardest part is ever saying no to using it for something new.


📚 Sources

  • Attorney General Alliance – press material on the Partnership for Youth Online Safety.(prnewswire.com)
  • Roblox corporate “Safety Snapshot” and related coverage on its role as founding tech partner.(Roblox)
  • US Senate and legal analyses of the Kids Online Safety Act (KOSA) and COPPA 2.0.(commerce.senate.gov)
  • Civil liberties critiques of KOSA and youth-safety regulation from EFF and related groups.(Squarespace)
  • Documentation of Omidyar-funded youth and “responsible tech” coalitions (Log Off, Design It For Us, etc.).(Omidyar Network)

Avatar photo

By Pressi Editor

Jos lainaat tekstiä, laitathan lainatun tekstin yhteyteen paluulinkin!