On May 22, 2025, the U.S. House of Representatives passed a bill that includes a 10-year moratorium on any state or local laws related to AI.
If it becomes law, over 1,000 active and pending state AI bills - from California to Colorado - will be immediately frozen.
States won’t be allowed to regulate AI at all.
Only the federal government can make the rules, and so far, there’s no clear plan or timeline for what those rules will be.
They’re calling it regulatory harmony.
But it feels a lot more like regulatory silence.
🔹 What the Government Is Saying
The official line:
“One national rulebook is better than fifty different ones.”
Lawmakers say this is about:
-
Making it easier for businesses to comply
-
Avoiding a patchwork of confusing rules
-
Helping AI innovation move faster without legal friction
On paper, that sounds like progress.
But progress for who?
🔹 What This Means (In Human Words)
This bill says that for the next 10 years, only Washington can decide how AI gets regulated.
States lose all say.
And here’s what that looks like in real life:
-
Your state wants to ban deepfakes in election ads? It can’t.
-
Your city wants transparency in how police use AI surveillance? Nope.
-
Your school board wants rules for AI-powered grading tools? Not allowed.
Right now, there are over 1,000 state-level AI laws in motion - all of them gone if this passes.
Even when the risks are real. Even when action is urgent.
Supporters say this keeps things clean and unified.
But here’s the catch:
🔸 The federal government hasn’t written any real AI rules yet.
🔸 This bill doesn’t create a plan - it just blocks everyone else from trying.
🔸 Communities with specific needs? Ignored.
🔸 Local innovation in AI safety? Frozen.
So this isn’t streamlining.
It’s a 10-year pause, dressed up as efficiency.
🔹 Bottom Line
-
🟡 This is not law yet - it still needs Senate approval
-
❌ If passed, it blocks all state and local AI regulation until 2035
-
✅ Companies get a single compliance path - fewer legal headaches
-
🚫 Communities lose the right to respond to their own AI risks
-
⚖️ The law faces legal challenges (and may violate Senate rules)
❄️ Frozen Light Team Perspective
This isn’t just about AI.
It’s about who gets to act - and who has to wait.
The people behind this bill are saying:
“Let’s pause the chaos and do this one way - our way.”
But we’ve seen this before.
“Centralization” sounds clean - until it silences voices that matter.
States have always been labs for innovation - especially in tech safety, education, and privacy.
This bill shuts the labs down.
And it does it without offering a working solution in return.
If the future of AI is going to work for everyone, it can’t be built by just a few.
We need speed, yes - but we also need diversity, adaptability, and local response.
Otherwise, the only thing that gets regulated…
is the freedom to build better.