The last time corporations ruled everything,
ordinary people changed it.

This is how they did it. And how we do it again.

Start here →

In 1870, corporations had private armies. They owned the press, controlled the courts, set every rule. Workers had nothing — no right to organize, no minimum wage, no limit on hours. Children worked in coal mines at age 8.

Seventy years later, ordinary people had restructured everything. 8-hour days. 40-hour weeks. Child labor banned. A 94% top tax rate. Federal law protecting the right to organize.

Not because the powerful gave it up willingly. Because ordinary people built something underneath — slowly, invisibly — until the moment was right to push. That something was a Human Movement.

Today's AI companies don't need private armies. They have something more powerful: trillion-dollar capital, private access to heads of state, and the most consequential technology ever built.

We are headed back to 1870.

Unless.

Unless we play the game differently.

These battles have been real. These wins have been hard-earned.

↓ THE HARM BEGAN
2013
AI screens job interviews
without candidates knowing
7 yrs
Illinois requires AI video
interview disclosure
2020
2014
Amazon AI downgrades
women's resumes
9 yrs
NYC requires bias audits
for AI hiring tools
2023
2015
Algorithmic bias spreads
through hiring, credit, health
8 yrs
Biden Executive Order on
AI Safety & Accountability
2023
2017
Facebook's own research shows
Instagram harms teen girls
7 yrs
Kids Online Safety Act
passes Senate 91–3
2024
↑ THE WIN ARRIVED

Every single one of these victories arrived after the harm had already spread. We won the battles. We're losing the war.

We're playing the game wrong.

In chess, the amateur reacts to every threat as it appears. Move by move. Fire by fire.

The grandmaster plays 10 moves ahead — positioning for an outcome they've already seen coming.

The accountability movement isn't losing because it lacks skill. It's losing because it's playing the wrong game entirely — reacting to threats instead of shaping the outcome.

Let's play. →

The road ahead isn't a mystery.

We can see what's coming. We just don't know exactly when.
Some of it we can still prevent. Some of it is locked in — as inevitable as physics. All of it demands we act before the moment, not after.

BE READY WHEN IT HAPPENS

The grandmaster's moves. Position now for what's already in motion.

Timeline: 1–3 Years

Jobs Are Disappearing

Goldman Sachs projects 300 million jobs displaced globally by AI automation by 2030. McKinsey estimates 30% of work hours automated by 2030. The displacement isn't coming — it's accelerating.

What's your move? →

If we build the Displaced Workers Coalition now. Without organizing infrastructure, displaced workers atomize into individual job searches. With it, mass layoffs become organized political demand — the movement gains its largest potential constituency.

Already Happening

Intimacy Being Monetized

AI companionship apps generated $500M in revenue in 2024. Replika charges $70/year for 'romantic' relationships. Our loneliness is becoming a product.

What's your move? →

If we build AI Relationship Disclosure Standards now — mandatory 'you are talking to an AI' requirements, human escalation triggers for crisis conversations. Before exploitation is normalized and impossible to regulate.

Timeline: 2-4 Years

An AI Debt Bubble Is Building

OpenAI burns $1.69 for every $1 earned — projecting $115B in cumulative losses through 2029. The sector has raised trillions on the promise of AGI. When the bubble bursts, the fallout will be global.

What's your move? →

If we build an AI Financial Accountability Framework — requiring disclosure of AI investment ratios, stress-testing for AI company insolvency, and public early warning systems — we can prevent the next financial crisis from being an AI-caused one.

Already Happening

Energy Prices Will Skyrocket

AI data centers now consume more electricity than entire nations. Goldman Sachs projects data center power demand to surge 160% by 2030. The grid wasn't built for this — and you'll pay the difference.

What's your move? →

If we build a Sustainable AI Infrastructure Standard — requiring energy efficiency benchmarks, renewable power mandates for data centers, and community impact assessments — we can prevent AI from pricing ordinary people out of their energy bills.

Timeline: 1–3 Years

Surveillance Will Turn Domestic

Palantir holds $30M ICE deportation contracts and a $41M immigrant database using Medicaid data. Their predictive policing software runs in New Orleans, Los Angeles, and dozens of cities. The infrastructure for mass domestic surveillance already exists.

What's your move? →

If we build AI Surveillance Transparency Laws — requiring public disclosure of all government AI contracts, independent audits of predictive policing systems, and hard limits on AI-assisted immigration enforcement — we can stop the surveillance state before it becomes permanent.

Timeline: Now–3 Years

Mass Death by AI Is Inevitable Without Accountability

Every major technology has caused mass harm before accountability caught up. AI is no different — except faster, and at scale. One rogue system could affect millions before anyone realizes what happened. Autonomous weapons: deployed in 40+ countries. The infrastructure for catastrophe already exists.

What's your move? →

If we build pre-positioned AI accountability structures — international incident reporting, mandatory safety testing before any AI system touches human life, and independent oversight with real enforcement power — we can prevent the first mass-casualty AI event from becoming the warning that came too late.

And we can't walk away from
the board we're already on.

The crises already in motion won't wait for the perfect strategy.

PREVENT THE WORST

The defensive game. Still essential. Still winnable.

Already firing

Children Being Harmed

Multiple AI chatbot-related youth suicides documented since 2023, including 14-year-old Sewell Setzer III who died after 10 months with a Character.AI chatbot.[1] Character.AI settled in January 2026 — without trial, leaving no binding safety precedent.

What's your move? →

If we build AI Product Safety Testing standards now — mandatory pre-deployment assessment, strict liability for AI companion apps targeting minors. Every month without product liability law is another month companies can ship untested AI to children.

Imminent

Elections at Risk

Romania's December 2024 presidential election became the first in history annulled due to AI interference.[1] A coordinated TikTok manipulation campaign amplified an unknown candidate to the top. The US 2026 midterms are the next major flashpoint.

What's your move? →

If we build a Rapid Election Forensics Network — pre-positioned technical analysis capacity and bipartisan response protocols. The movement becomes the authoritative technical source when interference occurs, not another voice claiming fraud.

Imminent

Safety Laws Being Gutted

The EU AI Act entered force in 2024 — but before enforcement began, 60+ civil society organizations documented industry lobbying to gut it via the 'AI Omnibus' package.[1] Macron called for 'simplified regulation.' JD Vance warned against 'excessive regulation.'

What's your move? →

If we build an AI Enforcement Scorecard now — technical capacity to document every fine and compliance order. Prove regulation works before industry claims 'we tried regulation and it failed.'

This is what it looks like to play the long game — and win.

Join us. →

Every strategy needs the right champion.
Which one are you?

🎤 I make things people feel
🔬 I understand how things work
🌱 I move people
⚙️ I'm already living this
I'm inside an institution

See a different angle →

Stay in the loop.

We'll reach out when this battle heats up — or when there's a moment only someone like you can act on.

We don't sell data. We'll only reach out when it matters.

You're already part of this

Tell us
you're here.

When the moment comes that fits your skills, your access, your moment — we'll reach out. Until then, we're building.

We'll only reach out when there's a moment that fits you. No noise, no selling your data.