
🤖
On the June 23, 2025 cover of TIME Magazine, a gleaming mechanical eye stares out from the page.
Below it, the unsettling headline reads:
“ARE YOU HUMAN?”
Sam Altman wants to find out.
It’s not fiction.
It’s not satire.
It’s where we are now:
The builders of artificial intelligence are no longer asking whether machines can be human —
they are asking if we still are.
This eerie inversion tells us everything we need to know.
The question isn't just strange — it's revealing.
It suggests a shift in power so profound, so unexamined,
that the very systems we built to serve us now demand proof of our own legitimacy.
And their answer is a sphere.
The Worldcoin ORB: a device that scans your iris, stores your biometric signature, and links it to a global digital ID.
In a world where AI will mimic everything — voices, faces, texts, memories —
they’re offering a machine to certify that you’re real.
But here’s the real message:
They know what they’ve built will erase the boundary between human and synthetic.
And their solution is to brand you in order to protect you.
From “Don’t Be Evil” to Don’t Be Human?
Years earlier, Google tried to offer its own ethical north star:
“Don’t be evil.”
But to many outside Silicon Valley, it came across as... odd.
Most people don’t walk around reminding themselves not to be evil.
That’s not moral superiority — it’s moral baseline.
To need to say it out loud betrayed something deeper:
That those building the future knew, even then, how much power they were accumulating —
And how precarious their moral framework was.
By 2018, Google quietly dropped the phrase from its code of conduct.
(Source: Gizmodo, 2018)
It wasn’t evolution.
It was erosion.
What these moments reveal — the ORB and the dropped motto —
is the same unsettling truth:
That too often, the people shaping the most powerful systems on Earth
are brilliant at math and abstraction,
but untrained in empathy, ethics, or humility.
The Blind Spot of the Puzzle-Solver
This is not about malice.
It’s about narrowness.
A culture of puzzle-solvers and coders, optimizing for speed, scale, and elegance —
but often blind to the human context they are reshaping.
So they build tools to simulate people,
then tools to detect those simulations,
and finally tools to verify that you are not one of them.
A world where you must prove you are human,
because the humans building it forgot what humanity actually is.
The Future Needs More Than Code
“What touches all should be decided by all.”
— Ancient Roman proverb, quoted in Vox
Source
This moment demands a wider table.
Not just engineers and investors — but poets, parents, historians, and healers.
Not just those who can scale intelligence — but those who can protect life.
Letting the market and its math define our shared future
is not just a bad idea —
it’s an unnatural one.
The natural world doesn’t centralize power.
It distributes it.
It evolves through mutualism, not monopolies.
So must we.
We Are Not a Problem to Solve
We are not your training set.
We are not your data stream.
We are not a line of code to be audited or confirmed.
We are human.
And that should never require verification.