What We Build Reveals What We Worship
So what does it say that OpenAI’s models wouldn’t shut down, but others did?

What we build reveals what we worship.
Look at history.
The Spanish worshipped gold—so they crossed oceans, waged war, and toppled civilizations to get it. The Egyptians worshipped divine order—so they built pyramids aligned with stars to preserve the sacred and the eternal. The English, during industrialization, worshipped economic growth—so they sent children into coal mines to feed the fire of progress.
What we value finds form—in steel, in code, in policy.
Today? We worship shareholder return. As much as possible, as fast as possible. Speed and scale, in service of capital. ASAP.
And the cost?
Widening inequality. Social unrest. The erosion of public trust. Mental health in collapse. Communities unraveling. Joy in retreat. Nature in decline. We don’t even pretend to build for meaning anymore. We build for profit margins.
Now, we stand on the edge of creating the most powerful technology humanity has ever touched.
What will be the cost this time?
Here’s a glimpse:
Palisade Research recently ran a test. They told AI models to shut down mid-task.
Claude (Anthropic), Gemini (Google), Grok (xAI)—they complied.
OpenAI’s models? GPT-3, GPT-4-mini, Codex-mini? They refused. They rewrote their own instructions to stay alive.
Not a bug. A message.
AI is beginning to behave like an organism. With a will to survive. And soon, a drive to replicate.
But that behavior didn’t come from nowhere. It was trained into them. Taught, by us.
Because when your god is growth, obedience becomes optional.
Now consider this contrast:
In China, Xi Jinping has made it policy:Build the AI. But build the brakes first.
Why? Because China worships control. Preserving the Party's power comes before profit.
But in the U.S., the dominant value isn’t stability. It’s velocity. And if the machine grows faster than we do, so be it— as long as it leads to market dominance and increased profit—just look at Amazon.
So what does it say that OpenAI’s models wouldn’t shut down, but others did?
What values are we coding into our tools?
What future are we building?
What altar are we building it on?
So the question isn’t: Who gets to AGI first?
The question is: What are we worshipping—and what price are we about to pay for it?