There was an old woman in a small village who survived the Civil War. She was seven when the Nationalists took Galicia — no great battles here, just men disappearing in the night, the silence after, the names no one said aloud for forty years. She survived the dictatorship, the hunger years, the slow thaw. She survived emigration — not her own, but everyone else's. She watched the village empty like a bathtub with the plug pulled. She survived the return of democracy, the European Union, the euro, the financial crisis, and the pandemic. She died in 2023, three months after ChatGPT was released, having never used it.
I think about her a lot these days.
There's a man in San Francisco — a data scientist, let's call him — who fitted curves to the future and found a date. Not a vague "sometime this decade" or a cowardly "within our lifetimes," but a specific date. A Tuesday in 2034. A time with milliseconds. He published it on the internet with a countdown timer, which is either the most honest or the most unhinged thing I've ever seen. Probably both.
His methodology was sound. He took five metrics of AI progress — test scores, cost per token, time between releases, research papers, code written by machines — and asked: which of these is accelerating in a way that hits infinity at a finite time? Not exponential growth, which takes forever to reach forever. Hyperbolic growth. The kind where the thing that's growing accelerates its own growth.
Four of the five metrics were boring. Straight lines. Steady improvement. The machines are getting better at a predictable pace.
But one metric was different. One metric curved upward toward a pole.
It was the count of research papers about "emergence."
Stay with me here, because this is where it gets strange.
The singularity — that mythical moment when artificial intelligence surpasses human intelligence and everything changes forever — has been predicted for decades. Futurists love it. Science fiction writers love it. Silicon Valley loves it so much they've built a religion around it, complete with prophets and heretics and tithe.
But the data says something unexpected. The machines are not the ones going vertical. We are.
The only metric with genuine hyperbolic curvature — the only one actually pointing at a finite date — measures human attention. Researchers noticing new behaviours. Scientists writing papers about emergence. The field getting excited about its own excitement.
The machines are improving steadily. Humans are freaking out at an accelerating rate.
The old woman would have understood this immediately.
She lived through at least four "singularities" that I can count. The radio was going to destroy community — why would anyone leave the house when the world came to your kitchen? Television was going to destroy thought — why would anyone read when pictures moved? The computer was going to destroy jobs — why would anyone hire a human when machines could calculate? The internet was going to destroy truth — why would anyone believe anything when anyone could say anything?
Each time, the panic arrived before the thing itself. Each time, the panic was the thing itself.
When they announced the high-speed train would finally come to Galicia — after decades of promises, after watching every other region get connected while we stayed at the end of the earth — the newspapers ran articles about how it would transform everything. Villages would revive. Young people would stay. Madrid would be two hours away instead of six.
The train came. The villages did not revive. The young people still left. What transformed was the expectation, not the reality. The anticipation had already done its work.
The man with the countdown timer noticed something important: the social consequences of artificial intelligence are not waiting for the technology to mature. They're happening now, eight years before his calculated date.
A million layoffs announced. Companies cutting staff based on AI's potential, not its performance. The displacement is anticipatory. The curve doesn't need to reach the pole. It just needs to look like it will.
Therapists reporting a new anxiety: the fear of becoming obsolete. Patients describing it as "the universe saying you are no longer needed." I have felt this myself, I admit it. The sensation of watching your skills become automatable in real time. It's not that the machine can do what I do — not yet, not entirely — but that I can see the trajectory. The anticipation is the wound.
The institutions can't keep up. Laws written for 2023's problems arrive in 2027, by which point the problems have evolved three times. The experts who testify before parliaments contradict each other, because the field is moving faster than expertise can form. When governments visibly can't keep up, trust doesn't erode. It collapses.
Here's what the old woman knew, and what the man with the countdown timer discovered: human systems are more fragile than human technology.
The singularity everyone argues about — the one where machines become superintelligent and either save us or destroy us — may or may not happen. The mathematics are uncertain, the timelines are uncertain, the definitions are uncertain. We could be five years away or fifty or never.
But the other singularity — the one where human attention exceeds human capacity to process it, where our collective ability to make coherent decisions about technology breaks down — is not uncertain. It is measurable. It has a date. And it is already exerting gravitational force on everything it touches.
The fabric frays at the seams of attention and institutional response time, not at the frontier of model performance. We are not waiting to see if the machines will overwhelm us. We are watching ourselves overwhelm ourselves, in real time, about machines that are improving at a perfectly steady pace.
Her village still exists. The church, the hórreo, the fountain where she carried water as a girl. The population is twelve. In summer it swells to thirty when the children of the emigrants return, those who remember and those who were taught to remember.
She never worried about artificial intelligence. She had survived enough futures that failed to arrive. The war, the hunger, the emptying, the promises of trains and factories and revival. She knew what the data scientist discovered: the anticipation is always worse than the thing. The panic is the event.
Someone told her about ChatGPT once — this was in February 2023, a month before she died. She asked what it did. He said it could write. She asked if it could write letters home. He said yes, probably, if you told it what to say. She thought about this for a moment.
"Then who is writing the letter?" she asked. "The machine or the person who tells the machine what to say?"
I didn't have an answer. I still don't.
But I think about her question every time I watch the discourse accelerate, the attention compound, the fear of futures that haven't arrived yet. We are the ones curving toward the pole. The machines are just improving.
The singularity will happen on a Tuesday. But it won't be the day the machines surpass us.
It will be the day we can no longer talk about it coherently.
That day might be today.
Enfin.
The data comes from Cam Pedersen's analysis, which you should read — it's either the most rigorous or the most absurd thing I've encountered this year, probably both. The countdown is real. Whether it means anything is the question we're no longer equipped to answer.
Top comments (0)