A few nights ago, sometime after midnight, I opened LinkedIn and immediately felt the urge to close it again.
Not because anything particularly offensive was there. In some ways, that was the unsettling part. The feed looked exactly as it always does now. Founders announcing AI pivots. Consultants narrating transformation strategies in carousel posts. Executives speaking with complete certainty about "the future of work." Twenty-three-year-olds calling themselves AI thought leaders after six months of prompting models they barely understand. Recruiters discussing "AI-native talent." Productivity evangelists promising tenfold acceleration. Every sentence carrying the same emotional undertone:
adapt,
accelerate,
optimize,
remain relevant.
I kept scrolling for a while anyway. And after a few minutes, I noticed the same feeling I've increasingly started associating with large parts of professional culture online: not irritation exactly, but estrangement. A strange sense that I was watching an enormous collective performance taking place in real time. Millions of people narrating adaptation simultaneously. Everyone trying to appear future-proof inside systems that nobody fully understands anymore.
What disturbed me was not the enthusiasm surrounding AI itself. The underlying breakthroughs are genuinely extraordinary. I say that sincerely. I studied computer science at Carnegie Mellon. I have spent years around enterprise systems, infrastructure, hedge funds, financial architectures, data pipelines, machine learning conversations, operating models, and AI strategy environments. Some of the most intellectually serious people I know are working on problems related to reasoning systems, compilers, retrieval architectures, semantic abstraction, distributed computation, and cognition itself.
That world still fascinates me. Or at least part of it does.
Because lately, I've begun to realize that the emotional atmosphere surrounding technology feels profoundly different from the one many of us originally entered. And I do not think the difference is merely generational.
Later that night, I was texting one of my closest friends from Carnegie Mellon — someone I genuinely admire for his rigor and seriousness. We've followed strangely parallel trajectories through technology and systems thinking. At some point, the conversation drifted toward AI, as conversations increasingly seem to these days. But not toward the technical side of it. Not transformers or reasoning models or context windows or retrieval systems.
Toward the feeling.
That's what neither of us could stop circling: the feeling. Not cynicism exactly. Not technophobia. Not even skepticism. Something stranger. A sense that the semantic center of gravity around technology had shifted so dramatically that many of us no longer fully recognized the ecosystem we were participating inside.
Because technology, at least in the environments I once loved, was never primarily about products. The product was almost incidental. What mattered were the ideas underneath: the elegance of algorithms, the architecture, the abstraction layers, the strange beauty of systems behaving coherently, the intellectual satisfaction of solving difficult conceptual problems. There was something almost philosophical about it.
Some of my strongest memories from Carnegie Mellon are not tied to grades or recruiting or prestige. They are tied to conversations. Long wandering conversations about systems, cognition, distributed architectures, language, information theory, emergence, complexity. Hours spent thinking about problems that had no immediate market value attached to them whatsoever.
That world felt intellectually alive. And perhaps most importantly, it felt slow enough for thought itself to exist inside it.
"That is what I increasingly feel we are losing — the space slow enough for thought itself to exist inside it."
Somewhere along the way, technology stopped feeling like a domain organized around inquiry and started feeling like a domain organized around financialization. Every breakthrough now arrives already wrapped in market narratives. Every new capability immediately collapses into adoption curves, monetization strategies, quarterly expectations, growth forecasts, startup valuations, productivity metrics, and deployment pressure.
The distance between invention and extraction has almost completely disappeared.
The Loop
Social media accelerated this transformation enormously. Platforms like LinkedIn fundamentally altered the relationship between labor and identity. People no longer simply worked. They learned to narrate themselves working. Professional life became permanently visible, permanently performative, permanently optimized for legibility inside algorithmic systems.
At some point, the modern professional stopped being evaluated solely through actual contribution and started being evaluated through continuous narrative maintenance: personal branding, thought leadership, visibility, relevance signaling, engagement, future-readiness.
Even uncertainty became performative. Especially in AI discourse.
Sometimes I scroll through LinkedIn and genuinely feel as though I am watching a kind of collective adaptation panic unfolding in public. Everyone trying to demonstrate fluency simultaneously. Everyone terrified of appearing obsolete. Every company racing to signal AI integration before they fully understand what integration even means.
And because social media converts anxiety into visibility, the fear recursively amplifies itself.
Fear becomes content.
Content becomes engagement.
Engagement becomes professional legitimacy.
Professional legitimacy reinforces the behavior generating the fear.
The system loops. And because the loop rewards visibility over reflection, depth itself starts becoming economically maladaptive. The ecosystem increasingly rewards confidence before comprehension.
Acquisti and the Long Game
Years ago, while studying economics at Carnegie Mellon, I took classes with Professor Alessandro Acquisti, whose work focused on privacy economics and behavioral asymmetry inside digital systems. At the time, I understood the material academically. Privacy valuation. Information asymmetry. Behavioral incentives.
But I don't think I emotionally understood what he was actually observing.
Acquisti understood something long before most people did: human beings consistently surrender long-term autonomy in exchange for short-term convenience under conditions of informational asymmetry. The exchange rarely feels dramatic while it is happening. That is what makes it dangerous.
Nobody wakes up one morning and consciously decides: I would like my interior life transformed into behavioral infrastructure for advertising systems. The exchange happens incrementally. A little more convenience. A little more visibility. A little more optimization.
Social media normalized that pattern completely. We traded privacy for connection. Attention for stimulation. Interior life for visibility. Silence for participation.
And now AI appears poised to extend that same logic into cognition itself. The extraction no longer concerns data alone. It concerns language. Creativity. Communication. Memory. Reasoning. Accumulated human knowledge. Collective intellectual history.
"The substrate now is cognition itself."
The Three-Body Problem
In physics, the three-body problem emerges when three massive bodies exert gravitational force on one another simultaneously. Two-body systems can often be modeled relatively cleanly. Add a third body, however, and predictability begins collapsing. Tiny fluctuations produce disproportionate effects.
Artificial intelligence increasingly feels like a civilizational version of that problem. Not because the technology itself is chaotic. But because AI now exists at the unstable intersection of three systems moving according to fundamentally incompatible logics: innovation, markets, and governance.
Innovation accelerates because discovery compounds naturally. Markets move differently — they optimize for velocity, defensibility, market capture, and return. Not necessarily wisdom. And governance operates at bureaucratic speed inside a technological environment now evolving at computational speed.
The deeper question is whether human civilization is actually capable of metabolizing intelligence infrastructure at the speed we are currently scaling it. And I increasingly suspect the answer may be: not without enormous psychological cost.
The Grief
Modern life increasingly feels composed of layers so abstracted that ordinary people can no longer clearly perceive where power, responsibility, extraction, governance, or accountability actually reside anymore. And ordinary people are somehow expected to emotionally process all of this while optimizing résumés, networking, learning AI tools, and maintaining continuous employability performance online.
Burnout no longer feels like ordinary overwork. It feels closer to the exhaustion of continuous performativity inside systems that no longer feel psychologically believable.
And underneath all of this, I increasingly sense another emotion hiding beneath the acceleration: grief. Not nostalgia exactly. Not resistance to progress. Something stranger. A grief for coherence. A grief for the feeling that the systems governing society once appeared comprehensible enough for human beings to orient themselves inside them meaningfully.
People followed the script they were handed:
study hard,
build skills,
stay adaptive,
remain employable,
optimize yourself continuously.
And yet despite doing everything correctly, many increasingly feel detached from their own working lives. Not because they failed individually. But because the underlying contract itself has shifted faster than society has collectively acknowledged.
The old maps still exist. But the terrain underneath them keeps moving.
I don't have a complete answer to any of this. I suspect nobody really does. But I increasingly believe there is value in naming the atmosphere honestly before the abstraction layers become so dense that we lose the language required to describe what is happening to us in real time.
Because civilizations do not only fracture materially. Sometimes they fracture semantically first.