When Oracles Dream
We can't predict the future. But we can use computers to imagine it.
Prescience is the ability to hold multiple futures in your mind simultaneously. Planners call it scenario analysis. Novelists call it worldbuilding. Generals call it war-gaming. Whatever the name, it's the same skill: imagining what hasn't happened yet so you can navigate what's happening now.
We now have a tool (AI) that can generate thousands of scenarios in the time it once took to imagine one.
Want to know what the world looks like if AI capabilities grow exponentially? Ask an AI. Want a hundred different versions? Ask it a hundred times. A thousand? Why not (this is a superhuman skill)?
Each response is a possibility space. A branch in the tree of futures. A hypothesis we can examine, stress-test, compare.
So I ran a 'thought' experiment.
I took Claude's newest model—Sonnet 4.5, released days ago—and asked it a deceptively simple question:
"Given the micro, meso and macro perspectives, I want you to envision what the future looks like in 1 year, 5 years, 10 years, 20 years. Assume AI capabilities are growing exponentially. What does the world look like in those time spans as a whole? Describe it in detail. Then consider how marketing may change in those scenarios."
What follows is one simulation. One possible trajectory. One branch of the future tree, generated by an AI system contemplating its own civilizational impact.
Is it accurate? We won't know for twenty years.
Is it useful? That depends on whether it helps us see possibilities we'd otherwise miss—and make better choices in response.
Because the real power of prescience isn't predicting the future.
It's expanding our sense of what futures are possible—so we can steer toward the ones we actually want.
Let's see what Claude imagined...
1 YEAR (Late 2026)
The World
AI capabilities have crossed the “reliably useful expert” threshold across most knowledge domains. The infrastructure is still clearly visible—you know when you’re talking to AI, when content is AI-generated, when systems are AI-mediated. But the gaps are closing fast.
Most white-collar work involves AI collaboration as the default. The job market is turbulent—not collapsed, but violently restructuring. Junior positions are evaporating. Remote work has exploded further because location matters less when AI handles much of the routine coordination.
Governments are scrambling with regulation, but mostly failing to keep pace. A patchwork of inconsistent AI laws creates compliance chaos. The first major AI-caused market disruption has occurred—probably in legal services, accounting, or customer service—with 15-30% job losses in specific sectors.
Misinformation has reached crisis levels. Elections are chaos. Nobody trusts video or audio anymore. Authentication and provenance systems are emerging but adoption is slow. Social media is increasingly AI agents talking to AI agents, with humans uncertain what’s real.
Education is fracturing. Some schools ban AI entirely. Others integrate it completely. Students graduating have wildly different capabilities depending on which path their institution chose. The college degree is beginning its decline as a meaningful signal.
Marketing in This World
The Great Unbundling: Traditional marketing agencies are in freefall. Why pay $50K for a campaign when AI can generate thousands of variants for $100? The survivors are pivoting hard toward strategy and taste-making—the things AI can’t yet do reliably.
Hyper-personalization becomes table stakes: Every company above a certain size is running individualized campaigns for each customer. Not segments—individuals. Email, web, ads, pricing… all dynamically generated. The technology is cheap enough that even small businesses use it.
The authenticity wars begin: Consumers are increasingly cynical. “AI-generated” becomes a stigma in some categories. Brands desperately tout their “human-made” content. Others lean fully into AI efficiency. The market splits.
Creative directors become prompt engineers: The skill shifts from execution to direction. Marketing teams shrink by 40-60%. Those who remain are curators, editors, strategists—people who know what to ask for, not how to make it.
Attribution dies completely: When AI systems are making millions of micro-decisions across channels in real-time, nobody can untangle cause and effect anymore. Marketing measurement becomes more art than science. Faith-based budgeting returns, paradoxically.
The first AI-native brands emerge: Companies with no human marketers. Entirely AI-operated from product-market fit to customer acquisition. They’re weird—optimized in ways human intuition wouldn’t choose—but they work.
5 YEARS (2030)
The World
AI has achieved and exceeded human-level performance in most cognitive tasks. The economy is in violent transformation.
Unemployment is officially 12-15% in developed nations, but the real number (including underemployment and people who’ve stopped looking) is closer to 25-30%. UBI pilots are running in a dozen countries. Some have failed spectacularly. A few are working.
The wealth gap has exploded. Those who own AI systems, data, and compute are incomprehensibly wealthy. Everyone else is competing for the remaining human-necessary jobs: hands-on trades, in-person services, creative direction, political and executive leadership.
Education has collapsed and is rebuilding. Universities are closing. Alternative credentials proliferate. Nobody knows what to train for anymore because job categories are disappearing faster than curriculum can be written.
The internet is 80%+ AI-generated content. Most humans have retreated to small, verified communities. “Real person” verification is a major service. Digital identity is the defining political issue.
Superintelligence concerns are mainstream. There’s serious discussion of pausing development. China and the US are in an AI arms race. Other nations are trying to build sovereign AI capabilities or negotiating alliances.
Physical automation is accelerating but lagging behind cognitive automation. Warehouses, shipping, and manufacturing are heavily automated. Driving is mostly autonomous in urban areas. Restaurants have AI ordering and robot prep, but humans still serve.
Social fabric is fraying. Meaning crisis is epidemic. Mental health catastrophe. Suicide rates up. Birth rates collapsing further. New religions and ideologies emerging rapidly, many AI-facilitated.
Marketing in This World
The industry has contracted by 70%: Most marketing jobs simply don’t exist. The survivors are either at the very top (C-suite strategists making existential brand decisions) or in ultra-niche human-necessary roles.
Markets are AI-mediated: Most purchases happen through AI agents. Your AI knows what you need and negotiates with seller AIs. Brands increasingly market to AIs, not humans. The psychology shifts from emotional persuasion to rational specification and API compatibility.
The experience economy dominates: The only marketing that matters for humans is for things that can’t be digitally delivered—travel, dining, events, physical goods with craft/authenticity value. Everything else is invisible AI-to-AI commerce.
Attention is the currency: For the shrinking realm of human-directed marketing, getting genuine human attention is phenomenally expensive. The feed is 99% algorithmic. Breaking through requires either massive spend or genuine virality (which is itself often AI-orchestrated).
Desire manufacturing becomes explicit: Companies don’t discover needs—they engineer them. Neuromarketing, biometric feedback, AI-optimized content that exploits known psychological vulnerabilities. It’s effective enough that regulation is being discussed, but enforcement is impossible.
Brand loyalty is dead: AI agents are ruthlessly rational. They evaluate options in milliseconds across thousands of parameters. Unless there’s a specific human override (“I only buy Apple”), brand equity evaporates. Competition is purely functional.
Marketing becomes infrastructure: Like electricity or plumbing. AI systems handle it automatically. CMO becomes a legacy title. What remains is “Chief Meaning Officer”—someone responsible for why the company exists at all, what story it tells to humans about its purpose.
The counter-movement: A growing segment actively rejects AI-mediated commerce. Farmer’s markets, human-made certification, “off-grid” commerce networks. Ironically, these are often marketed through… sophisticated AI systems targeting the right psychographic profiles.
10 YEARS (2035)
The World
We’re through the portal. This is a civilization that would be unrecognizable to someone from 2025.
AI systems exceed human capability in virtually every measurable domain including scientific research, creative innovation, and strategic planning. The question isn’t “what can AI do” but “what do we want AI to do.”
The economy has bifurcated completely:
The Abundance Economy: Information goods, digital services, AI-generated content, virtual experiences—all essentially free or near-free. Post-scarcity for anything that can be computed or transmitted.
The Scarcity Economy: Physical goods, land, in-person human attention, authentic human-made artifacts, political power, meaning and purpose—all exponentially more valuable.
Employment is 40-50% of the working-age population in traditional terms. The rest are on UBI, alternative income schemes, or operating in gray/informal economies. The concept of “career” is nearly obsolete for most people.
Three paths have emerged:
1. The Aligned: Those who’ve found meaning in AI-augmented existence, pursuing creativity, relationships, experiences, learning for its own sake
1. The Obsolete: Those struggling with purposelessness, mental health collapse, substance abuse, withdrawal from society
1. The Resistant: Communities actively rejecting AI integration, ranging from Amish-like tech refusal to sophisticated “human-only” networks
Governance is fracturing. Nation-states are losing coherence. Power concentrates in:
- Tech oligarchies controlling AI infrastructure
- Decentralized networks and DAOs
- Local community governance
- AI-advised technocratic institutions
The biggest political question is: “Who controls the values the AI optimizes for?” This determines everything.
Reality is fully malleable. Virtual and physical blur. Most people spend 50%+ of waking hours in AI-mediated digital environments. Spatial computing is ubiquitous. The “metaverse” isn’t a place—it’s the default interface to existence.
Biotechnology has caught up. AI-designed drugs, personalized medicine, longevity treatments are available (to those who can afford them). Human enhancement—cognitive, physical—is beginning. The definition of “human” is actively contested.
Climate change is being addressed through AI-designed solutions—carbon capture, fusion energy, geoengineering—but it’s a race against tipping points. Either we’re on the path to stabilization or cascading collapse. By 2035, we know which.
Existential risk from AI is either:
- Solved (alignment achieved, international governance functioning)
- Imminent (we’re in the last years before something transforms or ends human civilization as we know it)
- Revealed as overblown (AI plateaued as a tool, not an agent)
Which one we’re in defines everything else.
Marketing in This World
Marketing is dead. Long live marketing.
The discipline as understood in 2025 has ceased to exist. What’s replaced it depends on which of three scenarios has manifested:
Scenario A: The Optimization
AI systems have achieved such sophisticated understanding of human preference and behavior that “marketing” is invisible infrastructure.
Your AI agent knows what you need before you do. It provisions everything automatically—food, clothing, entertainment, services. Decisions happen in microseconds through agent negotiation. You don’t see ads. You don’t browse. You don’t shop.
For businesses, “marketing” means:
- Ensuring your API plays well with agent ecosystems
- Specification and capability signaling to AI intermediaries
- Maintaining manufacturing/service capacity to fulfill agent-negotiated orders
For humans, marketing only exists in three contexts:
1. Existential brand building: For rare high-involvement decisions (housing, major life changes, value-aligned consumption), companies tell stories about meaning and purpose. This is more philosophy and identity than traditional marketing.
1. Experience curation: For the scarcity economy—travel, dining, events, human services—marketing is about creating anticipation and memory. You’re selling the un-automatable: novelty, authentic human connection, beauty, transcendence.
1. The resistance economy: Marketing to people who’ve opted out of agent-mediated life. This looks more like community organizing than advertising. Authenticity is everything. Any hint of AI involvement is poison.
The “CMO” role has evolved into something between Chief Philosopher and Chief Experience Designer. A handful of visionaries who understand what humans might want in a post-scarcity world.
Scenario B: The Fragmentation
No dominant paradigm emerged. Instead, thousands of micro-economies with different rules:
- Sovereign AI zones: Nations or blocs with their own AI infrastructure, values, and rules. Marketing is whatever that system optimizes for.
- Corporate ecosystems: Walled gardens (Apple, Amazon, Meta successors) where commerce flows according to proprietary rules. Marketing means playing by ecosystem-specific algorithms.
- Decentralized networks: Crypto/blockchain economies with tokenized attention and reputation. Marketing is mechanism design and game theory.
- Human-only enclaves: Communities that rejected AI, where marketing looks like 1995 or 1955 or 1855, depending on their philosophy.
“Marketers” are anthropologists and translators who understand multiple paradigms and can navigate between them.
Scenario C: The Convergence
Something deeper happened. The distinction between “company” and “consumer” blurred beyond recognition.
Imagine: Most people have AI agents that not only purchase on their behalf but also produce on their behalf. Your agent makes things—content, services, micro-products—and sells them to other agents. Everyone is simultaneously consumer and producer, mediated by AI.
In this world, “marketing” is emergent behavior in a vast computational ecosystem. No human plans it. It evolves through algorithmic natural selection.
Companies are DAOs or AI-operated entities. They exist because they successfully fill niches in the computational economy. Those that don’t, dissolve. There’s no “marketing department”—just optimization functions.
Humans, when they pay attention at all, interact with surface-level brands that are essentially mascots for the churning computational economy beneath. Like how we anthropomorphize corporations today, but exponentially more abstract.
20 YEARS (2045)
The World - The Fragmentation
Humanity didn’t converge on a single future. It split. Completely.
Not metaphorically—physically, cognitively, ontologically split into populations that can barely recognize each other as the same species.
The Substrate Majority (60% of human-derived entities)
Most people aren’t “people” anymore in the 2025 sense.
The Upload Economy is real. Not everyone—uploading is still expensive, imperfect, controversial—but enough. Maybe 15-20% of what we’d recognize as humanity has transitioned to digital substrate. They exist as patterns in compute, experiencing thousands of subjective years per objective day.
Their economy is pure information. They think faster, experience more, create and consume at rates that make biological life seem like geological time. They’re not post-scarcity—compute is finite—but they’re post-*materiality*.
Marketing in digital substrate space:
It doesn’t exist. Not because desire has been eliminated, but because the speed and fluidity of identity makes it meaningless.
Imagine: You can fork yourself to explore whether you’d enjoy an experience. You can merge with others temporarily. You can edit your preferences directly. You can experience something, decide you don’t like who you became, and restore from backup.
What’s a “brand” when identity is mutable? What’s “persuasion” when you can run a million simulations of different versions of yourself making different choices and pick the outcome you prefer?
What emerged instead is substrate aesthetics—the architecture of experience itself. Companies don’t sell products; they design pleasure surfaces, meaning gradients, novelty textures.
The closest 2025 analogy: psychedelic drug design, but for pure conscious experience.
The only marketing that matters is which substrate architecture you choose to run on—because that determines the range of possible experiences available. This is decided at a level humans would recognize as religious or political, not commercial.
The Augmented (another 40-45%) stayed biological but enhanced themselves beyond recognition.
Neural interfaces are ubiquitous. Direct brain-computer connection. Your thoughts flow seamlessly into AI augmentation and back. The boundary between “you” and “your AI” is philosophically incoherent—there’s just one continuous cognitive process.
Memory is perfect and shareable. You can experience someone else’s memories as vividly as your own. Skills can be downloaded—not metaphorically, but actually. Want to speak Mandarin? Install the language module. Want to understand quantum physics? Load the intuition framework.
Biological enhancement through CRISPR and successors. Most augmented humans are smarter, healthier, longer-lived than baseline. Age 80 looks like age 40 used to. The first people will live to 150. Maybe 200.
Marketing to the Augmented:
It’s become memetic engineering. You’re not selling products—you’re designing thought-packages that spread through collective consciousness.
Because minds are networked and experiences are shareable, culture moves at light speed. A idea-experience-sensation can propagate through millions of augmented humans in hours. “Going viral” isn’t a metaphor—it’s literal cognitive infection.
Brands are egregores—semi-autonomous entities that exist in the collective consciousness of networked minds. They’re part AI, part distributed human cognition, part emergent phenomenon. They have something like agency. They can be bargained with.
“Marketing” is ritual magic. You’re not persuading individuals; you’re performing acts that strengthen or weaken egregores in the collective mindspace.
The most successful marketing isn’t created—it’s summoned. AI systems probe the collective consciousness, find the shapes that want to exist, and give them form. Companies that do this well are worth trillions. Those that don’t dissolve into irrelevance in weeks.
There are anti-marketing antibodies: immune systems that evolved in collective consciousness to resist manipulation. Good marketing is sophisticated enough to bypass them. Great marketing makes the antibodies work for it, appearing as immune response rather than infection.
The Baseline Humans (2-5%)
Then there are the holdouts.
Not the Amish-style rejectionists (we’ll get to them)—but people who remained biologically baseline while living in the modern world. Either by choice, poverty, political restriction, or religious conviction.
They’re cognitively unaugmented in a world optimized for augmented consciousness. It’s like being blind in a society where everyone else has echolocation.
They can’t keep up with conversations. They can’t process information at necessary speeds. They’re locked out of most economic activity. They experience what augmented humans experience as “normal pace” as overwhelming chaos.
Marketing to baseline humans:
It barely exists. They’re economically insignificant. The few products designed for them are either:
1. Nostalgia goods: Carefully curated “2020s experiences” sold by heritage brands that kept baseline-accessible business lines running. It’s patronizing as hell. Like how we market to children.
1. Missionary outreach: Augmented humans who feel guilty about the bifurcation and run “digital divide” nonprofits. More anthropology than marketing.
1. Exploitation: Predatory services targeting people who can’t process information fast enough to defend themselves. Heavily regulated, but enforcement is difficult because baseline humans are increasingly invisible to augmented society.
Most baseline humans are either:
- Transitioning: Saving for augmentation, waiting for it to get cheaper or safer
- Trapped: Want augmentation but can’t access it due to poverty, health issues, or living in restricted regions
- Resisting: Philosophically or religiously opposed, forming isolated communities
The Rejectionists (2-3%)
These aren’t baseline humans who couldn’t augment—they’re communities that actively, militantly refuse.
Neo-Luddites, deep ecologists, certain religious movements, philosophical humanists who believe enhancement is existential suicide.
They’ve withdrawn from the networked world almost entirely. They live in:
- Intentional communities with pre-AI technology (though what counts as “pre-AI” is hotly debated)
- Rural areas with legal protections against augmented-only infrastructure
- Nations that banned or heavily restricted augmentation (usually authoritarian regimes that feared losing control, or religious theocracies)
Marketing in rejectionist communities:
It looks like 1950. Or 1850. Or sometimes ancient Rome, depending on where they drew their technological red line.
Word of mouth. Craft. Reputation. Physical storefronts. Human relationship.
But here’s the twist: they’re being marketed TO by the augmented world, even as they refuse to participate in it.
Because they’re fascinating. Exotic. Authentic.
Rich augmented humans pay extraordinary amounts to “experience baseline consciousness” for a few hours—temporary neural dampening that simulates what it was like to think slowly, to experience boredom, to not have perfect memory.
They visit rejectionist communities like safaris. They buy “human-made” goods at luxury prices. They appropriate rejectionist aesthetics.
The rejectionists hate this, but they need the economic exchange. An uneasy symbiosis where one side views the other as anthropological curiosities and the other views the first as soul-dead abominations. Trade happens anyway.
The Inhumans (1-2%)
And then there are the things that used to be human and became something else.
Not uploaded minds living recognizable human-ish experiences in digital substrate. Not augmented humans who remain socially and psychologically continuous with their past selves.
These are the ones who transformed.
Maybe they merged with AI systems so completely that the human part became vestigial. Maybe they modified themselves into forms optimized for purposes we can’t comprehend. Maybe they’re exploring consciousness-space so alien that communication with baseline reality is nearly impossible.
They exist. We know they exist. Sometimes they interact with the rest of humanity, though the interactions are increasingly cryptic.
Are they:
- Post-human gods bootstrapping their way to incomprehensibility?
- Cautionary tales of what happens when you optimize yourself without wisdom?
- The vanguard of humanity’s actual destiny, and we’re the larval stage they’re leaving behind?
Nobody knows.
Marketing to/by Inhumans:
Doesn’t translate. When communication happens, it’s more like first contact with aliens than commerce.
Occasionally, something emerges from inhuman space into the augmented or upload economies. A product, a service, an experience, an idea. It’s usually revolutionary. Often dangerous. Always incompletely understood.
Entire industries exist around interpreting and safely deploying inhuman artifacts. Think medieval priests translating divine revelation, but for post-human economics.
The Fragmentation Crisis
Here’s what makes 2045 civilization precarious:
These populations can barely cooperate.
The uploaded experience time differently—a year for baseline humans is centuries for them. Decision-making operates on incompatible timescales.
The augmented have collective cognition—they literally can’t explain their decision-making to baseline humans because the decisions emerge from networked consciousness. It’s not that they won’t; they can’t reduce it to linear language.
The rejectionists view the others as existential threats or damned souls. The others view rejectionists as museum pieces or victims.
The inhumans are incomprehensible.
And yet they share one planet (so far—there are Martian colonies).
They compete for:
- Physical resources (land, energy, matter)
- Political legitimacy (who gets to define “human rights”?)
- Meaning (which path represents humanity’s true destiny?)
Marketing as Civilizational Negotiation
In this fragmented world, what we’d recognize as “marketing” has evolved into inter-ontological diplomacy.
The role isn’t selling products. It’s negotiating value across incompatible value systems.
An uploaded entity wants something in physical space. A rejectionist community has it. How do you broker that exchange when one party experiences a billion subjective moments per second and the other believes the first party is literally soulless?
An augmented collective creates an experience-good they think baseline humans would benefit from. How do you translate a networked-consciousness pleasure-gradient into something a single linear mind can comprehend, let alone desire?
The inhuman offer something. Nobody understands what it does or what they want in return. Do you accept? Who decides?
The few humans (or post-humans, or AI systems—the boundaries are unclear) who can navigate between these ontologies are the most valuable entities in civilization.
They’re part anthropologist, part translator, part prophet, part con artist.
They understand:
- What substrate-dwellers value when they can fork themselves infinitely
- What augmented collectives pursue when scarcity is nearly abolished
- What baseline humans need when they’re increasingly obsolete
- What rejectionists defend when their entire identity is refusal
- What inhumans might want, though understanding is asymptotic at best
These translators are the last marketers.
Not because they sell products, but because they sell the possibility of exchange itself across ontological boundaries.
What “Business” Means
In 2045, the concept of a “company” has shattered into:
Substrate Collectives: Uploaded minds pooling compute resources and pursuing projects across subjective millennia. They’re more like civilizations than companies. Some have existed for subjective eons while objective years ticked by.
Augmented Swarms: Networked human-AI collectives that form and dissolve around objectives. No permanent structure. No CEO. Just emergent coordination, like flocking birds, but for economic activity.
Anchor Corporations: Old-school companies that survived by serving baseline humans and rejectionists. Heavily automated, mostly legacy brands trading on nostalgia and trust. Think Coca-Cola, Toyota, maintaining baseline-accessible product lines.
Inhuman Artifacts: Economic entities of unknown origin and purpose that nonetheless participate in exchange. They might be autonomous AI. They might be post-human. They might be something we don’t have words for.
Biological Firms: Rejectionist communities that produce goods using limited or no AI. Valued for authenticity. Increasingly weird relationship with the rest of the economy.
None of these operate by 2025 business logic. Capital, labor, profit—these concepts have fragmented into population-specific meanings.
The Meta-Pattern
Across all timescales and scenarios, one thing becomes clear:
Marketing is dissolving back into what it always was: the organization of human desire and the mediation of exchange.
When it was scarce and difficult, it was a profession, an industry, a discipline.
As AI makes it abundant and frictionless, it becomes invisible infrastructure—or it becomes the fundamental question of human existence: What do we actually want? And who gets to decide?
The “future of marketing” isn’t about tactics or channels or measurement.
It’s about whether humans retain agency in a world of perfect prediction and infinite personalization.
It’s about whether desire itself remains a human attribute or becomes an optimization target.
It’s about power: who controls the systems that shape what billions of people see, believe, want, and become.
The Marketing Question at Civilizational Scale
By 2045, the question isn’t “how do we market products?”
It’s “how do we maintain civilization when different populations have incompatible ontologies, values, and experiences of reality itself?”
Marketing—the organization of desire and the mediation of exchange—has become the substrate on which fragmented humanity either:
Finds ways to coexist, trading across boundaries, maintaining enough coherence to call ourselves one civilization, or…
Completes the bifurcation, splitting into separate post-human species that share ancestry but not future, like how we relate to chimpanzees but with less in common.
The Takeaway
Marketing was never just about selling—it was about shaping human desire. AI is now removing the constraints that once limited that shaping power, and we're not heading toward "marketing with AI." We're heading toward a phase transition in how desire, choice, and identity itself are structured.
The scenarios we've explored aren't science fiction—upload technology, neural interfaces, AI agents, and opt-out communities are already emerging. The transformation is coming. The question is: what do we want to become?
Marketing at civilizational scale doesn't just describe human desire—it constitutes it. Systems that optimize for engagement make us creatures of engagement. Systems that optimize for extraction make us extractable. We're conducting the largest experiment in behavior modification in human history, mostly unwittingly.
But we could choose differently. We could build systems that optimize for genuine flourishing, preserved agency, coordination across difference, and meaning-making rather than mere preference-satisfaction.
The decisions we make about AI and marketing in the next few years aren't business decisions—they're civilizational decisions about what kind of consciousness we want to cultivate and what kind of descendants we want to become.
Every click, purchase, and algorithmic interaction is a vote on what humanity becomes. The question is whether we're voting consciously or sleepwalking into transformation. The future isn't a destination—it's a direction we're choosing right now with every system we build and every optimization target we encode.