Stop calling it 'The AI bubble': It's actually multiple bubbles, each with a different expiration date

0



It’s the question on everyone’s minds and lips: Are we in an AI bubble?

It's the wrong question. The real question is: Which AI bubble are we in, and when will each one burst?

The debate over whether AI represents a transformative technology or an economic time bomb has reached a fever pitch. Even tech leaders like Meta CEO Mark Zuckerberg have acknowledged evidence of an unstable financial bubble forming around AI. OpenAI CEO Sam Altman and Microsoft co-founder Bill Gates see clear bubble dynamics: overexcited investors, frothy valuations and plenty of doomed projects — but they still believe AI will ultimately transform the economy.

But treating "AI" as a single monolithic entity destined for a uniform collapse is fundamentally misguided.  The AI ecosystem is actually three distinct layers, each with different economics, defensibility and risk profiles. Understanding these layers is critical, because they won't all pop at once. 

Layer 3: The wrapper companies (first to fall)

The most vulnerable segment isn't building AI — it's repackaging it.

These are the companies that take OpenAI's API, add a slick interface and some prompt engineering, then charge $49/month for what amounts to a glorified ChatGPT wrapper. Some have achieved rapid initial success, like Jasper.ai, which reached approximately $42 million in annual recurring revenue (ARR) in its first year by wrapping GPT models in a user-friendly interface for marketers.

But the cracks are already showing. These businesses face threats from every direction:

Feature absorption: Microsoft can bundle your $50/month AI writing tool into Office 365 tomorrow. Google can make your AI email assistant a free Gmail feature. Salesforce can build your AI sales tool natively into their CRM. When large platforms decide your product is a feature, not a product, your business model evaporates overnight.

The commoditization trap: Wrapper companies are essentially just passing inputs and outputs, if OpenAI improves prompting, these tools lose value overnight. As foundation models become more similar in capability and pricing continues to fall, margins compress to nothing.

Zero switching costs: Most wrapper companies don't own proprietary data, embedded workflows or deep integrations. A customer can switch to a competitor, or directly to ChatGPT, in minutes. There's no moat, no lock-in, no defensibility.

The white-label AI market exemplifies this fragility. Companies using white-label platforms face vendor lock-in risks from proprietary systems and API limitations that can hinder integration. These businesses are building on rented land, and the landlord can change the terms, or bulldoze the property, at any moment.

The exception that proves the rule: Cursor stands as a rare wrapper-layer company that has built genuine defensibility. By deeply integrating into developer workflows, creating proprietary features beyond simple API calls and establishing strong network effects through user habits and custom configurations, Cursor has demonstrated how a wrapper can evolve into something more substantial. But companies like Cursor are outliers, not the norm — most wrapper companies lack this level of workflow integration and user lock-in.

Timeline: Expect significant failures in this segment by late 2025 through 2026, as large platforms absorb functionality and users realize they're paying premium prices for commoditized capabilities.

Layer 2: Foundation models (the middle ground)

The companies building LLMs — OpenAI, Anthropic, Mistral — occupy a more defensible but still precarious position.

Economic researcher Richard Bernstein points to OpenAI as an example of the bubble dynamic, noting that the company has made around $1 trillion in AI deals, including a $500 billion data center buildout project, despite being set to generate only $13 billion in revenue. The divergence between investment and plausible earnings "certainly looks bubbly," Bernstein notes.

Yet, these companies possess genuine technological moats: Model training expertise, compute access and performance advantages. The question is whether these advantages are sustainable or whether models will commoditize to the point where they're indistinguishable — turning foundation model providers into low-margin infrastructure utilities.

Engineering will separate winners from losers: As foundation models converge in baseline capabilities, the competitive edge will increasingly come from inference optimization and systems engineering. Companies that can scale the memory wall through innovations like extended KV cache architectures, achieve superior token throughput and deliver faster time-to-first-token will command premium pricing and market share. The winners won’t just be those with the largest training runs, but those who can make AI inference economically viable at scale. Technical breakthroughs in memory management, caching strategies and infrastructure efficiency will determine which frontier labs survive consolidation.

Another concern is the circular nature of investments. For instance, Nvidia is pumping $100 billion into OpenAI to bankroll data centers, and OpenAI is then filling those facilities with Nvidia's chips. Nvidia is essentially subsidizing one of its biggest customers, potentially artificially inflating actual AI demand.

Still, these companies have massive capital backing, genuine technical capabilities and strategic partnerships with major cloud providers and enterprises. Some will consolidate, some will be acquired, but the category will survive.

Timeline: Consolidation in 2026 to 2028, with 2 to 3 dominant players emerging while smaller model providers are acquired or shuttered.

Layer 1: Infrastructure (built to last)

Here’s the contrarian take: The infrastructure layer — including Nvidia, data centers, cloud providers, memory systems and AI-optimized storage — is the least bubbly part of the AI boom.

Yes, the latest estimates suggest global AI capital expenditures and venture capital investments already exceed $600 billion in 2025, with Gartner estimating that all AI-related spending worldwide might top $1.5 trillion. That sounds like bubble territory.

But infrastructure has a critical characteristic: It retains value regardless of which specific applications succeed. The fiber optic cables laid during the dot-com bubble weren’t wasted — they enabled YouTube, Netflix and cloud computing. Twenty-five years ago, the original dot-com bubble burst after debt financing built out fiber-optic cables for a future that had not yet arrived, but that future eventually did arrive, and the infrastructure was there waiting.

Despite stock pressure, Nvidia’s Q3 fiscal year 2025 revenue hit about $57 billion, up 22% quarter-over-quarter and 62% year-over-year, with the data center division alone generating roughly $51.2 billion. These aren’t vanity metrics; they represent real demand from companies making genuine infrastructure investments.

The chips, data centers, memory systems and storage infrastructure being built today will power whatever AI applications ultimately succeed, whether that’s today’s chatbots, tomorrow’s autonomous agents or applications we haven’t even imagined yet. Unlike commoditized storage alone, modern AI infrastructure encompasses the entire memory hierarchy — from GPU HBM to DRAM to high-performance storage systems that serve as token warehouses for inference workloads. This integrated approach to memory and storage represents a fundamental architectural innovation, not a commodity play.

Timeline: Short-term overbuilding and lazy engineering are possible (2026), but long-term value retention is expected as AI workloads expand over the next decade.

The cascade effect: Why this matters

The current AI boom won't end with one dramatic crash. Instead, we'll see a cascade of failures beginning with the most vulnerable companies, and the warning signs are already here.

Phase 1: Wrapper and white-label companies face margin compression and feature absorption. Hundreds of AI startups with thin differentiation will shut down or sell for pennies on the dollar. More than 1,300 AI startups now have valuations of over $100 million, with 498 AI "unicorns" valued at $1 billion or more, many of which won't justify those valuations.

Phase 2: Foundation model consolidation as performance converges and only the best-capitalized players survive. Expect 3 to 5 major acquisitions as tech giants absorb promising model companies.

Phase 3: Infrastructure spending normalizes but remains elevated. Some data centers will sit partially empty for a few years (like fiber optic cables in 2002), but they'll eventually fill as AI workloads genuinely expand.

What this means for builders

The most significant risk isn't being a wrapper — it’s staying one. If you own the experience the user operates in, you own the user.

If you're building in the application layer, you need to move upstack immediately:

From wrapper → application layer: Stop just generating outputs. Own the workflow before and after the AI interaction.

From application → vertical SaaS: Build execution layers that force users to stay inside your product. Create proprietary data, deep integrations and workflow ownership that makes switching painful.

The distribution moat: Your real advantage isn't the LLM, it's how you get users, keep them and expand what they do inside your platform. Winning AI businesses aren't just software companies — they're distribution companies.

The bottom line

It’s time to stop asking whether we're in "the" AI bubble. We're in multiple bubbles with different characteristics and timelines.

The wrapper companies will pop first, probably within 18 months. Foundation models will consolidate over the next 2 to 4 years. I predict that current infrastructure investments will ultimately prove justified over the long term, although not without some short-term overbuilding pains.

This isn't a reason for pessimism, it's a roadmap. Understanding which layer you're operating in and which bubble you might be caught in is the difference between becoming the next casualty and building something that survives the shakeout.

The AI revolution is real. But not every company riding the wave will make it to shore.

Val Bercovici is CAIO at WEKA.



Source link

You might also like
Leave A Reply

Your email address will not be published.