Skip to content

AI Is Scaling Faster Than Anyone Expected

US tech giants dominate global markets, but growth dynamics are shifting. Companies stay private longer while AI drives unprecedented expansion. With nearly $400 billion in annual infrastructure CapEx, the landscape of growth investing is undergoing a massive transformation.

Table of Contents

The premise of the current technology market is straightforward yet profound: tech markets are larger than they have ever been, and the most dominant companies are US-based technology firms. A look at the most valuable companies globally reveals that technology has effectively swallowed the market, with the majority of the top ten market cap leaders being tech giants.

However, the dynamics of how these companies grow and how investors approach them have shifted fundamentally. Companies are staying private longer—often up to 14 years—while the advent of Artificial Intelligence (AI) has introduced a tailwind that is expanding the market faster than any previous cycle. From infrastructure build-outs to shifting bottlenecks in energy, the landscape of growth investing is undergoing a massive transformation.

Key Takeaways

  • Unprecedented Infrastructure Investment: Big tech companies are deploying nearly $400 billion in annual CapEx, primarily into AI infrastructure, creating a massive supply-side foundation for new applications.
  • Rapid Cost Deflation: The cost of accessing AI models has declined by roughly 99% over the last two years, outpacing Moore’s Law, while performance capabilities double approximately every seven months.
  • Global Distribution Speed: AI has achieved adoption speeds previously unimaginable—ChatGPT reached 365 billion searches in two years, a milestone that took Google 11 years to achieve.
  • Shifting Bottlenecks: While compute supply is the current constraint, the industry anticipates energy production and data center cooling will become the primary bottlenecks within five years.
  • The Private Market Shift: High-growth opportunities are increasingly concentrated in private markets, as public software companies show slower growth rates compared to the "hypergrowth" era of the early 2000s.

The Magnitude of the AI Infrastructure Build-Out

The groundwork currently being laid for the AI revolution is distinct from previous technological cycles in both scale and stability. While the dot-com era was characterized by telecom debt and speculative build-outs, the current AI infrastructure boom is largely funded by the balance sheets of the most profitable companies in history: Google, Meta, Amazon, and Microsoft.

Current estimates suggest an annual capital expenditure run rate of roughly $400 billion from these major players, the vast majority of which is directed toward data centers and AI infrastructure. This massive overbuild is a feature, not a bug; these companies can afford to bear the burden of potential capacity oversupply. For the broader ecosystem, this creates a robust foundation upon which startups can build without needing to capitalize the infrastructure themselves.

The groundwork that's being laid is bigger than anything we've ever seen before... the infrastructure is going to get built for all of the training and inference needs that the market is going to need.

Cost Deflation vs. Quality Improvement

Simultaneous to this massive spend is a dramatic divergence between cost and quality. Over the last two years, the input cost—the price of accessing these frontier models—has declined by greater than 99%. This represents a decrease significantly faster than Moore’s Law.

Conversely, frontier model capabilities are improving by a factor of two roughly every seven months. This combination of plummeting costs and skyrocketing intelligence suggests that AI will eventually function like a utility—similar to electricity or WiFi—where the underlying resource is ubiquitous and affordable, shifting the value to what is built on top of it.

Demand Signals and Consumer Surplus

Unlike previous cycles where infrastructure had to precede adoption (e.g., laying fiber cables before broadband usage or manufacturing smartphones before mobile app adoption), AI benefits from immediate global distribution. Because it is built on the back of the existing internet and cloud computing, it reached a global audience instantly.

The demand signals are staggering. OpenAI reached scale five and a half times faster than Google did in its early years. Today, estimates suggest there are between 1.5 and 2 billion active users of AI products globally, with roughly half of the global internet population having tried these tools.

The 90/10 Rule of Economic Value

A critical consideration for investors and founders is where the economic value of AI will accrue. Historical patterns suggest that technology creates massive consumer surplus. A general rule of thumb is that 90% of the value goes to the end customer (in the form of efficiency, time saved, or "magic"), while 10% is captured by the companies providing the service.

However, given the size of the AI market—which is poised to be significantly larger than the software market due to its impact on the broader economy—capturing just 10% represents a monumental market capitalization opportunity. The challenge for companies lies in monetization strategies that effectively capture this slice without eroding user trust.

Future Bottlenecks: Energy and Cooling

As the constraint on chips and compute production begins to ease through manufacturing scaling, the bottlenecks for AI expansion are expected to shift toward physics and infrastructure.

The Energy Constraint

Current energy production methods are becoming a limiting factor. The industry is already seeing a pivot toward nuclear energy to power the next generation of data clusters. Major tech companies are exploring co-location with nuclear plants, and there is renewed optimism for restarting dormant facilities like Three Mile Island. While natural gas in regions like West Texas offers a stopgap, the long-term solution requires new energy sources.

Thermodynamics and Cooling

Beyond energy generation, the next frontier of engineering challenges will likely be cooling. Managing the heat output of massive GPU clusters without environmental damage or hardware failure will drive a new wave of innovation in data center design and thermal management.

Once we figure out how to generate all this energy, how to actually cool all this stuff down without boiling our oceans... you'll see a whole wave of innovation around that part as well.

Evolving Business Models and Metrics

The introduction of AI is forcing a re-evaluation of traditional SaaS metrics. While the "seat-based" subscription model remains dominant, there is significant experimentation regarding usage-based pricing and outcome-based monetization.

Analyzing Gross Margins

There is currently intense scrutiny regarding the gross margins of AI application companies, particularly regarding their reliance on third-party models. However, smart investing requires a forward-looking view on input costs. Because the cost of model inference is dropping so precipitously (the 99% decline mentioned earlier), companies with lower margins today may see significant expansion as competition among model providers (OpenAI, Anthropic, Google) drives prices down further.

Consequently, investors are placing higher premiums on gross retention and customer love rather than penalizing early-stage AI companies for temporary margin compression. If the product delivers undeniable value and retains users, the unit economics are expected to correct over time.

Durability of Revenue

Stickiness in AI applications comes from integration into workflows. While simple "wrapper" applications or prototyping tools may suffer from high churn, platforms that integrate deep into enterprise rules engines, medical workflows, or customer support protocols are showing high durability. The more "context" an application has about a specific business's operations, the harder it is to rip out.

The Shift to Private Markets

A defining characteristic of this era is the migration of growth from public to private markets. In the past, companies went public within 5 to 10 years. Today, the timeline is often 14 years or longer. This has resulted in a private market capitalization of roughly $3.5 trillion—a figure that has grown 7x over the last decade.

This shift has implications for public market investors. Currently, roughly 95% of public software and internet companies are forecasting growth of less than 25%. The "hypergrowth" assets are almost exclusively private. This dynamic forces a change in investment strategy, necessitating access to companies much earlier in their lifecycle.

Investment Strategy: Momentum vs. Research

In this environment, successful capital deployment often falls into two distinct buckets:

  1. Undeniable Momentum: Companies that are "flying off the page" with product-market fit, adoption, and revenue growth (e.g., coding assistants like Cursor).
  2. Elite Research Teams: Early-stage bets on the top five or so research teams in the world. These are high-variance bets where business risk is high, but the scarcity of talent (the "Ilya Sutskever" level researchers) provides a form of downside protection due to the sheer value of the human capital involved.

Conclusion

The AI revolution is scaling faster than any previous technology cycle, driven by an unprecedented convergence of capital, infrastructure build-out, and consumer demand. While risks remain—particularly around energy constraints and the eventual monetization of consumer surplus—the structural shift is undeniable. For investors and founders alike, the variance in outcomes has increased, but so has the potential upside. As the infrastructure settles, the next decade will likely be defined by the applications that successfully leverage this new utility to reimagine workflows and economic value.

Latest

The creator of Clawd: "I ship code I don't read"

The creator of Clawd: "I ship code I don't read"

Peter Steinberger, creator of Clawd, merges 600 commits daily using a fleet of AI agents. In this deep dive, discover how he challenges engineering norms by shipping code he doesn't read, treating PRs as "Prompt Requests," and replacing manual review with autonomous loops.

Members Public
The Clawdbot Craze | The Brainstorm EP 117

The Clawdbot Craze | The Brainstorm EP 117

The AI landscape is shifting to autonomous agents, led by the viral "Claudebot." As developers unlock persistent memory, OpenAI refines ad models, and Tesla hits new milestones, software intelligence meets real-world utility. Tune into The Brainstorm EP 117.

Members Public