Table of Contents
For the first time in computing history, we're not just disrupting other industries with software—software itself is being disrupted by AI.
Key Takeaways
- Infrastructure is evolving beyond compute, networking, and storage to include AI models as a fundamental fourth pillar that's reshaping how we build software
- We're witnessing the first time in computing history where applications are abdicating core logic to AI systems rather than just abstracting resources
- The traditional boundaries between infrastructure and application companies are blurring as early-stage AI companies often serve as both during market expansion phases
- Developer tools are experiencing a renaissance, with coding assistants like Cursor proving that AI agents work exceptionally well for well-defined, error-correctable tasks
- The expansion phase of AI infrastructure means there's plenty of value to capture across all layers of the stack, from chips to applications
- Programming isn't disappearing—it's becoming more creative and accessible, likely leading to more developers rather than fewer
- Context engineering is emerging as the new programming paradigm, focusing on getting the right information to AI models rather than traditional prompt engineering
- Defensive moats in AI companies are forming differently than in traditional software, but early success stories show sustainable value creation is definitely possible
The Fourth Pillar: Why AI Infrastructure Changes Everything
Here's the thing about infrastructure—it never really goes away, it just gets layered on top of what came before. For decades, we've thought about the foundational elements of computing in three main categories: compute, networking, and storage. But something fundamental has shifted. AI models aren't just another tool in the developer toolkit; they're becoming a fourth pillar of infrastructure that's as essential as the other three.
What makes this so significant is how AI models leverage and build upon all the existing pillars while demanding entirely new approaches. They consume massive amounts of compute power, generate and process enormous datasets, and require sophisticated networking capabilities for optimal performance. But here's where it gets interesting—programming these models is completely non-obvious compared to traditional infrastructure.
- Traditional infrastructure responds predictably to programmer instructions and follows deterministic logic paths
- AI models sometimes ignore direct commands and occasionally write their own code, fundamentally changing the developer-machine relationship
- The infrastructure demands are so different that companies are literally building new types of data centers and designing specialized chips
- Memory requirements, latency expectations, and performance guarantees all shift when AI becomes central to your application architecture
The programming model shift is unprecedented in computer science. We've abstracted resources before—asking for compute, storage, or networking without worrying about the underlying hardware. But we've never abdicated the actual decision-making logic of our applications. Now we're essentially saying to these systems, "figure out the answer for me," which requires completely rethinking what it means to be a programmer and what software itself actually is.
Software Eating Software: The Ultimate Plot Twist
You know what's wild? For my entire career, software has been the disruptor. We disrupted taxis with Uber, hotels with Airbnb, retail with Amazon—basically every industry you can think of. Software was always the thing doing the disrupting. But here's the plot twist nobody saw coming: software is finally being disrupted by software itself.
This isn't just another technology wave. The internet changed how we distribute and access information. Mobile changed how we interact with devices. But AI is fundamentally changing how we create the software that powers everything else. It's like the industry is eating itself in the most fascinating way possible.
- The profession that many of us have dedicated our entire lives to is being fundamentally transformed from within
- Instead of just disrupting external industries, we're now disrupting our own tools, processes, and methodologies
- The disruption is happening faster than previous waves because the people building AI tools are the same people who understand software development intimately
- This creates both existential anxiety and incredible excitement within the developer community
What's particularly interesting is how this changes the entire value chain. Previously, software companies would identify inefficiencies in traditional industries and build solutions. Now, we're identifying inefficiencies in our own software development processes and building AI solutions to address them. The meta-nature of this transformation makes it both more complex and more revolutionary than anything we've seen before.
The Developer Renaissance: More Creators, Not Fewer
There's this persistent fear that AI will somehow eliminate the need for developers. But here's what I've observed talking to programmers today—they're like kids in a candy store. The tools available now are genuinely incredible, and they're enabling developers to build things they've always dreamed of but never had the bandwidth to tackle.
The mental model that makes the most sense is this: we're not going to shrink development teams because we have amazing new tools. That's just not how these markets have historically worked. When we introduced higher-level programming languages, we didn't end up with fewer programmers—we ended up with more software and more programmers. The same pattern is playing out with AI.
- AI coding assistants are accelerating the pace at which developers can pick up new languages and frameworks
- Side projects that were previously too time-intensive are becoming feasible for weekend experimentation
- The fundamental creativity required for software development—literally creating things that didn't exist before—remains uniquely human
- These models are essentially files on hard drives that transform data; they don't replace the creative and analytical thinking that programming requires
Here's a fascinating statistic that illustrates how development actually works: the average number of lines changed in a pull request across the industry is just two. That's the median. This reveals that most of software development isn't about writing massive amounts of new code—it's about understanding business needs, gathering requirements from users, and making precise adjustments to existing systems.
The hard part was never building the software itself. The hard part is figuring out what to build, understanding user needs, and making the countless small decisions that turn a technical solution into a valuable product. AI can certainly help with the implementation, but the creative and strategic aspects of development remain firmly in human hands.
Context Engineering: The New Programming Paradigm
Prompt engineering was just the beginning. What we're really talking about now is context engineering—the art and science of getting the right information into AI models at the right time. This isn't just about crafting clever prompts; it's about understanding what data, tools, and context an AI system needs to perform optimally.
Think about it this way: when you call a model, you need to know exactly what to put in that context window. Sometimes you'll use other models to help with this, but eventually, you're going to fall back on traditional computer science approaches—indexes, prioritization algorithms, data structures we've been using for decades.
- Context engineering requires combining AI capabilities with traditional programming approaches like indexing and data prioritization
- The goal is providing models with precisely the right information rather than just more information
- This creates entirely new infrastructure needs around data pipelines, discovery systems, and observability tools
- The emerging patterns will likely formalize into structured methodologies and development frameworks within the next five years
What's exciting is watching these patterns emerge in real-time. We're seeing the birth of entirely new ways to build software, complete with their own best practices, tools, and formal methodologies. It's like watching the early days of web development all over again, except this time we're building systems that can reason and make decisions.
The infrastructure implications are enormous. Traditional data pipeline problems suddenly become critical when your application's intelligence depends on feeding the right context to your models. How do you ensure data quality? How do you handle discovery and observability? These classic infrastructure challenges are taking on new importance in an AI-driven world.
Agents That Actually Work: The Coding Revolution
Let me be clear about something: I'm generally skeptical of the "agent" marketing hype. But coding agents? They're genuinely incredible. There's a simple reason why they work so well in programming environments compared to general web browsing or other tasks—error correction.
Here's my mental model: agents are essentially large language models running in loops. In most environments, errors propagate and compound throughout that loop, causing performance to degrade over time. But coding environments are special because they provide multiple ways to catch and correct errors—linting, interpretation, compilation, testing.
- Coding agents can iterate and improve because programming environments provide immediate feedback on correctness
- Tasks that can be well-articulated and have clear success criteria show the best agent performance
- The ability to test, compile, and validate code creates natural error-correction mechanisms that don't exist in other domains
- Even bite-sized, well-defined tasks are showing remarkable success rates with current agent technology
I'm seeing this transformation happening in real-time. Companies I work with are showing agent-generated commits appearing in their GitHub activity feeds and Slack integrations. These aren't just proof-of-concept demos—they're production commits solving real problems and adding genuine value.
The key insight is that agents work best in environments where you can programmatically verify correctness. Code is perfect for this because it either works or it doesn't, and there are multiple tools available to check functionality, style, and performance. This is why we're seeing such rapid adoption in development workflows while other agent applications are still struggling to prove their value.
The Expansion Phase: Why There's Room for Everyone
One of the most important concepts to understand about the current AI infrastructure market is that we're in an expansion phase, not a contraction phase. This means zero-sum thinking is not just wrong—it's actively harmful to investment and business strategy.
Think of it like the big bang. During expansion phases, new market opportunities are being created faster than existing players can capture them. We're seeing this everywhere: Nvidia keeps selling more chips despite predictions of market saturation, cloud platforms maintain healthy margins despite increased competition, and AI application companies are finding success across multiple layers of the stack.
- During expansion phases, aggressive investment and risk-taking typically outperform conservative, defensive strategies
- Every layer of the current AI stack is showing strong performance, from hardware manufacturers to application developers
- The "no defensibility" argument misunderstands how infrastructure markets actually work over time
- Historical precedent shows that expansion phases eventually consolidate into oligopolies or monopolies that maintain sustainable margins
What happens during the inevitable contraction phase? Layers don't disappear—they consolidate. And consolidated layers almost always maintain value through either oligopoly pricing (like cloud providers) or monopoly advantages (like Intel historically). The switching costs for infrastructure components are genuinely high, even for API-based services that appear easily replaceable.
The key insight is that successful infrastructure companies aren't competing in zero-sum games—they're racing to capture newly created market opportunities. The companies that understand this and invest aggressively during the expansion phase typically emerge as the dominant players when consolidation eventually occurs.
What This All Means for the Future
We're building systems to build other systems, and that's both the challenge and the opportunity ahead of us. The anthropomorphic fallacy—projecting human characteristics onto AI systems—is leading to both unrealistic fears and unrealistic expectations. These tools aren't going to solve every problem automatically, but they're also not going to replace the need for skilled professionals who understand system specifications and requirements.
The future still requires professionals. Formal systems evolved from natural language for good reasons, and every professional discipline has eventually developed specialized terminology and methodologies. Programming is no different. While AI makes certain aspects of development more accessible, the core skills of understanding user needs, system design, and problem-solving remain as valuable as ever.
What we're witnessing is the emergence of a new software development paradigm that combines human creativity with AI capabilities. The most successful developers and companies will be those who embrace this hybrid approach rather than viewing it as a replacement for traditional skills. It's not about humans versus machines—it's about humans working more effectively with increasingly powerful tools.
The infrastructure landscape will continue evolving, new patterns will emerge and formalize, and entirely new categories of companies will be built around these capabilities. But at its core, this is still about solving real problems for real people using the best tools available. That fundamental aspect of software development isn't changing, even as everything else transforms around it.