The cloud era: how centralised capex heralded the speed of AI
Welcome to the second article in our series on the AI productivity wave. We established in our first piece that the PC era was a glacial, two-decade-long journey of physical hardware deployment. Yet, the story of modern technology diffusion has a critical bridge: the cloud era (2006–2022).
The cloud wasn’t just a faster internet, it was an entirely new business model that fundamentally solved the PC era's central bottleneck. For investors trying to comprehend the breakneck speed of the current AI build-out, understanding the cloud era is essential. It is the template for how AI is being deployed today, proving that a major technological shift can compress adoption from decades into single-digit years.
Centralising capital, decentralising adoption
The cloud revolution was defined by one critical shift in corporate budgeting: customer capex became hyperscaler capex. In the PC era, every single company had to purchase, install, maintain and upgrade its own servers, storage and networking creating a massive and continuous drain on both capital (capex) and operating expense (opex). The cloud flipped this script.
Starting with the launch of Amazon Web Services (AWS) and its foundational services, S3 (Simple Storage Service) and EC2 (Elastic Compute Cloud) in 2006, a new infrastructure model was born. Companies no longer needed to buy servers; they could simply rent compute and storage capacity as a service.
The hyperscalers (AWS, Microsoft Azure, Google Cloud, etc.) took on the monumental, multi-billion-dollar annual cost of building massive, efficient data centres. This heavy, physical capital lift was centralised, freeing enterprises to focus on their core businesses.
A revenue ramp unseen since the genesis of the internet
The resulting acceleration in growth was unprecedented. The cloud era created one of the fastest revenue ramps in modern business history. In under 15 years, collective global cloud revenue went from essentially zero to over US$500 billion annually.
By the early 2020s, conservative estimates suggested the cloud was hosting around 70% of new IT workloads, demonstrating near-ubiquity in corporate digital transformation. This speed was a result of three structural advantages the cloud offered which are now being perfectly applied to AI:
1. Software-as-a-Service (SaaS) proliferation
The cloud enabled SaaS (Software-as-a-Service) to become the dominant delivery model. Companies like Salesforce, Workday, and Atlassian delivered one single codebase to millions of users globally requiring no local installation, no manual patching and no need for customised deployments. This standardised model drastically lowered the cost and complexity of integrating new technology.
2. Instant global distribution
Since the software was housed in the cloud and accessed via a web browser or API, new features and updates could be rolled out instantly. The time between a new technology’s release and its adoption by an enterprise dropped from months (in the PC era) to days or even hours. There was zero physical friction.
3. Solving the endpoint bottleneck
The cloud era benefitted from the PC era's success as every knowledge worker already had an internet-connected device (a PC, laptop or smartphone). The cloud simply connected these existing endpoints to the new, powerful centralised compute, a critical hurdle AI no longer has to overcome.
The cloud is the template for AI diffusion
The cloud era didn't just digitise businesses; it built the distribution rails for the next industrial revolution. This is the single most important lesson for investors.
AI is fundamentally a new layer of intelligence software being run on the same massive data centres built out during the cloud era. The shift is not AI replacing the cloud, the shift is AI running on top of the cloud.
This means AI skips the initial 20-year infrastructure build-out (the challenge of the PC Era) and jumps straight to the deployment phase at the speed of the Cloud Era.
The heavy, high-cost and complex parts of the AI equation, building AI-optimised data centres, installing GPUs from Nvidia and manufacturing advanced chips by TSMC, is once again being handled centrally by the very same hyperscalers.
The enterprise adoption of AI today via tools like Azure OpenAI, AWS Bedrock and Google Vertex AI is taking place through the same APIs and cloud marketplaces that drove the rapid growth of SaaS, propelled by:
- Centralised capital (hyperscaler capex)
- Decentralised, API-driven software distribution
- Instant propagation speed
This is why analysts believe AI's adoption curve won't mirror the 20-year PC cycle, but rather the rapid 5-10 year curve of the cloud era, only magnified by the sheer scale of the potential productivity gain. The cloud solved friction. AI is poised to leverage that solution to unlock trillions in knowledge worker value faster than any technology that came before it.
Tomorrow, we will examine the early data confirming this rapid-fire adoption and detail the four structural advantages that allow AI to compress decades into single-digit years.
1 topic