The glacial start of the PC era: a lesson for the AI revolution
Welcome to the first in our series: Why AI is the fastest revolution.
We often think of technological disruption as an overnight event, but for investors, it pays to remember the long game. The analogy of the PC era (1980–2005) is perhaps the most instructive for today’s AI boom. It reshaped the global economy and delivered staggering returns, yet its start was anything but explosive. It was, in fact, glacial. This historical analogue teaches us that the greatest revolutions require patience, monumental upfront capital and years of costly, granular, physical deployment.
The price of progress: $15,000 for a desktop
To truly appreciate the transformation AI represents, let's step back to the mid-1980s. When you consider the price of the foundational technology of the PC era, the modern cost of an AI subscription seems like an absolute bargain. In 1985, a powerhouse office machine such as an IBM PC XT with two disk drives and a monster 10-megabyte hard drive (significantly less storage than a single photo on your smartphone today) would have set a business back around US$4,995. Adjusting that for inflation to today's dollars (using 2025 as the basis), that same entry-level machine would cost an astonishing ~US$15,000.
Today, any business can unlock generative AI capabilities across its entire workforce for roughly US$30 per seat per month (less than $1 per day) for services like ChatGPT Plus or Microsoft Copilot. The friction and upfront capital required to deploy the PC were orders of magnitude higher. This dramatic cost disparity underscores one of the key differences in the current AI revolution: the barrier to entry for adoption is incredibly low.
PC era: a 20-Year diffusion timeline
The sheer cost of the hardware, coupled with the complexity of deployment, meant the PC era was a slow burn, even within the corporate structure.
Consider the timeline in the US:
- 1981: the IBM PC launched The start of the hardware cycle, confined initially to specific departments (e.g., accounting, engineering) due to its high cost.
- Early 1990s: core business adoption phase. While only ~15% of US households owned a PC, most white-collar offices had begun the protracted, costly process of replacing typewriters with networked PCs, initiating the lengthy process of workflow redesign.
- 1995: Windows 95 and the GUI revolution. The introduction of a user-friendly Graphical User Interface finally lowered the skill barrier for the majority of workers, accelerating mass deployment across all office roles.
- 2005: ubiquity achieved. Two decades after the PC launched, penetration was above 65%, and office productivity software was essentially universal, leading to a massive economic payoff.
The payoff was enormous, but the plumbing, including the hardware, the servers, the networking, the training of millions of workers and the redesign of entire workflows took two full decades to build out.
The massive economic payoff
Despite the slow start, the eventual economic contribution was significant and provides a blueprint for the AI opportunity. The PC era laid the foundation for the internet and added ~1% to US annual productivity growth during its peak years (mid-1990s to early 2000s).
We can track its value through global IT investment as a share of global economic output. In 1980, investment in "Information Processing Equipment and Software" was less than 0.6% of GDP. By 2000, it had surged to ~3% of GDP. Assuming an approximate global GDP of US$111 trillion today, that 3% share translates to over US$3.3 trillion in economic activity. The PC era ramped from a fraction of a percent to a multi-trillion-dollar economic engine over 20 years.
Addressing the sceptics: good capital vs bad capital
The current AI bubble narrative likens the surge in AI capex spending to the debt-fuelled Telecom bust, but this analysis fails to hold up when applied to the core infrastructure. As we established previously in 'Is there an AI bubble? And is there still value? Both can be true,' we view the massive capex as a mix of both. The good capital is overwhelmingly deployed by the hyperscalers (Microsoft, Amazon and Google) which are highly profitable companies with immense cash flows and strong balance sheets. This core build-out is financially sound. However, the bad capital, the speculative debt binges and overvaluation seen in the broader AI ecosystem is a real risk. The key lesson for investors is to distinguish between the two.
The key lesson for AI
The PC era diffused slowly because it was a physical capital cycle. Every organisation had to:
- Buy new hardware (PCs, servers) and constantly upgrade it, a cycle dictated by the relentless obsolescence of Moore's Law.
- Deploy software on-premise, which was expensive and customised.
- Train workers to use complex, new systems.
It was an inherently high-friction, capital-intensive process.
Today, AI is a pure software-based productivity wave. The ~US$250 billion per year in capital expenditure currently being deployed is centralised with the hyperscalers (Microsoft, AWS, Google), and the software is distributed instantly via APIs (Application Programming Interfaces) and existing devices. The lesson for investors is clear: the AI wave combines the economic potential of the PC era with a speed of diffusion far closer to the cloud era, making it potentially the biggest, fastest productivity boom in modern history.
In summary, the PC era's glacial start teaches us a monumental lesson: monumental returns require monumental, sustained upfront capital. However, the AI revolution is fundamentally different, it's not building a physical capital base from scratch but rather layering intelligence on top of existing infrastructure.
Tomorrow, we will consider the critical bridge between the PC and AI eras: the cloud era. We will show how centralised capital investment paved the way for the instant, zero-friction deployment that defines AI today.
1 topic