The lazy version of this analogy says, “AI is the new oil.” That is not quite right. Oil is a physical commodity. AI is a cognitive substrate. Oil is extracted, refined, transported, and burned. AI is trained, tuned, deployed, and reused. But there is a deeper comparison that is worth making, and it is far more interesting than the slogan: the oil industry did not initially understand which fraction of its own output would matter most, and I suspect we are making the same mistake with AI. (U.S. Energy Information Administration)
In the early petroleum era, kerosene was the star. After Drake’s 1859 well in Pennsylvania, crude oil was refined largely for lighting, and kerosene became the major refinery product for decades. Gasoline existed, but it was not the point. It was simply one of the lighter fractions that appeared in the refining process before the market knew what to do with it. The important historical lesson is not merely that gasoline was once unwanted. The lesson is that an industry can be standing on its future and still fail to recognize it. (U.S. Energy Information Administration)
That changed when two separate systems collided. First, electric lighting began to weaken kerosene’s central role. The Department of Energy notes that Edison patented and commercialized his incandescent bulb in 1879 and 1880, while Britannica notes that the advent of the electric lamp reduced kerosene’s value for lighting. Second, the automobile created a new need for a volatile, energy-dense fuel. The EIA notes that gasoline was not recognized as valuable until 1892, when the automobile was invented, and that by 1920 there were already 9 million gasoline-powered vehicles on U.S. roads. What had looked like a peripheral fraction suddenly became the center of an entire civilization. (The Department of Energy’s Energy.gov)
This is also why the diesel clarification matters. Gasoline was not initially “waste” because diesel displaced it. Diesel was its own separate path. Rudolf Diesel obtained a development patent in 1892 and published his engine design in 1893, with successful demonstrations following later in the decade. In other words, gasoline’s early low status belongs to the kerosene era and to the absence of a fitting machine, not to a victory by diesel. The machine came later, and when it came, the fraction found its destiny. (Encyclopedia Britannica)
Once the market discovered gasoline, the real winners were not merely the people who “had oil.” The winners were the people who could refine, transport, standardize, and distribute it at scale. Rockefeller’s Standard Oil did not become powerful just because crude existed in the ground. It became powerful because it acquired pipelines and terminal facilities, purchased refineries, expanded markets, and turned a messy raw material into a dependable industrial system. Later, refining innovations such as Burton’s thermal cracking increased the proportion of gasoline obtainable from petroleum, meaning that once demand appeared, the industry learned how to produce more of the fraction people actually wanted. (Encyclopedia Britannica)
That is the frame I find useful for AI. Most people are still talking about AI as though the visible chatbot is the main event. That may prove to be as historically narrow as thinking kerosene lamps were the permanent destiny of oil. The visible, conversational layer may be only the first commercially legible use case, not the final dominant one. The larger economic role of AI may emerge not from the glamorous demo but from the countless lower-level cognitive tasks that organizations currently treat as too small, too repetitive, too annoying, or too fragmented to dignify. Those fragments may turn out to be AI’s gasoline. (OECD)
This is why I do not think the deepest AI story is “content generation.” I think it is cognitive absorption. Generative AI’s largest measurable value today is already concentrated in functions like customer operations, marketing and sales, software engineering, and research and development. McKinsey estimates $2.6 trillion to $4.4 trillion in annual value across the use cases it analyzed, with an even larger upside when generative AI is embedded more broadly into work. The pattern is revealing: the economic value is not mainly in producing clever outputs for their own sake. It is in taking over work that absorbs human time, attention, and coordination. (McKinsey & Company)
That makes AI feel less like a tool and more like infrastructure. The OECD now argues that generative AI appears to exhibit the defining characteristics of a general-purpose technology: pervasiveness, continuous improvement, and innovation-spawning effects. That is exactly the category oil eventually entered as well. Oil was not just a product. It became a platform for transportation, chemicals, logistics, geopolitics, suburbanization, warfare, and modern growth. In the same way, AI may not settle into a neat “software category.” It may become an enabling layer that reorganizes the surrounding economy. (OECD)
And the infrastructure race is already visible. According to the IEA, data centres consumed about 415 terawatt-hours of electricity in 2024, roughly 1.5% of global electricity consumption, and in its base case that demand nearly doubles to about 945 terawatt-hours by 2030. At the same time, Stanford’s 2025 AI Index reports that private investment in generative AI reached $33.9 billion in 2024, while U.S. private AI investment reached $109.1 billion. Those are not the numbers of a novelty. Those are the numbers of a system beginning to harden into industrial form. (IEA)
The adoption data point in the same direction. Stanford reports that in 2024, 78% of survey respondents said their organizations were using AI, up from 55% the year before, and 71% reported using generative AI in at least one business function, more than double the prior year. That is an early sign that AI is moving out of the lab-demo phase and into the workflow phase. Oil became history-changing not when people admired crude, but when it disappeared into engines, heating systems, factories, and supply chains. AI may follow the same arc: its real importance will begin when it becomes less visible, not more. (Stanford HAI)
This is where the oil analogy becomes especially sharp. In the kerosene era, the industry understood illumination first. Mobility came later. In the AI era, we understand answers first. Delegation may come later. Today, people still approach AI as something they visit. They open a tab, type a prompt, get a response, and leave. That is the lantern model. The more consequential model is the engine model: AI as a continuously available background layer that routes calls, drafts follow-ups, classifies documents, monitors exceptions, retrieves institutional memory, updates records, handles first-pass service, prepares decisions, and quietly absorbs the attended work that never truly deserved a full unit of human consciousness in the first place. That is not just automation. That is a new allocation of attention.
Seen that way, the economic question changes. The question is no longer, “Can AI write a paragraph?” That is like asking whether refined petroleum can light a lamp. The real question is, “Which fraction of this emerging capability becomes the indispensable fuel for entirely new systems of work?” My guess is that the answer will not be the most theatrical use case. It will be the fraction that is currently undervalued because it feels too ordinary: background cognition, workflow delegation, decision preparation, synthetic memory, and the quiet absorption of low-surprise mental labor.
There is also a strategic warning here. Early industries routinely misprice their future because they overvalue the first successful interface. Kerosene was not unimportant. It was simply not ultimate. In the same way, chat may not be unimportant. It may simply be the first legible market that helps society finance the buildout of something larger. The executives who win the next decade may be the ones who understand that the model is not the whole business, just as crude was never the whole oil business. Refining mattered. Distribution mattered. Standards mattered. Reliability mattered. In AI, the equivalents are orchestration, evaluation, integration, governance, power, chips, data centres, and domain-specific deployment. (Encyclopedia Britannica)
Still, the analogy has limits, and those limits matter. Oil is finite, rivalrous, and destroyed in use. AI is replicable, non-rivalrous in many contexts, and often becomes more useful through iteration and integration. Oil’s primary externalities are environmental and geopolitical. AI’s externalities are epistemic, organizational, social, labor-related, and increasingly electrical. Oil pools where geology placed it. AI advantage pools where compute, talent, capital, power access, distribution, and trust converge. So no, AI is not literally oil. But like oil, it may become one of those base-layer capacities that reorganizes everything around it. (IEA)
The deepest lesson from petroleum history is not that new technologies are powerful. Everyone knows that. The deeper lesson is that societies usually misunderstand what a new substrate is for while they are standing inside its first commercial chapter. They mistake the first profitable use for the permanent essence. They think they are in the final form when they are really in a transitional form. Kerosene was real revenue, but it was not the final story. Chat is real revenue, but it may not be the final story either.
So when I compare AI to oil, that is the comparison I care about most: not resource to resource, but misrecognition to misrecognition. Gasoline looked secondary until the surrounding world changed and suddenly revealed what it had been all along. AI may be in that exact kind of moment now. The industry may still be selling lamps while the engines are quietly arriving.
