Oil and semiconductors: a story that repeats itself but is not the same.


By Jean-Claude Muller and John-Guy Park





The article " Oil and semiconductors: a story that repeats itself but is not the same" explores the parallels between oil and semiconductors, two crucial resources that have shaped not only technology and industry, but also global geopolitics. Oil, long considered the lifeblood of the global economy, now faces a new competitor in semiconductors, the cornerstones of the digital age. This article highlights the implications, geopolitical context and historical parallels of these two vital resources.

Oil has been synonymous with economic and military power since the dawn of the 20th century. The nations that controlled the vast reserves of oil, or possessed the capacity to exploit them, often dictated the rules of world order. The transition to new forms of energy and associated power is inevitable, as summed up by the quote from former Saudi Oil Minister Zaki Yamani: "The Stone Age didn't end because the world ran out of stones; the Oil Age won't end because we run out of oil."

The article also discusses the evolution of semiconductors as the "new energy" of the future, powering almost every advanced technology, from smartphones to electric cars and artificial intelligence. Semiconductors are becoming the vital industrial resource that will power the vast majority of the economy of the future. This reality affirms that the transition to an economy where information processing and storage are just as critical as physical energy was in the 20th century.

The article also discusses the importance of GPUs (Graphic Processing Units) in the semiconductor field, their role in artificial intelligence and the geopolitical competition for access to the most advanced GPUs. Nations that dominate semiconductor production and innovation, and GPUs in particular, gain a strategic advantage in the global AI race.


In the complex fabric of the recent global economy, two resources have played and continue to play crucial roles, shaping not only technology and industry but also global geopolitics: oil and semiconductors. Oil, long considered the lifeblood of the global economy, now finds a new competitor in semiconductors, the cornerstones of the digital age. This analysis explores the parallels between these two vital resources, highlighting implications, geopolitical context and historical parallels.


Oil: A History of Power and Conflict



Since the dawn of the 20th century, oil has been synonymous with economic and military power. Nations that controlled vast oil reserves, or possessed the capacity to exploit them, often dictated the rules of world order. As former Saudi Oil Minister Zaki Yamani put it, "The Stone Age didn't end because the world ran out of stones; the Oil Age won't end because we run out of oil." This quote sums up the inevitable transition to new forms of energy and associated power.

Petroleum, this age-old natural resource, has long been used in relatively simple applications such as lighting via kerosene lamps. It was only at the beginning of the 20th century, with the advent of the automobile and aeronautics industries, that oil became a strategic resource. These booming sectors had no viable alternative to oil as an energy source, catapulting its demand and value to unprecedented levels. This transformation not only reshaped the global economy, but also played a part in contemporary geopolitical dynamics revolving around energy resources.

The recent replacement of combustion engines by powerful, reliable electric motors illustrates a major turning point in the history of technology and innovation. It has taken almost a century to develop and refine electric motor technology to the point where they can now compete with, and often surpass, their internal combustion counterparts in terms of performance and energy efficiency. This transition to electrification reflects the ongoing move towards more sustainable solutions that are less dependent on fossil fuels.


Semiconductors: The "New Energy" of the Future


Today, semiconductors are at the heart of almost every advanced technology, from smartphones to electric cars and artificial intelligence. Semiconductors are becoming the vital industrial resource that will power the vast majority of the economy of the future. This reality asserts that an information processing and storage economy will become just as critical as access to oil energy was in the 20th century.

A computer is made up of several key components: the processor (or CPU), which is the brain of the computer executing program instructions; memory (RAM), which temporarily stores data for rapid access by the CPU; and semiconductors, which are the materials used in the manufacture of electronic components such as transistors, essential to the operation of processors and memory. The GPU (Graphic Processing Unit) is the electronic component specialized in graphic data processing. It is designed to rapidly perform the complex calculations required to render images, videos and animations with great precision and detail.

Unlike oil, none of these components is a natural resource; they have to be designed, developed and constantly improved. This innovation takes place mainly in developed countries, while manufacturing is often relocated to countries in the South, reflecting an international division of labor influenced by production costs and the availability of specialized labor.

Semiconductors are materials whose ability to conduct electricity can be precisely controlled, making them indispensable in the manufacture of electronic circuits. Moore's Law, which predicted a doubling in the number of transistors on a microprocessor every two years or so, has long guided the industry, driving constant increases in power and efficiency. Today, it's not just a question of quantity, but of power and energy efficiency, making semiconductors a strategic issue comparable to oil in terms of importance to the global economy.

Access to semiconductors has become a major issue due to their crucial importance in a wide range of industries, from automotive to information technology, and the concentration of their production in a limited number of geographical regions. The recent global shortage of semiconductors has highlighted their critical role and the vulnerabilities of global supply chains, exacerbating geopolitical and economic tensions.

Explosive demand for artificial intelligence (AI) capabilities has turned GPUs into strategic commodities. Technology companies and research institutions are scrambling for the most powerful GPUs to fuel their AI projects, causing shortages and driving up costs. This situation is reminiscent of the race for oil, where access to a resource determines a nation's or company's ability to innovate and maintain competitiveness.

Initially, central processing units (CPUs) were seen as essential components of computing, while graphics processing units (GPUs) were seen as additional accessories, with a higher tolerance for error in calculations, due to their use mainly in video games. However, as graphics rendering technologies advanced, notably towards photorealism and high-resolution 3D, computing requirements began to diverge. Whereas CPUs were adapted to handle high volumes of information rapidly, GPUs distinguished themselves by their ability to perform parallel calculations, thus reducing latency, i.e. the delay between an event and its graphical representation.

This development has prompted engineers to explore the use of GPUs for parallel mathematical calculations, including finite element calculations, where they have demonstrated remarkable efficiency. The CUDA architecture, for example, has been adopted in professional applications such as Dassault Systèmes' CATIA, paving the way for new uses of GPUs beyond video games.

Furthermore, the rise of blockchain and, by extension, cryptocurrencies, has revealed another facet of GPU use. Cryptocurrency mining, requiring intensive computing capacity, has led to the emergence of GPU-equipped "mining farms", often at the expense of environmental considerations. This increased demand for GPUs has also influenced the gaming computer market, where a mid-range model can today fetch a price of 2,000 euros, with half of this sum attributed to the GPU alone.

The rapid evolution of artificial intelligence technologies has propelled GPUs to the status of indispensable resources. Initially designed to improve graphics performance in video games, GPUs have proved extremely efficient for executing complex AI algorithms, particularly LLMs.

A Large Language Model is a type of machine learning model designed to understand and generate text in natural language. It is trained on a large volume of text data, enabling it to learn the structures, patterns and nuances of human language. It is a computer tool capable of reading and understanding text in a way similar to that of humans. It can predict which word or phrase is likely to follow a given text, answer questions, summarize texts, translate languages and even generate original text.

These models require colossal computing power to process the vast datasets on which they are trained, making GPUs essential for the development and deployment of advanced AI. According to Jenssen Huang, one of Nvidia's three co-founders, "The revolution is just beginning". The company's sales jumped 265% to $22.1 billion in the last quarter of 2023. "The first wave concerns data centers, "veritable factories" for manufacturing AI. Once AI models have been trained, their use will require more and more computing power. The part of Nvidia's business related to inference has grown enormously because of the use of AI models to generate text, images or videos" explains Jensen Huang.

Projects like OpenAI's GPT-4 require thousands of highly specialized GPUs to process and learn huge amounts of information. Training such models without an adequate GPU infrastructure would be unthinkable. Efforts to create general artificial intelligence, capable of performing a wide range of cognitive tasks at a human level, rely heavily on the capabilities of GPUs. Innovations in this field could redefine many aspects of our society, from medicine and education to industrial production.


GPUs, Data Centers and Geopolitics: A New Arena of Competition



The competition for access to the most advanced GPUs also has a geopolitical dimension. Nations that dominate semiconductor innovation and production, and GPUs in particular, gain a strategic advantage in the global AI race. The main semiconductor-producing countries include Taiwan, South Korea, the USA and China. Taiwan, thanks to TSMC (Taiwan Semiconductor Manufacturing Company), is the world leader in semiconductor manufacturing. Samsung Electronics in South Korea and Intel in the USA are also major players in semiconductor production.When it comes to innovation, particularly in GPU development, the USA stands out with companies like Nvidia and AMD. Nvidia, in particular, is recognized for its significant contributions to the advancement of GPUs, both for video games and artificial intelligence applications.This has led some countries to invest heavily in their own semiconductor production capacity, in the hope of reducing their dependence on foreign suppliers. Faced with the concentration of semiconductor production in Asia, the USA has launched initiatives to revitalize its domestic semiconductor industry. The CHIPS Act is an example of these efforts, aimed at stimulating domestic semiconductor production through subsidies and financial incentives.European Union: The EU has also announced plans to increase its share of global semiconductor production, with ambitious targets for achieving self-sufficiency in this critical sector. The European plan aims to double its share of global semiconductor production by 2030.China: China is investing massively in its semiconductor production capacity as part of its "Made in China 2025" strategy, with the aim of reducing its dependence on foreign suppliers and becoming a world leader in this field.South Korea and Taiwan also continue to invest in expanding their production capacity, with massive investment plans announced by companies such as Samsung, SK Hynix, and TSMC to maintain and strengthen their dominant market positions.These efforts reflect worldwide recognition of the strategic importance of semiconductors and GPUs in the global digital economy, underlining the race for technological sovereignty in the context of geopolitical tensions and economic competition.


The growing importance of GPUs raises a number of questions, particularly in terms of accessibility and sustainability. The growing demand for these components puts pressure on the natural resources required for their production, and their energy consumption poses significant environmental challenges:

Silicon: Silicon remains the basic material for the manufacture of semiconductors, including GPUs. Although silicon is indeed abundant, the type of silicon used in semiconductors must be of extremely high purity. Transforming natural silicon into a semiconductor-grade material is a costly and energy-intensive process. What's more, the production of high-purity silicon can be concentrated in certain regions, which can create supply bottlenecks.

  • Other materials: In addition to silicon, other critical materials are used in GPU manufacturing, such as gallium, arsenic (for compound semiconductors like GaAs), germanium, and various rare metals (such as indium and tantalum). These materials are less abundant than silicon and can pose challenges in terms of availability and sustainable extraction.
  • Materials such as gallium and indium are often by-products of the extraction of other ores (such as zinc and copper), and their supply therefore depends on global mining operations. Major mining areas include China, the Democratic Republic of Congo (for cobalt, another important element in certain types of semiconductors and batteries), Australia, and other mineral-rich countries.
  • Energy consumption: GPU manufacturing is energy-intensive, not only because of the silicon purification process, but also because of the complex steps involved in manufacturing the chips themselves. What's more, the intensive use of GPUs in data centers for tasks such as cryptocurrency mining, AI model training, and cloud gaming increases energy demand, raising environmental concerns.
  • Environmental challenges: The energy consumption of data centers and infrastructures needed to support the intensive use of GPUs poses challenges in terms of CO2 emissions and impact on climate change. This highlights the need for renewable energy sources and more efficient technologies to reduce the semiconductor industry's carbon footprint.

To overcome these obstacles, the industry is exploring more efficient, energy-saving alternatives, such as specialized chips and optimization of AI algorithms. These innovations could enable a more sustainable use of resources and pave the way for democratizing access to cutting-edge AI.

The American "Magnificent Seven" Alphabet, Amazon, Apple, Meta, Microsoft, Nvidia and Tesla, along with China's Alibaba and Tecent and Korea's Samsung, are the biggest users of processors and semiconductors. It's worth noting that there are no major European players, and none from Japan.

Their dependence on these technologies to power their data centers, products and services underlines the strategic importance of semiconductors in the digital economy.

Data centers.

Data centers, the brains of the digital age, are at the heart of the global Internet infrastructure, artificial intelligence and massive data storage. Their role in processing and managing data makes them essential pillars of the digital economy, underpinning the operations of thousands of businesses worldwide, from startups to technology giants.

The location of data centers has become a major geopolitical issue for several reasons:

  1. Digital Sovereignty: Nations aspire to maintain or increase their digital sovereignty by hosting data centers on their soil, thereby controlling the digital data and services essential to their economy and national security.
  2. Access to resources: Data centers require significant access to energy resources for their operation and cooling. Countries with cheap or renewable energy sources have thus become prime locations for these infrastructures.
  3. Security and stability: The political and economic stability of a region influences data center location decisions. Companies and governments are looking for environments where investment in digital infrastructure is secure over the long term.
  4. Latency and market proximity: Geographical location affects latency, i.e. the response time between the user and the data center. Proximity to major economic centers and user populations is therefore sought to improve the efficiency of online services.

Just as the control of oil resources shaped regional planning, international relations and geopolitics in the 20th century, the mastery of key technologies, including semiconductors and digital infrastructures such as data centers, is now shaping a new form of strategic power. Countries that dominate these technologies enjoy a major competitive advantage, influencing global economic, technological and military dynamics.

Growing dependence on semiconductors and advanced computing capabilities, particularly for artificial intelligence, makes the location of data centers and semiconductor production central issues in global competition. This situation is reminiscent of the race to access and control oil, but with implications adapted to the realities of the digital age.


Conflict and Geopolitics: From Black Gold to Electronic Chips



The parallels between oil and semiconductors also extend to conflict and geopolitics. Just as oil was at the center of many 20th century conflicts, semiconductors are beginning to play a similar role in international tensions. Global dependence on a limited number of semiconductor suppliers, notably Taiwan, is creating vulnerabilities and tensions similar to those seen in the oil industry.

For almost a century, access to oil has historically been more or less multi-polar, with territories such as Texas, Iraq, Iran and Kuwait, Saudi Arabia, the Caspian Sea, then offshore the Gulf of Mexico, the Gulf of Guinea, the Scottish Sea and the North Sea off Norway, and more recently the United Arab Emirates, Qatar and Kazakhstan. The same cannot be said for semiconductor designers and manufacturers, where, as during the Cold War, we find ourselves in a bipolar world: the United States, with its major production sites in Taiwan, and the People's Republic of China. A clash of titans in which Europe, Japan, Korea and the rest of the world are absent and non-aligned. While it's perfectly understandable that access to the latest-generation GPUs is only a competitive issue when it comes to launching a new video game, the same cannot be said when it comes to equipping military or space hardware. At a time when tensions in the world are higher than they were a few decades ago, access to high-performance GPUs and their military and space applications will become a considerable advantage if armed conflict breaks out in the years to come. It is precisely to maintain or gain such an advantage that the USA and China, among others, are engaged in a frantic race for performance.

But the military and space aren't the only fields demanding innovation. Demand is growing in many sectors, all over the world, and especially in generative AI, which explains Sam Altman's recent initiative to raise $7 trillion. He wants to reinvent the semiconductor industry, to be seen not only as a futuristic ambition, but also as a harbinger of the challenges to come. This sum, which represents half the European Union market, or half the combined capitalization of the Magnificient Seven, underlines the potential scale of investment required to meet future needs in semiconductor and AI capabilities.

Just as the automotive and aeronautical sectors depended exclusively on oil at the beginning of the 20th century, today's technology industry relies on semiconductors. However, just as innovation has led to alternatives to oil, such as electric motors, the semiconductor industry is also looking for new technologies to overcome current limitations, such as the exploration of non-silicon-based materials.


Intel, Nvidia and the New Era of Technological Standard Oil



Comparing semiconductor giants such as Intel and Nvidia with Standard Oil in the 1930s reveals both similarities and differences. Whereas Standard Oil dominated the oil industry through its control of Texas oil production, refining and distribution, Intel and Nvidia today represent a form of global technological and intellectual domination, thanks to their innovations in semiconductors and GPUs. However, unlike Standard Oil's monopoly, the semiconductor sector is marked by intense competition, particularly between the United States and China, against a backdrop of geopolitical tensions where questions of patents, grey matter and training are predominant.






The transition from oil to semiconductors and GPUs as strategic resources marks a fundamental shift in global economic and geopolitical dynamics, reflecting the gradual transition from the computer age to the age of artificial intelligence. This mutation underlines the importance of innovation, sustainable resource management and adaptation to new technological realities for nations,  companies and individuals aiming to prosper in a rapidly changing landscape. To date, it is clear that the USA and China have taken the lead in this field, and that Europe, Japan and Korea are not (yet) competitive.

History never repeats itself identically, but political and economic leaders should remember the tensions and conflicts generated by difficulties in accessing or delivering oil. The same could be said of access to high-performance semiconductors, the new strategic resource that is indispensable to artificial intelligence, and which all future human activities will increasingly need in the near future.


Written in Paris and Singapore, March 6, 2024

The authors

Jean-Claude Muller is Executive Editor of btobioinnovation, a blog that publishes reports and articles on innovative topics in the life sciences. He was Senior Vice President in charge of Prospective and Strategic Initiatives at Sanofi.

John-Guy Park is an engineer and serial entrepreneur. After 15 years in listed companies in aeronautics, telecoms and pharmaceuticals, he founded 3 startups specializing in data and AI under the mentorship of Jean-Claude Muller. He works as an independent advisor to venture capital firms and VCs.

Share :