The Strait of Hormuz is a narrow stretch of water between Iran and Oman — roughly 33 kilometres wide at its narrowest point. Through it flows nearly 20% of the world’s traded oil and a significant share of its liquefied natural gas. When tensions in the region escalate, energy markets around the world react — and the effects reach further than most people expect.
With the Iran conflict intensifying and regional instability growing, oil prices have climbed sharply in recent months. The conversation tends to focus on fuel costs and shipping rates. Fewer people are asking what this means for the cost of running AI infrastructure. The answer is: quite a lot.
Why Hormuz Matters So Much to Energy Markets
The strait is effectively the world’s most critical single point of failure for energy supply. Saudi Arabia, the UAE, Iraq, Kuwait, and Iran itself all depend on it to export oil. If transit through Hormuz is disrupted — through military action, blockades, or insurance markets refusing to cover vessels — global supply tightens immediately.
Markets price in this risk before any disruption actually happens. Futures contracts move on perceived threat alone. When Iran signals escalation, oil benchmark prices respond within hours. Brent crude, the international reference price, has shown exactly this pattern throughout the current conflict cycle.
Higher oil prices push up the price of natural gas, which remains tightly correlated with oil in most markets. And natural gas is still a primary input for electricity generation across Europe and much of Asia.
Datacenters Are Among the World’s Largest Energy Consumers
Every AI query, automated workflow, and cloud operation runs on servers inside datacenters — buildings that never switch off, consume electricity continuously, and require additional energy just to stay cool.
The scale is significant. The International Energy Agency estimated global datacenter electricity consumption at around 460 TWh in 2022 — comparable to the entire electricity consumption of France. That figure is projected to exceed 1,000 TWh by 2030, with AI workloads driving the majority of the growth.
That electricity comes from a grid that, in most of the world, still depends heavily on gas-fired generation. When gas prices rise because oil prices rise because Hormuz is under pressure, datacenter operating costs move with them.
AI Makes the Energy Problem Structurally Worse
Running traditional software is relatively energy-efficient. A web server handling thousands of simultaneous requests uses a modest, predictable amount of power. AI inference — the process of running a query through a large language model — is fundamentally different.
Modern AI models require massive parallel computation. Running a single request through a model like GPT-4 or Claude consumes orders of magnitude more energy than serving a web page or executing a database lookup. As AI adoption scales across industries — customer service, document processing, fraud detection, code generation, medical analysis — the aggregate energy demand grows continuously.
The more deeply AI is embedded in daily business operations, the more directly those businesses are exposed to energy price volatility — even if they never buy a litre of oil.
The Cost Chain From Hormuz to Your API Invoice
The transmission mechanism works like this: Iranian escalation raises the geopolitical risk premium on oil. Oil prices rise. Gas prices follow. Electricity generation costs increase. Cloud providers — AWS, Azure, Google Cloud — absorb higher energy costs across their global datacenter fleets. Those costs eventually move into compute pricing, either through direct price increases or through degraded economics on lower-margin services and free tiers.
The lag is real — cloud contracts and reserved capacity mean changes take time to filter through. But the direction of travel is clear. Companies that have built products or internal tools on AI APIs are now more exposed to geopolitical risk than they were three years ago, even if nothing in their stack has changed.
What Businesses Should Be Thinking About Now
The response is not to avoid AI — the productivity benefits are too significant for most businesses to ignore. But a few strategic considerations are worth taking seriously.
Model efficiency matters. Not every task requires the most capable — and most energy-intensive — model. Routing simpler queries to smaller, faster models reduces cost without sacrificing meaningful quality on those tasks.
Caching and batching reduce redundant inference. Many AI applications repeatedly answer similar queries. Caching common responses and batching non-time-sensitive operations can cut compute consumption substantially.
Geography affects cost stability. Datacenters in regions with abundant renewable energy — Scandinavia, Iceland, parts of Canada — are increasingly insulated from fossil fuel price swings. Choosing cloud regions with low-carbon grids is not just a sustainability decision; it is a cost risk management decision.
Treat AI compute as a variable cost. Unlike the flat, predictable pricing of traditional cloud compute, AI inference costs at scale are exposed to energy market volatility. Budget models that treat AI costs as fixed are likely to disappoint.
The Broader Point
The Strait of Hormuz has shaped global energy geopolitics for decades. What is new is the degree to which digital infrastructure — and specifically AI infrastructure — has become part of that energy system.
Businesses that understand this connection are better positioned to architect their AI usage intelligently, manage costs proactively, and avoid the surprise of watching their AI spend climb without any change in their own usage patterns.
At anfedev, energy cost efficiency is part of how we think about AI architecture. If you are scaling AI usage within your product or operations, the time to think about this is before the costs arrive, not after.
Want to talk through how AI fits into your operations without unpredictable cost exposure? Get in touch.