Artificial intelligence increasingly sits behind the digital tools many people rely on every day. AI systems now help draft emails, summarise documents, recommend products, translate languages, detect fraud, assist doctors with diagnostics, and support everything from logistics planning to financial analysis. Yet growing concerns have emerged about the physical infrastructure that makes those capabilities possible. AI systems do not exist purely in software. They run on powerful servers housed in vast data centres, connected by high-speed networks and supported by complex cooling systems. As AI adoption accelerates, those facilities are expanding rapidly across the world.
That expansion has triggered a growing debate about the environmental cost of AI. Two concerns appear most frequently. The first is water. Data centres often use water to cool the equipment inside them, and headlines about the water footprint of AI training have sparked concern that the technology could strain already limited water resources. The second concern is energy. Running large machine-learning models requires significant amounts of electricity, and the rapid growth of AI workloads is expected to increase power demand from data centres substantially in the coming years. Both issues are legitimate, but they operate on different scales and carry different implications. Understanding the real impact of AI requires examining them carefully and placing them in the context of the wider economy.
The reality is that AI does consume resources, and those demands will rise as the technology becomes more widely used. However, the scale and significance of those demands are often misunderstood when numbers are presented without context. Water consumption tends to receive the most attention because it is easy to visualise and easy to communicate through striking statistics. Electricity demand, by contrast, is the more important constraint because it ultimately determines the physical limits of how large AI infrastructure can grow. Looking at each in turn reveals a more balanced picture of where the genuine challenges lie.
Water Demands and Consumption
Water enters the story of artificial intelligence through a practical engineering challenge. Modern AI systems rely on specialised processors, particularly graphics processing units (GPUs), that perform enormous numbers of calculations simultaneously. When thousands of these chips operate at full capacity inside a data centre, they generate large amounts of heat. If that heat is not removed continuously, the equipment can quickly overheat and fail. Cooling therefore becomes a critical component of any data centre operation. While several cooling technologies exist, many facilities rely on systems that use water because water is extremely effective at absorbing and carrying away heat.
This water use can occur in two ways. Some facilities employ evaporative cooling systems, where water absorbs heat and evaporates, removing energy from the system in the process. Other facilities circulate water through heat exchangers that transfer heat away from servers. Water can also be consumed indirectly through electricity generation, since many power plants use water to cool turbines during electricity production. When researchers estimate the “water footprint” of AI, they often include both the water used directly within data centres and the water associated with producing the electricity those centres consume.
Estimates of AI water consumption therefore vary widely depending on the assumptions used. One widely cited study examining the environmental footprint of large language models estimated that training a model similar in scale to early frontier systems could evaporate around 700,000 litres of freshwater during the training process alone. Research analysing everyday AI usage has also attempted to estimate the water associated with routine interactions. One such estimate suggests that generating a multi-page AI-written report could require roughly 0.7 litres of water, depending on the model and the location of the data centre involved. These figures can appear alarming when presented on their own, particularly when they are repeated without explaining how they compare with other forms of resource use.
The broader context changes the interpretation significantly. Globally, water consumption associated with AI infrastructure remains relatively small compared with other sectors of the economy. Agriculture alone accounts for roughly 70 percent of global freshwater withdrawals, according to the Food and Agriculture Organization. Industrial sectors such as textiles, manufacturing, and conventional power generation also consume vastly larger quantities of water than digital infrastructure. When estimates of global AI-related water use are distributed across the world’s population, the total footprint amounts to roughly 40 to 100 litres per person per year. That is a useful way to understand scale because it translates abstract global numbers into something easier to compare with everyday activities. On average, that level of consumption is roughly equivalent to the water used in a single additional load of laundry each year.
However, interpreting the global average alone would miss the aspect of water use that matters most. Water resources are highly unevenly distributed geographically. A data centre located in a region with abundant water supply will have very different implications from one located in a drought-prone area where water is already scarce. Because data centres tend to cluster in specific locations near fibre networks, electricity infrastructure, and major technology hubs, their local impact can sometimes be more significant than global statistics suggest. In regions experiencing water stress, the arrival of a large data centre development can raise concerns among residents and policymakers about competition for limited water supplies.
This is why many technology companies and governments have begun paying closer attention to where new facilities are built and how they are cooled. Advances in cooling technology are gradually reducing the amount of water required. Some operators are shifting toward air-cooled systems, while others are experimenting with closed-loop liquid cooling systems that recycle water instead of allowing it to evaporate. Improvements in cooling efficiency have the potential to reduce both water and energy consumption significantly over time. Research into next-generation cooling approaches suggests that improvements in system design could reduce cooling energy requirements by as much as half in certain configurations, which would also lower the associated water footprint.
For these reasons, water use is better understood as a local infrastructure question rather than a global environmental constraint. The overall scale of water consumption from AI remains modest when compared with other major economic activities. The more relevant issue is ensuring that data centres are built in locations where water resources can support them and that operators adopt cooling technologies that minimise unnecessary consumption. When these conditions are met, the water footprint of AI is unlikely to become a defining limitation on the technology’s growth.
Energy Demands and Consumption
Electricity demand represents a much larger and more consequential issue. Every AI system ultimately depends on electrical power to function. Electricity runs the processors performing calculations, the memory storing information, the networking equipment transferring data, and the cooling systems preventing hardware from overheating. As AI applications expand across industries and consumer services, the computing infrastructure required to support them must also grow, and with it the amount of electricity required.
At present, global data centres consume roughly 415 terawatt-hours (TWh) of electricity per year, which corresponds to about 1.5 percent of total global electricity demand, according to the International Energy Agency (IEA). That share has remained relatively stable for several years despite the steady growth of digital services. Efficiency improvements in hardware and software have historically offset much of the increase in computing demand. However, the rise of artificial intelligence is expected to accelerate electricity consumption more rapidly than previous waves of digital expansion.
Projections from the IEA suggest that electricity demand from data centres could reach roughly 945 TWh annually by 2030, more than doubling current consumption levels. To understand the scale of that number, it is useful to compare it with national electricity consumption. Annual electricity demand in Japan, for example, is slightly lower than that level today. In other words, global data centre infrastructure could soon require as much electricity as a large industrialised country.
Artificial intelligence is a central driver behind this growth. Traditional data centre workloads such as cloud storage, web hosting, and video streaming require substantial computing power but tend to operate within relatively predictable performance ranges. AI workloads differ because they involve large-scale machine-learning computations that can be extremely intensive. Training advanced models involves processing massive datasets across thousands of specialised processors for extended periods of time. This process can take weeks or months and requires continuous high-performance computing.
Once a model has been trained, a second phase begins. The model must then serve real users. Each time someone interacts with an AI system, the model performs additional computations to generate a response. When millions or even billions of users interact with AI-powered tools daily, the cumulative electricity demand of these interactions becomes substantial. Over time, this “inference” stage can require more total computing power than the initial training process.
The growth of AI therefore translates directly into higher electricity demand. While the global share remains relatively modest, the impact becomes more noticeable at the national and regional level. In the United States, for example, projections suggest that data centres could account for nearly half of the growth in electricity demand between now and 2030. Similar trends are emerging in other countries experiencing rapid expansion of digital infrastructure.
This rising demand presents both challenges and opportunities. The challenge lies in infrastructure. Electricity systems require long planning horizons. Building new power plants, upgrading transmission lines, and expanding grid capacity often takes many years. Data centres, by contrast, can be built relatively quickly once land and permits are secured. If electricity infrastructure fails to expand at the same pace, power grids could face increased pressure during periods of high demand.
At the same time, the energy demand associated with AI could accelerate investment in new power generation. Many technology companies have already committed to purchasing large quantities of renewable energy through long-term power agreements. These agreements help finance the construction of new wind and solar projects while allowing data centre operators to secure reliable electricity supplies. Some companies are also exploring partnerships with nuclear developers and advanced geothermal projects as potential long-term sources of stable, low-carbon power.
Another factor shaping the energy footprint of AI is efficiency. Hardware manufacturers continue to develop processors that perform more computations using less electricity. Software improvements can also reduce the number of calculations required to complete a given task. These advances have historically played a significant role in limiting the growth of energy consumption in the computing sector. However, whether efficiency improvements will fully offset the rapid growth of AI workloads remains uncertain.
Ultimately, electricity demand represents the primary environmental and economic constraint associated with artificial intelligence. The global share of electricity used by data centres remains relatively small today, but the growth trajectory is steep enough that it will require careful planning from both technology companies and energy policymakers. The key factor determining the environmental impact will be the energy sources used to power this infrastructure. If the electricity comes from carbon-intensive generation, the climate implications could be significant. If it increasingly comes from renewable or other low-carbon sources, the environmental impact will be much smaller.
How Worried Should We Be?
Artificial intelligence does require substantial physical infrastructure, and that infrastructure inevitably consumes both water and electricity. Concerns about these resource demands are therefore understandable. At the same time, examining the numbers more closely reveals that the scale and significance of these impacts differ considerably.
Water consumption from AI remains relatively small when viewed in global terms. The more relevant issue is how and where data centres are built, since local water availability can determine whether new facilities create meaningful pressure on regional resources. Electricity demand represents a more significant challenge. As AI becomes embedded across industries and consumer services, data centre electricity consumption is expected to rise sharply. Ensuring that power systems can support that growth while continuing to reduce emissions will be one of the central infrastructure challenges of the digital economy.
The long-term environmental footprint of AI will therefore depend less on the technology itself and more on the systems that support it. Advances in cooling technology, improvements in computing efficiency, and the continued expansion of low-carbon electricity generation will all play a role in determining whether AI’s resource demands remain manageable. If those developments keep pace with the growth of digital infrastructure, the environmental constraints associated with AI are likely to remain limited. If they do not, the rapid expansion of computing demand could place increasing pressure on both energy systems and local resources.
💼 Unpacked
Data Centres
Large facilities that house the servers and networking equipment needed to store data and run digital services. Companies like Amazon, Microsoft, and Google operate thousands of data centres worldwide to power cloud computing, websites, and increasingly artificial intelligence systems. These facilities require large amounts of electricity and cooling to keep equipment running reliably.
AI Training
The process of teaching an artificial intelligence model to recognise patterns and generate outputs by analysing large datasets. During training, powerful processors perform vast numbers of calculations to adjust the model’s internal parameters. Training advanced models can take weeks or months and requires significant computing power and electricity.
Externalities
An externality occurs when an economic activity imposes costs or benefits on others that are not reflected in the market price. Environmental impacts are a common example. If AI data centres increase water stress in a region or raise electricity demand that leads to higher emissions, those effects may represent negative externalities unless they are accounted for through regulation or pricing.
📣 Support The Fiscal Compass
If you found this insightful, consider sharing with friends or colleagues. For weekly economics-led takes on markets, policy, and macro trends, subscribe to The Fiscal Compass.
Follow along on social media for concise updates throughout the week:
Instagram: @thefiscalcompassofficial
X: @FiscalCompass.
LinkedIn: Vinay Meisuria
Sources
A Water Efficiency Dataset for African Data Centers
Food and Agriculture Organization – AQUASTAT Global Water Use Statistics
International Energy Agency – Energy and AI: Electricity demand from data centres
International Energy Agency – AI is set to drive surging electricity demand from data centres
Featured Image: Hanwha Data Centres, License



