| The True Cost of AI | MR Online

The true cost of AI: Water, energy, and a warming Planet

Originally published: Bioneers on July 3, 2025 by Paris Marx (more by Bioneers)  | (Posted Aug 09, 2025)

AI doesn’t run on magic—it runs on energy, water, and massive physical infrastructure. As tech companies scale up generative AI, they’re building out hyper-scale data centers that consume millions of gallons of water per day and as much electricity as entire nations. These facilities are quietly reshaping local ecosystems and rapidly increasing global carbon emissions, all while companies promise a more “intelligent” future.

In this essay, tech critic Paris Marx unpacks the environmental footprint of AI’s infrastructure and asks: Is this the future we really want? Adapted from the Bioneers 2025 panel AI and the Ecocidal Hubris of Silicon Valley, this piece is the second in our four-part series exploring the unchecked impacts of artificial intelligence. Read to the end to access the other three essays.


PARIS MARX: Let’s go back to November 2022. You probably heard about an app called ChatGPT, released on November 30th. Almost overnight, generative AI was everywhere. It became the dominant topic of conversation, central to headlines, social media, and everyday discussions. The media couldn’t stop speculating about what ChatGPT might mean or how it could reshape society. OpenAI’s CEO, Sam Altman, was tweeting about how fast it was growing, as if rapid adoption alone proved that a massive transformation was underway. And with everyone from tech outlets to your social feed buzzing about it, it felt almost obligatory to try it out just to see what the fuss was about.

That launch was accompanied by a sweeping narrative: this was going to change the world. Something bigger was emerging—something with the potential to be incredibly powerful, maybe even beneficial, but also deeply unsettling.

Proponents of generative AI framed it as a leap in collective human intelligence. They promised a wave of AI assistants, each specialized for different industries—an architecture bot, a science bot, and so on. These tools, they claimed, would revolutionize entire sectors and possibly replace human workers along the way. At the same time, they made sure to pitch a silver lining: AI would vastly expand access to education and healthcare. But let’s be honest: When they talked about people going to AI doctors, they didn’t mean themselves. That was clearly meant for everyone else.

These tech giants are channeling their capital into realizing their particular vision of the future—one that depends on expanding AI, increasing computational power, and rolling it all out at a global scale.

There may well be some positive outcomes from this technology, but there’s also the looming possibility of serious harm. The narrative goes something like this: We must develop AI, even though it might destroy the world. It could lead to the end of humanity. This mix of hype sprinkled with warnings of existential risk doesn’t just shape public perception; it influences how the media talks about AI and how organizations begin to position themselves in response to it.

The tech industry benefits from these grand, speculative conversations. They want us focused on how powerful AI might become someday, rather than examining how it’s already being used right now. It’s more convenient to keep eyes on the future than on the real impacts unfolding in the present.

That’s why it’s so important to understand the foundations of this technology—where it comes from, what it actually is, and why it feels like it’s suddenly everywhere.

So why, in November 2022, did a chatbot like ChatGPT emerge and suddenly dominate the tech conversation? I think there are three key reasons. The first is centralized computing power. Back in 2006, Amazon began building massive centralized cloud computing warehouses—what we now call data centers. Imagine an e-commerce warehouse, but instead of packages, it’s packed wall-to-wall with servers. These enormous facilities require a huge amount of energy and power. Over the past two decades, they’ve expanded rapidly and become essential to the infrastructure behind the internet and the digital platforms we use every day.

So why are we seeing this explosion of AI tools right now? Yes, they require centralized computing power, but they also need something else: massive amounts of data. Companies collect enormous quantities of information from the open web and beyond, feeding it into these models. The result? Tools that seem far more capable than previous versions, not because of magic, but because they’re powered by vastly more data and computing resources.

That’s why data collection is so central. It fuels not just generative AI but also targeted advertising and many other systems. To gather all that data, companies have built a vast surveillance infrastructure, quietly capturing information across nearly every corner of our digital lives.

But there’s a third ingredient here: money. Immense amounts of capital are required to build and scale this kind of infrastructure. Companies such as OpenAI are reportedly losing billions each year in the short term, betting that these tools will become profitable in the long run.

They can afford to take that risk because they’re backed by some of the largest, most valuable corporations in the world. These tech giants are channeling their capital into realizing their particular vision of the future—one that depends on expanding AI, increasing computational power, and rolling it all out at a global scale.

So what do these infrastructures actually look like?

We often talk about “the Cloud” as if it were something intangible—data floating in the ether. But in reality, all that data lives in massive physical facilities that require enormous amounts of power and water to operate.

Hyper-scale data centers are a step beyond the standard data centers that have existed for decades. These facilities are far more massive in both their size and their impact. And they’re growing fast.

In 2018, there were about 430 hyper-scale data centers worldwide. By 2020, that number had jumped to 597. By the end of 2024, it had nearly doubled to 1,136. According to Synergy Research Group, another 504 are currently under construction or in the planning stages, driven largely by the surge in demand for generative AI infrastructure.

Roughly 40 to 50 percent of these centers are located in the U.S., though international growth is accelerating, especially in China. The three biggest players—Amazon, Microsoft, and Google—own about half of them.

As these facilities multiply, so do concerns from the communities where they’re built. One data center requires significant resources, but build five or ten in the same area, and the strain on local power and water systems becomes hard to ignore.

Around the world, more and more communities are beginning to push back, and for good reason. Hyper-scale data centers such as Google’s use an average of 550,000 gallons of water per day, or about 200 million gallons per year, primarily for cooling. Just as a laptop heats up under heavy use, these massive facilities, housing tens of thousands of constantly running servers, generate an enormous amount of heat. That heat has to go somewhere, so water and air conditioning systems are used to keep things cool.

Just between 2022 and 2023, Google’s water use across its data centers rose by 20 percent. At Microsoft, it jumped 34 percent. And that was before the generative AI boom really gained momentum, so it’s safe to say those numbers have only gone up since.

In pursuit of lower costs, many companies are building hyper-scale data centers in more remote or arid regions such as Arizona or parts of Spain—where water is already scarce. These areas often offer more access to renewable energy, which allows companies to market the facilities as “green,” but in reality, this shift puts even greater stress on already fragile water supplies.

Next, of course, is energy use. Globally, data centers currently account for about 2–3% of total energy consumption. In the U.S., that number is closer to 5%, since, as mentioned earlier, a disproportionate number of data centers are located here, and that energy demand is only set to grow. In 2022, data centers, along with crypto and AI infrastructure, consumed about 460 terawatt hours of electricity worldwide—roughly equivalent to the total energy use of France. By 2026, the International Energy Agency projects that number will more than double to 1,050 terawatt hours—about the same as Japan’s total annual energy use. That’s a massive escalation in just a few years.

Ireland is on the frontlines of this issue. Right now, 21% of all metered electricity used in Ireland goes to data centers. In winter, this creates serious strain on the grid, sometimes triggering public alerts that warn residents to reduce energy use or risk outages. As a result, there’s growing pressure to expand what has been a temporary moratorium on new data centers in Dublin. But Ireland’s struggle is just the tip of the iceberg; similar tensions are emerging in communities around the world.

How much computation do we actually need? Do we really need to build out endless data centers to support a flood of AI tools with questionable uses—tools that often serve tech companies’ bottom lines more than the public good?

So, where are we headed? Generative AI really began taking off at the end of 2022, and the momentum hasn’t slowed. In late 2024, OpenAI CEO Sam Altman told Bloomberg at the World Economic Forum: “We need way more energy in the world than I think we thought we needed before. We still don’t appreciate the energy needs of this technology.” He went on to say that the world may soon have to embrace geoengineering as a stopgap for climate impacts, unless, of course, we have a breakthrough in nuclear energy. In other words, we’re pushing forward with AI, no matter the energy cost, and if it overwhelms the planet, we’ll just have to engineer our way out of it.

More recently, we’ve seen a major shake-up coming out of China. You might have heard about DeepSeek, a company that’s doing what American AI companies are doing, but far more efficiently. Its emergence rattled the industry, causing U.S. tech stock prices to dip as investors began to question whether this AI boom is really all it’s cracked up to be, and whether the massive buildout by U.S. companies was truly justified? But of course, they’re not backing down.

Not long after DeepSeek’s debut, Sam Altman, Oracle CEO Larry Ellison, and SoftBank’s Masayoshi Son went to the White House to announce a $500 billion investment—code-named Stargate—aimed at building even more massive, nuclear-powered data centers. Meanwhile, Nvidia CEO Jensen Huang responded to DeepSeek’s efficiency by saying that greater efficiency will only drive greater demand, ultimately requiring 100 times more computing capacity. In his view, more efficient models don’t reduce resource use, they multiply it.

But is that actually what’s happening?

We’re starting to see some serious cracks in the foundation. Microsoft has recently canceled a number of data center leases, raising red flags for investors. Even leaders such as Alibaba’s Chairman Joe Tsai have warned that we may be in the middle of an AI data center build-out bubble..

So I’ll leave you with two final questions.

First: Who gets to decide what kinds of technology we build? Should those decisions be left to people such as Sam Altman or Microsoft’s Satya Nadella? Or should we be making these choices democratically, asking whether it really makes sense to invest staggering amounts of water, energy, and materials into technologies whose benefits are still unclear?

And second: How much computation do we actually need? Do we really need to build out endless data centers to support a flood of AI tools with questionable uses—tools that often serve tech companies’ bottom lines more than the public good? These companies rely on constantly growing demand for Cloud services to keep profits up, but that doesn’t mean we have to go along with it. It’s worth asking: how much computing capacity do we truly need? I’d argue it’s probably a lot less than what they want us to believe.

This series—adapted from the Bioneers 2025 session AI and the Ecocidal Hubris of Silicon Valley—offers critical perspectives on the systems driving the AI boom and the broader impacts of techno-solutionism.

In the first essay, journalist and activist Koohan Paik-Mander delivered a sweeping critique of AI’s role in accelerating climate collapse, inequality, and authoritarian control—framing the technology as a force multiplier for late-stage capitalism.

Monthly Review does not necessarily adhere to all of the views conveyed in articles republished at MR Online. Our goal is to share a variety of left perspectives that we think our readers will find interesting or useful. —Eds.