Skeptoid Podcast Apple Podcasts Spotify Amazon Music

Members Portal

Support Us Store

 

Free Book

 

Making AI Environmentally Friendly

Donate Will our improvements to AI's voracious appetite for electricity keep pace with the exploding demand?  

Skeptoid Podcast #994

Listen on Apple Podcasts Listen on Spotify

Making AI Environmentally Friendly

by Brian Dunning
June 24, 2025

In 2017, there were various financial upheavals all around the world. Japan became the first country to recognize Bitcoin as legal tender. Australia followed soon after. The natural result was that Bitcoin's value jumped by 20× that year, and more entrepreneurs than ever began Bitcoin mining operations in earnest — often with millions of dollars of investor backing. Why did they need it? Two reasons: First, you have to buy a whole bunch of tiny computers that are optimized for these particular types of calculations; and second, you'll quickly learn that your electric bills are going through the roof. These tiny computers — often referred to as mining rigs — suck up an incredible amount of electricity to power their processors. This, in turn, causes those processors to emit a surprising amount of heat, which sends your air conditioning bill into the stratosphere to keep the computers from melting down. And while that was the problem a decade or so ago, its analog has reared the same ugly head again — not in cryptocurrency, but this time in AI.

Just listen to these recent nightmarish reports:

  • A 2024 study by KnownHost found that ChatGPT alone produces 261 tons of CO₂ per month, with every page view producing 1.59g. But some, such as Rytr, produce as much as 10.1g of CO₂ per page view.

  • Using data from Lawrence Berkeley National Laboratory, a 2025 article in the MIT Technology Review found that at current trends, AI in the United States alone will consume as much power as 22% of all US homes by 2028.

  • In its 2024 sustainability report, Google revealed that in the past five years, AI has caused their greenhouse gas emissions to increase by 48%. Moreover, they expect their energy use to triple by 2030.

  • At the 2024 World Economic Forum, OpenAI CEO Sam Altman told Bloomberg that the future power needs of AI are currently unreachable, and that "There's no way to get there without a breakthrough."

All of this negativity surrounding AI's environmental impact is — to put it bluntly — basically true. But like with all burgeoning technologies, it's necessary to go through the steps of doing things badly on the way to learning to do them well. The first airplanes were terrible, but we pushed through in order to learn to make better ones. 150 years ago, medical care was likely to do more harm than good, but that's how we learned to do it so well today.

In 2022, Ethereum — the second largest cryptocurrency after Bitcoin — made a fundamental change to the way it functions, in order to address these environmental concerns and massive costs. This was switching from Proof of Work validation to Proof of Stake validation, and while you don't need to understand what that means, it meant a reduction in power consumption of over 99%. Might there be a fundamental shift like this in the future of AI?

In January 2025, we thought there might be. You may have heard about DeepSeek, a Chinese AI that claimed to be 90% more efficient than other models. A 90% reduction in power consumption would indeed be a game changer, but it turned out this claim was not what everyone hoped. The efficiency gains were only realized during DeepSeek's initial training period. The hardware used for this was indeed highly optimized, but it was only trained for 10% as many GPU hours as those it was being compared against, including Meta's Llama AI. So, obviously, it would have only consumed 10% as much power. But now that it's trained and it's up and running, analysts have found that DeepSeek is actually less efficient at running each query given to it, consuming up to twice as much power as the same query on Llama. So, once again, be skeptical of amazing science news coming from China.

Let's take a quick look at how AI is sucking up all this power. There are various kinds of AIs for different applications — neural networks, natural language processing, generative AI, machine learning — but they all run on basically the same hardware. A conventional data center that runs websites and cloud computing consists of thousands of computers, each with a CPU, memory, and storage, but little else. But like cryptocurrency mining, AI computers are constantly doing complex calculations, and so they are much more reliant on GPUs.

Most of us know GPUs (graphics processing units) as that thing our computer needs to drive an extra monitor, or to play video games with amazing real-time graphics. While a CPU may have a few powerful cores, a GPU has thousands of tiny cores allowing it to perform thousands of simpler, repetitive tasks simultaneously, in parallel. This is ideal for the matrix operations that are central to AI processing, so today, these simple yet highly specialized machines loaded up with powerful GPUs are what carry the lion's share of AI, and what eat up all that power.

However, growing needs drive innovation, and innovation in the land of AI hardware means ever-greater efficiency. Higher efficiency serves two needs at the same time: it improves the speed and power of the AIs, and doing more with less also means reduced power consumption. In 2015, Google developed a new kind of chip called a TPU (tensor processing unit). These are optimized for processing multidimensional arrays, a central computing function of AI. They're designed explicitly for this and can't really do anything else, thus they work faster and consume less energy than GPUs.

A TPU is one kind of ASIC (application-specific integrated circuit). These are chips that are hardwired to perform a single specific function or algorithm. Compared to running that same task in software, an ASIC does it far faster and requires much lower computing and power resources to do so. TPUs are not the only ASICs that have been developed for AI, but you get the idea. The more the AI algorithms mature, the more the most resource-intensive part of the AI infrastructure can be made vastly more efficient.

Another interesting way that AI can be made more resource efficient is by methodological improvements like model pruning and quantization. This is something like using heuristics, or shortcuts in the way we think about things. If you ask an AI whether you should bring an umbrella this afternoon, there are a million answers it could give you that you don't care about, all of which would require computing time. How many raindrops per square meter per second are falling? What's the barometric pressure likely to do in the next three hours? No. You only care whether it's raining a lot or hardly at all. Simplifying the math where it makes sense, fewer significant digits, ignoring all but the most important inputs, can cut the size of the job tremendously. Faster results, less power consumed. More useful in every way.

A question we might ask at this point is which one is likely to accelerate faster: improvements in efficiency, or demand for more AI? We do have a solid answer for this, and it's not the one we'd like to hear. Remember those numbers at the top of the show were basically correct. As of now, 2025, the best projections show that global data center power consumption will probably double by 2030; and that's in spite of all the gains in efficiency. But it's also due, in part, to those same gains in efficiency. As we reduce their energy consumption we're able to do more with them. They become even more useful. And that drives their demand even faster. It's a type of feedback loop we call the snowball effect: the faster it rolls, the more snow it picks up and the heavier it gets, making it roll even faster, and so on. This is called Jevon's Paradox, after the 19th century British economist William Stanley Jevons: Increased efficiency leads to increased consumption.

But we live in a capitalistic world. Supply and demand are forever intertwined. When demand becomes too great, we have to do one of two things: we increase the supply, or we reduce the demand. In this case we're not physically able to increase the supply, so we do the other thing: reduce the demand — and we do that by raising the price. Expect AI to get more expensive. Potentially a lot more; as much as it takes to avoid melting the grid.

But wait, you say, a lot of AI is open source — meaning the algorithms and software, or at least analogs comparable to the commercial versions — are freely available to all. That's nice. The hardware and the electricity are not. This is a system where water is going to find its own level.

One thing nearly all the industry experts are projecting is that data centers are going to turn increasingly to renewable energy. It's the one variable in this equation that's a one-time expenditure. Invest once now, and the energy needs will be covered for the industry's next phase of growth.

And that brings us to a whole other side to this issue that many people don't take into consideration. Powering the AI engines may indeed have a high environmental impact, but some of what the AIs are doing is protecting the environment. An AI can be trained to do just about anything, and whether your application is reducing carbon emissions or protecting old growth forests, somebody probably already has an AI at work on that problem. Is the benefit each program produces worth the cost of generating the power to run it? Well, maybe in some it is, maybe in some it isn't. Let's take a look at a few examples:

  • Here's one that more than directly pays for itself. One way we capture carbon out of smokestacks is to react the flue gas with a limestone slurry which can absorb the carbon. Pumping that slurry takes a lot of power. The University of Surrey developed an AI which samples the CO₂ in the flue in real time, looks at current renewable energy availability and grid energy prices, and dynamically adjusts both the slurry pump rate and the slurry pH. In field trials in India, the system saved 22% in power costs while capturing 17% more CO₂.

  • Another way this is done is with the use of MOFs (metal organic frameworks) which are materials that selectively adsorb CO₂ directly out of the gases. A team from Argonne National Laboratory, the University of Illinois, and the University of Chicago set up an AI to invent new MOF compounds with a high predicted carbon selectivity. In only 30 minutes it came up with 120,000 of them, which were then fed to a supercomputer to run molecular dynamics simulations on them. Six were as good as the top industrial adsorbents on the market. That was just in the first 30 minutes — although the supercomputer simulations took considerably longer.

  • A California nonprofit called Rainforest Connection has developed a novel system consisting of a sensitive microphone, transmitter, minicomputer, and solar panels that is mounted high in the treetops in places like the Brazilian rainforest. The computer uses onboard AI to analyze the sounds being recorded, listening for the telltales of illegal logging operations and also poaching. This allows law enforcement to catch the operators redhanded, whereas before they'd had to rely on random patrols in hopelessly vast areas.

Of course these are only three of many, many such initiatives that turn to AI to increase efficiency and cleanliness of processes throughout the world. But so far, they are not nearly enough to offset the costs of running the AIs. Renewable energy helps, but it too is in a losing battle. And so far, nobody sees anything on the horizon comparable to what Ethereum did in 2022 to reduce its consumption by 99%.

The bottom line is that gains against the environmental impact of AI are likely to be evolutionary, not revolutionary. In the meantime, we can probably expect economic levers to be about the only effective tool we have, and that means jacking up the cost more and more to reduce the demand.

Hopefully, in a few years, I'll have the pleasure of updating this episode with a major new development.


By Brian Dunning

Please contact us with any corrections or feedback.

 

Shop apparel, books, & closeouts

Cite this article:
Dunning, B. (2025, June 24) Making AI Environmentally Friendly. Skeptoid Media. https://skeptoid.com/episodes/4994

 

References & Further Reading

Boschee, P. "Comments: Grabbing the Brass Ring To Power the Demand for Data Centers and Generative AI." Journal of Petroleum Technology. 1 May 2024, Volume 76, Number 5: 8-9.

Google. 2024 Environmental Report. Mountain View, CA: Google, 2024.

Howson, P. "DeepSeek claims to have cured AI’s environmental headache. The Jevons paradox suggests it might make things worse." The Conversation. The Conversation US, Inc., 31 Jan. 2025. Web. 8 Jun. 2025. <https://theconversation.com/deepseek-claims-to-have-cured-ais-environmental-headache-the-jevons-paradox-suggests-it-might-make-things-worse-248720>

O'Donnell, J. "DeepSeek might not be such good news for energy after all." MIT Technology Review. Massachusetts Institute of Technology, 31 Jan. 2025. Web. 8 Jun. 2025. <https://www.technologyreview.com/2025/01/31/1110776/deepseek-might-not-be-such-good-news-for-energy-after-all/>

O'Donnell, J., Crownhart, C. "We did the math on AI’s energy footprint. Here’s the story you haven’t heard." MIT Technology Review. Massachusetts Institute of Technology, 20 May 2025. Web. 4 Jun. 2025. <https://www.technologyreview.com/2025/05/20/1116327/ai-energy-usage-climate-footprint-big-tech/>

Shehabi, A., et al. 2024 United States Data Center Energy Usage Report. Berkeley, CA: Lawrence Berkeley National Laboratory, 2024.

Staff. "Carbon Footprint of AI Tools." KnownHost Blog. KnownHost LLC, 9 Dec. 2024. Web. 4 Jun. 2025. <https://www.knownhost.com/blog/carbon-footprint-of-ai-tools/>

 

©2025 Skeptoid Media, Inc. All Rights Reserved. Rights and reuse information

 

 

 

Donate

Donate



Shop: Apparel, books, closeouts


Now Trending...

Special Announcement from Skeptoid

Email Myths

Scalar Weapons: Tesla's Doomsday Machine?

Chemtrails: Real or Not?

The Man from Taured

Skinwalkers

Tartaria and the Mud Flood

The Skeletons of the Great Eastern

 

Want more great stuff like this?

Let us email you a link to each week's new episode. Cancel at any time: