How ChatGPT’s Energy Use Is Crushing Microsoft’s Carbon Goals

Microsoft's push into AI, particularly with ChatGPT, is causing a massive spike in energy use and carbon emissions. Training advanced AI models like GPT-4 has consumed an enormous amount of electricity, equivalent to the yearly needs of over 1,000 U.S. households. As Microsoft’s data centers power up to support these AI breakthroughs, their electricity use has soared from 11 to 24 terawatt-hours in just four years, leading to a 42% rise in carbon emissions. Despite ambitious plans to become carbon-negative by 2030, the company's AI boom is putting its green goals at serious risk, reflecting a broader struggle in the tech industry to balance innovation with sustainability.

The AI weapons race by Big Tech comes at a high energy cost. According to research, for instance, training OpenAI's GPT-4 required up to 62,000 megawatt-hours or the energy consumption of 1,000 American families over the course of five to six years.

This is also evident when monitoring Microsoft's power use...

Limited Time

Full Access

$10
Monthly

Included:

  • Access to All Articles.
  • One Plan. No Tiers.
  • No Ads.
  • Cancel anytime.
register now

 
Do you have a tip or sensitive material to share with GGI? Are you a journalist, researcher or independent blogger and want to write for us? You can reach us at [email protected].