AI energy demands: where does it come from?

3–4 minutes

Allen McCallie ‘26, Business and Ad Manager

Due to new capabilities of artificial intelligence (AI), including tools like generation, virtual assistants and chat bots, the usage has skyrocketed since the release of chatbots like ChatGPT. Since the big companies like Google and Microsoft have launched their own chatbots, Gemini and CoPilot, it begs the question, where does all the energy come from?

According to an article by the MIT Technology Review, the average energy cost for the average inquiry of Google Gemini is 0.24 watt-hour, which is similar to running your TV for under nine seconds. Bigger chatbots like ChatGPT take more energy at .34 watt hour. The use of energy varies by a lot; if doing a simple task like a math question, that takes less energy than writing long essays. Tasks like image and video generation take more energy and processing power and can take up to three watt-hours, which is comparable to running a small LED for an hour. 

The vast amount of energy that is used during the use of AI is at data centers, where the energy going in goes to the giant server farms, cloud computing, and cooling. The massive demand for AI has led to data centers consuming immense amounts of energy, which leads to immense strain on power grids.

 Water use at data centers is also a concern for people who live nearby. The common practice of water disposal is that after most water is evaporated into the atmosphere, what is left is discharged into municipal water waste systems, industrial water waste facilities, or surface water systems. The mass consumption of water by data centers often rival cities, with average daily rates ranging from 300,000 to millions of gallons. A big fear is that the water is not treated well enough or is too hot to release into rivers, lakes, or oceans. Water that still has minerals in it would harm wildlife, and hot water threatens thermal pollution. The majority of the time—because of laws, permits, and quality control—pollution is avoided, but there are reported cases of thermal pollution in places like the Great Lakes region. 

Where data centers get their energy differs from place to place, but the most common practice is to get energy from the local power grid. Another method is to use natural gas in on-site turbines and generators to get energy quickly. Renewable energy in the form of local solar and wind farms is a very common practice in the southwest in places like Arizona, Nevada, California, and Texas. All big AI companies are striving for zero footprint, and one of the best ways to get there is nuclear energy. According to the US Department of Energy, nuclear energy produces little waste, has a small land footprint, and a zero emission energy source. Companies are spending billions to start running data centers straight from nuclear power plants. Amazon has invested 650 million dollars in a completely nuclear-powered data center, Microsoft has entered a 1.6 billion dollar deal to restart a nuclear reactor at threemile island, and the Stargate project, involving OpenAI, SoftBank, Microsoft, Nvidia, and others, are investing 500 billion dollars to make advanced data centers that heavily run on nuclear reactors. 

As of now, most data centers run mostly on fossil fuels, but the future for AI having a zero carbon footprint is promising. This innovation will hopefully lead to a healthier Earth; but as of now, this technology is relatively new, and there are definitely some technological and ethical developments that need to be made. But the future for this technology is looking bright and hopefully one day AI will have zero carbon footprint.