ChatGPT uses 17,000 times more electricity every day than the average household

Artificial intelligence is revolutionizing our world, but at what cost? The disproportionate energy consumption of these technologies is becoming a major problem. ChatGPT, a popular conversational chatbot, uses 17,000 times more electricity every day than the average American household. Is a sustainable future for AI possible?

ChatGPT power consumption
ChatGPT uses a lot of energy © Tom’s Guide

While artificial intelligence (AI) is revolutionizing many areas, a shadow hangs over its development: excessive energy consumption. Recent research shows that ChatGPT, the popular conversational chatbot, is consuming every day 17,000 times more electricity than an average American household.

This means that this energy-hungry technological gem consumes the equivalent of 493,000 kilowatt hours per day, or an ecological heresy. By comparison, the average daily consumption of an American household is only 29 kilowatt hours.

ChatGPT: a chatbot with an insatiable appetite for electricity

This energy glut is even more worrying as AI takes off. Given the exponential growth of the sector and its large-scale application, the situation threatens to become critical.

Alex de Vries, data scientist at De Nederlandsche Bank, predicts that annual consumption in the AI ​​sector could reaching 85 to 134 terawatt hours by 2027. This colossal figure represents almost 0.5% of global electricity consumption. So much so that the industry is considering nuclear reactors to meet the energy demands of data centers.

To illustrate, integrating AI into every Google search would result in a annual consumption of 29 billion kilowatt hours, which exceeds the electricity consumption of entire countries. These alarming figures raise crucial questions about the ability of our electricity infrastructure to support the development of AI.

Figure 1: ChatGPT uses 17,000 times more electricity every day than the average household

In addition to energy consumption, the operating costs of these technologies are exorbitant. OpenAI, the company behind ChatGPT, spends $700,000 every day to keep its chatbot running. These staggering costs increase the need for constant cooling, with each ChatGPT request requiring a bottle of water to dissipate heat (and as much as a nuclear reactor to power it).

The ChatGPT example is not an isolated example. Other AI models like Microsoft Copilot could consume enough energy to power a small country by 2027, according to some studies.

Faced with this observation, the question arises about the viability of AI development. How can we reconcile technological progress with the need to protect the environment?

  • AI is hungry for energy: ChatGPT uses 17,000 times more electricity than a US household.
  • By 2027, AI could consume 0.5% of global electricity.
  • Sustainable solutions are needed to reconcile technological progress and environmental protection.

Sources: NewYorker, Digiconomist

Leave a Comment