Rapid artificial intelligence expansion, particularly generative AI like ChatGPT, has raised concerns about its growing energy consumption.
A study revealed that ChatGPT running on 10,000 NVIDIA GPUs consumes the electricity equivalent to powering 121 homes in the United States for a year. This massive energy demand is attributed to the extensive training phase of AI models. As AI chips become increasingly in demand, the industry’s energy footprint is expected to surge.
The study emphasizes the need to address the energy consumption of AI not only during training but also during the inference phase, where AI tools generate output. While some efforts are being made to reduce AI’s power consumption, the industry faces the challenge of balancing its rapid growth with sustainability concerns.