AI processing has the potential to consume electricity equivalent to that of Ireland’s consumption, warns The Register.






AI and Energy Consumption: The Impact of AI on Datacenter Electricity Usage

AI and Energy Consumption: The Impact of AI on Datacenter Electricity Usage

Introduction

The recent surge of interest in AI, particularly large language models (LLMs) and generative AI, has led to concerns about the potential increase in datacenter electricity consumption. These concerns are raised in a paper by Alex de Vries, a researcher at the Vrije Universiteit Amsterdam.

The Sustainability of AI

When considering the sustainability of AI, most research has focused on the resource-intensive training phase of AI models. However, de Vries argues that the inference phase, which involves operating the trained model, might contribute significantly to an AI model’s life-cycle costs.

The Energy Demand of AI Models

De Vries supports his argument by citing the energy demand of OpenAI’s ChatGPT model. To support this model, OpenAI required 3,617 servers with a total of 28,936 GPUs, resulting in an energy demand of 564 MWh per day. In comparison, the training phase of the GPT-3 model consumed an estimated 1,287 MWh.

Google’s AI-Powered Search Engine

Google, following Microsoft’s lead, is integrating AI-powered search capabilities into its search engine. According to the paper, this could result in an electricity consumption of approximately 3 Wh per search, which is ten times more than a standard keyword search.

The Potential Electricity Consumption of AI

According to de Vries, if every Google search became an LLM interaction, the electricity needed to power this could be equivalent to the electricity consumption of a country like Ireland. The paper estimates that Google’s total electricity consumption for 2021, with AI accounting for 10 to 15 percent, was 18.3 TWh. This scenario assumes full-scale AI adoption with current hardware and software, which de Vries considers unlikely to happen rapidly.

A Realistic Projection

For a more realistic projection, the paper looks at the expected number of Nvidia-based AI servers to be acquired, as Nvidia currently holds an estimated 95 percent share of the market. Analyst estimates suggest that Nvidia will ship 100,000 AI server platforms in 2023, resulting in a combined power demand of 650 to 1,020 MW and an annual electricity consumption of up to 8.9 TWh.

Don’t forget the Jevons paradox

The paper also considers the Jevons paradox, which suggests that improvements in efficiency may lead to greater demand. In the case of AI, it means that gains in model efficiency may be offset by increased consumption from more generic GPUs.

The Future of AI-related Electricity Consumption

While the future electricity consumption of AI processing is difficult to predict, de Vries suggests that various resource factors are likely to limit the growth of global AI-related electricity consumption in the near term. However, the research warns against expecting improvements in efficiency to fully offset long-term changes in AI-related electricity consumption, and questions the wisdom of using AI in all applications without considering the costs.

Editor Notes

This article highlights the potential impact of AI on datacenter electricity usage. The research conducted by Alex de Vries raises important concerns about the sustainability of AI and the need to carefully consider the energy consumption associated with AI models. As AI continues to evolve and become more prevalent, it is crucial to strike a balance between technological advancements and sustainable practices. To stay updated on the latest news and developments in AI, visit GPT News Room.




Source link

Subscribe

Related articles

Los Creadores de Contenido en Google

Title: Google Empowers Web Editors with New Feature Introduction: Google has...

Interview: Lenovo’s Role in Democratizing AI

Leveraging Generative AI: Lenovo's Journey Towards Accessibility and Security Generative...