Environmental impact of artificial intelligence
The environmental impact of artificial intelligence includes substantial electricity consumption for training and the usage of deep learning models, as well as the related carbon footprint and water usage impact. Moreover, artificial intelligence data centers are materially intense, requiring a large amount of electronics that use specialized mined metals and which eventually will be disposed as e-waste.
Some scientists argue that AI may also provide solutions to environmental problems, such as material innovations, improved grid management, and other forms of optimization across various fields of technology.
As the environmental impact of AI becomes more apparent, governments have begun instituting policies to improve the oversight and review of environmental issues that could be associated with the use of AI, and related infrastructure development.
Carbon footprint and energy use
Individual level
Researchers have estimated that a ChatGPT query consumes about five times more electricity than a simple web search. Other researchers have estimated a request made via ChatGPT, uses 10 times as much electricity as a Google Search. Microsoft and Meta had similar increases in their carbon footprint, similarly attributed to AI. In June 2025, OpenAI executive Sam Altman stated that the average ChatGPT query used about of electricity and of water.According to a study by Luccioni, Jernite and Strubell, simple classification tasks performed by AI models consume on average 0.002 to 0.007 Wh per prompt. Text generation and text summarization each require around 0.05 Wh per prompt on average, while image generation is the most energy-intensive, averaging 2.91 Wh per prompt. The least efficient image generation model used 11.49 Wh per image, roughly equivalent to half a smartphone charge.
Researchers at the University of Michigan measured the energy consumption of various Meta Llama 3.1 models released in 2024 and found that smaller language models use about 114 joules per response, while larger models require up to 6,700 joules per response. This corresponds to the energy needed to run a microwave oven for roughly one-tenth of a second and eight seconds, respectively.
According to researchers, the median Google Gemini text prompt in 2025 consumes about 0.24 Wh of electricity. Google’s improvements in software efficiency and its procurement of clean energy have reduced energy use by a factor of 33 and carbon emissions by a factor of 44 for a typical prompt over a year. In practical terms, the median Gemini text prompt uses roughly as much energy as watching nine seconds of television.
A 2024 Scientific Reports study compared the estimated carbon impacts of human writers and artists to those of select AI systems, and calculated that humans have 130 to 2900 times higher carbon impact. A 2025 study criticized that comparison based on differences in output quality, and found a counterexample in completing programming tasks with GPT-4, which had a carbon impact 5 to 19 times more than human programmers.
System level
Carbon footprint
AI has a significant carbon footprint due to growing electricity consumption, especially due to training and usage. Researchers have argued that the carbon footprint of AI models during training should be considered when attempting to understand the impact of AI. One study suggested that by 2027, energy costs for AI could increase to 85–134 Twh, nearly 0.5% of all current electricity usage. Training large language models and other generative AI generally requires much more electricity compared to running a single prediction on the trained model. Using a trained model repeatedly, though, may easily multiply the electricity costs of predictions. The computation required to train the most advanced AI models doubles every 3.4 months on average, leading to exponential power usage and resulting carbon footprint.Additionally, artificial intelligence algorithms running in places predominantly using fossil fuels for energy will exert a much higher carbon footprint than places with cleaner energy sources. These models may be modified for less environmental impacts at the cost of accuracy, emphasizing the importance of finding the balance between accuracy and environmental impact.Recent research has shown that developing and training large language models can produce significant environmental impacts, including approximately 493 metric tons of carbon dioxide emissions and 2.77 million liters of water use when considering the full lifecycle from hardware manufacturing to model training.
BERT, a language model trained in 2019, required "the energy of a round-trip transcontinental flight" to train. GPT-3 released 552 metric tons of carbon dioxide into the atmosphere during training, "the equivalent of 123 gasoline-powered passenger vehicles driven for one year". Much of the energy cost is due to inefficient model architectures and processors. One model named BLOOM, from Hugging Face, trained with more efficient chips and, therefore, only released 25 metric tons of CO2. Incorporating the energy cost of manufacturing the chips for the system doubled the carbon footprint, to "the equivalent of around 60 flights between London and New York." Operating BLOOM daily was estimated to release the equivalent carbon footprint as driving 54 miles.
Algorithms which have lower energy costs but run millions of times a day can also have significant carbon footprints. Research in 2024 suggested the integration of AI into search engines could multiply energy costs significantly. Another estimate found that integrating ChatGPT into every Google search query would use 10 TWh each year, the equivalent yearly energy usage of 1.5 million European Union residents.
Increased computational demands from AI caused both increased water and energy usage, leading to significantly more demands on the grid. Due to increased energy demands from AI-related projects, coal-fired plants in Kansas Cityand West Virginia pushed back closing. Other coal-fired plants in the Salt Lake City region have pushed back retirement of their coal-fired plants by up to a decade. Environmental debates have raged in both Virginia and France about whether a "moratorium" should be called for additional data centers. In 2024 at the World Economic Forum, Sam Altman gave a speech in which he said that the AI industry can only grow if there is a major technology breakthrough to increase energy development.
Carbon footprints of AI models depends on the energy source used, with data centers using renewable energy lowering their footprint. Many tech companies claim to offset energy usage by buying energy from renewable sources, though some experts argue that utilities simply replace the claimed renewable energy with increased non-renewable sources for their other customers. Analysis of the carbon footprint of AI models remains difficult to determine, as they are aggregated as part of datacenter carbon footprints, and some models may help reduce carbon footprints of other industries, or due to differences in reporting from companies.
Some applications of ML, such as for fossil fuel discovery and exploration, may worsen climate change. Use of AI for personalized marketing online may also lead to increased consumption of goods, which could also increase global emissions. Critics have highlighted that AI companies are also signing contracts with polluters to expand their production, thus increasing their environmental impact.
A 2023 bibliographic report from Inria and CEA-Leti emphasizes that AI's carbon footprint must be assessed using full life cycle analysis, including hardware manufacturing, training energy, and deployment phases, rather than only operational emissions.
Energy use and efficiency
AI chips use more energy and emit more heat than traditional CPU chips. AI models with inefficiently implemented architectures or trained on less efficient chips may use more energy. Since the 1940s, the energy efficiency of computation has doubled every 1.6 years.The International Energy Agency released its 2025 Electricity Analysis and Forecast in February 2025, projecting 4% growth in global electricity demand over the following three years due to data center growth, increased industrial production, increased electrification, and increased use of air conditioning. By 2027, energy consumption in the US is expected to grow by an amount equivalent to California's entire annual power usage, largely driven by energy-hungry data centers and manufacturing operations. In 2024, U.S. electricity generation rose by 3%, with data centers emerging as a dominant force behind the increase. Demand is expected to grow as semiconductor and battery manufacturing facilities increase production.
In 2024, a U.S. public policy group reported that artificial intelligence and other emerging industries with potential global economic influence are marked by high electricity consumption. As a result, U.S. energy policy is expected to focus on ensuring a reliable and sufficient electricity supply to support key sectors considered important for maintaining economic and technological competitiveness.
The rapid growth of artificial intelligence has led to a sharp increase in electricity demand, posing challenges for the sector’s continued expansion. In Northern Virginia, a major hub for AI data centers, the wait time to connect large facilities to the electrical grid has increased to seven years, reflecting pressure on existing energy infrastructure. Across the United States, utilities are experiencing the most substantial surge in electrical demand in decades. This strain is directly contributing to longer wait times for grid connections, complicating efforts to maintain the country's technological leadership in AI. The significance of these energy challenges extends beyond logistics. A New York Times editorial emphasized the critical role of energy infrastructure, stating that "Electricity is more than just a utility; it's the bedrock of the digital era. If the United States truly wants to secure its leadership in A.I., it must equally invest in the energy systems that power it."
Globally, the electricity consumption of data centers rose to 460 terawatts in 2022. This would have made data centers the 11th largest electricity consumer in the world, between the nations of Saudi Arabia and France, according to the Organization for Economic Co-operation and Development.