AI data center
An AI data center is a specialized data center facility designed for the computationally intensive tasks of training and running inference for artificial intelligence and machine learning models. Unlike general-purpose data centers, they are optimized for the parallel processing demands of AI workloads, typically utilizing hardware such as AI accelerators and high-speed interconnects.
The global push to construct these specialized facilities accelerated dramatically during the AI boom of the 2020s. This demand has reshaped supply chains, driving memory manufacturers to prioritize production of High Bandwidth Memory essential for AI servers, which has led to a global memory supply shortage, and triggering a broader competition for advanced chips, power, and infrastructure.
Architecture
Data centers for building and running large machine learning models contained specialized computer chips, GPUs, that used 2-4 times as much energy as their regular CPU counterparts. Companies such as Google and Nvidia constructed GPUs specifically for machine learning, which could process thousands of calculations per second. Thousands of these GPUs were stored closely together in data centers, alongside specialized hardware and cables to quickly migrate data between these chips. To cool these systems, AI data centers have developed new techniques for doing so. Google pumps large amounts of water through its data centers, using pipes that run next to the computer chips, which can strain nearby water supplies. Cirrascale uses large chillers to cool the water, which is largely recycled, but uses more electricity.According to PCMag, AI data centers use 60+ kilowatts of power per server rack, whereas more standard data centers typically use 5-10 kilowatts per rack.
American Big Tech companies have described these facilities as essential to building artificial general intelligence. The term AI factory, coined by Nvidia's CEO Jensen Huang, has been used as an alternative phrase for AI data center by some technology companies.
Operators
As of August 2025, The Information tracked 18 planned or existing AI data centers in the United States, operated by Amazon Web Services, CoreWeave, Crusoe, Meta, Microsoft/OpenAI, Oracle, Tesla, and xAI. Other AI data center operators include Digital Realty and Alibaba. Data centers are also being built in China, India, Europe, Saudi Arabia, and Canada. The New Yorker described CoreWeave as the most prominent AI data center operator in the United States.Two types of data center providers for machine learning have been noted: hyperscalers and neoclouds. The Verge listed large technology companies such as Google, Meta, Microsoft, Oracle and Amazon as hyperscalers. The New York Times described neoclouds as "a new generation of data center providers". CoreWeave, Nebius, Nscale, and Lambda have been described as examples of neoclouds.
In January 2025, OpenAI, in partnership with Oracle and Softbank, announced the Stargate project, which as of September 2025 is composed of six built or proposed AI data centers in the United States.
In response to the Stargate project, Amazon launched in October 2025 an AI data center on 1,200 acres of farmland in Indiana. This data center, known as Project Rainier, is one of the largest AI data centers in the world, with Amazon spending $11 billion on the project. Rainier is specifically intended for training and running machine learning models from Anthropic. As of that time, this facility contains seven data centers and will use 2.2 gigawatts of electricity and millions of gallons of water per year. Computer chips from Annapurna Labs and Anthropic, Trainium 2, were designed for use in such facilities. Amazon pumped millions of gallons of water out of the ground to construct the data center, and as of June 2025, Indiana state officials are investigating whether this dewatering process led to dry wells for local residents.
In November 2025, Anthropic announced a plan in partnership with Fluidstack to develop artificial intelligence infrastructure in the United States, including data centers in New York and Texas, worth $50 billion.
Other AI data center projects include the Colossus supercomputer from xAI, a Louisiana-based project from Meta, Hyperion, expected to use 5 GW of power, and a second Ohio-based Meta project, Prometheus, with a capacity of 1 GW. A 3,200-acre AI data center, capable of 4.4-4.5 GW of power and located on the decommissioned Homer City Generating Station, is under construction as of 2025, and will use seven 30-acre gas generating stations supplied by EQT.
As of December 2025, CRH is working on over 100 data centers in the United States.
In 2025, ExxonMobil and NextEra announced plans to build a data center powered by natural gas and using carbon capture technology, with 1.2 GW of power capacity. They previously purchased 2,500 acres of land in the Southeastern United States and plan to market the data center to an artificial intelligence company.
The increased interest in AI data centers has led to several executives from companies in that space becoming billionaires, including CoreWeave, QTS, Nebius, Astera Labs, Groq, Fermi, Snowflake and Cipher Mining.
Several companies involved in cryptocurrency mining, such as Bitdeer, CoreWeave, Cipher Mining, TeraWulf, IREN, Core Scientific, and CleanSpark have also been involved with AI data centers.
Finances
Between January and August 2024, Microsoft, Meta, Google and Amazon collectively spent $125 billion on AI data centers. Citigroup forecasted that $2.8 trillion would be spent on AI data centers by 2030, while McKinsey and Company estimated that almost $7 trillion would be spent globally by that time. According to S&P Global, $61 billion has been spent on the data center market as a whole in 2025, while debt issuance for data centers was $182 billion during the same year.Large technology companies have offloaded the financial risks of building AI data centers by setting up special purpose vehicles or by contracting with neoclouds. For example, Meta's Hyperion was mostly funded by Blue Owl Capital, which did so using a bond offering from PIMCO. Those bonds were sold to a number of clients, including BlackRock. Meta did not borrow money itself and instead established a special purpose vehicle from which it would rent the data center. This deal was structured by Morgan Stanley for $30 billion, the largest known private capital transaction as of 2025.
Neoclouds such as CoreWeave have gone into debt to buy computer chips from Nvidia for their data centers, and the chips themselves have been used for loan collateral. As of December 2025, CoreWeave took out three GPU-backed loans, collectively worth $12.4 billion, from private credit firms and from banks. Thus, these companies provide an indirect connection between private credit and established banks. Data centers have also established asset-backed securities, and debt for data centers has its own derivative financial products.
The real estate industry, including asset managers, public companies and private investors, has also invested in data centers.
Energy sourcing
As of 2024, data centers in the United States are primarily powered by natural gas, which supplies 40% of their electricity. The Associated Press reported that electricity for AI data centers in the United States would likely come from natural gas or oil, as companies prefer using currently available power plants, which primarily use fossil fuels. Non-renewable energy is also often cheaper in locations where data centers are developed, and experts believe that energy demands from generative AI and data centers would be difficult to fulfill with renewable energy alone. Some companies such as Google, Amazon and Meta have expressed interest in nuclear power for their data centers. Other data centers, such as Wonder Valley in Canada, plan to use their own natural gas and geothermal plants that are off-grid. Similarly, xAI is using onsite gas turbines for Colossus, while OpenAI and Meta have planned to use natural gas generators at their Stargate and Prometheus projects, respectively. VoltaGrid, a Texas-based energy company, proposed to install 33 reciprocating internal combustion engines at a data center in Covington, Georgia. Electric vehicle batteries have also been used for powering data centers, including for Colossus. Many data centers use lithium-ion batteries for backup power.Power utility companies make upgrades to their infrastructure to handle demands of new data centers, and the price for these changes typically falls on consumers: smaller businesses or individual households.
In December 2025, the Federal Energy Regulatory Commission published a unanimous order allowing data centers in the United States to have a direct connection with power plants. United States Secretary of Energy Chris Wright expressed support for un-retiring coal plants to power AI data centers. Trump paused leasing for offshore wind projects, a decision that Gizmodo criticized due to their potential to provide power to AI data centers. Electricity demands from AI data centers have led to the United States federal government, power companies and power grid operators slowing or reversing the retirement of peaking power plants.
Environmental footprint
Average AI data centers have an electricity footprint equivalent to 100,000 households, and use billions of gallons of water for cooling their hardware. In 2025, the International Energy Agency estimated that the larger AI data centers currently under construction could consume as much electricity as 2 million households. A 2024 report from the United States Department of Energy stated that data centers overall used 17 billion gallons of water per year in the United States, primarily due to "rapid proliferation of AI servers", and that this usage was forecasted to grow to nearly 80 billion gallons by 2028. Researchers estimated that AI data centers in the United States would emit 24-44 metric tons of carbon dioxide and use 731–1,125 million cubic meters of water per year between 2024 and 2030.Peaking power plants, which have been proposed as a power source for AI data centers, emit sulfur dioxide and have historically been located disproportionately near communities of color in the United States.
Reciprocating internal combustion engines, proposed as another power source for a data center, emit PM 2.5, nitrogen oxides, and volatile organic compounds.