Elon Musk has confirmed buying a power plant overseas and shipping it to the US to power its new data center housing one million AI GPUs and up to 2 Gigawatts of power under one roof, according to Tom’s Hardware quoting from Dylan Patel of SemiAnalysis. Two Gigawatts of power is equivalent to powering 1.9 million homes.
“They [xAI] are buying a power plant and shipping it from overseas because you can’t get a new one in time in the US,” said Patel in an X posting recently. He added that 200,000 GPUs are already running at xAI’s Colossus, its main AI data center located at Memphis, Tennessee. Colossus, built in 122 days, is one of world’s most powerful and power-hungry machines on the planet, housing some 200,000 Nvidia Hopper GPUs and consuming around an astounding 300 MW of power. xAI has installed 35 gas turbines that can produce 420 MW of power, as well as deploying Tesla Megapack systems to power Colossus.
xAI’s next AI data center is predicted to house one million AI GPUs, thus potentially consuming the same amount of power as 1.9 million households. Musk has assembled vast computing resources and a team of talented researchers to advance the company’s Grok AI models, Patel said.
Beyond the Colossus buildout, xAI is rapidly acquiring and developing new facilities. The company has purchased a factory in Memphis that is being converted into additional data center space, big enough to power around 125,000 eight-way GPU servers, along with all supporting hardware, including networking, storage, and cooling.
A million Nvidia Blackwell GPUs will consume between 1,000 MW (1 GW) and 1,400 MW (1.4 GW), depending on the accelerator models (B200, GB200, B300, GB300) used and their configuration, according to Tom’s Hardware.
However, the GPUs are not the only load on the power system; you must also account for the power consumption of CPUs, DDR5 memory, storage, networking gear, cooling, air conditioning, power supply inefficiency, and other factors such as lighting. In large AI clusters, a useful approximation is that overhead adds another 30% to 50% on top of the AI GPU power draw, a figure typically expressed as PUE (power usage effectiveness).
That said, depending on which Blackwell accelerators xAI plans to use, a million-GPU data center will consume between 1,400 MW and 1,960 MW (given a PUE of 1.4).
A large-scale solar power plant alone is not viable for a 24/7 compute load of this magnitude, as one would need several gigawatts of panels, plus massive battery storage, which is prohibitively expensive and land-intensive.
The most practical and commonly used option is building multiple natural gas combined-cycle gas turbine (CCGT) plants, each capable of producing 0.5 MW – 1,500 MW. This approach is relatively fast to deploy (several years), scalable in phases, and easier to integrate with existing electrical grids. Perhaps, this is what xAI plans to import to the U.S., said Tom’s Hardware.
Alternatives like nuclear reactors could technically meet the load with fewer units (each can produce around 1,000 MW) with no direct carbon emissions, but nuclear plants take much longer to design, permit, and build (up to 10 years). It is unlikely that Musk has managed to buy a nuclear power plant overseas, with plans to ship it to the U.S.
In practice, any organization attempting a 1.4 – 1.96 GW deployment — like xAI — will effectively become a major industrial energy buyer. For now, xAI’s Colossus produces power onsite and purchases power from the grid; therefore, it is likely that the company’s next data center will follow suit and combine a dedicated onsite plant with grid interconnections, according to Tom’s Hardware.
A data center housing a million AI accelerators with a dedicated power plant appears to be an extreme measure. But Patel pointed out that most leading AI companies are ultimately converging on similar strategies: concentrating enormous compute clusters, hiring top-tier researchers, and training ever-larger AI models. If xAI plans to stay ahead of the competition, it needs to build even more advanced and power-hungry data centers. But that also depends on whether he could secure massive energy supplies quickly.