Epoch AI
banner
epochai.bsky.social
Epoch AI
@epochai.bsky.social
We are a research institute investigating the trajectory of AI for the benefit of society.

epoch.ai
Much more in the post! epoch.ai/blog/what-y...

The post was written by @ansonwhho, @benmcottier, and @YafahEdelman.
What you need to know about AI data centers
AI companies are planning a buildout of data centers that will rank among the largest infrastructure projects in history. We examine their power demands, what makes AI data centers special, and what all this means for AI policy and the future of AI.
epoch.ai
November 10, 2025 at 6:03 PM
3) Thousands of people from multiple parties build >1 GW data centers, and cooling infrastructure can be spotted using satellite imagery.

So unlike the Manhattan Project, AI data center buildouts are hard to keep secret from the rest of the world.

epoch.ai/data/data-c...
OpenAI Stargate Abilene - Frontier Data Centers Satellite Explorer
See how satellite imagery, permits, and public disclosures are used to track the power capacity and performance of frontier data centers.
epoch.ai
November 10, 2025 at 6:03 PM
2) AI data centers are growing fast enough to support 5×/year growth in frontier training compute.

So companies probably won’t need to decentralize AI training over the next 2 years.

However, in practice they might choose to anyway, e.g. to soak up excess power in the grid.
November 10, 2025 at 6:03 PM
3 implications for AI policy/impacts:

1) AI’s climate impact has been small so far.

For example, AI data centers use 1% of US power vs 8% for lighting and air conditioning (12%).

But if trends continue, this could change in the next decade.
November 10, 2025 at 6:03 PM
This power runs IT equipment on “server racks” with a small area of 0.5 m^2. But each rack uses enough power for 100 homes!

This means a huge amount of heat in a small space. So you can’t cool these chips with fans — you need liquid coolants to efficiently soak up the heat.
November 10, 2025 at 6:03 PM
Where does this power come from?

Usually a mix of on-site fossil fuel generation and interconnection to the grid.

E.g. Stargate Abilene will start off with on-site natural gas, then connect to the grid to access Texas’ abundant renewable power.
November 10, 2025 at 6:03 PM
So power is the core determinant of where AI data centers are built.

Other factors like latency matter surprisingly little — it takes >100× longer to generate model responses than transmit data from Texas to Tokyo.

Even serving LLMs from the Moon may not be a big latency issue!
November 10, 2025 at 6:03 PM
Only a few countries have enough power to build many >1 GW data centers like Stargate

E.g. 30 GW is ~5% of the US’ power, ~2.5% of China’s, but ~90% of the UK’s

Other countries can build some frontier data centers and grow their power capacity — but they need more time/money
November 10, 2025 at 6:03 PM
By the end of the year, AI data centers could collectively see >$300 billion in investment, around 1% of US GDP.

That’s bigger than the Apollo Program (0.8%) and Manhattan Project (0.4%) at their peaks.
November 10, 2025 at 6:03 PM
AI data centers will be some of the biggest infrastructure projects in history

e.g. OpenAI’s Stargate Abilene will need:

- As much power as Seattle (1 GW)

- >250× the compute of the GPT-4 cluster

- 450 soccer fields of land

- $32B

- Thousands of workers

- 2 years to build
November 10, 2025 at 6:03 PM
This insight was written by Venkat Somala and Ben Cottier

You can read more details on their methodology at the following link:
epoch.ai/data-insigh...
November 10, 2025 at 5:41 PM
This work uses our new Frontier Data Centers dataset! See more about our analysis and the data behind it on our website: epoch.ai/data/data-c...
Frontier AI Data Centers
Open database of AI data centers using satellite and permit data to show compute, power use, and construction timelines.
epoch.ai
November 10, 2025 at 5:41 PM
These rapid timelines are remarkable given the scale involved: 1 GW is enough to power roughly 1 million homes (ignoring swings in demand). The ability to operationalize such massive compute infrastructure in 1-2 years exemplifies the current speed and intensity of the AI industry.
November 10, 2025 at 5:40 PM
xAI’s Colossus 2 in Memphis projects the fastest build-out planned, targeting just 12 months to reach gigawatt scale.

How? By reusing existing industrial shells and generating its own power early using gas turbines and batteries, before full grid connection.
November 10, 2025 at 5:40 PM
Based on planned timelines, we expect five AI datacenters at a scale of 1GW or more to come online in 2026. Each is operated by a different hyperscaler.
November 10, 2025 at 5:40 PM
AI data centers require massive amounts of power and permitting, yet timelines are short. Across the data center builds we’ve tracked, the time from starting construction to reaching 1 GW of facility power capacity typically ranges from 1 to 4 years.
November 10, 2025 at 5:40 PM
We’re excited to do more work tying ECI to concrete metrics. For now, check out details about this insight on our website: epoch.ai/data-insigh...
Epoch’s Capabilities Index stitches together benchmarks across a wide range of difficulties
Epoch’s Capabilities Index stitches together benchmarks across a wide range of difficulties. We show how scores can be interpreted by mapping them back to expected benchmark performance.
epoch.ai
November 7, 2025 at 7:13 PM
3. We’re quite uncertain about the difficulty of some benchmarks.

No model has scored more than 29% on FrontierMath, so saturating it definitely requires more than the current SOTA of 150… but will it take an ECI of 175? 200? For now, it’s hard to be sure.
November 7, 2025 at 7:13 PM
2. While a model with a score of 140 is expected to get 45% on SWE-Bench Verified, this is just an expectation. Individual models perform better or worse on specific tasks.

For instance, GPT-5 underperforms in GPQA Diamond but overperforms in VPCT.
November 7, 2025 at 7:13 PM
Three important takeaways:

1. Benchmarks vary in overall difficulty, and in slope. Steeper slopes imply a narrower range of difficulties at the question level and mean the benchmark saturates quickly once some progress is made.
November 7, 2025 at 7:13 PM
Original reporting of the Anthropic projection by The Information.

You can find the article here: www.theinformation.com/articles/ant...?
Anthropic Projects $70 Billion in Revenue, $17 Billion in Cash Flow in 2028
Anthropic this summer hiked its most optimistic growth forecasts by roughly 13% to 28% over the next three years and projected generating as much as $70 billion in revenue in 2028, up from close to $5...
www.theinformation.com
November 5, 2025 at 10:22 PM