Power Struggle: A Look at Data Centers and AI Growth in 2025

by Nick Shah
January 30, 2025
AI data centers: Powering the future.

“DeepSeek” is truly a magic word in tech this week.  

The Guardian’s Tuesday newsletter led: How an unknown Chinese startup wiped $593bn from the value of an AI giant.   

This “flipping of the AI script” (The New York Times), “reframing [of] the AI policy debate,” or disrupting “American plans for AI dominance” (The Washington Post) has been everywhere since DeepSeek’s products took center stage, reportedly performing on par with OpenAI’s offerings at a stunningly reduced cost.  

Trained on some 2000 less advanced Nvidia chips for a reported $6 million (compare with OpenAI’s $100 million+ training costs for GPT-4), all in less than two months, and this is a truly stunning accomplishment.  

Oh yes, and it’s open source.  

Per PitchBook, between 2023 and 2024 venture capital firms have invested $155 billion in AI startups.  

OpenAI reached a nearly $160 billion value and Anthropic $20 billion. The news of Nvidia’s own staggering transformation has been well-reported, but last year they were the world’s single most valuable company at some points.  

And while US AI tech stocks have started coming back at the time of this writing from their Monday beating (earnings due out for several after the bell Wednesday), Nvidia seemed not too concerned, calling DeepSeek’s R1 “an excellent AI advancement.”  

Just last week it was the Stargate project dominating tech headlines, a $500 billion (eventually) joint-venture between OpenAI, Oracle, and Japan’s SoftBank, with added partnership from Microsoft, Nvidia, and Arm Holdings. President Trump said it would be an investment in US AI infrastructure that would create more than 100,000 US jobs.  

Much of that money is aimed for data centers, and as the MIT Technology Review pointed-out in its own great article on the topic (see below), that amount would be more than the inflation-adjusted cost of the Apollo space program, and just barely less than the cost of building the entire US highway system (over 30 years, also adjusted for inflation). 

There’s a lot to unpack here, so in today’s newsletter, I want to do just that, considering this need for high-powered data centers for AI (ala the space race and American highways), the reported energy efficiency in these data centers, and how this may impact AI use by businesses during the coming year.  

The Data Center Dilemma: Power, Location, and Capacity Constraints 

What projects like Stargate and Meta’s own planned spending increase—estimated $60–65 billion in 2025 with much earmarked for data centers—are seeking to avoid are limitations in AI development and capacity due to data center infrastructure.  

Even early last year, per CBRE and the Uptime Institute, the hyperscalers (Amazon, Microsoft, Google) were preleasing data center space 24–36 months out, construction in primary markets was up some 25%, and data center power usage was expected to double by 2030.  

I see this demand only intensifying.  

Nvidia is the industry standard in AI chips (and infrastructure) for good reason, but per Morgan Stanley, their entire 2025 slate of the highly-touted Blackwell chips were sold out in November of last year.

Data Center Power Consumption Outpaces Supply 

AI, as we’ve come to understand it, is still very power-hungry.  

While estimates vary wildly, researchers estimate that GPT-3’s training required as much power as the annual consumption of some 130 US homes.  

Image-generation is particularly intensive, with one picture reportedly drawing as much power to produce as the energy needed to entirely charge your smartphone. 

Estimates on AI power demands going forward vary wildly, but the conventional wisdom has been that we haven’t hit the limits yet, so more power means more capacity. And the Department of Energy, per Axios, estimated in late 2024 that AI data center needs could account for 6.7 to 12% of all US electricity by 2028.  

From all of this, it’s clear to see why DeepSeek’s open-source offerings are being hurriedly reverse-engineered, even just to assess its reportedly vast improvements in operating efficiency. 

And as it stands, next-gen AI workloads are increasing data center power needs exponentially. Nvidia’s new rack systems supposedly demand some 130 kW per rack, growing to an estimated 300 kW per rack by 2026.  

This is why the big tech players have also been making energy plays, missing their emissions targets, and targeting choice land for new data centers. (Land-rushes like this saw a Texas company, TPL, gain 230% in value in late November.) 

But why does all this matter to businesses that aren’t directly involved? 

I expect some of these expansions will crash into the same kinds of near-term limitations we see from fast tracking so many physical facilities, from legitimate public concern over utility burdens and pollution concerns to grid failures to water supply issues for cooling, even as regional governments clamor for the investment. 

And this short-term growth is unsustainable in many areas even where the land, water, and raw power resources may be readily available, as many power grids are already under strain. 

AI systems themselves are already helping with data center energy efficiency, such as by adjusting draw based on season and smarter awareness of overall need, but these benefits may pale in comparison to the challenges in the near-term. 

This collision will likely increase costs, making alternatives like the solutions found by DeepSeek and smaller-scale models increasingly appealing to businesses desperate for on-prem answers or more sustainable data centers.  

AI Compute Supply Chains and Logistical Issues 

Even if you have the data center space you need, there are other physical restrictions on growth expected this year.  

Chip shortages were the bottleneck story even before data centers, and they remain a significant issue for cloud providers.  

Nvidia’s AI market share is generally reported between 70–95%, despite a surge of investment in rivals, and even the supply of AI-specialized silicon is constrained.  

There are also delays reported in areas of other infrastructure needs, such as power components and cooling systems, and as with all of this, as you’d expect to see prices inflated as a result.  

This can also translate into a very long line for companies wholly dependent on the latest public cloud resources.  

Cloud Computing and AI Scalability 

Speaking of the cloud, the training costs that we’ve been seeing for large models before DeepSeek have made it infeasible for most organizations to do this themselves, leading them to rely on cloud providers for training.  

It’s only reasonable to see this continuing. As noted by the Uptime Institute for their 2025 cloud predictions, all of these issues will see most companies working from pre-trained, large models instead of training their own.  

These demands also make it likely that on-premises AI systems may be rare in some use cases, forcing reliance on hyperscaler cloud services.  

As noted by Uptime, 71% of enterprises already expect budgets to increase in 2025 to deal with the rapidly changing situation, but many organizations will still lack the ability to invest in their own, dedicated AI infrastructure.  

Overreliance on Hyperscalers? 

I’ve perhaps already stated this, but I believe the massive power demands, hard-to-get physical technology, incredible complexity, and dominated market shares all point to rising prices and a concentration of use within a limited number of providers.  

Some 2/3 of all rentable cloud space belongs to just three entities (Amazon AWS, Microsoft Azure, and Alphabet/Google), with two of them controlling more than half the market in 2024 (AWS 32%, Azure 23%).  

It’s of course prudent that these are also among the leading investors in AI, but this concentration, faced with the kinds of limitation discussed above, means an incredibly high amount of pressure acting on the system.  

Regional data center constraints and prioritization by scale may mean mid-size and smaller companies get left out, unable to target AI workloads at the times they need.  

I expect AI and cloud resource management costs to become prohibitive for more vendors before many of these issues can be resolved. 

Last time out I wrote about an AI implementation in healthcare services meant to alleviate scale and quality problems with claims handling. As discussed there, the best AI solutions were already not affordable at scale for the company, forcing them to use varying approaches and get creative to get the work done efficiently.   

AI ROI Strategies 2025: Flexibility and Alternative Approaches  

Generative AI is not all AI offers, just as the large models are not the whole game. In fact, for highly specific uses, far smaller models (1B to 3B parameters, such as can run on the machine I’m typing this on), fine-tuned, show up to 60% more accuracy for many tasks, and also can allow greater control of your own data.  

What DeepSeek Points To 

Again, we need look more closely at DeepSeek and the numbers we’re seeing, but I believe there can be no doubt it harnessed truly impressive innovation. And by being open source, it extends that capacity far and wide.  

The US government, across administrations, has made leading the world in AI (and especially China) a key goal, expressed in the 2022 CHIPS Act, partnerships between big tech firms and the government, and controls limiting what access China can have of the newest Nvidia AI infrastructure offerings. 

And while DeepSeek seems to point to the failure of this effort, as the MIT Technology Review discusses, such export controls take time (that is, it’s still too early to tell).  

[For more on these restrictions, take a look at our PTP Report on chip diplomacy.] 

But DeepSeek’s efficiency sends a shot across the bow of the big AI companies who have been keeping their own reasoning models close to the chest. Here are some of the claims: 

  • Estimates put the building cost as much as 1/1000th that of the largest OpenAI 
  • Cheaper to run as well as create (I’ve seen R1, the DeepSeek reasoning model, at $0.55 per million tokens input/$2.19 per million output vs o1’s $15 per million in/$60 per million out) 
  • Uses less and lower-end Nvidia GPUs 
  • Discloses reasoning steps, and search engine cites sources 

Depending on how severe you take the stakes, such a release could be considered Sputnik (using the space race, as Marc Andreeson termed it on X) or enlisting Werner Heisenberg (using the Manhattan Project).  

But despite all this, AI researchers remain undaunted on the need for new, heavily-powered data center infrastructure. OpenAI’s Noam Brown posted on X that adding more compute will only make DeepSeek, too, more powerful.  

That is, we’re not at the limits yet.  

Still, for questions of affordability and the control of the market, there’s no doubt that DeepSeek could be a tremendous game-changer 

Sustainable Data Center Solutions as a Longer-Term Fix 

I posted about a Wired article on LinkedIn that profiled DNA storage: a solution at least 1000 times more compact than solid-state, made from our own building blocks.  

(They’ve already encoded all of Shakespeare’s 154 sonnets, 52 pages of Mozart’s music, and an episode of the Netflix show “Biohackers” into tiny DNA capsules).  

What’s truly remarkable here is that these don’t even need power when they’re not in use. Unlike using servers that are constantly chewing up energy, DNA storage can sit without energy for years at a time.  

And while it’s still in the experimental stage and may not ever be the method of choice for high-access data, it’s just one example of a multitude of solutions in development now that can relieve this power issue.  

Nuclear power for data centers is itself is an entire article, with Amazon and Google looking to small modular reactors (SMRs) and Microsoft invested in getting Three Mile Island back up and running. Meta, too, is in the nuclear game.  

And while these solutions could be a significant move toward providing the needed power, their deployment is complex, and all require several years to become operational due to regulatory approvals and construction timelines. 

The Critical Role of Expertise 

We are all witnessing a truly radical transformation in technology that’s of course transforming the entire world with it.  

Navigating this takes more than just waiting for the early adopters to clear the path and work out the kinks, search out the best-of-breed solution, buy-in, train, and put it online. This shift requires fully flexible AI deployment strategies, starting ASAP, and that means expertise.  

No matter what your use cases are, no matter what the scale, expect to be pairing AI experts with subject matter experts, and expect to need prior implementation experience.  

This is where PTP can help. We’ve been investing in ML and AI talent since the beginning, and I’m proud of the pipeline we’ve built, including machine learning and AI specialists, onshore, nearshore, and off, as well as technological specialists from all across the spectrum.  

Conclusion: Adaptability Wins in 2025 

Is this another Manhattan Project we’re witnessing? Adjusted for inflation, that monumental undertaking only cost $30 billion. And it built an entire town in the desert. 

Against the backdrop of enormous AI investments, that looks almost tiny. But it has government support in common, and a sense of urgency pushed by national security.  

I’d planned this article even before DeepSeek exploded the scene with their revelation of training costs and operating efficiency. But DeepSeek, for all its staggering achievements, is also sending all the data back to China, where there’s a right for access by the government (the very concern that’s kept Tik Tok in the center of our news for years).  

Nevertheless, I fully expect the ongoing data center surge to draw public opposition in places, as we see resources strained to the breaking point and environmental commitments dropped.  

I believe we’ll see an even greater move to the cloud, and a squeeze in the transitional period that will be felt most by smaller and mid-size companies, and especially those who are the slowest-moving.   

As we’re already seeing with DeepSeek, expect this pinch to also draw companies away from the extreme concentration of the hyperscalers (by necessity), as more alternatives to Nvidia also come online.  

Agentic solutions will help us make use of a wider variety of options, including smaller, more cost-effective models, which may allow for a true blending of AI solutions. 

As DeepSeek’s shown us, this stage is set for creativity and innovation. 

References 

AI’s energy obsession just got a reality check, MIT Technology Review 

Five data center predictions for 2025, Uptime Intelligence 

Meta to Increase Spending to $65 Billion This Year in A.I. Push, The New York Times 

Estimating the Carbon Footprint of BLOOM, a 176B Parameter Language Model, arXiv:2211.02001v1 [cs.LG] 

DeepSeek shakes up the energy-AI equation, Axios 

An Entire Book Was Written in DNA—and You Can Buy It for $60, Wired

Read more on Cloud Computing   or related topics From our CEO   ,
26+ Years in IT Placements & Staffing Solutions

Illinois

1030 W Higgins Rd, Suite 230
Park Ridge, IL 60068

Texas

222 West Las Colinas Blvd.,
Suite 1650, Irving, Texas, 75039

Mexico

Av. de las Américas #1586 Country Club,
Guadalajara, Jalisco, Mexico, 44610

Brazil

8th floor, 90, Dolorez Alcaraz Caldas Ave.,
Belas Beach, Porto Alegre, Rio Grande do Sul
Brazil, 90110-180

Argentina

240 Ing. Buttystreet, 5th floor Buenos Aires,
Argentina, B1001AFB

Hyderabad

08th Floor, SLN Terminus, Survey No. 133, Beside Botanical Gardens,
Gachibowli, Hyderabad, Telangana, 500032, India

Gurgaon

16th Floor, Tower-9A, Cyber City, DLF City Phase II,
Gurgaon, Haryana, 122002, India

Work with us
Please enable JavaScript in your browser to complete this form.
*By submitting this form you agree to receiving marketing & services related communication via email, phone, text messages or WhatsApp. Please read our Privacy Policy and Terms & Conditions for more details.

Subscribe to the PTP Report

Be notified when new articles are published. Receive IT industry insights, recruitment trends, and leadership perspectives directly in your inbox.  

By submitting this form you agree to receiving Marketing & services related communication via email, phone, text messages or WhatsApp. Please read our Privacy Policy and Terms & Conditions for more details.

Unlock our expertise

If you're looking for a partner to help build talent management solutions, get in touch!

Please enable JavaScript in your browser to complete this form.
*By submitting this form you agree to receiving marketing & services related communication via email, phone, text messages or WhatsApp. Please read our Privacy Policy and Terms & Conditions for more details.
Global Popup