Tech News, Magazine & Review WordPress Theme 2017
  • Blog
  • Der Digital Schamane
    • Ikigai: Das japanische Geheimnis für ein erfülltes  Leben
    • Entfesseln Sie Ihr innovatives Potenzial mit den Denkhüten von de Bono
    • Enthüllen Sie die Geheimnisse Ihres inneren Teams: Eine einfacher Leitfaden
    • Die Kunst der kollegialen Fallberatung: Förderung einer Kultur der Zusammenarbeit und des Lernens
    • Vom Träumen zur Wirklichkeit: Die Kraft der Walt Disney Methode!
  • Spiele
Montag, 13. Oktober 2025
No Result
View All Result
  • Blog
  • Der Digital Schamane
    • Ikigai: Das japanische Geheimnis für ein erfülltes  Leben
    • Entfesseln Sie Ihr innovatives Potenzial mit den Denkhüten von de Bono
    • Enthüllen Sie die Geheimnisse Ihres inneren Teams: Eine einfacher Leitfaden
    • Die Kunst der kollegialen Fallberatung: Förderung einer Kultur der Zusammenarbeit und des Lernens
    • Vom Träumen zur Wirklichkeit: Die Kraft der Walt Disney Methode!
  • Spiele
No Result
View All Result
Arbeit 4.0 und KI: die Zukunft ist jetzt!
No Result
View All Result

Breaking the bottleneck: Why AI demands an SSD-first future

by Manfred Groitl
13. Oktober 2025
149 1
Home AI
Share on FacebookShare on Twitter

Presented by Solidigm


As AI adoption surges, data centers face a critical bottleneck in storage — and traditional HDDs are at the center of it. Data that once sat idle as cold archives is now being pulled into frequent use to build more accurate models and deliver better inference results. This shift from cold data to warm data demands low-latency, high-throughput storage that can handle parallel computations. HDDs will remain the workhorse for low-cost cold storage, but without rethinking their role, the high-capacity storage layer risks becoming the weakest link in the AI factory.

„Modern AI workloads, combined with data center constraints, have created new challenges for HDDs,“ says Jeff Janukowicz, research vice president at IDC. „While HDD suppliers are addressing data storage growth by offering larger drives, this often comes at the expense of slower performance. As a result, the concept of ’nearline SSDs‘ is becoming an increasingly relevant topic of discussion within the industry.“

Today, AI operators need to maximize GPU utilization, manage network-attached storage efficiently, and scale compute — all while cutting costs on increasingly scarce power and space. In an environment where every watt and every square inch counts, says Roger Corell, senior director of AI and leadership marketing at Solidigm, success requires more than a technical refresh. It calls for a deeper realignment.

“It speaks to the tectonic shift in the value of data for AI,” Corell says. “That’s where high-capacity SSDs come into play. Along with capacity, they bring performance and efficiency — enabling exabyte-scale storage pipelines to keep pace with the relentless pace of data set size. All of that consumes power and space, so we need to do it as efficiently as possible to enable more GPU scale in this constrained environment.”

High-capacity SSDs aren’t just displacing HDDs — they’re removing one of the biggest bottlenecks on the AI factory floor. By delivering massive gains in performance, efficiency, and density, SSDs free up the power and space needed to push GPU scale further. It’s less a storage upgrade than a structural shift in how data infrastructure is designed for the AI era.

HDDs vs. SDDs: More than just a hardware refresh

HDDs have impressive mechanical designs, but they’re made up of many moving parts that at scale use more energy, take up more space, and fail at a higher rate than solid state drives. The reliance on spinning platters and mechanical read/write heads inherently limits Input/Output Operations Per Second (IOPS), creating bottlenecks for AI workloads that demand low latency, high concurrency, and sustained throughput.

HDDs also struggle with latency-sensitive tasks, as the physical act of seeking data introduces mechanical delays unsuited for real-time AI inference and training. Moreover, their power and cooling requirements increase significantly under frequent and intensive data access, reducing efficiency as data scales and warms.

In contrast, the SSD-based VAST storage solution reduces energy usage by ~$1M a year, and in an AI environment where every watt matters, this is a huge advantage for SSDs. To demonstrate, Solidigm and VAST Data completed a study examining the economics of data storage at exabyte scale — a quadrillion bytes, or a billion gigabytes, with an analysis of storage power consumption versus HDDs over a 10-year period.

As a starting reference point, you’d need four 30TB HDDs to equal the capacity of a single 122TB Solidigm SSD. After factoring in VAST’s data reduction techniques made possible by the superior performance of SSDs, the exabyte solution comprises 3,738 Solidigm SSDs vs over 40,000 high-capacity HDDs. The study found that the SSD-based VAST solution consumes 77% less storage energy.

Minimizing data center footprints

„We’re shipping 122-terabyte drives to some of the top OEMs and leading AI cloud service providers in the world,“ Corell says. „When you compare an all-122TB SSD to hybrid HDD + TLC SSD configuration, they’re getting a nine-to-one savings in data center footprint. And yes, it’s important in these massive data centers that are building their own nuclear reactors and signing hefty power purchase agreements with renewable energy providers, but it’s increasingly important as you get to the regional data centers, the local data centers, and all the way out to your edge deployments where space can come at a premium.“

That nine-to-one savings goes beyond space and power — it lets organizations fit infrastructure into previously unavailable spaces, expand GPU scale, or build smaller footprints.

„If you’re given X amount of land and Y amount of power, you’re going to use it. You’re AI“ Corell explains, “where every watt and square inch counts, so why not use it in the most efficient way? Get the most efficient storage possible on the planet and enable greater GPU scale within that envelope that you have to fit in. On an ongoing basis, it’s going to save you operational cost as well. You have 90 percent fewer storage bays to maintain, and the cost associated with that is gone.“

Another often-overlooked element, the (much) larger physical footprint of data stored on mechanical HDDs results in a greater construction materials footprint. Collectively, concrete and steel production accounts for over 15% of global greenhouse gas emissions. By reducing the physical footprint of storage, high-capacity SSDs can help reduce embodied concrete and steel-based emissions by more than 80% compared to HDDs. And in the last phase of the sustainability life cycle, which is drive end-of-life, there will be 90% percent fewer drives to disposition. .

Reshaping cold and archival storage strategies

The move to SDD isn’t just a storage upgrade; it’s a fundamental realignment of data infrastructure strategy in the AI era, and it’s picking up speed.

„Big hyperscalers are looking to wring the most out of their existing infrastructure, doing unnatural acts, if you will, with HDDs like overprovisioning them to near 90% to try to wring out as many IOPS per terabyte as possible, but they’re beginning to come around,“ Corell says. „Once they turn to a modern all high-capacity storage infrastructure, the industry at large will be on that trajectory. Plus, we’re starting to see these lessons learned on the value of modern storage in AI applied to other segments as well, such as big data analytics, HPC, and many more.“

While all-flash solutions are being embraced almost universally, there will always be a place for HDDs, he adds. HDDs will persist in usages like archival, cold storage, and scenarios where pure cost per gigabyte concerns outweigh the need for real-time access. But as the token economy heats up and enterprises realize value in monetizing data, the warm and warming data segments will continue to grow.

Solving power challenges of the future

Now in its 4th generation, with more than 122 cumulative exabytes shipped to date, Solidigm’s QLC (Quad-Level Cell) technology has led the industry in balancing higher drive capacities with cost efficiency.

„We don’t think of storage as just storing bits and bytes. We think about how we can develop these amazing drives that are able to deliver benefits at a solution level,“ Corell says. „The shining star on that is our recently launched, E1.S, designed specifically for dense and efficient storage in direct attach storage configurations for the next-generation fanless GPU server.“

The Solidigm D7-PS1010 E1.S is a breakthrough, the industry’s first eSSD with single-sided direct-to-chip liquid cooling technology. Solidigm worked with NVIDIA to address the dual challenges of heat management and cost efficiency, while delivering the high performance required for demanding AI workloads.

„We’re rapidly moving to an environment where all critical IT components will be direct-to-chip liquid-cooled on the direct attach side,“ he says. „I think the market needs to be looking at their approach to cooling, because power limitations, power challenges are not going to abate in my lifetime, at least. They need to be applying a neocloud mindset to how they’re architecting the most efficient infrastructure.“

Increasingly complex inference is pushing against a memory wall, which makes storage architecture a front-line design challenge, not an afterthought. High-capacity SSDs, paired with liquid cooling and efficient design, are emerging as the only path to meet AI’s escalating demands. The mandate now is to build infrastructure not just for efficiency, but for storage that can efficiently scale as data grows. The organizations that realign storage now will be the ones able to scale AI tomorrow.


Sponsored articles are content produced by a company that is either paying for the post or has a business relationship with VentureBeat, and they’re always clearly marked. For more information, contact sales@venturebeat.com.

Manfred Groitl

Please login to join discussion

Recommended.

LLMs generate ‘fluent nonsense’ when reasoning outside their training zone

19. August 2025

 A data bottleneck is holding AI science  back, says new Nobel winner

15. Oktober 2024

Trending.

KURZGESCHICHTEN: Sammlung moderner Kurzgeschichten für die Schule

24. März 2025

We’ve come a long way from RPA: How AI agents are revolutionizing automation

16. Dezember 2024

Gartner: 2025 will see the rise of AI agents (and other top trends)

21. Oktober 2024

Spexi unveils LayerDrone decentralized network for crowdsourcing high-res drone images of Earth

17. April 2025

UNTERRICHT: Mit dem Growth Mindset ins neue Schuljahr

11. August 2024
Arbeit 4.0 und KI: die Zukunft ist jetzt!

Menü

  • Impressum
  • Datenschutzerklärung

Social Media

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • Home
  • Review
  • Apple
  • Applications
  • Computers
  • Gaming
  • Microsoft
  • Photography
  • Security