Solidigm confirms 245+ TB SSDs set to launch before end of 2026

Solidigm
(Image credit: Solidigm)

The storage industry is at a tipping‑point. For years, hard disks ruled high‑capacity workloads, but shifting economics around power, space, and performance are tilting the scales heavily toward SSDs.

Few companies illustrate that shift as clearly as Solidigm, and I recently spoke with Roger Corell, the firm's senior director of AI and leadership marketing, and he told me why he believes QLC‑based SSDs are poised to replace HDDs in AI and data‑intensive environments.

Corell highlighted Solidigm’s progress to date and revealed its roadmap extends beyond 245 TB, targeting next-gen models by 2026.

Solidigm’s message goes beyond sheer scale however, and Corell told us about the company's innovations in rack-level designs, thermal efficiency, and edge-ready architectures as key to meeting AI’s growing demand for low-latency, high-concurrency storage.

Corell sees HDDs relegated to archival roles. The future, he argues, belongs to NAND - stacked denser, deployed at scale, and designed for performance, efficiency, and sustainability.

  • Four SSD vendors have confirmed that they will launch SSDs bigger than 245TB in the near future: SanDisk, Samsung, Kioxia, and Micron. Is Solidigm going to launch such an SSD in 2025?

Solidigm is already leading the industry in SSD capacity with the D5-P5336, which delivers up to 122.88TB, more than double the previous generation, and is shipping to customers today.

In fact, we recently passed the milestone of having shipped more than 120 exabytes (EB) of QLC SSD capacity across our four generations of QLC technology, while others in the industry are just getting started.

Additionally, while we don’t pre-announce unlaunched products, our roadmap is focused on pushing the boundaries of QLC density, thermal efficiency, and rack-scale integration to meet the demands of AI and data-centric workloads.

We’re actively exploring next-gen capacities, form factors, and system-level innovations that will redefine how storage is deployed at scale. This includes our plans to ship 245+TB drives by the end of 2026.

We like to say: we’re not just scaling capacity; we’re redefining what’s possible in enterprise storage.

We also work closely with our customers to ensure our solutions are tailored to their unique workloads, especially in AI and hyperscale environments.

  • I am surprised that QLC is rapidly becoming the dominant NAND flavor when quite a few experts predicted that PLC would replace it sooner rather than later. What has changed since these predictions were made?

QLC has matured into a proven, high-density solution that delivers exceptional value in real-world deployments. Solidigm’s leadership in QLC is built on delivering SSDs that meet the performance, endurance, and efficiency needs of AI, big data, and cloud environments.

PLC remains an interesting concept, but it faces major technical hurdles in endurance and reliability that QLC has already overcome.

The industry has shifted from theoretical potential to practical impact, and QLC is delivering. We are focused on maintaining a leadership portfolio based on QLC NAND.

  • Solidigm and other SSD vendors are adamant that high-capacity SSDs should (and will in time) displace hard disk drives for AI workloads. What’s the argument for this?

AI workloads, and other data-intensive workloads needing fast access to large data sets, demand high throughput, low latency, and energy efficiency, all areas where SSDs, especially Solidigm’s QLC-based drives, have a clear advantage.

Our D5-P5336, for example, enables storing 10PB of data using just 84 SSDs, compared to more than 400 HDDs. That math speaks for itself.

That’s a dramatic reduction in power, cooling, and rack space, which translates directly into lower TCO and higher performance.

SSDs are no longer just faster; they’re smarter, denser, and more sustainable. While HDDs still have a place in the data stack, SSDs are more effective at meeting the latency, throughput, and concurrency demands of AI.

  • So, does that mean that you no longer see HDDs playing a role in AI inference?

For AI inference, which requires rapid data access and consistent performance, SSDs are quickly becoming the standard. HDDs may still serve in archival or cold storage roles for pools of data that are less frequently tapped.

Solidigm’s portfolio is optimized for every stage of the AI data pipeline, from ingest to inference, with products like the D7-PS1010 delivering up to 70% better IOPS per watt than leading competitors.

AI doesn’t wait, and inference workloads don’t tolerate lag - and neither should the infrastructure supporting them. Furthermore, the real growth in AI inference will be at the edge, where constraints such as power, space, cooling, and even weight will pose even more challenges for HDDs.

One more point on HDDs in general: modern workloads combined with data center constraints have rapidly exposed HDD limitations.

HDD suppliers are addressing data storage growth with larger drives, but this comes at the expense of slowing performance. Therefore, what some in the industry are calling nearline SSDs is a very relevant conversation to be having.

  • Will we ever see large-capacity SSDs for prosumers or end users (i.e., M.2 format), or will such products remain exclusive to enterprise and hyperscalers?

Solidigm has strategically exited the consumer SSD market to focus on enterprise and hyperscaler needs, but we continue to support M.2 form factors for enterprise deployments.

While ultra-high-capacity SSDs are currently tailored for data centers, the innovations driving them - such as QLC density, thermal optimization, and rack-scale efficiency - will eventually influence broader markets.

We’re building for the data center, but the innovation benefits everyone.

  • Last question. SSD capacity doubles roughly every 18 months. So, the next step will be 512TB and possibly 1PB by 2028. Would the obvious move be rack-level SSDs? 1U or 4U populated with NAND chips and controllers only?

Absolutely. Solidigm is already enabling rack-scale innovation with high-density SSDs and new designs incorporating liquid cooling and advanced thermal management.

As AI workloads grow in complexity and scale, the need for integrated, rack-level NAND solutions becomes more urgent.

Our approach is not just about filling racks with drives; it’s about rethinking how to deliver the performance, efficiency, and sustainability demanded at scale.

Desire Athow
Managing Editor, TechRadar Pro

Désiré has been musing and writing about technology during a career spanning four decades. He dabbled in website builders and web hosting when DHTML and frames were in vogue and started narrating about the impact of technology on society just before the start of the Y2K hysteria at the turn of the last millennium.

With contributions from

You must confirm your public display name before commenting

Please logout and then login again, you will then be prompted to enter your display name.