Get all your news in one place.
100's of premium titles.
One app.
Start reading
PC Gamer
PC Gamer
Nick Evanson

RARE PHOTOS: Dell s ceo reckons that the total memory demand from the entire ai market in 2028 will be 625x bigger than it was in 2022 - The Real Truth

A processed photo of a data center server room, showing racks of computers lit by overhead lights, reflecting off the ground.

Today is a Wednesday. I mention that because when it comes to news about the global memory market, you might just think that every day is now a Woesday instead. Enter Dell's CEO into the fray with an insight as to how things are going to fare over the next few years, and it's going to be worse: 625 times worse.

That's according to IT Home and Jukan on X, which claims that at a Bank of America event, Michael Dell said that "As memory per accelerator and system scale expand simultaneously in AI infrastructure, a structure is forming where total memory demand increases by approximately 625 times" (machine translation).

Dell arrives at this figure by noting that the most popular AI accelerator in 2022, Nvidia's H100, sported 80 GB of HBM3 (High Bandwidth Memory), but that this figure is estimated to rise to 2 TB by 2028: a fraction over 25 times more DRAM.

He then apparently said that the rate at which AI accelerators are implemented in data centers will increase by a factor of 25 over the same period. Multiply the two increases together, and you arrive at the claimed 625.

However, Dell is almost certainly using the maximum memory that Nvidia's Vera Rubin superduper chip can support, though most single racks will 'only' be rocking 576 GB of HBM4. That's 7.2 times more memory than a single H100 accelerator, and if we assume Dell is correct about the other growth rate, then the total memory demand will climb by a factor of 180.

Is that good news? Hardly. There are only three companies in the whole world that manufacture HBM4—SK hynix, Samsung, and Micron—and while other companies are trying to catch up with HBM3 offerings, none of them can keep up with current memory demands, let alone how things are going to be in a couple of years.

By 2028, the big three memory makers are expected to have more production facilities in operation, but whatever they're able to get up and running, it surely won't be enough to cope with a DRAM demand that's going to be many times larger than it already is. And it's not just HBM that's going to be in crushingly short supply; LPDDR5x (memory used in laptops and handhelds) and NAND flash storage will too.

A single rack/compute tray in an Nvidia GB200 NVL72 AI server, for example, requires 480 GB of LDDR5X, and a full rack tower has up to 144 E1.S slots (server equivalent of M.2), each home to multiple terabytes of fast SSDs.

A fully kitted-out NVL72 tower could have as much as 17 TB of DRAM and 547 TB of flash storage. That's just one tower, and big AI data centers use hundreds, if not thousands, of them.

If we're lucky, the growth in DRAM and flash manufacturing will be able to maintain the memory status quo (i.e. it's all outrageously expensive but still 'affordable' compared to how much a top-end graphics card costs), and should Dell's prediction comes to pass, we'll still be in with a chance of enjoying one of the best hobbies around.

Perhaps it's best not to consider what will happen if the demand-supply ratio for memory gets considerably worse.

Sign up to read this article
Read news from 100's of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.