While I can barely find two sticks of 16 GB to rub together, Micron unveils a 256 GB memory module destined for AI servers

8 hours ago 3

Rommie Analytics

In the midst of a memory supply crisis, I am definitely thankful for the 32 GB of DDR5 that came inside my prebuilt rig a few moons ago now. Sure, 16 GB of RAM is fine for most things I'd want to dive into, but the AI industry is playing a whole other ball game. Case in point, I am sweating just thinking about Micron's 256 GB DDR5 server module.

The Boise, Idaho-based memory manufacturer unveiled the tech on Tuesday. It's built on Micron's 1-gamma technology which, per the press release, "is capable of speeds up to 9,200 megatransfers per second (MT/s), greater than 40% faster than modules in volume production today."

Samples of the registered dual in-line memory modules (RDIMM) are being offered "to key server ecosystem enablers for platform validation" in order to ensure wide-ranging compatibility. To simp...

Read Entire Article