That something as superficially straight forward as plain old energy supply is turning out to be a major limiting factor in the AI revolution is one of those delightful unexpected quirks that keeps us all on our toes. Who'd have thought in just a few short years the world would be spooling up more GPUs than the power grid can cope with?
One possible partial solution, apparently, is moving those GPUs nearer to the actual power supply. This is something Nvidia and a gang of collaborators plan to pilot later this year in the form of roughly 25 small data centers located next to power substations in the USA.
Now, at first glance, you might think this is a bit of a zero sum game. How would putting GPUs nearer the actual power supply reduce the amount of power required?
Nope, it's not reduced losses from shorter cables or anything like that. Indeed, it's not about reducing power consumption at all. Instead, the idea is load balancing.
In other wor...


English (US)