We don’t know where Nvidia came up with that GPU compute price, but at somewhere around $400,000 a pop minimum for a configured GPU server that is a clone of the eight-way DGX H100 from Nvidia, that’s 2,000 servers and that is a cost of $800 million – not $400 million. We ...
The company’s latest A.I.-training module, known as the DGX H100, is a three hundred-and-seventy-pound metal box that can cost up to five hundred thousand dollars. It is currently on back order for months. The DGX H100 runs five times as fast as the hardware that trained ChatGPT, ...