We don’t know where Nvidia came up with that GPU compute price, but at somewhere around $400,000 a pop minimum for a configured GPU server that is a clone of the eight-way DGX H100 from Nvidia, that’s 2,000 servers and that is a cost of $800 million – not $400 million. We ...
The company’s latest A.I.-training module, known as the DGX H100, is a three hundred-and-seventy-pound metal box that can cost up to five hundred thousand dollars. It is currently on back order for months. The DGX H100 runs five times as fast as the hardware that trained ChatGPT, ...
It should be noted that this is not how brains in the natural world operate. The network architecture of brains in the animal kingdom does not map well to existing hardware and, therefore, would operate at much lowerFLOPS utilization rates. Historically, dense matrix models have been able to ...