Nvidia (NVDA.O) announced the addition of new capabilities to its top-tier AI processor. The company stated that Amazon.com (AMZN.O), Google (GOOGL.O), and Oracle (ORCL.N) will be among the first to provide the new product in the upcoming year.
The processor, dubbed the H200, will now surpass Nvidia’s top-tier H100 chip. More high-bandwidth memory, one of the most expensive components of the processor that determines how much data it can process rapidly, is the main improvement.
In addition to powering OpenAI’s ChatGPT service and several other generative AI services that provide human-like writing responses to questions, Nvidia is the industry leader in AI processors. Such services will respond faster because of extra high-bandwidth memory and a quicker link to the chip’s processing units.
High-bandwidth memory in the H200 is 141 gigabytes, compared to 80 gigabytes in the H100. In September, Nvidia did not disclose the memory vendors for the new chip; however, Micron Technology (MU.O.) said it was becoming an Nvidia supplier.
Additionally, Nvidia purchases memory from SK Hynix (000660. KS), a Korean company that said last month that AI processors were boosting sales.
Along with specialized AI cloud providers CoreWeave, Lambda, and Vultr, Nvidia said on Wednesday that Amazon Web Services, Google Cloud, Microsoft Azure, and Oracle Cloud Infrastructure will be among the first cloud service providers to enable access to H200 processors.
Comment Template