Nvidia's New H200 GPU Promises to Outpace Competition in AI Hardware Race

Nvidia Corp NVDA showcased its H200 graphics processing unit, an upgrade from the H100, the chip OpenAI used to train GPT-4.

Nvidia expects around $16 billion of revenue for its Q3, up 170% year-on-year.

Also Read: What's Going On With Nvidia Monday?

The H100 chips cost between $25,000 - $40,000, CNBC cites from Raymond and James estimates.

The H200 includes 141GB of next-generation "HBM3" memory.

Nvidia said the H200 will generate output nearly twice as fast as the H100 based on a test using Meta Platforms Inc's META Llama 2 LLM.

The H200, which will likely ship in the second quarter of 2024, will compete with Advanced Micro Devices, Inc's AMD MI300X GPU. Amazon.com Inc's AMZN AWS, Alphabet Inc's GOOG GOOGL Google Cloud and Oracle Corp's NYSE: ORCL) Cloud Infrastructure has all committed to using the new chip starting in 2024.

AMD will launch its rival MI300 chip to market in the fourth quarter. Intel Corp INTC claims its Gaudi 2 model is faster than the H100.

Nvidia said the H200 will be compatible with the H100.

The H200 will be available in four-GPU or eight-GPU server configurations on the company's HGX complete systems and in a chip called GH200, which pairs the H200 GPU with an Arm Holdings Plc ARM based processor.

Both the H100 and H200 are based on Nvidia's Hopper architecture.

The stock gained 241% YTD, reaching the $1 trillion valuation thanks to the AI frenzy.

Price Action: NVDA shares traded higher by 0.79% at $487.30 on the last check Monday.

Market News and Data brought to you by Benzinga APIs
Posted In: NewsTechMediaBriefs
Benzinga simplifies the market for smarter investing

Trade confidently with insights and alerts from analyst ratings, free reports and breaking news that affects the stocks you care about.

Join Now: Free!

Loading...