Micron Technology has started mass production of its high-bandwidth memory (HBM) semiconductors for use in Nvidia's latest chip for artificial intelligence, sending its shares up more than 5% on Monday.
The HBM3E (High Bandwidth Memory 3E) will consume 30% less power than rival offerings, Micron said, and could help tap into soaring demand for chips that power generative AI applications.
Nvidia will use the chip in its next-generation H200 graphic processing units, expected to start shipping in the second quarter and overtake the current H100 chip that has powered a massive surge in revenue at the chip designer.
"I think this is a huge opportunity for Micron, especially since the popularity of HBM chips seems to only be increasing for AI applications" said Anshel Sag, an analyst at Moor Insights & Strategy.
Demand for HBM chips, a market led by Nvidia supplier SK Hynix, for use in AI has also raised investor hopes that Micron would be able to weather a slow recovery in its other markets.
Since "SK Hynix has already sold out its 2024 inventory, having another source supplying the market can help GPU makers like AMD, Intel or NVIDIA scale up their GPU production as well," Sag added.
HBM is one of Micron's most profitable products, in part because of the technical complexity involved in its construction.
The company had previously said it expects "several hundred million" dollars of HBM revenue in fiscal 2024 and continued growth in 2025.
▼▼▼
Intel teases 1.4nm advanced manufacturing node
Apple suppliers need 103,000 workers in Vietnam this year
Japan's $67 Billion Bet to Regain Title of Global Chip Powerhouse
Samsung wins first 2 nm AI chip order from Japan’s PFN; a slap on TSMC
German Court bans sales of select Intel CPUs in Germany over patent dispute
+86 191 9627 2716
+86 181 7379 0595
8:30 a.m. to 5:30 p.m., Monday to Friday