Nvidia has asked SK hynix to move up its delivery timeline for next-generation HBM4 memory chips by six months, according to ...
In an industry-first, SK hynix has announced its 16-Hi HBM3E memory, offering capacities of 48GB per stack alongside other bleeding-edge NAND/DRAM products.
SK Hynix Inc. is accelerating the launch of its next-generation AI memory chips after Nvidia Corp. Chief Executive Officer ...
Nvidia CEO Jensen Huang had asked memory chip maker SK Hynix to bring forward by six months the supply of its next-generation ...
Nvidia CEO Jensen Huang emphasized during a video call at the SK AI Summit in Seoul's Coex that he aims to strengthen ...
SK Hynix plans to provide samples of its 48GB, 16-layer HBM product—the industry's largest capacity and highest layer ...
SK Hynix, the world’s second-largest memorychip maker, is racing to meet explosive demand for the high-bandwidth memory (HBM) ...
Nvidia ( NVDA, Financials) has requested SK Hynix expedite delivery of its HBM4 high-bandwidth memory chips by six months, SK ...
SK hynix unveils the industry's first 16-Hi HBM3E memory, offering up to 48GB per stack for AI GPUs with even more AI memory in the future.
Nvidia is urging SK Hynix to fast-track the production of its high-bandwidth memory (HBM4) chips as demand for AI hardware ...
The chief executive of artificial intelligence (AI) chip giant Nvidia on Monday stressed the importance of its partnership ...
NVIDIA currently uses SK hynix's HBM3E memory for its AI chips and plans to use HBM4 in its upcoming Rubin R100 AI GPU.