Next-Gen HBM4 Memory Reportedly Features Significantly Bumped Up Bandwidth

Photo of author
Written By Editor

Lorem ipsum dolor sit amet consectetur pulvinar ligula augue quis venenatis. 

DigiTimes divulges that HBM4 might reach a memory bus size of as much as 2048-bit, opening a tremendous capacity for GPUs & the AI section.

Samsung & SK Hynix are Reportedly Working on “2000 I/O Ports” on HBM4, Expect Serious HPC Output in Future AI GPUs

Estimating Seoul Economy, DigiTimes exposes that the next-gen HBM memory might see a big dive in memory bandwidth, with an anticipated bump of 2x. In regards to its effect within the market, this suggests a considerably big quantity and to put it in this viewpoint, it is very important to keep in mind that HBM memory hasn’t seen an improvement in memory user interface because 2015.

Associated Story Elon Musk Clarifies That His Meeting With Israel’s Netanyahu Is Focused on AI and Has Nothing To Do With the ADL Saga

While on paper, this advancement looks fantastic however it definitely includes a great deal of “ifs”, generally worrying how makers will handle the information transfer rate and the essential modifications in specific memory stacks. Presently, the market has actually seen the combination of HBM3e with the most recent AI GPUs, which might rise to 5 TB/s bandwidth per chip, bringing good efficiency boosts in NVIDIA’s extremely popular H100 AI GPUs.

SK hynix Lands In NVIDIA's Request To Sample Next-Gen HBM3E DRAM For Future AI GPUs 1

DigiTimes has actually reported the advancement declaring that Samsung and SK Hynix are moving towards incorporating “2000 I/O” ports on their next-gen HBM4 memory requirement. In layperson’s terms, this suggests that the procedure will include much bigger computational abilities together with assistance for much bigger LLMs, which is a crucial consider the next-gen technique of genAI advancement. While the report hasn’t divulged a main advancement, we will eventually reach the point however when it comes to now, the turning point is far and we will talk about why listed below.

The AI market is presently in a paradigm shift, with genAI abilities being instilled into customer applications, which has actually led the tech leviathans into an evident “race.” This has actually eventually included tremendous need for AI GPUs, which need HBM as a main element, and today, memory makers are concentrated on providing sufficient supply. Do not get me incorrect, developments within the HBM market are undoubtedly impending, nevertheless, they will not take place quickly a minimum of in the upcoming years, unless there is something “cooking” that we aren’t knowledgeable about yet.

Categories PC

Leave a Comment