NVIDIA's Blackwell B100 GPUs have actually apparently gotten in the supply chain accreditation phase which is another action in the advancement of the next-gen AI powerhouse.
NVIDIA Reportedly Selects Wistron & Foxconn As Supply Chain Partners For Next-Gen Blackwell AI GPUs
According to Chinese and Taiwanese outlets, UDN & CTEE, it is being reported that NVIDIA has actually now gotten in the Supply Chain Certification phase for its upcoming Blackwell B100 AI GPUs. The GPU will be successful the Hopper H100 chip and provide a significant leap in calculate efficiency & AI abilities.
To understand speedy production and supply throughout launch, NVIDIA is now choosing its supply chain partners reports recommend that Wistron has actually been picked as the main partner who will provide substrates for the production of the Blackwell GPUs. Both Wistron and Foxconn remained in the race to protect these orders nevertheless appointments with yields and other elements resulted in Wistron winning all of the early-stage orders.
NVIDIA's GB200 (B100) AI server is set up to be introduced in 2024. The supply chain has actually gone into the accreditation phase. There are reports in the market that Hon Hai initially prepared to win the B100 substrate order, however just recently the accreditation has actually been "obstructed". Wistron keeps the Original order share.
It is reported that Huida initially prepared to consist of Hon Hai as the 2nd AI-GPU server substrate provider for the next-generation B100 series. Due to yield rate and other factors to consider, Wistron will still acquire 100% of the order share, and Wistron is likewise taking benefit of the chance. The early phase orders for AI-GPU modules are supposedly effective.
UDN likewise points out that NVIDIA will be moving its orders of H100 and B100 GPUs to Foxconn in North America. These are particularly for the H100 & B100 modules so while the substrates are supplied by Wistron, it is most likely that Foxconn will be completion provider who will be using the completely packaged AI modules.
All orders for NVIDIA's highest-end H100 AI server modules in North America were turned over to Hon Hai, which will be produced in factories in Mexico, the United States, and Hsinchu, Taiwan, the NVIDIA B100 module order that the market is most worried about will likewise be contracted by Hon Hai in the future.
As per the most recent reports, NVIDIA has actually fast-forwarded the launch of its Blackwell B100 AI GPUs which are now stated to introduce in Q2 2024. The GPUs are stated to use SK hynix's HBM3e memory, using the greatest bandwidth and transfer speeds in the market. This, the brand-new GPUs are going to keep NVIDIA's supremacy within the AI sector where it currently has a market share hold of 90%. Next year is going to be a heated fight with other chipmakers likewise getting the rate within the AI section such as AMD, Intel, Tenstorrent, and more. News Source: Dan Nystedt