NVIDIA Aims At Shipping Millions of AI GPUs By 2024, Working to Diversify Supply Chain

Photo of author
Written By Editor

Lorem ipsum dolor sit amet consectetur pulvinar ligula augue quis venenatis. 

NVIDIA has actually bet on AI for its future, as it is now reported that the company prepares to deliver 1.5 million to 2 million H100 systems by 2024.

NVIDIA Could Potentially Generate $80 Billion Revenue Through AI in 2024, Breaking All Records

Just a few hours earlier, NVIDIA published its record-breaking incomes with an incredible 171% increase in income for the Data Center sector coming in from the sales of AI GPUs and AI platforms that generally include Hopper H100, Ampere A100, and HGX systems. The green giant has actually currently slated another 170% increase in earnings growth for the current quarter (Q3) however this is just a taste of the things to come.

Our need is remarkable. We are significantly expanding our production capability. Supply will substantially increase for the rest of this year and next year. NVIDIA has actually been getting ready for this for over 20 years and has actually produced a brand-new computing platform that the world’s market– world’s industries can build upon.

Jensen Huang, NVIDIA CEO

Financial Times reports that NVIDIA is working to considerably enhance its production centers to produce high volumes of AI GPUs next year. To give you a concept of this increment, NVIDIA is set to deliver 550,000 H100s GPUs this year, and the business prepares almost to triple this amount next year. This will undoubtedly be a hard goal to attain because Team Green is faced with extended order backlogs, delayed to as far as December.

Our supply over the next several quarters will continue to ramp as we lower cycle times and deal with our supply partners to add capability. Furthermore, the brand-new L40S GPU will assist address the growing need for numerous types of workloads from cloud to enterprise.

We do expect to continue increasing ramping our supply over the next quarters along with into next fiscal year. In regards to percent, it’s not something that we have here. It is a work throughout numerous different providers, a lot of various parts of constructing an HGX and a lot of our other brand-new products that are pertaining to market. However we are extremely happy with both the assistance that we have with our suppliers and the long time that we have invested with them enhancing their supply.

Colette Kress, NVIDIA CFO

The AI boom has led all business into a GenAI race, with NVIDIA capitalizing the most. It is said that AI chip orders are reserved till 2024, and NVIDIA has already pledged to provide a high volume of H100s. The huge concern is how NVIDIA would satisfy such large orders, given that the business deals with numerous problems. There might be two solutions to this; the first is that Team Green deals with broadening present centers.

TSMC, NVIDIA’s leading partner, is accountable for producing AI GPUs. TSMC not just deals with NVIDIA however also has orders from Apple and AMD. AI GPU harvesting has been a problem for the Taiwan giant, particularly in packaging. While TSMC intend on fast expansion to accommodate the industry requires, that won’t become effective until 2024. NVIDIA has to embrace another plan, and the most ideal one is adopting a “dual-sourcing” method.

We reported the other day that Samsung is in talks with AMD to obtain orders for its MI300X AI accelerators. Samsung has actually gotten the spotlight in the industry since it offers its customers a “hybrid” approach, taking obligation for all advancement phases, unlike TSMC, which contracts out parts like HBM. Going to the Samsung route would be smart for NVIDIA here given that it would distribute the order workflow and guarantee a more streamlined supply, leading to more revenues.

News Source: Financial Times

Categories PC

Leave a Comment