Unpacking AMD's $500 Billion AI Accelerator Forecast
Alright, these things are never easy. But let's go ahead and try and unpack the $500 Billion DC AI Accelerator forecast that was announced by Lisa Su at this week's Advancing AI.
This number should be incredibly exciting to the AI bulls. It is the largest forecast out there and the way I'm reading it is these numbers constitute a combination of dedicated datacenter GPU, CPU, DPU, and XPU.
Also, As I interpret it, this is chip level and not system level. So, this is merely the chips and not
First, as a comparison, our Futurum Intelligence Forecast had these categories growing as follows between 2023-2028 at:
We didn't forecast the DPU and SmartNIC in our most recent report, but this will clearly need to be accounted for as back-end networking silicon will be a rapid growth number with companies like Broadcom, Marvell, and now AMD all competing in the multi-billion-dollar space, which could be $10-20 billion by 2028. (Have to come back to this part of the buildup in a future post)
So, lt'es do a little back of the napkin on how we can get to a buildup based upon our market breakdowns. And where this would leave some of the biggest contributors in the space.
First, our breakdown of the composition of AI Accelerators in 2023.
Now, we make some assumptions about what the composition of AI Accelerators will look like in 2028. Our assessment in 2024 had XPU and Cloud Instance Growing between 10-20% faster than GPU but at a much smaller overall revenue number.
So based upon our data and smoothing based upon the conversations we have had with customers, ODM, and OEM, I envision by 2028 the breakdown (revenue) will look something like:
Recommended by LinkedIn
Finally, we look at the current market share, which has NVIDIA at 92% of the GPU market and AMD holding most of the remaining GPU market. Intel will enter in the 2026/27 year, but right now is limited to the accelerator category.
For the XPU and Accelerators Google has 58% of the market, AWS 21% and the other vendors are still largely negligible. These are largely being made by Broadcom and Marvell for the cloud vendors and other AI accelerator companies like Groq, SambaNova, and others.
NVIDIA has a massive advantage and will likely give up a negligible amount of the GPU market itself as it has characteristics of Intel during its most dominant market phase. What we do expect is the overall market to shift to more accelerators. I would pin NVIDIA at 86% in 2028 if nothing sizable occurs like a major DOJ incident. That leaves me estimating AMD to have 10% of the GPU market and Intel to win 3-4% and potentially other taking the rest.
So, what is left and how does this split off and presuming around $485 Billion of AI chips sans the networking chips.
At 60% of the TAM that puts the GPU split at $291 Billion and I would estimate:
On the XPU and Cloud Accelerator Side there is a ton of TAM expansion. Current estimates have the all in XPU and Cloud Instance number at about $10-12 Billion. This number is harder to pin down because sell into Meta and Google and others like Alibaba and Tencent that are using internal home grown is harder to track.
But the interesting stat here is that the overall XPU and Cloud Instance number runs to $135.8 Billion by 2028. This is split between the new entrants, the cloud providers, and the specialty chip makers. This is probably the most interesting part of the mix.
CPUs are primarily used for enabling the AI Chips to be more efficient and performant and there will be some growth in this category, but it will be much more conservative for dedicated AI use. This is pinned at about 12% CAGR for the dedicated CPU for AI. Intel, AMD, Arm will continue to consume most of this with AMD and Arm best positioned because their respective market share in cloud providers.
And there you have it. The AI market over the next 5 years and a best guess at the construct of the $500 Billion TAM.
This sounds like a fascinating analysis! The growth projections for AI chips are definitely something to watch closely. What specific factors do you think will drive the differences in revenue between the major players?
Editor @ RetireFunds.Blogspot.com | Focusing on Future Tech stocks
2mo👍 Long AMD 👍 retirefunds.blogspot.com/2024/11/why-we-bought-both-amd-and-micron.html
CTO of Dr. Lisa AI. & CTO of a new company started by Dr Lisa Palmer
2moThere's an implied shift of AI workloads from GPUs to XPUs which makes intuitive sense to me and I'd be interested to learn more about how your team estimated the degree of that shift, and what the key assumptions were. I've never tried to estimate the degree of this shift myself, my intuition tells me it'll be "big" and that a new category will be carved out of XPUs.