Nvidia AI Growth Substantiated By Taiwan Semiconductor And Meta Reports
Summary:
- If you still think AI is a fad or investor “hopium,” you haven’t read the recent Taiwan Semiconductor Manufacturing Company Limited’s or Meta Platforms, Inc.’s earnings reports regarding AI hardware demand.
- These two reports provide a read-through for Nvidia Corporation’s quarter and the rest of its fiscal year.
- The bottom line is Taiwan Semiconductor is nearly supply-constrained getting Nvidia chips out the door, while Meta is prioritizing AI servers and deprioritizing non-AI servers.
- Stop denying the truth of AI and the sheer amount of hardware demand it’ll see this decade, there’s very likely more outperformance left for Nvidia.
I’m still amazed to see people doubting artificial intelligence (“AI”), thinking it’s nothing more than a fad, hype, investor “hopium,” or vaporware. I know fads can see billions of dollars spent on them, and I’ve witnessed vaporware pushed from major vendors of mine amount to nothing, so I get the thinking that AI will be nothing more than the next “cool thing” to hit the trash bin. But there’s something more pervasive about AI that the average person doesn’t see. And while the average person may not see it, two major companies do, along with the need for it. Both just reported in the last week or so.
As I’ve said before, most people don’t even realize the social media apps they use have been invigorated by more AI in the last eight months than they ever have been over those apps’ existence. But I’m seeing AI push into other realms now. I had a coworker tell me in the last week she used AI to generate a headshot for her work avatar and LinkedIn profile pic because she didn’t want to pay for a professional headshot. Before I was even privy to her explanation, the picture looked good enough that I didn’t think beyond the initial, “Hmmm, I mean, it looks like her…”
Are photographers going to go out of business? Probably not today or tomorrow. But in the not-so-distant future, I’m not so sure.
I’ll ask you this: do fads replace the cost or use of something else, providing utility to the average person? Honestly, think about it.
More to this article’s point, everyone interested in stocks or tech knows Nvidia Corporation (NASDAQ:NVDA) has big revenue numbers planned for this year in its Data Center segment. Many are skeptical the company will be able to hit its $11B guidance in a month. I’m not one of those skeptics.
Is AI a fad is the question.
My position was reinforced this past week when Taiwan Semiconductor Manufacturing Company Limited (TSM) and Meta Platforms, Inc. (META) reported earnings.
Now, you might be saying one is a logic chip manufacturer, and one is a social media conglomerate; what do all three have to do with one another? Well, the two have to do with the third’s revenue. Taiwan Semiconductor produces the chips in Nvidia’s A100 and H100 AI accelerators, while Meta is buying those AI accelerators. Therefore, the read-through between the two earnings reports allows us to see both sides of Nvidia’s business – supply and demand.
Taiwan Semiconductors’ AI Revenue
HPC (high performance computing) is a segment of TSM’s business. It’s the high-performing CPUs for Advanced Micro Devices, Inc. (AMD), Apple (AAPL), Intel (INTC), and Nvidia, among others. And with AMD having a sluggish quarter with another one on tap and weakness in the PC market for Apple and Intel, this accounts for a good portion of TSM’s weakness.
The HPC segment also contains what the company defines as AI processor demand from CPU, GPU, and AI accelerators. AI processors make up 6% of TSM’s total revenue, or $926M. And if AI processors are within HPC, this means AI processors make up 13.6% of HPC revenue.
…server AI processor demand…accounts for approximately 6% of TSMC’s total revenue. We forecasted this to grow at close to 50% CAGR in the next 5 years and increase to low teens percent of our revenue.
– C. C. Wei, CEO, Taiwan Semiconductor’s Q2 ’23 Earnings Call.
Since TSM likely places various degrees of AI acceleration into this bucket, my take is it includes processors from AMD, Qualcomm (QCOM), Intel, Amazon (AMZN), Google (GOOG, GOOGL), and Nvidia. I’ve used the customer breakdown found at this new Substack by Sravan to gauge who and what level of revenue is contributed.
We know AI processors have been a growing bucket for all of these customers, just based on their press releases and the continued push for “their own chips.” But most notable is Nvidia and its need to “procure…substantially higher supply for the second half of the year.”
For the AI, right now, we see a very strong demand, yes. For the tightness part, we don’t have any problem to support. But for the back end, the advanced packaging side, especially for the cohorts, we do have some very tight capacity to – very hard to fulfill 100% of what customer needed. So we are working with customers for the short term to help them to fulfill the demand, but we are increasing our capacity as quickly as possible. And we expect these tightening will be released in next year, probably towards the end of next year. But in between, we’re still working closely with our customers to support their growth.
C. C. Wei, CEO, Q2 ’23 Earnings Call Q&A (emphasis added).
In this, I’m making an educated assumption Nvidia contributed 80% to TSM’s AI processor revenue in the quarter, a figure below Nvidia’s estimated AI data center market share of over 90%. This is further supported by Nvidia’s 6.3% CY22 revenue contribution to TSM, with a majority of Nvidia’s revenue coming from its Data Center segment (read as AI) at around 56%. Nvidia would have been spending roughly $670M per quarter on average in its FY23 (which is offset by one month from CY22). Then factor in the $926M in TSM’s AI processor revenue this quarter and Nvidia’s need for more supply, and it’s clear Nvidia is the leading capacity utilizer for AI chips with at least 73% contribution, and that’s based on CY22. This gives us a range, but considering Nvidia’s outperformance, it likely comes in closer to the 80% AI processor revenue contribution side. It was likely $750M in revenue to TSM in the quarter for AI processors. I expect Nvidia to be well over $800M per quarter in AI processor spend going forward.
So for AI processors to be a bright spot for TSM with 50% CAGR expectations and seeing the majority of growth coming from Nvidia through its direct guidance, it appears NVDA’s read-through from TSM on its year for data center sales remains substantiated. If anything, Nvidia is supply constrained as TSM works feverishly to get its tight supply resolved on the back end packaging side.
Meta Platforms’ Need For More Nvidia Hardware Into 2024
Now, knowing the supply side is one thing, and, especially in this supply chain environment, it’s important to know the company can be provided the pieces to its product. But seeing the demand side, especially further out, is as important, if not more important, than supply.
Enter Nvidia’s customers.
On the demand side, its customers are prioritizing AI accelerator spend over any other infrastructure spend. In fact, with Meta Platforms, it’s deprioritizing non-AI server spend for the sake of shifting more to AI servers to remain disciplined in CapEx growth. Meta’s earnings report and earnings call explicitly called this out.
We expect our full-year 2023 capital expenditures to be in the range of $27-30 billion, lowered from our prior estimate of $30-33 billion. The reduced forecast is due to both cost savings, particularly on non-AI servers, as well as shifts in capital expenditures into 2024 from delays in projects and equipment deliveries rather than a reduction in overall investment plans.
Looking ahead, while we will continue to refine our plans as we progress throughout this year, we currently expect total capital expenditures to grow in 2024, driven by our investments across both data centers and servers, particularly in support of our AI work.
– Susan Li, CFO, Meta Platforms Q2 ’23 Earnings Press Release (emphasis added).
There’s an important piece within this commentary on AI. The key understanding is not only is AI being prioritized, but non-AI servers are being deprioritized. There’s an active shift in spending from CPU-based servers to GPU-based AI-accelerated servers.
This isn’t something to gloss over; server hardware is the number one capital expenditure. It’s not office space, campus expansion, or a refresh in employee laptops; it’s servers and the data centers that house those servers. Then on top of that, it’s AI-based servers, not non-AI infrastructure getting the nod.
While Meta is working toward its own AI-forward processor (its Meta Training and Inference Accelerator, or MTIA), it continues to buy Nvidia’s H100 DGX systems. Thus far, Meta has admitted it has only been able to “handle… ‘low-complexity’ and ‘medium-complexity’ AI models more efficiently than a GPU.”
But furthermore, “MTIA’s focus is strictly on inference — not training — for ‘recommendation workloads’ across Meta’s app family.” Inferencing requires training to happen. AMD (AMD) explains the difference between training and inference as “…the former is a building block for the latter.”
Nvidia’s DGX is designed for this deep learning training, and is why Meta has focused on the “easier” workloads for its in-house silicon. Therefore, it must “outsource” its heavy AI workloads to Nvidia’s capable hardware. Workloads needed to get to the “easier” inferencing part.
Will Meta get to a place where it can use its own silicon for all AI workloads? Potentially, but even at best, TechCrunch claims Meta is only on track for a 2025 chip release with those capabilities.
While some of Meta’s CapEx will go toward MTIA, the vast majority is being directed to Nvidia’s hardware package, so its AI capabilities are run at as close to 100% (read: competitive) as possible in these “early” AI days.
Meta’s findings with its own silicon and AI hardware solution resulted in the same answer as to why Qualcomm (QCOM) is the 5G leader and Apple is years and billions of dollars behind in producing an in-house modem: decades of experience. Something Nvidia has.
Nearly every major hyperscalers and cloud provider started its business by buying hardware to support a cloud experience. Getting into the silicon race this late in the game for AI workloads will require the same experience as Nvidia to beat it at its own game – even after procuring the same talent from the best houses. But then the battle on two fronts will only allow incumbents like Nvidia to continually pull away.
AI Isn’t Going Away, And Nvidia Is In More Demand
Between the outsized supply Nvidia is garnering from Taiwan Semiconductor and the continual struggles Nvidia’s customers have with their own silicon and AI packages, it’s clear Nvidia’s quarter and subsequent quarters into 2024 will remain in outperformance mode. Many are skeptical of the guidance raise, but more are tepid about what lies ahead for Nvidia’s next two quarters. Trying to pin down Nvidia’s upcoming quarter is difficult after such an explosion in data center AI hardware demand.
My two cents is Nvidia will make a mess out of analyst estimates once again.
Of course, the question is, will the stock continue higher regardless? No, it’s quite possible Nvidia will consolidate after its report, which is less than four weeks away. But if the numbers for the July quarter are solid (i.e., closer to $12B than $11B) and the upcoming quarter has a material sequential growth, the stock may only see a shallow pullback. In the meantime, there’s not much preventing the stock from making a new all-time high.
I’ll be patient and see what the chart looks like going into the print, but for now, things look quite good for the business to keep accelerating and the stock to keep moving higher, even with higher expectations.
Analyst’s Disclosure: I/we have a beneficial long position in the shares of NVDA, AMD, INTC, META, QCOM, TSM either through stock ownership, options, or other derivatives. I wrote this article myself, and it expresses my own opinions. I am not receiving compensation for it (other than from Seeking Alpha). I have no business relationship with any company whose stock is mentioned in this article.
Seeking Alpha’s Disclosure: Past performance is no guarantee of future results. No recommendation or advice is being given as to whether any investment is suitable for a particular investor. Any views or opinions expressed above may not reflect those of Seeking Alpha as a whole. Seeking Alpha is not a licensed securities dealer, broker or US investment adviser or investment bank. Our analysts are third party authors that include both professional investors and individual investors who may not be licensed or certified by any institute or regulatory body.
Decrypt The Cash In Tech With Tech Cache
Do two things to further your tech portfolio. First, click the ‘Follow’ button below next to my name. Second, become one of my subscribers risk-free with a free trial, where you’ll be able to hear my thoughts as events unfold instead of reading my public articles weeks later only containing a subset of information. In fact, I provide four times more content (earnings, best ideas, trades, etc.) each month than what you read for free here. Plus, you’ll get ongoing discussions among intelligent investors and traders in my chat room.