Nvidia: Why Skeptics Keep Missing The Point
Summary:
- Nvidia Corporation’s stock has surged, driven by robust AI demand, despite skepticism about AI Capex ROI and competition from AMD’s MI300X chip.
- Cloud providers like AWS, Azure, and Google Cloud show increasing profitability, debunking claims of low AI ROI impacting Nvidia’s chip demand.
- AI’s transformative impact extends beyond consumer-facing applications, with accelerated computing becoming essential for traditional data center tasks.
- Nvidia’s valuation reflects its leading position in the AI revolution, with sales projected to grow significantly, justifying its premium price.
Investment Thesis
I remember my first article on Nvidia Corporation (NASDAQ:NVDA) back in August 2023, which carried a “buy” rating. NVDA was up 240% year-over-year, and doubts about whether the ticker would maintain its meteoric rise began to surface. For many, the main concern was the rising competition as AMD prepared to launch the MI300X chip, its answer to NVDA’s H100 at the time.
Things have moved fast since then. By May 2024, my second article was practically a eulogy to these voices overstating AMD’s MI300X market position, citing its botched debut in Q1 2024. At that time, NVDA was up another 300%, and a new breed of skeptics arose, this time questioning whether AI demand would continue. I tried my best to argue that NVDA is in the early innings of a multi-year, if not a multi-decade, secular trend, giving Meta’s immense computational needs as an example of the magnitude of the market.
Since that May article, NVDA is up 55%. Today, investors find themselves facing another breed of skeptics, those who argue that at a certain point, tech companies will revisit their Capex spending in light of the low return on investment “ROI” on AI infrastructure. This article addresses these concerns.
False Premise
As a trained analyst, dissecting arguments becomes a second habit. Let’s look at this argument: Demand for NVDA’s chips will wane as tech companies realize a disparity between AI Capex spending and ROI.
- Premise: The ROI of AI projects is low.
- Assumption: Companies base their AI Capex decisions on ROI.
- Conclusion: Demand for NVDA’s chips will decline given the premise and underlying assumption.
The underlying premise and assumption of this argument is false. Now, I understand that NVDA has a diverse client base, and perhaps some of them have low ROI, but let’s start from the top.
Half of NVDA’s data center sales — or 42% of total sales — come from public cloud companies — the likes of Microsoft’s (MSFT) Azure, Amazon’s (AMZN) AWS, and Alphabet’s (GOOG) Google Cloud. If returns on AI investment are as weak as the argument at our hands suggests, we should have seen a decline in margins because in GAAP accounting, depreciation is expensed in the Cost of Goods “COGS” line. Once the server is installed and running, part of its value will be deducted as COGS, regardless of the revenue.
What we’re actually seeing is that profitability is increasing. AWS operating margins reached 38% last quarter, up from 30% in the same period of the previous year. In Q2 2022, before the Generative AI bonanza, AWS’s margin was 26%, less than when AMZN accelerated its AI Capex spending. Azure margins remained steady in the past three years, standing at 43.6%, 44.5%, and 44.1% in Q3 2024, Q3 2023, and Q3 2022, respectively. Google doesn’t segregate earnings per segment, but looking at its gross margins, they are going in the right direction, showing no signs of margin contraction.
I think some people are skeptical about the ROI of AI because consumer-facing AI applications haven’t grown as fast as expected. The public is still waiting for Apple’s (AAPL) Siri in its AI-empowered form. Nonetheless, behind the scenes, AI has had seismic cost benefits. JPMorgan (JPM) recently rolled out a chatbot that helps employees summarize documents and generate ideas. Programmers on GitHub (Part of Microsoft) can now generate computer code with a few natural language prompts, increasing productivity. Seeking Alpha recently launched the Virtual Report feature that compiles summaries of analysts’ articles. There are an increasing number of consumer-facing applications that are increasing productivity, increasing sales andor lowering costs.
Now, don’t get me wrong. I was one of the first to warn investors that while generative AI presents a growth opportunity for cloud providers, the pace of this growth won’t match the hype. I was right. But that is entirely different from saying that it isn’t profitable.
The delay in AI upgrades for many popular apps such as Apple’s Siri and Amazon’s Alexa is likely because of the low ROI. But that has more to do with the business model of these apps, rather than NVDA’s chips, or the capabilities of AI technology. The good news is that NVDA has been lowering the price per computation with each product upgrade. For example, the A100, launched in 2020, can handle 19.5 teraflops each second and is leased for $1.89hour by Data Crunch. The H100, launched in 2022, can handle 67 teraflops and is available for rent at $3.17hour, a higher price that reflects the cloud provider’s higher equipment cost for the H100. Nonetheless, the performance per dollar is much more economical using NVDA’s H100 (21 TFLOPS per dollar) compared to the prior A100 (10 TFLOPS per dollar).
As costs continue declining, it would be more economical to roll out AI features for freemium business models, such as Siri and Alexa. One can’t also dismiss the fact that generative AI is a new technology, and isn’t perfect, which also factors in the pace of development of consumer-facing AI applications.
It’s More Than Large Language Models
I also cover Intel (INTC), and listening to management’s comments, I see that they have an odd perspective of their market position. They’re differentiating between Traditional data center and AI data center spending, viewing the latter as a temporary surge that will eventually balance out.
big cloud customers, [in particular], have put a lot of energy into building out their high-end AI training environments. And that is putting more of their budgets focused or prioritized into the AI portion of their build-out. That said, we do think this is a near-term, right, surge that we expect will balance over time – Intel Q2 2023 Earnings Call.
I don’t believe that the budget shift from Traditional to AI racks in the data center market is temporary. Many of the traditional computing tasks such as search engines and recommendation algorithms that previously ran on traditional data center CPUs have shifted to accelerated computing using GPUs. Anywhere big data exists, the probability is that it is more efficient to run on an accelerated platform than traditional data centers. I think Intel’s market share loss in the data center market is more permanent than what management conveys to its shareholders.
Valuation
I believe that we are in the early phase of a multi-year, if not a multi-decade, AI secular tailwind that will transform the world as we know it. Emotionally intelligent digital assistants could very well become a reality. Neuralinks that map our thoughts aren’t as fictional as they were 15 years ago. General-purpose humanoid robots are currently being developed. All these innovations will require immense computational power.
The current computational capacity is clearly not enough. Even trivial tasks such as creating AI images are too slow. Queries at ChatGPT’s latest version, which supports rudimentary reasoning skills, are limited to ration computing power. All these are signs that the current AI infrastructure needs more microchips. This year, Wall Street expects NVDA’s sales to increase 115%, hitting $129 billion. In 2025, sales are estimated at $192 billion, a 47% growth. I think NVDA could surprise us. I think that the pace of AI innovation will accelerate exponentially, and NVDA’s sales will grow in tandem with these advancements.
The company’s 48x forward P/E ratio is based on 2024 earnings estimates. Using next year’s EPS forecast of $4.37 per share, P/E drops to 32x. It still isn’t the cheapest microchip company, but still within the industry range. Equally important, I think NVDA will command a price premium for the foreseeable future, given its dominant market position in the AI data center market.
1-Year Forward PE | |
Taiwan Semiconductor (TSM) | 21.39 |
Broadcom (AVGO) | 26.51 |
AMD (AMD) | 26.91 |
Texas Instruments (TXN) | 32.62 |
QUALCOMM (QCOM) | 32.62 |
ARM Holdings (ARM) | 66.25 |
Micron (MU) | 11.50 |
Analog Devices (ADI) | 28.86 |
Marvel (MRVL) | 36.77 |
NXP Semiconductors (NXPI) | 17.09 |
Nvidia (NVDA) | 32.46 |
Q3 Results
Data center currently constitutes the majority of NVDA’s earnings and half of this revenue comes from a handful of public cloud providers who in turn lease AI computing capacity to the public. Between Q4 2023 and Q3 2024, data center sales rose by nearly 10x. In the span of 24 months, the gap between data center and gaming rose from $1.6 billion to roughly $27 billion, mirroring the “infrastructure-building” phase we’re currently experiencing.
What is also interesting is that NVDA’s gaming revenue nearly doubled, mirroring GPU advancements under the GeForce brand, but also the reputational gains won, given NVDA’s leading position in AI.
The virtual and augmented reality business, reported under the Professional Visualization segment, generated $486 million in Q3 24, up 17% YoY, a fair performance that nonetheless doesn’t fully reflect the potential opportunity of the market.
The Auto segment grew 71% in Q3, and if this trend continues, it will exceed the Professional Visualization segment by Q4 2024. NVDA’s GPUs are used to provide real-time imaging analysis in autonomous vehicles. Roughly 75 million cars were sold last year. As the likes of Tesla (TSLA), Cruise (GM), and Waymo (GOOG) (GOOGL) continue their efforts to advance autonomous driving and pave the regulatory framework for autonomous driving, we could see NVDA’s sales for this sector surge in the same way that its Data Center did two years ago.
Final Thoughts and How I Might Be Wrong
As with any company pioneering a new market, volatility is to be expected. However, I believe that we have entered a phase of rapid AI advancements that will transform every industry. Think automotive driving, high-resolution virtual reality, and brain neuralinks that connect our brains to computers. Growth won’t be linear, but for those willing to ride the wave, there is a potential for significant returns.
Besides potential volatility and non-linearity of demand, competitive risk could also become more prominent in the long and medium run. I believe that in business, there is rarely a situation where the winner takes all. Different customers will choose different AI solutions, whether that be Intel Xeon and Gaudi accelerators or AMD’s EPYC and Instinct product suits.
However, currently, NVDA has the best GPUs, and their competitive moat is enhanced by their software ecosystem manifested in frameworks such as CUDA, TensorRT, and Omniverse, which provide developers the ability to optimize NVDA’s hardware to their specific needs. Equally important, these frameworks also make it harder for data centers to switch to a different hardware.
Analyst’s Disclosure: I/we have no stock, option or similar derivative position in any of the companies mentioned, and no plans to initiate any such positions within the next 72 hours. I wrote this article myself, and it expresses my own opinions. I am not receiving compensation for it (other than from Seeking Alpha). I have no business relationship with any company whose stock is mentioned in this article.
Seeking Alpha’s Disclosure: Past performance is no guarantee of future results. No recommendation or advice is being given as to whether any investment is suitable for a particular investor. Any views or opinions expressed above may not reflect those of Seeking Alpha as a whole. Seeking Alpha is not a licensed securities dealer, broker or US investment adviser or investment bank. Our analysts are third party authors that include both professional investors and individual investors who may not be licensed or certified by any institute or regulatory body.