Oracle’s documents show financial challenges of renting out Nvidia’s chips: report

Oracle’s (NYSE:ORCL) internal documents showed that the fast-growing cloud business has thin gross profit margins in the past year or so, lower than what many equity analysts have forecast, The Information reported.

This could raise questions about if the AI cloud expansions undertaken by Oracle and its competitors will affect profitability and sustain investors’ expectations, the report added.

Oracle did not immediately respond to a request for comment from Seeking Alpha.

Last month, Oracle’ stock soared after the company said it will generate $381 billion in revenue from renting out specialized cloud servers to OpenAI (OPENAI) and other AI developers over the next five fiscal years.

In the three months ended August, Oracle recorded about $900M from rentals of servers powered by Nvidia (NASDAQ:NVDA) chips and saw a gross profit of $125M — equal to 14 cents for every $1 of sales, the report noted, citing the documents.

This is lower than the gross margins of many non-tech retail businesses. As sales from the business nearly tripled in the past year, the gross profit margin from those sales ranged between less than 10% and slightly more than 20%, averaging around 16%, according to the report.

In some cases, Oracle is losing money on rentals of small quantities of both newer and older versions of Nvidia’s chips, the report noted.

The 14% gross margin figure seems to consider the labor, power and other direct costs of running Oracle data centers, including depreciation expenses for some of the equipment. Other unspecified depreciation expenses would take up another 7 percentage points of margin, the report added.

The internal documents offer a rare glimpse into Oracle’s dramatic transformation from a weak link of the cloud industry to an AI data center powerhouse. It also shows that being in the business of buying and renting out Nvidia graphics processing units isn’t easy.

If Oracle’s revenue from the server rental business eventually surpasses its traditional software businesses, the weak margins on server rentals mean its overall gross profit margins will decline from the level they’ve been at in recent years, about 70%. Those margins have already declined from nearly 80% a decade ago as the company has increasingly turned to selling cloud services, the report noted.

The previously undisclosed Oracle figures are not comparable to what other AI cloud server competitors like CoreWeave (NASDAQ:CRWV) and privately traded Lambda report to investors. Each company seems to calculate its margins differently, the report added.

The cost of powering Nvidia server chips far surpasses that of using traditional servers, which were used more by the cloud computing industry before the dawn of ChatGPT. AI computing also needs other specialized hardware, like networking equipment to connect the servers. In addition, to win big deals with AI customers, Oracle and other cloud providers have provided heavy discounts on graphics processing units, or GPUs, rental prices, compared to prices they list for regular customers, the report added, citing people familiar with these deals.

This further diminishes the profitability of those contracts. Many big cloud providers have said their spending on such chips has hurt their margins in recent quarters, the report noted.

Larger cloud providers have been able to offset the margin hit from renting out Nvidia GPUs to some extent because most of their businesses do not depend on that expensive hardware.

Amazon’s (AMZN) Amazon Web Services saw a 33% net operating profit margin and Alphabet’s (GOOG) (GOOGL) Google Cloud reported a 17% net operating margin in the quarter that ended in June. Both companies do not disclose a gross profit margin or break out financials for its GPU server rentals, according to the report.

Oracle is seeing a more direct impact to its margins by renting out GPUs. Oracle reported about $10B in sales from renting out cloud servers in the fiscal year that ended in May, with about 20% of that revenue coming from GPU servers, the report added.

In the most recent quarter, the percentage of GPU cloud server sales grew to 27%. Oracle’s public disclosures imply its GPU cloud business could equal the company’s non-cloud revenue as soon as 2028, as per the report.

Oracle’s margins see impact whenever it installs Nvidia’s latest GPUs in its data centers. For example, in recent months, the GPU cloud gross margin declined to the low teens from more than 20% due to the rollout of new Nvidia chips in facilities in Abilene, Texas, that Oracle rents out to OpenAI, the report added.

Oracle could be taking an outsize impact to its margins because it does not own the data centers its customers use, unlike AWS and Google Cloud, which own the majority of theirs. Instead, Oracle mainly leases its data centers from third parties, such as Crusoe, the report noted.

Another challenge is how reliant its GPU cloud business could become on a single customer. Virtually all of the $317B worth of cloud deals it signed in the three months that ended in August came from OpenAI, the report added.

Oracle’s top five AI cloud customers form about 80% of that business: ByteDance, Meta, xAI, OpenAI and Nvidia itself, which uses cloud-based GPUs for its own research and development. Other GPU cloud providers, such as CoreWeave, Nebius Group and Lambda, run a similar risk of concentrating most of their cloud revenue in a small number of customers, the report noted.

One benefit for Oracle’s GPU business is the amount of revenue it is generating from older generation Nvidia chips, such as the Ampere chips which were introduced in 2020. These chips appear to be helping Oracle’s margins, but newer versions of Nvidia chips strain them.This dynamic seems to contradict Nvidia CEO Jensen Huang’s recent comment that the introduction of newer chips would wipe out demand for older ones, the report added.

In addition, Oracle’s margins are impacted by how many of its servers customers are using — and paying for. Using of Oracle’s GPU cloud servers ranges between 60% and 90%, depending on the type of Nvidia chip that powers them, the report noted.

In the three months ended August, Oracle lost about $100M from rentals of Nvidia’s Blackwell chips, which came this year. That is partly because there is a time period between when Oracle gets its data centers ready for customers and when customers start using and paying for them. It is not clear what causes the gap or how Oracle intends to shorten it, the report added.

Leave a Reply

Your email address will not be published. Required fields are marked *