
JHVEPhoto
Microsoft (NASDAQ:MSFT) is taking much longer than expected to design its latest-generation AI chip, The Information reported.
The delay is increasing the possibility that the new chips would be even less competitive with Nvidia’s (NASDAQ:NVDA) by the time it gets into mass manufacturing.
Microsoft’s next AI chip, dubbed Braga, is facing a delay of at least six months. This is pushing out its mass production to 2026 from 2025, the report added, citing people familiar with the effort.
When the chip gets into mass production next year, it is expected to fall well short of the performance of Nvidia’s (NASDAQ:NVDA) new Blackwell chips, which were released in late 2024, the report noted.
Microsoft wished to put the Braga chip into its data centers this year, the report added citing a senior Microsoft executive who worked on the chip team.
Microsoft did not immediately respond to a request for comment from Seeking Alpha.
The delay has been attributed to unexpected changes to its design, staffing constraints and high turnover, according to the report.
Microsoft is not the only tech giant making its own AI chips in an effort to reduce its reliance on Nvidia. For example, Amazon (NASDAQ:AMZN) is working on its third-generation AI chip Trainium 3 and the company has said it will be available to customers by the end of this year.
An Amazon spokesperson told the news ouutlet that the project is on track and the chip will provide twice the computing power of Trainium 2, the previous-generation chip.
Amazon did not immediately respond to a request for comment from Seeking Alpha.
Meanwhile, Alphabet’s (NASDAQ:GOOG) (NASDAQ:GOOGL) has been developing its custom AI chip line, called tensor processing units, or TPUs, for a long time. Google’s next-generation TPU, known as Ironwood, is expected to see manufacturing in small quantities by the end of this year, with most of its mass production slated for next year, the report added, citing people familiar with the matter.
Google did not immediately respond to a request for comment from Seeking Alpha.
Google has faced another type of issue. Last year, Google started collaborating with Taiwan’s MediaTek on the design of its next-generation TPUs. However, the collaborating saw a setback — as key people from the MediaTek team responsible for the TPU’s networking technology, a vital part of AI processing which allows multiple chips to work together in unison, left to join Nvidia — the report noted, citing people familiar with the matter.
MediaTek did not immediately respond to a request for comment from Seeking Alpha.
Nvidia’s CEO Jensen Huang earlier this month told reporters at an Nvidia developer conference that many of the rival chip projects big tech companies are pursuing would be abandoned.“What’s the point of building an ASIC if it’s not going to be better than the one you can buy?” Huang said, using the industry term for the chips, application-specific integrated circuits. An Nvidia spokesperson referred to these remarks when asked for comment about its customers’ in-house chip efforts, according to the report.
Nvidia did not immediately respond to a request for comment from Seeking Alpha.
In November 2023, Microsoft unveiled its Maia 100 chip, aimed at AI workloads. At that time the company said that it will be available for Azure cloud customers and it was being tested with Bing and Office AI products. OpenAI, which is backed by billions of dollars of investment from Microsoft (NASDAQ:MSFT), is testing the Maia 100.
The Information report said that Microsoft has mainly relied on Maia 100 for internal testing rather than real-world usage. The chip is not powering any of its AI services, as per several current and former Microsoft employees. This largely because it was originally designed in 2019, before OpenAI released ChatGPT, and was meant for image processing rather than generative AI, the report added citing a person involved in the project.
After the release of Maia 100 in 2024, Microsoft launched a strategy to build three successor chips— dubbed Braga, Braga-R and Clea — with plans to deploy them in its data centers in 2025, 2026 and 2027, respectively, the report noted.
However, Braga’s delay to 2026 increases concerns if the company will be able to launch the remaining chips on time. Microsoft’s future road map and timelines for AI chips have not been previously reported.
The three chips are all meant for inference — the process of applying a trained model to new information to get responses or make decisions — according to the report.
Microsoft had planned to design a chip for training AI models but canceled the effort in early 2024, the report added.
Mid-way through Braga’s development, which is expected to be introduced publicly as Maia 200, Microsoft asked for design changes to include new features requested by OpenAI, the report added. That made the chip unstable when run through simulations and set the project back several months as engineers worked out bugs.
Despite significant changes to Braga’s design, Microsoft executives did not move the deadline to complete the design of the chip by year-end. The deadline created stress for such that around one-fifth of the people on some teams related to the chip’s design left, the report added citing current and former team members.
Microsoft’s AI chips would not come close to competing with those of Nvidia’s until at least Maia 300, dubbed Clea, which is based on a new chip design compared to Braga, the report noted.
Until then, the performance of the Maia chips relative to their high electricity cost will lag behind that of comparable chips from Nvidia, the report added.
More on tech stocks
- Alphabet: More Than Search, It’s An Investment In The Future
- Robotaxi Concerns, A Healthier Nvidia, AMD Momentum With Sara Awad, Tech Contrarians
- Alphabet: The Buffett Stock
- German regulators request Apple, Google to drop DeepSeek app: report
- Meta’s Zuckerberg, execs discuss ‘de-investing’ in Llama for competitors: report