AMD dips after unveiling new AI chip, CPU at showcase event
Advanced Micro Devices’ (NASDAQ:AMD) stock fell about 4% on Thursday amid CEO Lisa Su unveiling new artificial intelligence processors at its Advancing AI 2024 event in San Francisco on Thursday, stepping up efforts to challenge market leader Nvidia (NVDA).
The company showed off its new Turin EPYC data center CPUs and an Instinct MI325x AI accelerator. Su also unveiled Ryzen AI PRO 300 Series, the first Microsoft (MSFT) Copilot + laptops designed for enterprise, according to the company.
AMD touted MI325x to be better than Nvidia’s H200 HGX on several parameters. Meanwhile, the company called the fifth generation AMD EPYC the world’s best CPU for Cloud, Enterprise and AI.
AMD noted that the EPYC 5th Gen 9965 is the industry’s highest performing server CPU, by showing that it is better than the company’s 4th Gen 9754 and Intel’s (INTC) Xeon 5th Gen 8592+.
At the Computex 2024 show in Taipei in June, the company introduced its latest AI processors and provided a roadmap to develop AI chips over the next two years. Su had unveiled the AMD Instinct MI325X accelerator, and the AMD Instinct MI350 series accelerators based on AMD CDNA 4 architecture.
At the Advancing AI event, AMD reiterated its commitment to the GPU roadmap, which includes MI325X in 2024, MI350 in the second half of 2025, and MI400 in 2026. These would potentially compete with Nvidia’s upcoming Blackwell GPUs. At the Computex event, AMD had revealed plans to deliver performance and memory leadership on an annual basis for generative AI, similar to plans by rival Nvidia (NVDA) to shorten its release cycle to an annual basis.
Su noted that data center AI accelerator market is expected to grow to $500B in 2028 from the $45B it was in 2023, a growth of 60% CAGR. She had previously forecast the market to be $400B in 2027.
AMD also invited several speakers from its partners, including tech giants Microsoft (MSFT), Meta Platforms (META), and Oracle (ORCL).
Kevin Salvadore, VP of Infrastructure and Engineering at Meta, noted that the company has deployed over 1.5 million EPYC CPUs.
The race to develop generative AI products has resulted in a rising demand for the advanced chips used in AI data centers. Companies developing large language models, or LLMs, rely on advanced processors to train these models as they require high computational power, among other things.