Lisa Su displays an AMD Instinct MI300 chip while delivering a keynote address at CES 2023 on January 4, 2023 in Las Vegas, Nevada.
David Becker | Getty Images
Meta, OpenAI and Microsoft announced at an AMD investor event on Wednesday that they will use AMD’s latest AI chip, the Instinct MI300X. It’s the biggest sign yet that tech companies are looking for alternatives to the expensive Nvidia graphics processors that are essential to developing and deploying artificial intelligence programs like OpenAI’s ChatGPT.
If AMD’s latest high-end chip is good enough for the technology companies and cloud service providers that develop and deploy AI models when it ships early next year, it could reduce the cost of developing AI models and increase competitive pressures Exercising Nvidia’s rapid AI chip sales growth.
“All the interest is in big iron and big GPUs for the cloud,” AMD CEO Lisa Su said Wednesday.
According to AMD, the MI300X is based on a new architecture that often results in significant performance improvements. Its most outstanding feature is that it has 192GB of a cutting-edge, high-performance storage type called HBM3, which transfers data faster and is suitable for larger AI models.
Su compared the MI300X and the systems built with it directly to Nvidia’s main AI GPU, the H100.
“This performance directly translates to a better user experience,” Su said. “When you ask a model something, you want them to answer faster, especially when the answers get more complicated.”
The main question for AMD is whether companies that have built on Nvidia will invest time and money to add another GPU provider. “It takes work to introduce AMD,” Su said.
AMD told investors and partners on Wednesday that it has improved its software suite, called ROCm, to compete with Nvidia’s industry-standard CUDA software. This addressed a key flaw that was one of the main reasons AI developers currently prefer Nvidia.
Price will also be important. AMD didn’t announce pricing for the MI300X on Wednesday, but a chip from Nvidia can cost about $40,000, and Su told reporters that the purchase and operation of the AMD chip would have to be less than Nvidia’s to get customers to to move purchase.
Who says they will use the MI300X?
AMD MI300X artificial intelligence accelerator.
On Wednesday, AMD said it had already recruited some of the companies with the greatest need for GPUs to use the chip. According to a recent report from research firm Omidia, Meta and Microsoft were the two largest buyers of Nvidia H100 GPUs in 2023.
Meta said it will use MI300X GPUs for AI inference workloads such as AI sticker processing, image editing and operating its assistant.
Microsoft CTO Kevin Scott said the company will offer access to MI300X chips through its Azure web service.
Oracle’s cloud will also use the chips.
OpenAI said it will support AMD GPUs in one of its software products called Triton, which is not a large language model like GPT but is used in AI research to access chip features.
AMD isn’t yet forecasting massive sales for the chip, only forecasting about $2 billion in total data center GPU revenue in 2024. Nvidia reported more than $14 billion in data center revenue in the last quarter alone, although these Key figure also includes chips other than GPUs.
However, AMD says the total market for AI GPUs could rise to $400 billion over the next four years, doubling the company’s previous forecast. This shows how high expectations are and how desirable high-end AI chips have become – and why the company is now drawing investor attention to the product line.
Su also pointed out to reporters that AMD doesn’t believe it has to beat Nvidia to succeed in the market.
“I think it’s clear to say that Nvidia has to account for the vast majority of this right now,” Su told reporters, referring to the AI chip market. “We believe it could be over $400 billion in 2027. And we could get a nice piece of it.”
Source : www.cnbc.com