AMD CEO Launches Nvidia Chip Rival, Gives Stunning Predictions

(Bloomberg) — Advanced Micro Devices Inc., targeting a burgeoning market dominated by Nvidia Corp., unveiled new so-called accelerator chips that it said will be able to run artificial intelligence programs faster than competing products.

Most read from Bloomberg

The company introduced a much-anticipated lineup called the MI300 at an event held on Wednesday in San Jose, California. CEO Lisa Su also made a startling forecast for the size of the AI ​​chip industry, saying it could rise to more than $400 billion in the next four years. That’s more than double the forecast AMD made in August, showing how quickly expectations for AI hardware can change.

The launch is one of the most significant in AMD’s five-decade history, setting up a showdown with Nvidia in the hot market for AI accelerators. Such chips help develop artificial intelligence models by bombarding them with data, a task they handle more skillfully than traditional computer processors.

AMD is showing growing confidence that its MI300 lineup can win over some of the biggest names in technology, potentially shifting billions in spending toward the company. AMD said customers using the processors will include Microsoft Corp. and Oracle Corp. and Meta Platforms Inc.

Nvidia shares fell 1.4% to $458.98 in New York on Wednesday, while AMD shares fell less than 1% to $117.24.

Surging demand for Nvidia chips by data center operators has helped propel its stock this year, pushing the company’s market value past $1.1 trillion. The big question is how long it will keep the accelerator market to itself.

See also  Xbox Game Pass Adds The Last Case of Benedict Fox, BlazBlue: Cross Tag Battle Special Edition, Homestead Arcana, and More in Late April

AMD sees an opportunity: Large language models — used by AI chatbots like OpenAI’s ChatGPT — need a lot of computer memory, and that’s where the chipmaker thinks it has an advantage.

AMD’s new chip has more than 150 billion transistors and 2.4 times more memory than Nvidia’s H100 chip, the current market leader. It also has 1.6 memory bandwidth, which further boosts performance, AMD said.

Su said the new chip is equivalent to Nvidia’s H100 chip in its ability to train AI programs and much better at inference — the process of running that software once it’s ready for use in the real world.

But Nvidia is developing its own next-generation chips. The H100 will replace the H200 in the first half of next year, providing access to a new, high-speed type of memory. That should match at least some of AMD’s offerings. Nvidia is then expected to come out with a completely new processor architecture later in the year.

AMD’s forecast that AI processors will grow into a $400 billion market underscores the boundless optimism in the AI ​​industry. That compares to $597 billion for the entire chip industry in 2022, according to IDC.

Last August, AMD offered a more modest forecast of $150 billion over the same period. But it will take the company some time to capture a significant portion of this market. AMD has said its revenue from accelerators will exceed $2 billion in 2024, and analysts estimate the chip maker’s total sales will reach about $26.5 billion.

See also  How ASML Became Europe's Most Valuable Technology Company

The chips are based on a type of semiconductor called graphics processing units, or GPUs, which are typically used by video gamers for a more realistic experience. Their ability to perform a certain type of calculation quickly by performing many calculations simultaneously has made them an ideal choice for training artificial intelligence programs.

(Stock updates in fifth paragraph.)

Most read from Bloomberg Businessweek

©2023 Bloomberg L.P

Leave a Reply

Your email address will not be published. Required fields are marked *