Nvidia’s revenue doubles with demand for AI chips, and it could go up

In a sign that the technology industry’s next big boom is gaining steam, Nvidia on Wednesday predicted rapid growth in the already rapid demand for the chips it makes to build artificial intelligence systems.

Silicon Valley products called GPUs, or GPUs, are used to build the vast majority of AI systems, including the popular ChatGPT chatbot. Technology companies ranging from startups to industry giants are struggling to get their hands on it.

NVIDIA said heavy demand from cloud computing services and other customers for chips to power artificial intelligence systems caused revenue for the second quarter, which ended in July, to jump 101% from a year earlier, to $13.5 billion, while profits rose more than ninefold to approximately $13.5 billion. $6.2 billion.

That was better than Nvidia expected in late May, when its $11 billion revenue estimate for the quarter surprised Wall Street and helped push Nvidia’s market value past $1 trillion for the first time.

NVIDIA’s predictions and soaring market value have become a symbol of the growing abundance surrounding artificial intelligence, which is transforming many computing systems and the way they are programmed. They also sharply sparked interest in what Nvidia might say next about chip demand for its current quarter, which ends in October.

Nvidia projected third-quarter sales of $16 billion, nearly three times the level a year ago and $3.7 billion more than the average analyst forecast of about $12.3 billion.

Chipmakers’ financial performance is often seen as a harbinger for the rest of the tech industry, and Nvidia’s strong results could be Re-ignite enthusiasm technology stocks on Wall Street. Other tech companies like Google and Microsoft spend billions and make very little on AI, but NVIDIA makes money.

See also  Stock futures rose slightly ahead of major Fed policy decision

Nvidia CEO Jensen Huang said major cloud services and other companies are investing to bring Nvidia’s AI technology to every industry. “The trend is very clear now that we’re seeing a platform shift,” he said during a conference call with analysts.

Nvidia’s share price rose more than 9 percent in after-hours trading.

Until recently, Nvidia earned the largest share of its revenue from graphics processing unit sales in video games. But AI researchers started using those chips in 2012 for tasks like machine learning, a trend that Nvidia has seized on over the years by adding improvements to its GPUs and various software to reduce labor for AI programmers.

Chip sales to data centers, where most AI training is done, are now the company’s biggest business. Nvidia said revenue from this business grew 171 percent to $10.3 billion in the second period.

Patrick Moorhead, an analyst at Moor Insights & Strategy, said the rush to add generative AI capabilities has become a necessity for corporate heads and boards. Right now, he said, Nvidia’s only constraint is its struggle to supply enough chips — a gap that could create opportunities for major chip companies such as Intel and Advanced Micro Devices and startups such as Groq.

NVIDIA’s soaring sales contrast sharply with the fortunes of some of its chip-making peers, which have been hurt by weak demand for personal computers and data center servers used for general-purpose tasks. Intel said in late July that second-quarter revenue fell 15%, although the results were better than Wall Street had expected. Advanced Micro Devices’ revenue fell 18 percent in the same period.

See also  Here's how to distinguish a red flag from a buy when the stock is down

Some analysts believe that spending on hardware for AI, such as Nvidia chips and the systems that use them, pulls money away from spending on other data center infrastructure. Market research firm IDC estimates that cloud services will increase its spending on AI server systems by 68 percent over the next five years.

While Google, Amazon, Meta, IBM and others have also produced AI chips, today NVIDIA accounts for more than 70% of AI chip sales and has a bigger place in training generative AI models, according to research firm Omdia.

Demand is particularly high for the H100, Nvidia’s new GPU for AI applications, which began shipping in September. Companies large and small are scrambling to find supplies of the chips, which are made in an advanced production process and require equally sophisticated packaging combining graphics processing units and proprietary memory chips.

Nvidia’s ability to ramp up deliveries of the H100 is largely related to the actions of the Taiwanese semiconductor manufacturer, which handles packaging as well as GPU manufacturing.

Industry executives expect the shortage of H100s to extend throughout 2024, a problem facing emerging AI and cloud service companies hoping to sell computing services that exploit the new GPUs.

Mr. Huang said the company is working hard with its production partners to bring more chips to the market. “Supplies will rise significantly during the rest of this year and next year,” he added.

Leave a Reply

Your email address will not be published. Required fields are marked *