Skip to main content

Nvidia’s Q2 earnings prove it’s the big winner in the generative AI boom

Nvidia’s second-quarter earnings, which were reported Wednesday after markets closed, prove there is money to be made — and lots of it — selling the picks and shovels of the generative AI boom.

“A new computing era has begun. Companies worldwide are transitioning from general-purpose to accelerated computing and generative AI,” Nvidia founder and CEO Jensen Huang said in a statement.

Huang isn’t wrong. Nvidia has become the main supplier of the generative AI industry. The company’s A100 and H100 AI chips are used to build and run AI applications, notably OpenAI’s ChatGPT. Demand for these demanding applications has grown steadily over the last year, and infrastructure is shifting to support them.

A number of cloud service providers recently announced plans to adopt Nvidia H100 AI hardware in their data centers, according to Huang, who added that enterprise IT system and software providers also announced partnerships to bring Nvidia AI to every industry.

“The race is on to adopt generative AI,” he said.

Nvidia reported revenue of $13.51 billion in the second quarter, a figure that crushed Wall Street expectations and was double the $6.7 billion it generated in the same period last year. Analysts polled by Yahoo Finance forecast Q2 revenue of $11.22 billion.

Nvidia reported GAAP net income of $6.18 billion compared to $656 million it earned in the same year-ago period — upwards of a ninefold gain. Nvidia’s net income skyrocketed even from the first quarter when it reported earnings of $2.04 billion. Its earnings per diluted share for the quarter were $2.48, up 854% from same period last year. Analysts polled by Yahoo finance expected earnings per diluted share of $2.09.

The results show how dramatically its business has changed. The company’s gaming unit was once the main driver of revenue. And while gaming is growing — its Q2 revenue was $2.49 billion, up 22% from last year — it’s now overshadowed by its data center unit. Nvidia’s data center business generated $10.32 billion in revenue, up 141% from the previous quarter and up 171% from a year ago.

Huang said earlier this month during a keynote at SIGGRAPH in Los Angeles that the company made an existential business decision in 2018 to embrace AI-powered image image processing in the form of ray tracing and intelligent upscaling: RTX and DLSS.

That bet has paid off. And Nvidia has forecast even more growth.

The company forecast revenue of $16 billion for the third quarter, plus or minus 2%.

“The world has something along the lines of about a trillion dollars worth of data centers installed in the cloud,” Huang said during the company’s earnings call Wednesday. “And that trillion dollars of data centers is in the process of transitioning into accelerated computing and generative AI. We’re seeing two simultaneous platform shifts at the same time.”

He said accelerated computing is the most cost effective, most energy effective and the most performant way of doing computing now. Now, computing, enabled by generative AI, has come along.

“This incredible application now gives every everyone two reasons to transition to do a platform shift from general purpose computing — the classical way of doing computing — to this new way of doing computing accelerated computing,” he said.



from TechCrunch https://ift.tt/C0eUdzW
via IFTTT

Comments

Popular posts from this blog

The Silent Revolution of On-Device AI: Why the Cloud Is No Longer King

Introduction For years, artificial intelligence has meant one thing: the cloud. Whether you’re asking ChatGPT a question, editing a photo with AI tools, or getting recommendations on Netflix — those decisions happen on distant servers, not your device. But that’s changing. Thanks to major advances in silicon, model compression, and memory architecture, AI is quietly migrating from giant data centres to the palm of your hand. Your phone, your laptop, your smartwatch — all are becoming AI engines in their own right. It’s a shift that redefines not just how AI works, but who controls it, how private it is, and what it can do for you. This article explores the rise of on-device AI — how it works, why it matters, and why the cloud’s days as the centre of the AI universe might be numbered. What Is On-Device AI? On-device AI refers to machine learning models that run locally on your smartphone, tablet, laptop, or edge device — without needing constant access to the cloud. In practi...

Apple’s AI Push: Everything We Know About Apple Intelligence So Far

Apple’s WWDC 2025 confirmed what many suspected: Apple is finally making a serious leap into artificial intelligence. Dubbed “Apple Intelligence,” the suite of AI-powered tools, enhancements, and integrations marks the company’s biggest software evolution in a decade. But unlike competitors racing to plug AI into everything, Apple is taking a slower, more deliberate approach — one rooted in privacy, on-device processing, and ecosystem synergy. If you’re wondering what Apple Intelligence actually is, how it works, and what it means for your iPhone, iPad, or Mac, you’re in the right place. This article breaks it all down.   What Is Apple Intelligence? Let’s get the terminology clear first. Apple Intelligence isn’t a product — it’s a platform. It’s not just a chatbot. It’s a system-wide integration of generative AI, machine learning, and personal context awareness, embedded across Apple’s OS platforms. Think of it as a foundational AI layer stitched into iOS 18, iPadOS 18, and m...

Max Q: Psyche(d)

In this issue: SpaceX launches NASA asteroid mission, news from Relativity Space and more. © 2023 TechCrunch. All rights reserved. For personal use only. from TechCrunch https://ift.tt/h6Kjrde via IFTTT