Wednesday, March 19, 2025

Nvidia GTC Financial Analysts Q&A Transcript (NVDA)

Following on the (unembeddable )link to the video.

From Investing.com, March 19 2025:

NVIDIA at GTC Financial Analyst Q&A: AI Infrastructure Expansion

On Wednesday, 19 March 2025, NVIDIA Corporation (NASDAQ: NVDA) held its GTC Financial Analyst Q&A, presenting a strategic shift towards AI infrastructure for cloud, enterprise IT, and robotics. Led by CEO Jensen Huang, the discussion highlighted NVIDIA’s ambitions to modernize enterprise IT and dominate the AI landscape. While the company aims for growth in AI infrastructure, it also faces challenges such as economic downturns and tariffs.

Key Takeaways

  • NVIDIA is focusing on AI infrastructure, partnering with Dell, HPE, and Cisco to transform enterprise IT.
  • The company’s robotics business, valued at $5 billion, is expanding rapidly with key partnerships, including GM.
  • NVIDIA anticipates improved gross margins with the ramp-up of the Grace Blackwell architecture.
  • The company is preparing for potential tariffs by planning onshore manufacturing in Arizona.
  • NVIDIA aims to maintain dominance over custom ASICs with its comprehensive AI platform.

Financial Results

  • Gross margins are expected to improve as the Grace Blackwell architecture scales up over the next three to four years.
  • NVIDIA foresees large AI factory projects, with investments potentially reaching hundreds of billions of dollars.
  • Current data center spending is projected to hit a trillion dollars.

Operational Updates

  • NVIDIA is expanding its focus to include AI infrastructure for cloud, enterprise IT, and robotics, addressing computing, networking, and storage.
  • The emphasis on "AI factories" marks a shift towards single-function data centers dedicated to AI.
  • The company’s robotics business, valued at $5 billion, includes self-driving cars and robotic warehouses.

Future Outlook

  • NVIDIA is preparing for onshore manufacturing, leveraging TSMC’s investment in Arizona to mitigate potential tariff impacts.
  • The company laid out a roadmap of three years, emphasizing its role as an infrastructure provider rather than a consumer products business.
  • Companies investing in AI are expected to increase their focus on AI if a recession occurs.

Q&A Highlights

  • NVIDIA’s GPUs have a longer lifecycle, lasting three to four years longer than competitors, and remain useful for data processing.
  • The company plans to continue using copper for Nvlink as long as possible, with a shift to silicon photonics if necessary.
  • Homogeneous clusters are favored for higher data center performance, as every computer is fungible.

For a detailed understanding, please refer to the full transcript below.

Full transcript - GTC Financial Analyst Q&A:

Jensen: Sorry. It’s late. I was on TV. I’m just kidding. Chuck and I were on TV.

Chuck Robbins, Cisco.

Unidentified speaker: All good. And also on Kramer this morning. So just a few interviews, right?

Jensen: Yes. That was fun.

Unidentified speaker: Okay. Great to see everybody, both yesterday as well as, last night at our cocktail hour. This opportunity to speak with the Jensen and really talk about what this meant to our investor community in terms of our announcement at the GTC. I kindly remind you to look at our disclosure statement, this fine print in front of us, and then I want to make sure there’s one announcement for you all. Toshiya Hara is here with you, but he is with you as our new lead of investor relations.

He started just about yesterday and he’s now been here a good forty eight hours, so please make sure you ask him a ton of questions in terms of there. We’re really pleased to bring him on board, bring him on board here to both California and his many, many years in terms of, semis working at Goldman. Truly, truly excited to have him as part of the team in the whole. With that, I’m gonna turn the mic over to Johnson for some opening remarks.

Jensen: Good morning. Great to see all of you. Let’s see. We announced a whole lot of stuff yesterday, and, let me put it all in perspective. The first is, as you know, everybody’s expecting us to build AI infrastructure for cloud.

That, I think, everybody knows. And, the good news is that the, understanding of r one was completely wrong. That’s the good news. And the reason for that is because reasoning should be a fantastic new breakthrough. Reasoning, includes better answers, which makes AI more useful, solving more problems, which expands the reach of AI, and, of course, from a technology perspective, requires a lot more computation.

And so the computation demand for reasoning AIs is much, much higher than the computation demand of one shot pre trained AI. And so I think everybody now has a better understanding of that. That’s number one. Number two, and so so the first thing is inference, Blackwell incredibly good at it, building out AI clouds, the the investments of all the AI clouds continues to be very, very high. The demand for computing continues to be extremely high.

Okay. So I think that that’s the first part. The the part that I think people are are starting to learn about and and, we announced yesterday, I and and I I I I’ll just take, I could have done a better job explaining it, and so I’m gonna do it again. In order to bring AI to the world’s enterprise, we have to first recognize that AI has reinvented the entire computing stack. And if so, all of the data centers and all the computers in the world’s enterprises are obviously out of date.

And so just as we’ve been remodernizing the world’s AI clouds, all the world’s clouds for AI, it’s sensible we’re gonna have to re rack, if you will, reinstall, modernize, whatever words, the world’s enterprise IT. And doing so is not just about a computer, but you have to reinvent computing, networking, and storage. And so I didn’t give it very much time yesterday because we had so much so much content that that part, which represents about half of the world’s capex, enterprise IT, representing about half of the world’s CapEx. That half needs to be reinvented, and our journey begins now. Our partnership with Dell and HPE and, this morning, the reason why Chuck Robbins and I were on on, on CNBC together, is to talk about this reinvention of enterprise IT.

And Cisco is gonna be a net NVIDIA networking partner. I announced yesterday, basically, the entire world’s storage companies have signed on to be NVIDIA’s storage technology and storage platform partner. And, of course, as you know, computing is an area that we’ve been working on for a long time, including building some new modular systems that are much more enterprise friendly. And so, we announced that yesterday. Spark, DGX Spark, DGX Station, and all of the different Blackwell systems that are becoming from the OEMs.

Okay. So that’s second. So now we’re building AI infrastructure not just for cloud, but we’re building AI infrastructure for the world’s enterprise IT. And the third is robotics. When we talk about robotics, people think robots, and this is a great thing.

It’s fantastic. There’s nothing wrong with that. The world is tens of millions of workers short. We need lots and lots of robots. However, don’t forget the business opportunity is well upstream of the robot.

Before you have a robot, you have to create the AI for the robot. Before you have a chatbot, you have to create the AI for the chatbot. That chatbot is just the last drop last end of it. And so in order for us to enable the world’s robotics industry, upstream is a bunch of AI infrastructure we have to go create to teach the robot how to be a robot. Now teaching a robot how to be a robot is much harder than in fact even chatbots for obvious reasons.

It has to manipulate physical things and it has to understand the world physically. And so we have to invent new technologies for that. The amount of data you have to train with is gigantic. It’s not words, it’s video. It’s not world, it’s not just words and numbers, it’s video and physical interactions, cause and effects physics.

And so so that new adventure we’ve been on for several years, and now it’s starting to grow quite fast. Inside our robotics business includes self driving cars, humanoid robotics, robotic factories, robotic warehouses, lots and lots of robotic things. That business is already many billions of dollars. It’s at least $5,000,000,000 today, the automotive industry, and it’s growing quite fast. Okay?

And yesterday, we also announced a big partnership with, GM, who’s gonna be working with us across all of these different areas. And so we now have three AI infrastructure focuses, if you will, cloud data centers, enterprise IT, and robotic systems. I would say those three buckets, I’ve talked about I talked about those things yesterday. And then foundationally, of course, we spoke about the different parts of the technology, pre training and how that works and how pre training works for reasoning AIs, how pre training works for robotic AI, how in reasoning inference impacts, computing and therefore, directly how it impacts our business. And then answering a very big question that a lot of people seem to have, and I never understood why, which is, how how important is inference to NVIDIA?....

....MUCH MORE, the analysts have questions.

 Also at Investing.com, the transcript of Mr. Huang's March 18 GTC keynote speech:

NVIDIA at GTC 2025: AI and Accelerated Computing Innovations