Following on the sentiment expressed in this morning's "China's National Science and Technology Council (NSTC) Minister Wu Cheng-wen Said Taiwan's TSMC Has A 10-Year Lead In Chips", a slightly different take on the not-quite-bleeding-edge parts of the industry, from five weeks ago.
Via Neue Zürcher Zeitung's TheMarket.ch, August 24:
The global semiconductor industry is in a fundamental transformation. Jonathan Goldberg, Partner at Snowcloud Capital and Founder of D2D Advisory, talks about the competition for the fastest chips for artificial intelligence, China’s counter-offensive in the face of US boycott measures and Intel’s chances of survival....
... «In this class of compute we call AI, Nvidia is untouchable,» says industry expert Jonathan Goldberg. «On the other hand, for the stock to justify this valuation, we need to see exciting AI applications.»Goldberg knows what he is talking about. He is a partner at the venture capital firm Snowcloud Capital in Silicon Valley and advises companies in the electronics and semiconductor industries. He also knows his way around China, where he lived for around ten years and has just spent several weeks on a business trip.
In an in-depth interview with The Market NZZ, which has been lightly edited, the industry expert shares his thoughts on Nvidia and the competition for the fastest computer chips. He also comments on Beijing’s counter-offensive in the face of ever tougher US boycott measures against the Chinese semiconductor sector, and on Intel’s chances of survival.
These days, everything in the tech sector revolves around artificial intelligence. What’s your approach to AI as a semiconductor investor?The first thing I try to do when I talk about AI is to avoid using that term altogether. I grew up with science fiction, and I still read a lot of science fiction. So when I hear artificial intelligence I associate it with Star Trek or HAL, the sentient artificial general intelligence computer in Space Odyssey; really cool things. The AI we have today is not that. Right now, it’s just linear algebra. It’s matrix multiplication, and doing that kind of computation can be very useful in some ways, but it’s not artificial general intelligence.
How can the technology be better categorized at its current stage?I sort of divide AI into three categories. The first is ‹AI as magic,› which refers to artificial general intelligence: the idea of computers that can think and learn like humans. There are people who believe that’s close. Other people think it will never happen. I don’t know, but I don’t think it’s coming anytime soon. And frankly, if it does happen, everything is going to change so radically that I’m not going to worry about it now.
What are the other two categories?At the other end of the spectrum is what I call «AI as a feature»: using AI to improve things you are already doing anyway. That’s where most of the AI implementations we’ve seen so far have come into play. For instance, Meta says they improved ad matching by 10% by using AI. That’s a boring headline, but it’s also a lot of money, hundreds of millions of dollars. There are a few other software companies that said they’ve reduced cost of service by 5-10%. Again, that sounds not very exciting, but when you add it all that up, it’s significant. We see this being played out across the tech sector: AI or neural networks are useful in small, unglamorous ways – and in aggregate, that’s good.
And the third category?Right in between those two extremes, there’s AI as an application or as a platform. That’s the space Google, Meta, Microsoft, and everybody else is fighting over today; trying to build new things we can do with AI. To me, that falls in the category of «to be determined». ChatGPT came out almost two years ago, and they haven’t done much since. It’s gotten better, but it’s nothing really new or exciting anymore.
What do you mean by that?ChatGPT is very useful in some niche areas, like the copilot coding tools, or for writing things nobody is going to read like spam. The problem is you still need a fair amount of human intervention. About a month ago, Google, published an internal analysis regarding the adoption of AI coding tools. What they found was that about half of their software programmers were using these tools and the gross impact was a 20% increase in raw code output. But then, their programmers had to spend more time reviewing and editing code, so that net impact was only about a 10% increase in productivity. Again, that’s more like a feature, and beyond that narrow field, there’s not that much that it’s useful for. We haven’t found a truly interesting, compelling consumer application for AI yet.
However, tech giants such as Microsoft, Google, Meta and Amazon are building up their computing capacities for AI in a major way. Will these investments pay off?As someone who works in and around semiconductor companies, I love to see those numbers. But there’s a reasonable amount of not quite skepticism, but questions about the return on these investments. Even for these companies, this is a lot of money. OpenAI, Microsoft and Google, who are probably the leaders, have all announced new software applications using AI, but in every single case they had to pull back the most impressive ones. So it’s reasonable that we start to question what it’s good for. Some people out there are incredibly skeptical about it. They think it’s a pure bubble and it’s all going to end in failure and disaster. I’m not that pessimistic, but we need to start seeing some tangible value here.
The biggest beneficiary of this investment boom is Nvidia. How are you experiencing the euphoria surrounding the stock?There are two sides. On one hand, in this class of compute we call AI, Nvidia is untouchable. AMD has some stuff, and there’s a group of startups chasing Nvidia, but they’re all far behind. Nvidia has this massive head start, incredible software, and it’s just a very solid story. I don’t see them losing share in that market. On the other hand, for the stock to justify this valuation, we need to see exciting AI applications. If we can get them, Nvidia will benefit, but we need something more; not AGI, but more than what we have today. Then again, if you want to be very cynical about what AI really means, it’s just a transition in the data center; from data centers where Intel is dominant to architectures where Nvidia is dominant. That’s all it is from a cold, hard-cash perspective, and I don’t think that’s going away.
What do you think will happen to Nvidia’s stock over a longer time horizon?What I just described is sort of the bear case: Nvidia remains the leader in that space, but their market share goes from 80% to 50% or 45%. So they’re still the largest player, but they’re not as dominant. The bull case is that this just keeps going, and Nvidia’s share stays at 80% for the next five years. I think we will probably land somewhere in between those two extremes.
Then again, big tech companies are also increasingly relying on in-house chips, with Google and Amazon being the most advanced. How significant is this trend?That’s a fair point. It’s much more likely that Nvidia is losing share within the hyperscalers space, essentially to its customers instead of to its direct competitors. Today, about 60-70% of AI spend is Nvidia, about 20-30% is Google, and everyone else is small numbers. Google has a significant internal installed base of TPU-Chips. They’re still buying Nvidia chips, but that will probably go away. But Google, and Meta as well, is special because they control their own software stack. Amazon is different. AWS has to support everyone else’s silicon. They have to supply the silicon their customers want, and what they want is Nvidia. For instance, if a big bank is launching an AI chat bot, they’re not going to learn how to use Amazon’s Trainium and Inferentia environment.
What’s your take on end devices, i.e. AI-optimized PCs and smartphones?This goes back to the three buckets of AI. All of last year and early this year, when I talked to chip vendors like Qualcomm, AMD or Intel, they kept promising this thing called an AI PC: it’s coming and it’s going to have AI functionality built into the CPU. But when you asked them questions like «What are consumers going to do with that?» or «Why would I, as a consumer, spend $1 more for an AI PC?» you were always met with an awkward silence. I don’t think we have a good consumer use case for AI PC’s yet.
What about the smartphone market?We’ve had AI on phones for years now. Apple had what they call neural engines in their A series processors for six or seven years now. It’s useful, it makes taking pictures better, but we didn’t get all excited about that. Nobody would talk about an AI Revolution. We’ll see more of that. AI as a feature can make your phone a little bit better, but it’s not going to change the world. Interestingly, when Apple unveiled what they call Apple Intelligence in June, one of the things I noticed was that every single AI tool was integrated into something that already existed. So, AI is a feature, and as far as I know, they’re not planning to charge for any of it.Let’s broaden our perspective: Which companies are best positioned for AI, considering all stages of value creation in the tech sector?....
....MUCH MORE