From Bloomberg Businessweek, March 14:
No one has benefited more from the AI boom than Nvidia CEO Jensen Huang. With troubling signs ahead, he’s trying to extend the good times.
On a Monday in mid-January, Jensen Huang held a party for a crowd of health-care and tech executives in San Francisco. As about 400 guests filled up the Fairmont hotel’s opulent Gold room, the Nvidia Corp. chief executive officer, wearing his default black leather jacket, worked through a routine of tech-themed dad jokes. “What do you call a robot that’s better at finding your pills than you are?” he asked. “Computer-aided drug discovery!” The night progressed, and Huang downed at least two glasses of red wine, a bit of the harder stuff and—to the amusement of many in attendance—let fly. Huang taunted Stripe Inc. CEO Patrick Collison, a Massachusetts Institute of Technology dropout, for not being as smart as his wife, and told Ari Bousbib, the CEO of health-care software provider Iqvia Inc., that his company’s name looked like “you fell asleep at the keyboard and then sent it in.”
Those who know Huang would recognize the style: self-assured, a bit guileless and goofy enough to come off as either charming or cringe. One key thing has changed, though: the size of the audience. The artificial intelligence boom has made Nvidia a multitrillion-dollar company, even if it’s not quite a household name. (There isn’t consensus on how to say the name. Nvidia’s official brand guidelines suggest pronouncing the first syllable “en,” but some people still use “in,” or even “nuh,” which is clearly wrong.) Still, it’s an undeniable force in global tech. Huang—now the world’s 15th-wealthiest person, according to the Bloomberg Billionaires Index—is constantly on the road evangelizing for Nvidia and AI. Sizable press gaggles have documented him slurping noodles in a Taipei night market; he’s held babies and signed countless autographs, thrown out first pitches at Major League Baseball games, led tech conference crowds in chants, appeared onstage with the CEOs of Goldman Sachs, Meta Platforms and Salesforce, and chatted privately at the White House with President Donald Trump.
Huang’s tendency, even when he’s away from the office and noticeably tipsy, is to always be closing. At the Fairmont, he’d open with a gentle ribbing followed by a considered pitch. He took a dig at Jacob Thaysen, CEO of biotech firm Illumina Inc., for being bald, then praised Illumina for adopting Nvidia’s signature graphics processing units (GPUs), a shift he says will accelerate the process of gene sequencing. After needling the Mayo Clinic’s chief administrative officer, Christina Zorn, for giving up on studying zoology to become a lawyer, Huang boasted that Nvidia chips and software were helping the medical center create AI that could offer diagnoses. Maybe, he suggested, others in the crowd would want a similar tool?
Huang’s world tour has the appearance of a victory lap, but there’s palpable anxiety there, too. He’s well aware that corporate fortunes can change, and he’s watched them do so with brutal speed in the semiconductor industry. There’s a history of booms and busts for tech infrastructure companies because their products tend to become commoditized. The moment of seemingly endless demand for Nvidia’s GPU chips—the brains of modern AI systems—from the world’s biggest cloud computing companies won’t last forever, and Huang is intent on securing a new foothold while his current position is strong.
There are plenty of factors threatening to slow Nvidia down. Competitors seek to undercut it on price, its biggest customers are trying to build their own AI chips, and Trump’s trade wars are complicating things in all sorts of ways. Most of Nvidia’s GPUs are manufactured in Taiwan before being shipped around the world, so they’re especially sensitive to tariffs. Because China is the world’s largest market for chips, special national security designations that restrict sales are also particularly threatening.
Even more alarming is that some investors are worried the AI boom has peaked. Two weeks after Huang’s closed-door event, the world became fixated on a new Chinese-made AI model called DeepSeek-R1 that its developer says is almost as powerful as just about anything else out there, while it claims the model was trained at a fraction of the cost of its US counterparts. Nvidia lost almost $600 billion in market value in a single day, the biggest drop for any company in history.
The stock has yet to recover. On Feb. 26, Nvidia posted quarterly revenue and profit that exceeded expectations and gave a bullish sales forecast for the current quarter. In a conference call with investors that day, Huang explained how models such as DeepSeek could end up expanding the need for Nvidia chips by increasing the overall amount of AI computing. Its share price dropped anyway—then plummeted further the following week as Trump whipsawed markets with his on-again, off-again tariff plans. For the first time since the debut of ChatGPT, which relies on Nvidia’s chips, being an Nvidia skeptic doesn’t seem so crazy.
The next major opportunity for Huang to sell his story comes next week, when Nvidia’s annual conference takes over downtown San Jose, California, and plays host to business partners, startups and other spectators. About 900 companies are giving presentations about their use of Nvidia tech or participating in activities during the six-day event. In a keynote speech, Huang is expected to outline the many directions he’s leading Nvidia in search of the next frontier in AI.
The company declined to make Huang available for this article, but interviews with his executives and business partners suggest he’s stoking a sense of extreme impatience at Nvidia. After a yearslong building boom in AI infrastructure, he’s getting itchy to see AI applications that matter beyond the tech industry. As Huang will tell anyone who will listen, Nvidia is building not only the chips but the software that will accelerate those shifts in huge fields including health care, logistics, manufacturing and robotics. Such changes would justify the immense investments its existing customers have already made in AI while also making Nvidia invaluable to a much wider range of companies.
“We need to have real applications for AI,” says Aaron Jacobson, a partner at New Enterprise Associates Inc., a venture capital firm that sometimes invests alongside Nvidia in AI startups. “We can’t just be selling picks and shovels if nobody ever finds any gold.” To continue the gold rush metaphor commonly applied to Huang, he doesn’t just want to be a shovel dealer. He wants to lead miners to the fields, sift through the first scoops of dirt and point to where the shiny spots are.
For most of the 30-odd years of its existence, Nvidia lived in the world created by Intel Corp., which was the undisputed leader of the semiconductor industry when Huang quit his job at LSI Logic Corp. to start a chipmaker with two other engineers. Intel’s main product is a central processing unit, a vital component for practically every laptop and server rack. A major virtue of the CPU is its versatility; Huang and his colleagues based their company on the idea that specialized chips would be better at certain tasks, such as film editing and video games. Their GPUs break down tasks into many small chunks, handing them off to an array of smaller processors that work through them in parallel.
In Nvidia’s early days, these tasks consisted almost entirely of producing video game graphics. The reason Nvidia is now worth 28 times more than Intel was Huang’s bet that, eventually, someone else would find a need for this type of chip. Making a GPU operate efficiently while doing more than enabling gamers to play Wolfenstein required physical changes to the chip itself. To help developers take advantage of its chips’ specific attributes, the company released a programming language, CUDA (short for “compute unified device architecture”), in 2006.
Gamers, who simply wanted more powerful chips, and investors, who thought Nvidia should focus on selling such chips to those gamers, were perplexed. Huang wasn’t prioritizing the needs of some other group of customers; he was just hoping new markets would someday emerge. Nvidia began pitching the chips for various applications, including so-called high-performance computing, where machines run massively complex calculations to help with, say, oil and gas exploration or weather predictions. That proved to be a lucrative but very small niche, and modern-day AI, which would emerge many years later using completely different methods, is replacing it.
Among the early adopters of CUDA were researchers specializing in a then-out-of-favor branch of computer science known as deep learning. The technique relies on a computing architecture called neural networks, where information is pushed through rows of nodes modeled after the neurons of the human brain. The researchers realized that GPUs worked well in such systems and began designing them around Nvidia’s. When they did so, they saw startling improvements in performance, laying the groundwork for today’s systems, which can interpret images and produce human-sounding prose.
All of this initially happened without any prompting from Nvidia. But Huang realized he had a live one. Nvidia began meeting with researchers, many of whom began to move to big tech companies or start their own, and catering its products to the needs of the emerging AI industry. Because there wasn’t much money to be made in deep learning yet, Huang had the field to himself.
Nvidia’s AI business started growing slowly, then very, very fast. In 2014 the company first reported how much money it generated from selling chips to data centers, a decent measure of its AI business. At the time, the business accounted for less than 5% of Nvidia’s revenue. It passed 50% for the first time in 2023, and last year it accounted for almost 80%. Nvidia’s earnings reports have become a quarterly update for the entire AI world. Intel, meanwhile, is staring into the void, with expectations growing that it will either be taken over by a competitor or be sold off for parts.
In computer science, technologies are often discussed as layers arranged vertically. Apps make up the top of the stack. Below them are programming languages and other tools for developers. Another layer down is the firmware that controls basic functions of the hardware. At the bottom are the semiconductors, sending electrical signals through minuscule circuits that, in the case of modern computing, number in the hundreds of billions. This is where Nvidia lives.
Huang is interested in moving up a level or two. He and his deputies often talk about Nvidia being in the business not of chips but of computers—a range of whatever hardware and software any customer may want to purchase. “When I think about a computer, I’m not thinking about that chip,” Huang said in 2024 on the tech podcast BG2. “All the software, all the orchestration, all the machinery that’s inside, that’s my computer.”
Nvidia now designs equipment to house and cool its chips and to move information around, along with software libraries and AI models. At trade shows, the company shows off animations of sprawling AI factories seeming to construct themselves out of Nvidia products and services, starting with semiconductors and servers, then extending all the way up to AI models and eerily named “digital human” agents that interact with actual human customers....
....MUCH, MUCH MORE