From Bloomberg, March 20:
Since ChatGPT arrived, Nvidia and its key manufacturing partners have dominated the market for AI chips. Will it last?
Every time you type a question into ChatGPT, you are, probably without knowing it, making several monopolies richer.
Actually, it’s no different if you use one of ChatGPT’s many competitors. Nearly all of them use chips from Nvidia Corp., which sells around 92% of the particular components — called artificial intelligence accelerators — that make chatbots function. Nvidia relies on a trio of partners to produce its semiconductors: South Korea’s SK Hynix Inc., Taiwan Semiconductor Manufacturing Co. and ASML Holding NV of the Netherlands. Each supplier has a market position almost as fortified as Nvidia’s, or even more so.
In many industries, that kind of dominance might have antitrust watchdogs threatening a breakup. In technology, it’s long been accepted that important innovations can lead to companies dominating their markets and then staying on top for years by exploiting the laws of scale. It happened with mainframe and personal computers, web browsers, search engines, social networks and mobile software.
When some of those earlier monopolies ended, it was largely because rivals brought them down rather than that government regulators took them apart, Standard Oil-style. It’s possible that AI will have its “iPhone moment,” when a new invention renders companies at the top of their market obsolete almost overnight. It’s also conceivable that AI simply won’t have the world-changing economic impact that the industry promises, ending the gold rush. For now, the monopolies of artificial intelligence are taking a star turn.
*****
Never has so much money been riding on the outcome. Together, Nvidia and its three critical partners had a combined market value of more than $4 trillion as of mid-March. Nvidia alone accounted for 6% of the S&P 500 Index of leading US stocks. TSMC and ASML have become the most valuable companies in their home countries. Those valuations are predicated largely on the idea that these companies will have this growth market to themselves for years to come.
Yet the artificial-intelligence boom is already proving unpredictable and erratic, and Nvidia’s rivals are spending fortunes to develop chips that can compete with its own products for power, speed and reliability.
Here’s more on the new generation of tech monopolies, and the threats they face.
How they got so big
For decades, Nvidia was known for gaming, not AI. It designs graphics processing units, or GPUs — components that render realistic images in video games such as Call of Duty. The GPUs use a technique known as parallel computing, in which multiple processors solve many computational problems simultaneously, and much faster than a traditional computer. A little over a decade ago, some enterprising researchers discovered these chips were well suited for deep learning, a type of computing that works much like the human brain and became the foundation for today’s ChatGPT boom.
Nvidia Chief Executive Officer Jensen Huang made an early bet on some of those researchers, delivering a set of chips costing $129,000 to nonprofit startup OpenAI in 2016, when it was a small lab. “At first, it was almost an accident,” said Jason Furman, an economic policy professor at Harvard University. “Then they shrewdly capitalized on that accident.”
Nvidia had already built an extensive library of code for using its GPU chips centered on a programming language, called Compute Unified Device Architecture, that became the only way to use its chips for the new type of computing. Because so many AI engineers grew accustomed to using CUDA, alternative chips developed by well-funded startups and Google failed to make a dent. Even Intel Corp., the one-time chip king, couldn’t keep up.
On March 18, Huang introduced the latest line of Nvidia’s more powerful chips and related software, called Dynamo, that he described as “the operating system of an AI factory.”
Who does what
For Nvidia GPUs to work, they need a powerful memory chip — an integrated circuit designed to hold data while a processor works through it. For this, Nvidia turns to SK Hynix, a Korean company that controls about four-fifths of the market for the most powerful high-bandwidth memory (HBM) chips. SK Hynix long labored in the shadow of local rival Samsung Electronics Co. Then, in 2019, its engineers devised a novel way to package the memory chips used in the data-heavy processing of AI without overheating. Samsung has yet to catch up....
....MUCH MUCH MORE