Tiernan Ray, formerly the head scribe at Barron's Tech Trader Daily goes deep.
From ZD Net, Jan. 30:
Two expert reports this week from longtime chip observers show why the recent trend to design novel chips for AI will roll on for many years to come.
The march of specialized chips for artificial intelligence continues unabated, and reports from some luminaries of the semiconductor industry point to a broadening out of the movement of machine learning parts....MUCH MORE, including related stories
The well-regarded chip-industry newsletter Microprocessor Report this week reports that cloud computing operators such as Amazon and some electronics giants such as Huawei show impressive results against the CPUs and the graphics processing units, or GPU, parts that tend to dominate AI in the cloud. (Microprocessor Report articles are only available via subscription to the newsletter.)
And a think-piece this month in the Communications of the ACM, from two legends of chip design, John L. Hennessy and David A. Patterson, explains that circuits for machine learning represent something of a revolution in chip design broadly speaking. Hennessy and Patterson last year received the prestigious A.M. Turing award from the ACM for their decades of work on chip architecture design.
In the Microprocessor Report editorial, the newsletter's principal analyst, Linley Gwennap, describes the rise of custom application-specific integrated circuits for cloud with the phrase "when it rains, it pours." Among the rush of chips are Amazon's "Graviton" chip, which is now available in Amazon's AWS cloud service. Another is the "Kunpeng 920" from Chinese telecom and networking giant Huawei. Huawei intends to use the chips in both its line of server computers and as an offering in its own cloud computing service.
Both Amazon and Huawei intend to follow up with more parts: a "deep-learning accelerator" from Amazon called "Inferentia" and a part for neural network inference, the part of AI where the chip answers questions on the fly, called "Ascend 310." Another one, the "Ascend 910," is a "massive datacenter chip," as Gwennap describes it.
In the case of both Amazon and Huawei, the issues is the lock on inference by Intel's Xeon processor, and the lock on cloud-based training of Nvidia's GPUs.
"Cloud-service providers are concerned about Intel's near-100% share of server processors and Nvidia's dominance in AI accelerators," writes Gwennap. "ASICs offer a hedge against price increases or a product stumble from either vendor."
While the ASICs won't easily meet Intel's Xeon performance, "The strong performance of the Ascend 910 shows Nvidia is more vulnerable," he writes.
The essay by Hennessy and Patterson takes a bit of a longer view. The problem for the chip industry, they write, is the breakdown of Moore's Law, the famous law of transistor scaling, as well as the breakdown of Dennard Scaling, which says that chips get generally more energy-efficient. At the same time, Amdahl's Law, a rule of thumb that says the bottleneck in processor performance is the number of sequential, rather than parallel, operations that must be computed, is in full effect. All that means chips are something of a crisis, but one that also presents opportunity.
Basically, chip design has to move away from general-purpose parts, to specialization, they argue....
Previously on this trend:
Jan. 24
Watch Out Nvidia, Xilinx Is Performing (reports, beats, pops) XLNX; NVDA
Xilinx with their field-programmable gate array approach versus Nvidia's more generalist chips is an example of the type of competition experts were predicting NVDA would be facing, some over two years ago, links after the jump...Dec 2018
A Dip Into Chips: "AI Chip Architectures Race To The Edge"
Sept. 2018
"Why Alibaba is betting big on AI chips and quantum computing"
Sept 2018
Hot Chips 2018 Symposium on High Performance Chips
Sept 2018
Chips: "A Rogues Gallery of Post-Moore’s Law Options"
Aug 18
Ahead of NVIDIA Earnings: The Last of the "Easy" Comparisons (NVDA)
Beating that Q3 2017 EPS, 90 cents, by double i.e. $1.80 or more, is doable but AI and data centers will have to pick up the slack from the Q1 and Q2 cryptocurrency bump that started declining with Bitmain and other miners use of ASICs rather than GPU's.Aug 2018
Going forward the trend toward specialist proprietary chips, see Tesla's development of their own chips etc, etc will leave NVIDIA with a couple holes in the potential addressable markets they will want to fill.
Additionally, the smaller pups, some still in stealth, are nipping at the big dog's heels, making it more expensive for NVIDIA to maintain their edge in architecture....
Artificial Intelligence Chips: Past, Present and Future
May 2018
Chipmakers Battle To Power Artificial Intelligence In Cloud Data Centers" (AMD; NVDA; XLNX; INTC)
May 2018
"Xilinx Analyst Day Plays Heavy on AI" (XLNX)
May 2018
"Intel vs. Nvidia: Future of AI Chips Still Evolving" (INTC; NVDA)
Dec. 2017
“'The Ultimate Trading Machine' from Penguin Computing sets Record for Low Latency"
Oct. 2017
"The Natural Evolution of Artificial Intelligence"
"Nvidia CEO is 'more than happy to help' if Tesla's A.I. chip doesn't pan out" (NVDA; TSLA)
And April 2017
We've said NVIDIA probably has a couple year head start but this bears watching, so to speak....
In May, 2018 we met a person who knows about this stuff:
AI VC: "We Are Here To Create"
Sometimes the competition is just plain intimidating/scary/resistance-is-futile, smart.
June 2016
Machine Learning: JP Morgan Compares Google's New Chip With NVIDIA's (GOOG; NVDA)
And many more, use the "search blog" box, upper left if interested.