Wednesday, June 5, 2019

Chips: NVIDIA Begins To Embrace the Move Toward More Specialized Chips (NVDA; INTC; AMD; GOOG; XLNX)

See I told you I wasn't crazy. Something we've been babbling about for a couple years.
Some links below.
From MIT's Technology Review:
Hardware design, rather than algorithms, will help us achieve the next big breakthrough in AI. That’s according to Bill Dally, Nvidia’s chief scientist, who took the stage Tuesday at EmTech Digital, MIT Technology Review’s AI conference. “Our current revolution in deep learning has been enabled by hardware,” he said.

As evidence, he pointed to the history of the field: many of the algorithms we use today have been around since the 1980s, and the breakthrough of using large quantities of labeled data to train neural networks came during the early 2000s. But it wasn’t until the early 2010s—when graphics processing units, or GPUs, entered the picture—that the deep-learning revolution truly took off.
“We have to continue to provide more capable hardware, or progress in AI will really slow down,” Dally said.

Nvidia is now exploring three main paths forward: developing more specialized chips; reducing the computation required during deep learning; and experimenting with analog rather than digital chip architectures.

Nvidia has found that highly specialized chips designed for a specific computational task can outperform GPU chips that are good at handling many different kinds of computation. The difference, Dally said, could be as much as a 20% increase in efficiency for the same level of performance.
Dally also referenced a study that Nvidia did to test the potential of “pruning”—the idea that you can reduce the number of calculations that must be performed during training, without sacrificing a deep-learning model’s accuracy. Researchers at the company found they were able to skip around 90% of those calculations while retaining the same learning accuracy. This means the same learning tasks can take place using much smaller chip architectures.

Finally, Dally mentioned that Nvidia is now experimenting with analog computation. Computers store almost all information, including numbers, as a series of 0s or 1s. But analog computation would allow all sorts of values—such as 0.3 or 0.7—to be encoded directly. That should unlock much more efficient computation, because numbers can be represented more succinctly, though Dally said his team currently isn’t sure how analog will fit into the future of chip design.

Naveen Rao, the corporate vice president and general manager of the AI Products Group at Intel, also took the stage and likened the importance of the AI hardware evolution to the role that evolution played in biology. Rats and humans, he said, are divergent in evolution by a time scale of a few hundred million years. Despite vastly improved capabilities, however, humans have the same fundamental computing units as their rodent counterparts....
....MORE

Previously on specialized chips:
March 2019
Chips: The Accelerator Wall—A New Problem for a Post-Moore’s Law World (GPU; ASIC; FPGA)
Feb. 2019 
Watch Out NVIDIA: "Amazon, Huawei efforts show move to AI-centric chips continues"
Jan. 24
Watch Out Nvidia, Xilinx Is Performing (reports, beats, pops) XLNX; NVDA

Xilinx with their field-programmable gate array approach versus Nvidia's more generalist chips is an example of the type of competition experts were predicting NVDA would be facing, some over two years ago, links after the jump...
Dec 2018
A Dip Into Chips: "AI Chip Architectures Race To The Edge"
Sept. 2018
"Why Alibaba is betting big on AI chips and quantum computing"
Sept 2018
Hot Chips 2018 Symposium on High Performance Chips
Sept 2018
Chips: "A Rogues Gallery of Post-Moore’s Law Options"
Aug 18
Ahead of NVIDIA Earnings: The Last of the "Easy" Comparisons (NVDA)
Beating that Q3 2017 EPS, 90 cents, by double i.e. $1.80 or more, is doable but AI and data centers will have to pick up the slack from the Q1 and Q2 cryptocurrency bump that started declining with Bitmain and other miners use of ASICs rather than GPU's.

Going forward the trend toward specialist proprietary chips, see Tesla's development of their own chips etc, etc will leave NVIDIA with a couple holes in the potential addressable markets they will want to fill.

Additionally, the smaller pups, some still in stealth, are nipping at the big dog's heels, making it more expensive for NVIDIA to maintain their edge in architecture....
Aug 2018
Artificial Intelligence Chips: Past, Present and Future
May 2018
Chipmakers Battle To Power Artificial Intelligence In Cloud Data Centers" (AMD; NVDA; XLNX; INTC)
May 2018
"Xilinx Analyst Day Plays Heavy on AI" (XLNX)
May 2018
"Intel vs. Nvidia: Future of AI Chips Still Evolving" (INTC; NVDA)
Dec. 2017
“'The Ultimate Trading Machine' from Penguin Computing sets Record for Low Latency"
Oct. 2017
"The Natural Evolution of Artificial Intelligence" 

"Nvidia CEO is 'more than happy to help' if Tesla's A.I. chip doesn't pan out" (NVDA; TSLA)

And April 2017
We've said NVIDIA probably has a couple year head start but this bears watching, so to speak....
***
The only reason for Tesla to do this is that NVIDIA's chips are general purpose whereas specialized chips are making inroads in stuff like crypto mining (ASICs), Google's Tensor Processing Units (TPUs) for machine learning and Facebook's hardware efforts.

In May, 2018 we met a person who knows about this stuff:

AI VC: "We Are Here To Create"
Sometimes the competition is just plain intimidating/scary/resistance-is-futile, smart.

June 2016
Machine Learning: JP Morgan Compares Google's New Chip With NVIDIA's (GOOG; NVDA)

And many more, use the "search blog" box, upper left if interested.