From ZDNet:
Google says 'exponential' growth of AI is changing nature of compute
Google software engineer Cliff Young explains how the explosion in deep learning algorithms is coinciding with a breakdown in Moore's Law, the decades-old rule of thumb of progress in computer chips, forcing radical new computer designs.
The explosion of AI and machine learning is changing the very nature of computing, so says one of the biggest practitioners of AI, Google.
Google software engineer Cliff Young gave the opening keynote on Thursday morning at the Linley Group Fall Processor Conference, a popular computer-chip symposium put on by venerable semiconductor analysis firm The Linley Group, in Santa Clara, California.
Said Young, the use of AI has reached an "exponential phase" at the very same time that Moore's Law, the decades-old rule of thumb about semiconductor progress, has ground to a standstill.
"The times are slightly neurotic," he mused. "Digital CMOS is slowing down, we see that in Intel's woes in 10-nanometer [chip production], we see it in GlobalFoundries getting out of 7-nanometer, at the same time that there is this deep learning thing happening, there is economic demand." CMOS, or complementary metal-oxide semiconductor, is the most common material for computer chips.
As conventional chips struggle to achieve greater performance and efficiency, demand from AI researchers is surging, noted Young. He rattled off some stats: The number of academic papers about machine learning listed on the arXiv pre-print server maintained by Cornell University concerning is doubling every 18 months. And the number of internal projects focused on AI at Google, he said, is also doubling every 18 months. Even more intense, the number of floating-point arithmetic operations needed to carry out machine learning neural networks is doubling every three and a half months.
All that growth in computing demand is adding up to a "Super Moore's Law," said Young, a phenomenon he called "a bit terrifying," and "a little dangerous," and "something to worry about."
"Why all this exponential growth?" in AI, he asked. "In part, because deep learning just works," he said. "For a long time, I spent my career ignoring machine learning," said Young. "It wasn't clear these things were going to take off."
But then breakthroughs in things such as image recognition started to come quickly, and it became clear deep learning is "incredibly effective," he said. "We have been an AI-first company for most of the last five years," he said, "we rebuilt most of our businesses on it," from search to ads and many more.
The demand from the Google Brain team that leads research on AI is for "gigantic machines" said Young. For example, neural networks are sometimes measured by the number of "weights" they employ, variables that are applied to the neural network to shape its manipulation of data.
Whereas conventional neural nets may have hundreds of thousand of such weights that must be computed, or even millions, Google's scientists are saying "please give us a tera-weight machine," computers capable of computing a trillion weights. That's because "each time you double the size of the [neural] network, we get an improvement in accuracy." Bigger and bigger is the rule in AI.
To respond, of course, Google has been developing its own line of machine learning chips, the "Tensor Processing Unit." The TPU, and parts like it, are needed because traditional CPUs and graphics chips (GPUs) can't keep up.......MUCH MORE