Google’s Tensor Intriguing, But Nvidia is the Incumbent, Says JP Morgan
J.P. Morgan semiconductor analyst Joseph Moore today reflects on Alphabet’s (GOOGL) announcement last week, at its annual developer conference, Google I/O, that it has been developing in secret a chip of its own to speed up machine learning, called the “Tensor Processing Unit,” or “TPU,” which the company says is already “powering many applications at Google.”
Google has had input, I would note, from luminaries of the chip world, such as Dave Patterson of U.C. Berkeley, who took a sabbatical at the company a year ago, though he’s not mentioned in the Google materials.
Barron’s magazine’s print edition predicted a Google chip in a cover story last October, “Watch Out Intel, Here Comes Facebook.”
Writes Moore, Nvidia (NVDA), a company that of late has seen tremendous results in selling into the machine learning market made up of Google and Facebook (FB) and others, is the “incumbent” and the one to beat:
Google “Tensor” machine learning chip raises eyebrows – Arguably a positive for NVIDIA despite competitive concerns, as Google investment validates the importance of specialized chips for AI applications. Google announced this week that it has been building applications around a custom chip called “Tensor” to support machine learning applications. There was minimal architectural detail provided, but there were quotes from Google in the press to the effect that a custom solution will be “more efficient than graphics chips” in those applications....
...MORE