The author of this piece,
Tiernan Ray,
used to run Barron's 'Tech Trader' and 'Tech Trader Daily' columns.
When he assumed those chores after Eric Savitz left we greeted him and
wished him well: "
Barron's and Journo Tiernan Ray are Class Acts " but he had some gargantuan shoes to fill and I wasn't sure Barron's would remain one of our sources for tech.
And
then someone pointed out that Mr. Ray was getting answers out of
NVIDIA's Jenson Huang that were head and shoulders above anything the
NVDA CEO would tell anyone else and I started watching for it and
son-of-a-gun it was like being a fly on the wall. Tiernan knows this
stuff and the tech guys recognize it.
Here he is at ZD Net
, May 15, 2019
Xilinx hopes to take a big chunk of the market for semiconductors that
process machine learning inference tasks by convincing developers it's
not only about the neural network performance. It's about the entire
application.
Chip maker Xilinx on Tuesday held its annual "analyst day" event in
New York, where it told Wall Street's bean counters what to expect from
the stock. During the event the company fleshed out a little more how it
will go after a vibrant market for data center chips, especially those
for machine learning.
That market is expected to rise to $6.1 billion by 2024 from $1.8 billion in 2020.
The
focus for Xilinx is a raft of new platform products that take its
capabilities beyond the so-called field-programmable gate arrays, or
FPGAs, that it has sold for decades. That requires selling developers of
AI applications on the notion there's more than just the neural network
itself that needs to be sped up in computers.
Data center is a
small part of Xilinx's overall revenue, at $232 million in the fiscal
year ended in March, out of a total of $3.1 billion in company revenue.
However, it is the fastest-growing part of the company, rising 46% last
year. The company yesterday said data center revenue growth is expected
to accelerate, rising in a range of 55% to 65% this fiscal year, versus
the compounded annual growth of 42% in the period 2017 through 2019.
Xilinx expects to gain ground in machine learning
inference by virtue of "tiles," compute blocks that connect to one
another over a high-speed memory bus, inside the "AI Engines" portion of
its "Versal" programmable chips.
To do so, Xilinx is moving past its heritage in FPGAs, to something more
complex. FPGAs contain a kind of vast field of logic gates that can be
re-arranged, so they have the prospect of being tuned to a task and
therefore being of higher performance and greater energy efficiency.
...
MUCH MORE
Also by Mr. Ray, commentary on the AI company co-founded by Eon Musk:
OpenAI has an inane text bot, and I still have a writing job