Wednesday, December 13, 2017

Markets and Waterfowl: Machine Learning Is Only As Good As the Training It Receives

"When Google was training its self-driving car on the streets of Mountain View, California, the car rounded a corner and
encountered a woman in a wheelchair, waving a broom, chasing a duck. The car hadn’t encountered this before so it stopped and waited."
Hell, I'd stop too.
From Quartz:

AI does not have enough experience to handle the next market crash
Artificial intelligence is increasingly used to make decisions in financial markets. Fund managers empower AI to make trading decisions, frequently by identifying patterns in financial data. The more data that AI has, the more it learns. And with the financial world producing data at an ever-increasing rate, AI should be getting better. But what happens if the data the AI encounters isn’t normal or represents an anomaly?

Globally, around 10 times more data (pdf) was generated in 2017 than in 2010. This means that the best quality data is also highly concentrated in the recent past—a world that has been running on cheap money, supplied by central banks through purchases of safe securities, which is not a “normal” state for the market. This has had a number of effects, from causing a rise in “zombie” firms to creating generational lows in volatility to encouraging unusually large corporate buybacks (pdf).

With so much data residing in this era, AI might not know what a “normal” market actually looks like. Robert Kaplan, the president of the Federal Reserve Bank of Dallas, recently pointed out some of the market extremes that exist today. The essay included a caution that growing imbalances in the economy could increase the risk of a rapid adjustment....MUCH MORE