Saturday, February 17, 2018

"Scientists Can’t Replicate AI Studies. That’s Bad News"

We're with Popper and Feynman on the overarching premise: If what you're doing isn't falsifiable, if what you're doing isn't replicable, what you're doing isn't science.
And if what you're doing was funded by the public in any way the law should consider the resulting code to be owned by the public.
There, three different concerns dispensed with in two sentences.

From Futurism:
In Brief
Most AI researchers don't report the source code of the AI programs they use, or the data that AI are trained on. That means other scientists can't reproduce their results — and may make it harder to implement AI more broadly.
The specter of replication
The field of artificial intelligence (AI) may soon have to face a ghost that’s haunted many a scientific field lately: the specter of replication. For a research study to be considered scientifically robust, the scientific method says that it must be possible for other researchers to reproduce its results under the same conditions. Yet because most AI researchers don’t publish the source code they use to create their algorithms, it’s been largely impossible for researchers to do that.

Science magazine reports that at a meeting of the Association for the Advancement of Artificial Intelligence (AAAI), computer scientist Odd Erik Gundersen shared a report that found only six percent of 400 algorithms presented at two AI conferences in the past few years included the algorithm’s code. Only one in three shared the data they used to test their program, and just half shared a summary that described the algorithm with limited detail — AKA “pseudocode.”
Gundersen says that a change is going to be necessary as the field grows. “It’s not about shaming,” he told Science. “It’s just about being honest.”

harmful secrecy
Replication is essential to proving that the information an experiment produces can be used consistently in the real world, and that it didn’t result randomly; an AI that was only tested by its creators might not produce the same results at all when run on a different computer, or if fed different data. That wouldn’t be very helpful at all if you were asking that AI to do a specific task, whether that’s search for something on your phone or run a nuclear reactor. You want to be assured that the program you’re running will do what you want it to do....MORE