From the journal Science:
About 40% of economics experiments fail replication survey
When a massive replicability study in psychology was published last year, the results were, to some, shocking: 60% of the 100 experimental results failed to replicate. Now, the latest attempt to verify findings in the social sciences—this time with a small batch from experimental economics—also finds a substantial number of failed replications. Following the exact same protocols of the original studies, the researchers failed to reproduce the results in about 40% of cases.*I may have gone a bit further than just mentioning. From June 17, 2014:
"I find it reassuring that the replication rate was fairly high," says Michael L. Anderson, an economist at the University of California, Berkeley, not involved with the study. But he notes that most of the failures came from studies using a 5% "p value" cut-off for statistical significance, suggesting "what some realize but fewer are willing to discuss: The accepted standard of a 5% significance level is not sufficient to generate results that are likely to replicate."
Psychology's high-profile replication efforts, which were , have triggered policy changes at some scientific journals and modified priorities at many funding agencies. But the overall failure rate has also been called into question, because most of the original studies were reproduced only once, often without strictly following the initial protocol. And most of the replication studies .
The latest attempt at social science do-overs—a replication of 18 studies in experimental economics—went to great lengths to avoid such criticisms. "We did not want to pick out studies on any subjective basis," says lead author Colin Camerer, an economist at the California Institute of Technology in Pasadena. Instead, the team set its criteria based on the experimental setup and whether a study produced one central result. They combed through papers published from 2011 through 2014 in two of the field's top journals, American Economic Review and the Quarterly Journal of Economics and came up with a set of 18 that met their criteria.
"Our approach was very lawyerly," Camerer says. Before starting, the researchers drew up a three-page "replication report" for each study, spelling out how it would be executed and interpreted. The report was sent out to the original authors for feedback. "The idea was that in retrospect nobody could say we were not clear about the replications [or that we] were being unfair." And it all went smoothly, he says. "To our pleasant surprise, basically all of them were a combination of flattered and happy we were going to replicate their study."
Eleven of the 18 economic replications succeeded, they report today in Science. "Our takeaway is that the replication rate is rather good," says Camerer, noting that the study topics from the successful replications reflect "most of the things we study in experimental economics [that] are replicated over and over: Do prices move toward where supply meets demand? Are there ‘price bubbles’ in artificial markets? Do people contribute in ‘public goods’ where spending some of your own money helps the group?"
"The authors were fair and collegial," says Homa Zarghamee, an economist at Barnard College in New York City whose 2011 study failed to replicate. "They took great care to exactly replicate our study methodologically," she says. But she adds that the failure doesn't mean the results from the original study were a false positive....MORE
"Most Financial Economics Research is ‘Likely False’"
Reproducibility is pretty much the cornerstone of science. And yet some Bozo can come out and say:
People on all sides of the recent push for direct replication—a push I find both charming and naive—are angry....and keep his pathetic little job. As the young people used to phrase the rejoinder: L
By the way, that was James Coan, who calls himself Dr. although he apparently didn't have the intellectual horsepower to become a Chiropractor or D.D.S., writing in the Journal Medium.
Rather than the two honorable professions named above he's a freakin' Associate Professor of Clinical Psychology at the University of Virginia.
See, the thing is, if what one is writing about can't be reproduced, that kind of writing is called 'Literature'.
And, although gentle reader probably doesn't care, yes, I know the difference between replication and reproducibility.
With that note on the current state of the so-called soft sciences here's Barron's Focus on Funds with a much more upbeat post:
You might have heard of the study “Why Most Published Research Findings are False.” But who knew Ph.D.s had this much righteous indignation?...MORE
Two months back, we heard the argument that tinkering with investment backtests amounts to fraud.
If interested see also the Federal Reserve paper embedded in October 2015's "Ha! Ahead of the 2015 Economics Nobel, The Federal Reserve Proves Economics Is NOT A Science"