Sunday, September 3, 2023

"Instinct Can Beat Analytical Thinking"

If we don't use heuristics and rules-of-thumb we'd never get anything done. It would simply take too long to formally analyze every decision that we are faced with.

From Justin Fox at the Harvard Business Review, June 20, 2014:

Researchers have confronted us in recent years with example after example of how we humans get things wrong when it comes to making decisions. We misunderstand probability, we’re myopic, we pay attention to the wrong things, and we just generally mess up. This popular triumph of the “heuristics and biases” literature pioneered by psychologists Daniel Kahneman and Amos Tversky has made us aware of flaws that economics long glossed over, and led to interesting innovations in retirement planning and government policy.

It is not, however, the only lens through which to view decision-making. Psychologist Gerd Gigerenzer has spent his career focusing on the ways in which we get things right, or could at least learn to. In Gigerenzer’s view, using heuristics, rules of thumb, and other shortcuts often leads to better decisions than the models of “rational” decision-making developed by mathematicians and statisticians. At times this belief has led the managing director of the Max Planck Institute for Human Development in Berlin into pretty fierce debates with his intellectual opponents. It has also led to a growing body of fascinating research, and a growing library of books for lay readers, the latest of which, Risk Savvy: How to Make Good Decisions, is just out.

During a visit to HBR’s New York office, Gigerenzer discussed his work for an Ideacast podcast, which you can listen to here:

*****

We then continued talking well past the Ideacast time limit. What follows is a much-edited rendition of the full conversation.

HBR: Most of us are used to hearing about how bad we are at making decisions under conditions of uncertainty, and how our intuitions often lead us astray. But that’s not entirely the direction your research has gone in, correct?

Gerd Gigerenzer: I always wonder why people want to hear how bad their own decisions are, or at least, how dumb everyone else is. That’s not my direction. I’m interested to help people to make better decisions, not to state that they have these cognitive illusions and are basically hopeless when it comes to risk.

But a lot of your research over the years has shown people making mistakes.

Just imagine, a few centuries ago, who would have thought that everyone will be able to read and write? Now, today, we need risk literacy. I believe if we teach young people, children, the mathematics of uncertainty, statistical thinking, instead of only the mathematics of certainty – trigonometry, geometry, all beautiful things that most of us never need – then we can have a new society which is more able to deal with risk and uncertainty.

By teaching people how to deal with uncertainty, do you mean taking statistics class, studying decision theory?

If you’re in the world where you can calculate the risk, then statistical thinking is enough, and logic. If you go in a casino and play roulette, you can calculate how you will lose in the long run. But most of our problems are about uncertainty. So, for instance, in the course of the financial crisis, it was said that banks play in the casino. If only that would be true — then they could calculate the risks. But they play in the real world of uncertainty, where we do not know all the alternatives or the consequences, and the risks are very hard to estimate because everything is dynamic, there are domino effects, surprises happen, all kinds of things happen.

Risk modeling in the banks grew out of probability theory.

Right, and that’s the reason why these models fail. We need statistical thinking for a world where we can calculate the risk, but in a world of uncertainty, we need more. We need rules of thumb called heuristics, and good intuitions. That distinction is not made in most of economics and most of the other cognitive sciences, and people believe that they can model or reduce all uncertainty to risk.

You tell a story that I guess is borrowed from Nassim Taleb, about a turkey. What’s the problem with the way that turkey approached risk management?

Assume you are a turkey and it’s the first day of your life. A man comes in and you believe, “He kills me.” But he feeds you. Next day, he comes again and you fear, “He kills me,” but he feeds you. Third day, the same thing. By any standard model, the probability that he will feed you and not kill you increases day by day, and on day 100, it is higher than any before. And it’s the day before Thanksgiving, and you are dead meat. So the turkey confused the world of uncertainty with one of calculated risk. And the turkey illusion is probably not so often in turkeys, but mostly in people.

What kind of rule of thumb would help a person, or a turkey, in that sort of situation?

Let’s use people for that. For instance, the value at risk and other standard models that rating agencies used before the crisis in 2008 — the same thing happened there. The confidence increased year by year, and shortly before the crisis, it was highest. These types of models cannot predict any crisis, and have missed every one. They work when the world is stable. They’re like if you have an airbag in your car that works all the time except when you have an accident.

So we need to go away from probability theory and investigate smart heuristics. I have a project with the Bank of England called simple heuristics for a safer world of finance. We study what kind of simple heuristics could make the world safer. When Mervyn King was still the governor, I asked him which simple rules could help. Mervyn said start with no leverage ratio above 10 to one.  Most banks don’t like this idea, for obvious reasons. They can do their own value-at-risk calculations with internal models and there is no way for the central banks to check that. But these kinds of simple rules are not as easy to game. There are not so many parameters to estimate.

Here’s a general idea: In a big bank that needs to estimate maybe thousands of parameters to calculate its value-at-risk, the error introduced by these estimates is so big that you should make it simple. If you are in a small bank that doesn’t do big investments, you are in a much safer and more stable mode. And here, the complex calculations may actually pay. So, in general, if you are in an uncertain world, make it simple. If you are in a world that’s highly predictable, make it complex....

....MUCH MORE