Saturday, March 28, 2020

Complexity, Modeling, and Forecasting: Oxford's J. Doyne Farmer

Since it came to light that Edge.org was close enough to Jeffrey Epstein that there was no way they didn't know he was raping underage girls we haven't linked to what had been a sometimes interesting, sometimes pretentious intellectual hangout. It is, simply put, bad for you to associate in any way with people who condone this behavior.

So, if you are not in a position to put a stop to this kind of thing the very least you should do, simply for your own sake, is to avoid the enablers at all costs.

This is the case for the political crowd that hung out on pedo island, this is the case for the Rotherham authorities and this is the case for the Hollywood crowd that covered for Weinstein.
They all knew and if you have any association with those people it is just bad for you.
As the young people say: bad juju.

But I wanted to revisit the author of this piece for his thoughts on complexity as a set up for some posts on complexity risk next week. What to do?

That is what the Wayback Machine is for. I'll do something that is probably bad karma in its own right, link to the internet archive rather than give Edge.org the click.

Ha! You can justify pretty much anything you do if you think hard enough, just ask 90% of the people in prison. So without further ado, a repost from August 2018:

Complexity, Modeling, and Forecasting: Oxford's J. Doyne Farmer 
Not the other J. Doyne Farmer.
This 'un is Oxford's, thus the second comma in the headline,

From his Edge bio:

J. DOYNE FARMER is director of the Complexity Economics programme at the Institute for New Economic Thinking at the Oxford Martin School, professor in the Mathematical Institute at the University of Oxford, and an external professor at the Santa Fe Institute.

His current research is in economics, including agent-based modeling, financial instability and technological progress. He was a founder of Prediction Company, a quantitative automated trading firm that was sold to the United Bank of Switzerland in 2006. His past research includes complex systems, dynamical systems theory, time series analysis and theoretical biology.

During the eighties he was an Oppenheimer Fellow and the founder of the Complex Systems Group at Los Alamos National Laboratory. While a graduate student in the 70s, he built the first wearable digital computer, which was successfully used to predict the game of roulette.

From Edge.org:

Collective Awareness
Economic failures cause us serious problems. We need to build simulations of the economy at a much more fine-grained level that take advantage of all the data that computer technologies and the Internet provide us with. We need new technologies of economic prediction that take advantage of the tools we have in the 21st century.

Places like the US Federal Reserve Bank make predictions using a system that has been developed over the last eighty years or so. This line of effort goes back to the middle of the 20th century, when people realized that we needed to keep track of the economy. They began to gather data and set up a procedure for having firms fill out surveys, for having the census take data, for collecting a lot of data on economic activity and processing that data. This system is called “national accounting,” and it produces numbers like GDP, unemployment, and so on. The numbers arrive at a very slow timescale. Some of the numbers come out once a quarter, some of the numbers come out once a year. The numbers are typically lagged because it takes a lot of time to process the data, and the numbers are often revised as much as a year or two later. That system has been built to work in tandem with the models that have been built, which also process very aggregated, high-level summaries of what the economy is doing. The data is old fashioned and the models are old fashioned.

It's a 20th-century technology that's been refined in the 21st century. It's very useful, and it represents a high level of achievement, but it is now outdated. The Internet and computers have changed things. With the Internet, we can gather rich, detailed data about what the economy is doing at the level of individuals. We don't have to rely on surveys; we can just grab the data. Furthermore, with modern computer technology we could simulate what 300 million agents are doing, simulate the economy at the level of the individuals. We can simulate what every company is doing and what every bank is doing in the United States. The model we could build could be much, much better than what we have now. This is an achievable goal.

But we're not doing that, nothing close to that. We could achieve what I just said with a technological system that’s simpler than Google search. But we’re not doing that. We need to do it. We need to start creating a new technology for economic prediction that runs side-by-side with the old one, that makes its predictions in a very different way. This could give us a lot more guidance about where we're going and help keep the economic shit from hitting the fan as often as it does.
***
COLLECTIVE AWARENESS
I'm thinking about collective awareness, which I think of as the models we use to collectively process information about the world, to understand the world and ourselves. It's worth distinguishing our collective awareness at three levels. The first level is our models of the environment, the second level is our models of how we affect the environment, and the third level is our models of how we think about our collective effect on ourselves.

Understanding the environment is something we've been doing better and better for many centuries now. Celestial mechanics allows us to understand the solar system. It means that if we spot an asteroid, we can calculate its trajectory and determine whether it's going to hit the Earth, and if it is, send a rocket to it and deflect it.

Another example of collective awareness at level one is weather prediction. It's an amazing success story. Since 1980, weather prediction has steadily improved, so that every ten years the accuracy of weather prediction gets better by a day, meaning that if this continues, ten years from now the accuracy for a two-day weather forecast will be the same as that of a one-day weather forecast now. This means that the accuracy of weather prediction has gotten dramatically better. We spend $5 billon a year to make weather predictions and we get $30 billion a year back in terms of economic benefit.
The best example of collective consciousness at level two is climate change. Climate change is in the news, it's controversial, etc., but most scientists believe that the models of climate change are telling us something that we need to pay serious attention to. The mere fact that we're even thinking about it is remarkable, because climate change is something whose real effects are going to be felt fifty to 100 years from now. We're making a strong prediction about what we're doing to the Earth and what's going to happen. It's not surprising that there's some controversy about exactly what the outcome is, but we intelligent people know it's really serious. We are going to be increasingly redirecting our efforts to deal with it through time.

The hardest problem is collective awareness at level three—understanding our own effects on ourselves. This is because we're complicated animals. The social sciences try to solve this problem, but they have not been successful in the dramatic way that the physical and natural sciences have. This doesn’t mean the job is impossible, however.

Climate prediction had the big advantage that it could piggyback on weather prediction. As weather predictions got more accurate, climate models automatically got more accurate, too. There is a way in which climate prediction is actually easier than weather prediction. You don't try to say what's going to happen three days in the future, you try to say what's going to happen, on average, if things change. If we pump 100 parts per million more CO2 into the atmosphere, how much is that going to warm things up? A climate model is just a simulation of the weather for a long time, but under conditions that are different from those now. You inject some greenhouse gases into the atmosphere, you simulate the world, and you measure the average temperature and the variability of the weather in your simulation.

Climate predictions get a huge benefit from all the effort that's gone into weather prediction. I've been trying to get a good number on how much we've invested in weather prediction, but it is certainly $100 billion dollars or more. Probably more. It's probably closer to $1 trillion that we've invested since 1950, when we did the first numerical weather predictions. It sounds like a lot of money, but the benefits are enormous.

I've been thinking about how we can make better economic models, because a lot of the problems we're having in the world right now are at least in part caused by economics and the interaction of economics with mass sociology. Our cultural institutions are lagging technological change, and having a difficult time keeping pace with them. The economy plays a central role. Since the '70s, the median wage in the US has been close to flat. At the same time, the rich have been getting richer at a rate of two or three percent per year. A lot of the factors that are driving the problems we're having involve the interaction of the economy with everything else. We need to pursue some radically different approaches to making economic models.

It's interesting to reflect on the way we do economic modeling now. How do those models work? What are the basic ideas they're built on? We got an unfortunate taste of the ways in which they don't work in 2006, when some prescient economists at the New York Fed asked FRB/US, the leading econometric model, "What happens if housing prices drop by twenty percent?" This was 2006—their intuition was right on target—over the next two years, housing prices dropped by almost thirty percent. FRB/US said there'd be a little bit of discomfort in the economy, but not much. The answer FRB/US gave them was off by a factor of twenty. It made such bad forecasts because the model didn’t have the key elements that caused the crisis to happen.

Since then, economists have focused a lot of effort on adding these key elements, for example, by coupling financial markets to the macroeconomy. FRB/US didn’t model the banking system, and couldn’t even think about the possibility that banks might default. Issues like that are now in those models. The models have gotten better. But there is still a good chance that when we have the next crisis, we'll get similarly bad answers. The question is, how can we do better?

The first thing one has to say is that it's a hard problem. Economics is a lot harder than physics because people can think. If you make a prediction about the future of the economy, people may respond to your prediction, automatically invalidating it by behaving in a way that creates a different future. Making predictions about economics is a lot harder than using physics to predict the behavior of the natural world.

Fortunately, the most interesting things we want to do aren't to predict what GDP is going to do next month, but to make predictions about what happens if we tinker with the system. If we change the rules so that, say, people can't use as much leverage, or if we put interest rates at level X instead of level Y, what happens to the world? These are conditional forecasts, in contrast to predicting tomorrow's weather, which is an unconditional forecast. It's more like climate prediction. It’s an easier problem in some ways and harder in others because it is necessary to simulate a hypothetical world and take into account how people will behave in that hypothetical world. If you have a system like the economy that depends on thinking people, you have to have a good model for how they think and how they're going to respond to the changes you're making.
*** 
When I was a graduate student, Norman Packard and I decided to take on the problem of beating roulette. We ended up building what turned out to be the first wearable digital computer. We were the first people to take a computer into a casino and successfully predict the outcome of roulette and make a profit. We were preceded by Claude Shannon and Ed Thorpe who built an analog computer that allowed them to predict roulette in Shannon’s basement, but they never successfully took it into the casino. My roulette experience changed the rest of my life because it set me on a career path in which I became an expert on prediction. This never would have occurred to me before that.

If a system is chaotic it means that prediction is harder than it is for a system that isn’t chaotic....
Wayback Machine link

original Edge.org link

Previously on Birds of a Feather:
"The MIT-Epstein debacle shows ‘the prostitution of intellectual activity’. Time for a radical agenda: close the Media Lab, disband Ted Talks and refuse tech billionaires money" 
Following up on yesterday's "What Do You Get When You Cross Jeffrey Epstein With MIT's Media Lab? Apparently Something Like Theranos