This week’s guest blogger is David Orrell, an author, founder of Systems Forecasting, and an Honorary Visiting Research Scholar at the Smith School of Enterprise and the Environment in Oxford. The UK Kindle edition of Economyths is available for a limited time at the highly economical price of 99p.
In January 2009, in the immediate aftermath of the credit crunch, the physicist and hedge fund manager J.P. Bouchaud wrote in the pages of Nature that Economics needs a scientific revolution.
Economyths is an attempt to spell out what such a revolution might look like, and document the exciting developments taking place in economics.
It too is written from an outsider perspective – that of an applied mathematician, working mostly in the area of computational biology. Many of the techniques used in that field, such as network theory and agent-based modelling, are beginning to find widespread applications in economics. But the assumptions they are based on are completely different from those of mainstream economics.
Consider for example the idea that the “invisible hand” of the marketplace drives prices to an optimal equilibrium. This idea is usually attributed to Adam Smith, though as the Czech economist Tomas Sedlacek argues it actually goes back much further.
In the 19th century, neoclassical economists such as William Stanley Jevons and Léon Walras attempted to demonstrate this principle mathematically, based on the idea of Homo economicus, or rational economic man. In the 1950s, economists finally managed to prove that markets would indeed reach a Pareto-optimal equilibrium. But to do so, they had to make numerous assumptions – including rational utility-maximising behaviour, coupled with perfect information, and infinite computational capacity.
In the 1960s, efficient market theory was proposed as an explanation for why the economy was impossible to predict. Again, it assumed that market participants were rational and acted independently of one another to optimise their own utility. In the 1970s, rational expectations theory was all the vogue. Tools in use today – such as the risk models relied on by banks, or the General Equilibrium Models called on by policy makers – continue to make these assumptions, with at best small modifications.
Of course, no one thinks that people are perfectly rational or independent, or that the economy reaches a perfect equilibrium – but it has been generally believed that these assumptions were good enough to capture the overall behaviour. They could be viewed as representing an ideal economy, to which the actual economy can at least aspire. And they provided policy makers with an excuse for dangerous deregulation of the financial sector – what Adair Turner has called “regulatory capture through the intellectual zeitgeist.”
Unfortunately, as illustrated most recently by the credit crunch, this picture of the economy is highly unrealistic. The behaviour of home owners during the US credit crunch – or for that matter large firms like Lehman Brothers – hardly conforms to the model of rational economic man. And if stock markets are really governed by the invisible hand, then it has a bad case of the shakes.
So-called heterodox economists have long questioned the assumptions behind mainstream economics. But following the credit crunch, there has been an even more concerted effort to develop alternative models which can address issues such as economic inequality, environmental sustainability, human wellbeing, and financial instability. Many of the new ideas are coming from areas of applied mathematics such as nonlinear dynamics, complexity, and network theory.
An example is the agent-based models being used by complexity researchers such as Doyne Farmer to simulate the economy. Models have been developed of artificial stockmarkets in which hundreds of simulated traders buy and sell stocks. Each of the trader “agents” has its own strategy, which adapts in response to both market conditions and the influence of other agents. Instead of settling on a stable equilibrium, it is found that prices experience periodic booms or busts as investors flock in and out of the market. Agent-based models are also used to simulate the highly skewed distribution of wealth in many economies, in which a small percentage of the population sequesters most of the wealth.
Another rich source of new ideas is those other life sciences, biology and ecology. The ecologist Robert May recently joined forces with the Bank of England’s Andrew Haldane to analyse the financial network from a systems perspective. They found that risk metrics used for individual institutions such as banks fail to account for systemic risk.
The financial system has become increasingly interconnected in recent decades. This is good for short-term efficiency, but also means there is an increased risk of contagion from one area to another, which does not register with conventional risk models. As ecologists know, robust ecosystems tend to be built up of smaller, weakly connected sub-networks. Maybe financial regulators can learn a trick from nature, by introducing a degree of modularity and redundancy. An even more urgent issue, of course, is how to make the human economy fit in with the global ecosystem which contains it.
One of the lessons of complex systems research is that it requires collaborations between people from a broad mix of backgrounds. Another is that models are only imperfect approximations of a system. Accurate prediction will always remain elusive. However we can at least base our models on realistic assumptions. And even if we cannot predict the exact timing of the next financial crisis any better than we could the last one, at least we can learn how to make the system more robust in the first place.