Working In Uncertainty

Straighten out your thinking on ‘risk aversion’, ‘risk appetite’, ‘risk tolerance’, ‘risk limits’, and all that

by Matthew Leitch, first published 24 March 2010


In our society today there are lots of ideas floating around about the way people feel and should act in the face of ‘risk’. Some are built into our language. Others come from popular psychology, or from theories that have been abandoned long ago, from natural but wrong inferences, even astrology. Many have no identifiable origin; they just seem to be things a lot of people think.

These ideas seep into our heads from an early age, usually without conscious scrutiny. Some would say they are ‘memes’ transmitted through society that find a home in our minds. Some are more ‘viral’ than others and logical consistency is not the only thing that makes them contagious. Often the main thing in favour of them is that they seem to be held by lots of other people.

And this is where our problems start, because some of these ideas are wrong and lead to behaviours that are not in our best interests. In the popular TV game QI contestants lose marks when they give an answer that is wrong but believed by many to be true. The researchers who create the questions identify these popular but wrong answers and if you say one a loud siren goes off and you lose 10 points. Many contestants in QI end up with negative scores!

If you are serious about understanding the thinking around phrases like ‘risk aversion’, ‘risk appetite’, ‘risk tolerance’, ‘risk limits’ and so on then you can spend a few minutes reading this article. Expect an imaginary QI siren to go off and to lose 10 points at least once while reading it! But before you write to me in angry disbelief to point out where my logic is wrong, take a few minutes to wonder why you think what you do. Did you choose that belief? Does it truly agree with everything you know? Is there really no chance at all that it is wrong?’ What would you gain from quietly thinking again?

With those warnings in mind, read on if you dare.

Ideas that are wrong

Idea 1: Although most people are risk averse, there is still a significant minority who are risk seekers, meaning that they prefer to risk loss, and perhaps even enjoy the danger.

The problem here is caused by two different definitions of ‘risk’ and two different definitions of ‘risk averse’. For most people ‘a risk’ is some event/outcome that might happen and would be unwelcome if it did. In this sense ‘risk aversion’ is all but universal, at least among sane people. If we do not like an outcome then we do not like the possibility of that outcome either. Possibilities like being killed or crippled, losing a lot of money, or losing a good job are classic risks.

However, scientists and mathematicians became interested in explaining something much more subtle. Why do people usually prefer not to accept a bet with equal odds of losing an amount of money and gaining the same amount of money. In purely financial terms, they reasoned, people should be quite indifferent to this bet because, in the long run, on average, you end up no richer or poorer. In fact most people are averse to the bet. This was one of a number of related puzzles to be explained.

One of the theories put forward to try to explain this was that we do not value each unit of money equally. If we have just a small amount of money to live on then each pound is important. If we have millions then each is less important. Put it another way, each extra pound added to our fortune adds a smaller amount of ‘utility’. So, although winning 100 and losing 100 involves the same amount of money (100) it does not involve the same amount of utility. The relationship between money and utility graphed is a curved line that starts from the bottom left, rising rapidly, but tends to flatten out to the upper right. The utility involved in losing 100 is more than the utility involved in gaining 100, starting from the same level of wealth.

This shape of curve, flattening out for higher amounts of money, is not the only possibility and in some situations it makes sense for our curve to get steeper instead of flatter. Confusingly, the usual flattening curve was labelled ‘risk averse’ while the steepening curve was labelled ‘risk seeking’.

This abuse of language has been damaging. The phrase ‘risk seeking’ just meant welcoming a bet with an equal chance of losing and gaining the same amount of money. Someone who is ‘risk seeking’ in this sense still does not like the possibility of losing money (or anything else for that matter). In the usual sense they are still averse to risk.

Sadly, the graphs with their labels have been repeated many times in textbooks and on websites so that many people have been exposed to this deceptive wording, and repeated it to others.

Idea 2: If someone seems more willing to do something risky then that means they have a lower aversion to risk, or perhaps a greater preference for risk.

The truth is that they might have but it is not proven by their behaviour.

First, their willingness to do the apparently risky thing may be for other reasons. They may see the act as being safer than others do, or see the rewards from it as being larger. Indeed, it may be that for them the act really is unusually safe, or the rewards unusually large.

Second, a tendency to do more risky things in one area of your life does not necessarily mean you will do more risky things in all areas of your life.

Even in carefully designed scientific experiments it is extremely difficult and perhaps impossible to establish that two people really do have different preferences for risk per se, even if their lifestyles and personalities seem very different. (I have yet to see research that convincingly does it. Weber and Milliman (1997) cover many studies on this topic.) In contrast, different perceptions of risks, even if those risks seem very simple and clearly stated, are easy to find, and account for a lot of the differences between people in their willingness to do risky things.

Willingness to do risky things is driven by many factors and differs across situations even for one person. If you want people to avoid taking bad risks then helping them improve their understanding of risks and rewards is much more likely to work than trying to influence their personal preference for risk, if that is possible at all, which is far from clear.

This may surprise you. The stereotype of the risk seeking adrenaline junkie skydiving down towards a snow covered mountain as part of some extreme sporting stunt seems like proof that there are people who really do seek risk. It is not. They like the adrenaline but not the risk per se and they take safety precautions to avoid being killed or crippled. Furthermore, the ones who do really dangerous stunts are typically professionals, getting paid for it.

Many people enjoy a ride on a roller coaster, which is a ride that makes you feel you are going to die even if, intellectually, you know you are not. If the thrill seekers learned that there were real safety issues with a ride do you think the queue to go on it would shorten or get longer?

Idea 3: The importance we attach to risk in decision making is primarily or solely a matter of personality and/or mood, and that’s how it should be.

A corollary of this is the idea that risk aversion/preference is a personal opinion or characteristic and, as such, cannot be wrong, cannot be challenged on rational grounds, and indeed it is impolite to try.

Personality and mood seem to have an influence on lots of things related to decision making in the face of risk. However, the strength of its influence on risk aversion/preference specifically, relative to the influence of objective circumstances, is not well understood. Furthermore, it is very unlikely that personality and mood should have an influence on risk aversion/preference in rational decision making.

For example, if the risk is financial, some people will have greater financial reserves than others, or more predictable future earnings, or a lifestyle that is easier to adapt to a sudden loss. Having to settle for a nearly-new Bentley instead of a brand new one is a bit disappointing, but nowhere near as stressful as having to move your children from an expensive private school and sell your house under severe time pressure. Some people have stress related illnesses and try to avoid stress as a result. These are the sort of objective circumstances that should influence personal decision making under uncertainty, and there are analogous circumstances for organizations.

From a practical point of view it is a serious mistake to think that decisions under uncertainty are in some way a matter of personal opinion. It means we are more likely to leave bad decisions unchallenged and to fail to consider relevant objective circumstances.

Idea 4: There are times when we pursue risk itself.

If we just start with the usual, traditional view that risk is to do with bad things that might happen, then sane people generally do not pursue it. If we are averse to an outcome then we will also be averse to the possibility of that outcome. What we might do is pursue rewards that are impossible or nearly impossible, in practice, to separate from the risk. In this situation it feels as if we are pursuing risk, even though that is not what is happening.

For example, in financial markets the statistical connection between risk and prices is often seen as so strong that the two are inseparable. However, imagine that a bank has all but signed a deal involving expected returns and potential losses, and then someone thinks of a cost free, instant way to reduce the risk in the deal. The bank will normally use it. Put it another way, all other things being equal a bank will prefer a deal with low risk to one with high risk.

If we use the unfamiliar technical definition of risk as including good things that might happen too (e.g. in ISO 31000:2009) then of course there will be times when we pursue ‘risk’, but this is not ‘risk’ as most people understand it.

The buzz phrase ‘risk appetite’ and the confusion over ‘risk’ mentioned under Idea 1 may have encouraged more people to think that we pursue risk itself.

Idea 5: The most risk we will put up with before changing our minds in a decision is the same for all decisions.

Imagine someone offers you a bet. You can win 200 by driving your car across town in moderate traffic if you can do it within a certain amount of time. Driving normally this would take you about 30 minutes. What is the shortest amount of time you would agree to in this bet?

Now imagine the same bet, but this time you are offered 20,000. What is the shortest amount of time you would agree to now?

Nearly everyone would be prepared to drive a bit faster for that greater incentive. The target time is our indicator of risk in this situation so you can see that the level of risk at which your decision to accept or reject such a bet would flip depends on the incentive.

Although this is obvious, it is surprising how often organizations design systems for managing risk that involve setting fixed limits that are to be applied across different situations and regardless of the incentives involved.

Idea 6: The most risk we will put up with before changing our minds in a decision is solely determined by our goals.

In the driving bet above the incentive for you to drive faster is provided by the reward you anticipate, not any reward you might like to get. If a genie popped out of a lamp and granted you a wish how much money would you ask for? Most of us would think of very big numbers indeed, which gives some idea of how far our goals can differ from our realistic expectations.

Nevertheless, the idea that our goals are all that matters in deciding the limit on risk is common in attempts to define the phrase ‘risk appetite’ in terms of an overall limit on risk.

Idea 7: The most risk we will put up with before changing our minds in a decision is solely determined by our strategy.

Once again, the top limit on risk depends on the incentives we anticipate, not just on what we intend to do. The results we anticipate depend on circumstances as well as our actions.

When an organization chooses a strategy and some goals it can also choose a maximum level of risk (such that any more and something has to change). At that point it may well be that the goals are exactly the same as the anticipated results, so it can easily look as if the anticipated results do not matter. However, as time passes, circumstances change and so do expectations of results, even if goals and plans do not. Is the risk limit still correct? No. If conditions deteriorate we could be running a risk that is no longer justified by the poor returns now anticipated.

Idea 8: It is appropriate for a leader's personal risk propensity to set or influence an organization's weighting of risk in decisions.

It is not.

Imagine an organization run by a man who is nearing retirement, looking forward to a generous pension if he can hang on another year, has a long term stress related illness, and has never been happy with change. Would it be good for the organization for his personal attitude to risk to be an important factor in deciding its strategy?

He retires and is replaced by a man who is independently wealthy thanks to marrying an heiress. He has no money or pension worries and for him the job is about getting famous or he is not interested. Would it be good for the organization for his personal attitude to risk to be an important factor in deciding its strategy?

The difference between what people see as good for them personally and what is good for the organization and its stakeholders as a whole is a major concern in governance. The importance attached to risk in decision making on behalf of the organization should be set on the basis of objective circumstances and stakeholder interests, as rationally as is practical.

Idea 9: Setting limits prevents bad risk taking.

Setting limits can prevent excessive risk taking, if the limits are enforced and if risk assessments are reliable. What limits do not prevent is taking risks that are not justified. People can make all sorts of silly choices with scant regard for the risk involved so long as they are within their limit.

In principle, clever people should take care about the risk they take on even when they are far short of their limit in order to give themselves room for more investment later. In practice people do not always behave cleverly and there are occasions where people do something rash and comfort themselves with the thought that ‘It's within our overall risk appetite."

Setting fixed limits can also lead to turning down good business opportunities. Suppose you face a situation where you have to choose between two alternative investments. One offers excellent returns but its risk level takes you just beyond your limit. The other offers poor returns but its risk level is just within your limit. It's an agonising decision and I suspect many of us would be tempted to ask how firm the limit is. That limit was most likely set on the basis of forecasts (or at least gut feeling) about what investments would be available and what risks and returns they would offer. If the actual deal with excellent returns offers better returns for its risk than was envisaged when the limit was set then the limit is obsolete and enforcing it will mean making a bad decision.

Ideas that may be wrong

In addition to ideas that are clearly wrong there are some others that are far less certain than many people think. These are ideas that could be wrong.

Idea 10: Risk exists as a naturally occurring quantity in our minds.

Once again we have to be clear about which ‘risk’ concept we are thinking about. It seems obvious that we think about bad things that might happen. In this sense of ‘risk’ it is obvious that these ideas appear naturally in our thinking.

However, we also use the word ‘risk’ to refer to an amount representing the importance or ‘size’ of these risks.

The idea that we naturally have a quantity in our minds representing ‘risk’ in this second sense is, I suspect, a relatively new invention, quite possibly stemming from Markowitz’s use of expected value and variance as measures of reward and risk respectively. This was perhaps done for mathematical convenience, but the idea caught on anyway and now there are several mathematical formulae vying to be regarded as representing ‘risk’. None so far exactly matches the answers people give when asked to rate the ‘riskiness’ of alternative actions.

It may be that that in our minds what we have is thoughts about possible futures that might occur but no quantity representing an amount of ‘risk’. It could be that when someone asks us to say how ‘risky’ something is we half-consciously scratch around to create some kind of feeling or view about it to satisfy our questioner but that opinion did not exist until we were asked.

Idea 11: We are averse to risk (in the sense of an amount) per se.

In part this idea rests on the previous idea, for without a quantity corresponding to ‘risk’ we cannot be averse to it. An alternative version is that we are averse to uncertainty.

(Note that there is no real question that we are averse to ‘risks’ in the usual sense of aversive outcomes that might happen.)

Experiments have shown that the way we value alternative outcomes does not fully explain actual decision making under uncertainty. It is unclear why. It could be that actual decision making is irrationally flawed in some way, or that the experiments are confusing and unrealistic, or that there is something more interesting going on.

It could be that we really are averse to uncertainty itself, or to a quantity representing ‘risk’.

However, it could also be that the values we place on possible outcomes are not the only factors in our evaluation. What about the implications of having to change to adapt to those outcomes? And what about the implications of advance knowledge of what will happen, or lack of it?

  • Sudden changes in fortune, in either direction, can involve some disruption to our lifestyle which is aversive; and

  • the more certainly we can predict the future the better we can prepare for it, even if only mentally.

In an influential experiment Keller (1985) asked students to consider various hypothetical situations. In one they had to decide the maximum fare they would be prepared to pay for bus journeys of different durations to the same destination. Generally they were prepared to pay more for a faster journey. They were also asked how much they would pay for a 50% chance of one duration combined with a 50% chance of another. From this information the experimenters worked out the effect of the journey time being uncertain.

This is very mathematical but from a common sense point of view it is obvious that if your journey might take a long time then you cannot linger comfortably in bed but will need to be at the bus stop early to be sure not to be late. If the journey turns out to be quicker than you feared there is not much you can do with the time you saved. Clearly, a guaranteed journey time has a special attraction, but only because of what you can do with the knowledge that it is certain. If there was nothing you could do differently then the certainty gives no practical advantage.

In this hypothetical but realistic situation there is clearly more to it than just the value of the outcomes. Some other laboratory tasks are just concerned with bets on small amounts of money and it is hard to know how much our decision making is influenced by considerations other than the valuation of outcomes. It may be that we respond to these factors so habitually (perhaps even instinctively) that we do it even in artificial laboratory scenarios where they should not be important.

And finally

How was that? The interesting thing about the wrong ideas is that it is not too hard to see that they are wrong when the logic is spelled out and clarified with examples. However, when a lot of what you read implicitly assumes some of them it is hard not to be sucked in.

It takes a prolonged, determined effort to keep on recognizing ideas like these in their many guises and to avoid being influenced by them.


ISO 31000:2009 Risk Management - Principles and Guidelines, International Organization for Standardization, Geneva, 2009.

Keller, R.L., ‘An Empirical Investigation of Relative Risk Aversion’ in IEEE Transactions on Systems, Man, and Cybernetics, vol. SMC-15, No 4, pp. 475 - 482, July/August 1985.

Weber, E.U. and Milliman, R.A., ‘Perceived Risk Attitudes: Relating Risk Perception to Risky Choice’ in Management Science, vol. 43, No. 2, pp. 123 - 144, February 1997.

Made in England


Words © 2010 Matthew Leitch. First published 24 March 2010.