Working In Uncertainty

How to embed risk management into performance management and strategy making

by Matthew Leitch, first published 6 December 2006.


An important achievement

This article is about how to ‘embed’ risk management into performance management, planning, strategy making, budgeting, other management exercises that involve objective setting (in the widest sense), identifying performance measures, planning improvements, creating action plans, or trying to improve alignment.

The idea is to find ways to get risk and uncertainty considered in these processes and do it in a genuine way, but respecting as far as possible the intended process rather than offering an alternative.

The importance of doing this depends on the potential importance of the strategy/planning/alignment process involved, not its actual importance at present. That's because one of the main reasons these processes sometimes fail to make a helpful impact is that they do not do enough to cope with our impossibly complex, unpredictable, largely uncontrollable business world. Injecting some risk management could be the salvation of the whole process and of an organisation whose course is influenced by it.

Here's a test to see if you will like the style of embedding I describe in this article. Imagine a board of company directors who are making strategic plans. They have a choice between (1) forming a basic strategy then brainstorming things that could go wrong and writing a list of them with responses, or (2) doing scenario planning where they will consider a wide range of possible futures and make plans accordingly. Which would you say was more embedded?

To put it another way, which procedure will best ensure that risk and uncertainty are understood and responded to fully? If you are like most people and your answer is ‘Obviously the second one, the scenario planning’ then please read on. If not, read on at your peril.

(This is not an article about scenario planning specifically, though it is one element that can be helpful.)

The secret of embedding risk management

The key to embedding successfully is to:

  • stop looking for somewhere to write lists of risks; and

  • find every place where people could act as if they know the future for certain and change the instructions, procedures, tools, etc used so that people are more aware of their uncertainties and act on them.

Here's an example. Suppose a list of strategic goals has been drawn up and a high level group has ‘signed off’ that list. Does that mean the list is definitely the best one possible from now until the next time the group is presented with a list of goals? That would be acting as if we knew the future for certain. Why not make the sign off a qualified one that invites research, review, and revision in a controlled way according to a predefined schedule or triggers for change? Why not identify the points that are most uncertain at this stage so that the doubts we have are not forgotten?

There are many, many more opportunities to embed open mindedness and proper responses to risk and uncertainty. This article describes some of the most important. I've concentrated on the ones that have a helpful psychological impact in some way, because otherwise people just go through the motions. With these controls it is hard to go through the motions without your views being altered for the better.

Organizing frameworks

There are many elements to most planning/strategy/alignment processes and many opportunities to inject a little bit of risk management. To apply the controls in this article you will need to translate the points made in principle into specific words, layouts, and actions that fit into what your organisation is doing. (I don't think a list that can be copied verbatim is likely to be useful.)

That could be a lot of detail to work on. Here are two frameworks to help stay organised.

First, the typical activities in a planning/strategy/alignment processes include:

  • Starting

    • Devising the process itself.

    • Launching the process.

  • The deep thinking phase

    • Surveying the external environment and developing views about the future.

    • Taking stock of the current position and resources.

    • Developing systems of goals/objectives/visions/targets/etc.

    • Aligning/negotiating actions, resources, and targets.

    • Choosing what to measure over time.

  • Continuing

    • Communicating strategies.

    • Ongoing use of measures, monitoring of progress, and revisions.

The controls described in this article are organised in part using this list.

However, whether you can divide the activities up in this way or not may have little practical importance. What really matters is the people and things that need to be influenced or adjusted to inject risk management successfully. Here is a list, though some of these items will not be present in all cases:

  • Intangible but helpful:

    • Beliefs, preferences, and intended process of leaders of the approach.

    • Beliefs, preferences, and intended process of leaders of the organisation.

  • Tangible:

    • Presentation slides about the approach. (Inevitable in the PowerPoint era.)

    • Emails, letters, and video clips explaining the approach, or launching it.

    • Procedure manuals about the approach.

    • Training materials for explaining the approach to participants.

    • Training materials for explaining the approach to facilitators/trainers.

    • Forms (on paper or in sofware) to support the approach.

    • Checklists and other tools used in the approach.

    • Progress reports generated by the approach.

    • Reports of performance/progress measures.

    • Internal audit programmes.

A later section discusses the problem of persuading people to accept the refinements that will embed risk management. Before that, here are the suggested controls.

Starting well

Controls over devising the process itself

Control 1.1.1: Appoint a controls/risk specialist to represent risk management during the design of the process.

This person is responsible for making sure every point in the strategy/planning/alignment process where people might ignore their uncertainty is identified and a positive response to uncertainty is encouraged by specific, practical, action-oriented controls that have a helpful psychological impact.

In short, they are responsible for doing what is described in this article.

Control 1.1.2: Write and adopt a policy about managing risk and uncertainty within the process.

This is just a short document saying, in effect, that it is policy to ensure that the strategy/planning/alignment process will properly take account of risk and uncertainty, and will do so by ensuring that people are encouraged to identify and respond positively to uncertainty, and discouraged from behaving as if they know the future for sure, or have complete control over it.

The policy should be endorsed by a senior person or group for maximum effect.

Launching the process

Control 1.2.1: Be open about risk and uncertainty when kicking off the planning/alignment/strategy process.

The lead given by senior people and people in a position of authority or expertise can have a great influence on how other people behave, particularly in the early stages. If senior people are open about the uncertainties involved and describe the future of the process in ways that clearly recognise the need for flexibility, revisions of thinking, working with doubt and disagreement, and the need for continual learning then people are more likely to go into the process with realistic expectations.

In contrast, if the talk is of unanimous agreement, locked in place for a year at a time, with details cascaded, and no mention of learning or adapting, then the scene is set for uncertainty to be ignored, like a blocked safety valve, until finally it explodes into consciousness with painful results.

Example: The email from the Chief Executive expressing her full support for the strategy initiative and stressing its importance might include statements like: ‘Clearly we are working in a dynamic market and cannot assume we know what the future holds.’, ‘We need to be flexible as we go through this process and recognise that we will never have all the answers we might like to have.’, ‘Where we find we disagree on something this is most likely to be a sign that we are unsure and will need to plan a way forward that allows us to resolve our uncertainty and respond quickly to what we discover.’

Control 1.2.2: Introduce an ongoing activity rather than a project.

If what you launch is an activity that is planned to continue indefinitely, or at least for the foreseeable future, then it is obvious that a long term approach is needed with adaptations and improvements. In contrast, if the language used is about projects then this tends to prompt people to take a shorter term view with less interest in long term sustainability.

Example: The launch presentation by the leader of the strategy initiative might include references to ongoing development of the approach and a diagram showing how it will be reviewed and improved over the years ahead.

Controls over the deep thinking phase

There are some controls that apply to every activity in this phase.

Control 2.0.1: Record agreements in writing, and record the ways in which they are provisional or temporary.

A very traditional control is to get people involved in a decision to show their agreement or acceptance with a signature or otherwise in writing.

A useful refinement is to capture in writing the ways that those agreements are provisional or temporary. Otherwise, there is a tendency to think that no improvement or adaptation is possible, or to throw an agreement away entirely once conditions change. Neither is ideal and it makes more sense to leave the door open for sensible, gradual, controlled change.

Example: The minutes of a meeting to approve a document setting out priorities for service improvement might record that ‘it was agreed that the priorities set out in the draft were (a) reasonable at this stage (b) an acceptable basis for initial planning, and (c) should be revised through the year under the standard revisions procedure in the light of actual progress and resource consumption.’ This could be the default wording.

Control 2.0.2: Record or mark uncertainties arising during the process.

Uncertainties will come to light in many ways as you continue with the strategy/planning/alignment process. These should be recorded or marked in documentation, somehow, and followed up to make sure they are not ignored.

Example: The minutes of a meeting to discuss strategy might include points that were agreed and also points that were discussed but could not be agreed due to uncertainties. The minutes would include notes on the issues, uncertainties, and any actions agreed to resolve or at least manage those uncertainties. There could be a standard heading for such items.

Control 2.0.3: Show alternatives, where identified.

When recording uncertainties it is sometimes useful to list all the alternatives that were thought of.

Example: Three ways of measuring customer satisfaction may have been considered without a decision being reached on which to use. This uncertainty needs to be recorded, along with the three alternatives considered.

Keeping the alternatives in mind helps reinforce the existence of potentially useful alternatives, and may help prompt practical actions to design a better approach.

A less obvious place to capture alternatives is in a strategy map/logic model/rationale. People will almost certainly have different views about how the world works. Some of these differences will be important enough to capture as alternative drawings of part of the model.

Control 2.0.4: Record disagreements about objectives, rationales, and measures and identify which are evidence of uncertainty.

Disagreement can be informative.

Pushing everyone in a meeting to agree for the sake consensus is a recipe for creating hidden resentments and disagreement. More importantly, that disagreement may be a sign that the group is uncertain and some action is needed to respond to that uncertainty beyond agreeing to go with one assumption and hoping it's right.

The records could be, for example, notes about alternatives that were considered or ideas for actions that would help to resolve the uncertainty/risk.

Example: Imagine that a strategy map is being debated and some people think customer satisfaction above 90% drives sales growth while others say that it does not. After a few minutes of debate it should be clear that the future impact of customer satisfaction above 90% for sales growth is something that is uncertain so this is recorded as an open question. If it is an important one then something needs to be done about it. Perhaps the answer is to continue striving for happy customers but analyse purchases by customers who rated their satisfaction to see how satisfaction at various levels really relates to purchases. All this would be recorded in meeting minutes, or in documents arising from the discussions.

Control 2.0.5: Check important facts.

Some decisions will be heavily influenced by certain ‘facts’. The decisions may be important ones and now we know how much rides on those ‘facts’ we know we need to check them, along with the inferences made from them.

Example: A systematic marketing strategy development process might well have a lot of data gathering and analysis initially. As this gives way to considering potential strategies and attention focuses on certain specific strategies the strategy development process should have a sub-step that involves identifying key beliefs underpinning the choice of strategy and the evidence that led to those beliefs, followed by additional work to consider that evidence and investigate it further if there is any room for doubt.

Control 2.0.6: If possible, iterate.

Not all strategy/planning/alignment processes are designed to be iterative, but if there is a choice then iterate. At each stage, consider what is uncertain in some way, which uncertainties are important, what can be done about them before the next round of thinking, and who will do it.

This is a great way to direct time and energy towards the most important activities that will improve the strategy/process/etc.

Control 2.0.7: If possible, do scenario planning.

One of the big difficulties in getting people to manage risk and uncertainty effectively is that we have a psychological bias towards overconfident forecasts and a tendency to think we have more control over the future than we really have. In short, we don't see the risks or uncertainties so we don't think we need to do anything about them.

Scenario planning tackles this directly by insisting that we consider possible futures a little way outside those we would consider ‘likely’. If you've read about scenario planning the examples given were probably very grand, long term ones like ‘climate change’ or ‘the future of the pharmaceuticals industry’. However, it is easier and more useful to do it for mundane, everyday purposes where we are just as prone to blinkered thinking.

If the process being followed is not one of scenario planning, it might still be acceptable to consider scenarios as part of it, even if this is not followed up with the usual style of planning.

Controls over surveying the external environment and developing views about the future

Control 2.1.1: Do it!

Including a look at the outside world and possible futures is a key part of these processes and obviously very risk management friendly. If possible it should be included.

Control 2.1.2: Use ranges and alternative scenarios.

Building up one projection of the future is not helpful for managing risk and uncertainty. The methods and tools should encourage people to give ranges or probability distributions for key variables, and/or to talk about alternative scenarios.

Control 2.1.3: Use forced spreads.

Making people express views about the future in terms of spreads and scenarios helps them keep an open mind but is not enough to counter our natural tendency towards narrow mindedness about the future.

One way to counter the bias is to force consideration of potential futures outside what we would otherwise consider plausible or likely.

Example: Scenario planning does this by assuming combinations of alternative extremes on key environmental drivers.

Example: When it comes to estimating numbers, such as future interest rates, a similar thing can be done by asking people to think about how rates outside their expectations could arise. A facilitator might say ‘I know it is unlikely that rates will rise beyond x%, but imagine that in two years this has happened. What could be the possible reasons for that?’ Once the interest rates expert has finished answering that question the perceived likelihood of the extreme rates will usually have risen.

This is a good example of a psychologically powerful control where it is hard to go through the motions without being influenced in a helful way.

Control 2.1.4: Critically review facts and opinions about the environment and potential futures.

Our capacity for holding opinions far outstrips our capacity for finding and using the facts that would inform them. Including some activities specifically designed to probe the facts or logic, if any, supporting opinions can be helpful.

The result of this may be a feeling of taking a step backwards, but of course the truth is the opposite. Only when we know we are not sure will we see the value of doing something positive about our ignorance.

Controls over taking stock of the current position and resources

Control 2.2.1: Critically review facts and opinions about current position and resources.

How many employees do we have? How fast is our customer base growing? Are we really more efficient than we were last year? Do our patents have the value we imagine? Include activities that prompt people to review, critically, the facts and arguments.

Example: Imagine that a process requires managers to contribute a written description of the position of their part of the company and they have to do this by completing a document which has standard headings. Implementing this control could be as simple as including headings requiring an assessment of the evidence for key assertions and the potential value of doing more work to establish the truth.

Developing systems of goals/objectives/visions/targets/etc

Control 2.3.1: Deliberately analyse uncertainties by deriving them from strategic models.

Exactly what form this takes depends on the form of the strategy/planning/ alignment process. Getting the details right can be important. In particular it is important not to miss out model uncertainty. Here are some examples to show what it means in practice.

  • Strategy map/rationale/logic map: This is where the technique works best because there's more thinking on show. Given a causal model of some kind the areas of uncertainty (or ‘risks’ if you prefer) that need to be considered carefully one by one are:

    • An area for each node of the model (i.e. box, oval, etc) representing uncertainty about future values of that variable.

    • An area for each link of the model (i.e. line, arrow, etc) representing uncertainty about the characteristics of that link such as the strength of the relationship and any time delays involved. (This is part of model uncertainty.)

    • An area representing uncertainty about the structure of the model overall, for example what nodes to include and how to define them.

    • An area representing uncertainty about what to value and how much. Which nodes do you consider to be ones whose value you are ultimately hoping to improve, and how would you value different levels of improvment in them?

    To analyse these areas think about the data or other evidence that you have already that relates to each one and how this has been used so far. Some nodes will be part of the business environment and hard or impossible to influence. What do you see as the future for these, what is that view based on, and how uncertain is it? Always give a best guess reluctantly. It is much better to write about ranges and probabilities to remind people that the future is not known.

    Doing this kind of analysis will quickly show where key assumptions are less well supported than had been imagined and encourage people to be more open minded.

    Considering uncertainty about the links is particularly interesting. Why do we think that doing something will have the desired effect? So often this is just gut feel and it would be worthwhile recognising this and planning to get data, probably in the course of ordinary operations, that will provide facts.

  • Outputs and outcomes: A common form of analysis in the charities sector in the UK is to think in terms of outputs by the charity and the outcomes they should lead to for the people (or animals) the charity is trying to help. This is so much like the strategy map example that the same procedure can be applied.

  • Goal hierarchies: A less clear cut situation occurs where the strategy/planning process involves a hierarchy of objectives in the widest sense but these don't necessarily reflect how the world works. Instead, some or all of the links are just categorising detailed objectives under general headings. In these pictures there is less rationale on display than at first seems to be the case.

  • Unsupported scorecard: Although leading practice is to develop a rationale that supports choices of key performance measures the fact is that most organisations today still don't bother. In this situation the areas of uncertainty to consider are:

    • An area for each performance measure representing uncertainty about future values of that measure.

    • An area representing uncertainty about the best choice of measures to use and how to define them.

    • An area representing uncertainty about what to value and how much. Which KPIs do you consider to be ones whose value you are ultimately hoping to improve, and how would you value different levels of improvment in them?

Control 2.3.2: If you can, talk about performance levels rather than success or failure.

The way the strategy/planning/alignment process is designed may insist on categorising everything as either success or failure. For example, some like to do this because they are using targets while others find it allows fancy software to operate more easily.

However, if you have a free choice of whether to think of performance using scales or as success/failure you should choose the former. The success/failure simplification has a number of problems but from a risk management point of view the biggest problem is that it eliminates any awareness of upside risk.

Upside risk is the risk of something happening that is better than we plan, expect, or think correct for some reason. (Which it is in any particular case is a matter of choice or culture.) If the best that can be achieved is ‘success’ then there is nothing beyond that.

If, instead, achievement is on a scale of measure then the full range can be imagined and recorded more easily.

Example: Many public sector organisations in the UK have a vast number of targets imposed on them by central government. It is very tempting to classify performance as success or failure, where success means meeting a target and failure means not meeting it. However, this means that slightly missing a target is no better than missing by a lot, which is irrational and can lead people to focus on improvements where the existing level of performance is close to the target rather than on the areas where the existing level of performance is poor. This is usually the reverse of what ought to happen.

Controls over aligning/negotiating actions, resources, and targets

Control 2.4.1: If possible, get people to express their views about potential achievements in probabilistic terms.

Many performance management/planning/alignment processes involve some sort of negotiation about targets or other kinds of future achievement level. Cynics would say this involves the boss saying to the subordinate ‘Please give me an idea of what is achievable’ and when this is done saying ‘No, that's not enough. Give me another number.’ before finally imposing one.

I have not seen anything quite so one-sided. Negotiations about targets usually rumble on indefinitely, restarting at the slightest opportunity, such as when forecasting.

If the process allows it, a good way to curb this behaviour is to give people the opportunity to be uncertain about what they can achieve. They are uncertain, and have every right to be, which is one of the reasons they struggle with the whole process.

Example: You could ask ‘How confident are you of achieving an x% increase this year?’ or make it more informative by asking the same question for more than one potential level of achievement.

Control 2.4.2: If resources are under negotation, get people to express their views about potential achievements in probabilistic terms for given levels of support.

In practice if you ask people how confident they are of achieving a level of performance one of the most common answers is ‘Very confident, provided you give me the support I need.’ Quite often this is a logical answer, but it is also just another step in the negotiations about targets and resources.

A way to get round this is to ask for confidence levels for various levels of achievement given various levels of support e.g. money, staff, senior time, elapsed time.

This condenses several steps of horse trading into one relatively scientific exercise, provided the people concerned have the intellect to do it.

Example: A step in budget negotiations might require budget holders to fill in matrices giving their confidence of reaching three increasing levels of performance given three levels of financial or other support. Once the matrices have been completed and the reasoning discussed and clarified it is for the higher level of management to decide how to allocate the resources, given the levels of risk that will result from different levels of support for particular activities.

Control 2.4.3: When discussing future achievement levels, record the empirical support for projections made.

Statements about what is ‘achievable’ and other projections about future achievement are, inevitably, uncertain to some degree. We feel under pressure to be precise and minimise the uncertainty we express, and we often feign confidence.

Something that can help counter that tendency is to record the empirical support for estimates made. Total guesswork is very different from a statistical extrapolation based on a huge amount of accurate past data. Just capturing that simple difference can be very helpful.

To someone at a very senior level in a large organisation, results can look stable and predictable, in percentage terms. For example, ‘How much software are we going to sell next quarter?’ is a question that the Sales Director can analyse. In contrast, someone at a low level has a harder forecasting task. A salesperson selling software to a large customer may toil for months and get no reward, but then achieve a fantastic and unexpected breakthrough. The forecasting error in percenatge terms will be much larger at this lower level. This is not just a matter of data, but also a consequence of the law of large numbers.

Control 2.4.4: Establish priorities within selected actions.

Many strategy/planning/alignment processes involve prioritisation. However, once the final selection of actions or measures is made they often all have the same high priority.

This control involves establishing priorities even within these select items. This is a recognition that things may change and we may not be able to do as many actions as we fondly imagine at this stage. The priorities serve as a guide to how to react if this happens.

Control 2.4.5: Plan to gather more information, perhaps through ordinary operations, pilots, etc.

As the strategy/planning/alignment process continues there should be actions arising that involve getting or using more information, and these should be carried out promptly.

Example: If plans are documented on a form there might be a cell/column to state when an action is intended to gather information, and if so what.

Example: An internal audit of planning could include a study of actions planned that identifies whether these include items specifically to gain information (among other things) or consist entirely of actions that assume there is no more to learn.

Controls over choosing what to measure over time

Control 2.5.1: Involve data experts in deciding what to measure.

One of the common problems with performance measures is that they are incorrectly recorded, calculated, and/or presented. Financial measures are usually the least likely to be beset with these problems, but even here there are problems with different definitions and time wasting arguments about whose numbers are right.

Therefore, include someone who knows what data is already readily available and can guide marginal decisions about performance measures towards numbers that can easily be obtained and are likely to be reliable.

The best choice of measures will always be decided by considering what is possible and what is desirable. Don't ignore either.

[Incidentally, the value of including users of KPIs in their selection is so obvious and widely regarded as important that I won't mention it again. In one survey I conducted people were so keen to say their users had been involved in selecting KPIs that some even claimed this when KPIs had been entirely imposed by central government!]

Control 2.5.2: Have short term measures as well as long term measures.

People who know it all do not need short term measures to tell them if their plans are having different results from those expected. For the rest of us a bit of early feedback is essential.

This is much more than just checking that people are doing what they are supposed to be doing. One of the biggest areas of uncertainties in most planning is the result of actions planned. For example, we may think that if we do certain things then services will improve, but will they, and by how much? Finding out quickly and efficiently exactly what impact our new approach has is crucial.

Control 2.5.3: Have secondary indicators too, with mechanisms to use them efficiently.

The easy part is to have secondary indicators, by which I mean management information beyond Key Performance Indicators or whatever you call the top level scorecard. Many businesses have more measurements than they know what to do with because they are built into software packages.

The more challenging design problem is to find ways to be alerted to interesting messages from these indicators. One theory is that a set of Key Performance Indicators provides a complete view of performance, to the extent that nothing important happens without an indication on the KPIs. Once some indication is seen then more detail can be obtained by looking at other measures if necessary.

That may or may not be possible for you, but how will you know when you have a set of KPIs with that property? You won't. That's why you must consider the possibility that your KPIs miss something and look beyond them to secondary indicators.

Keeping on top of many secondary indicators is possible through delegation and automation.

Control 2.5.4: Have mechanisms to catch unexpected news that would not show up in the identified KPIs or perhaps even secondary indicators.

Beyond KPIs and even beyond secondary indicators there are snippets of news that may well be the first indication of something important in the wind. What mechanisms will bring these to management's attention?

Continuing well

Controls over communicating strategy

Control 3.1.1: Ask people inside the organisation to help resolve uncertainties and keep things up to date.

Most strategy/planning/alignment exercises involve some initial work by a relatively small and senior group, followed by a phase where more people are involved. Frequently people talk about ‘cascading’.

As everyone knows, getting people to ‘buy in’ to these processes is not always easy. They need to have a genuine contribution to make or it feels false to them and cynicism spreads.

If the areas of uncertainty encountered in the initial work have been carefully identified and recorded then it should be easy enough to explain these areas to others when telling them about their involvement, and to ask for their help in reducing or managing those uncertainties.

Example: If the top factors affecting buyers' decisions at present were uncertain when the senior team had their workshop then it makes sense to let people know this and invite them to contribute their knowledge, and go about their work in ways that will help resolve this uncertainty, or at least keep it under constant review.

Ideally, the emphasis should be on gathering and using facts rather than just more opinions.

Control 3.1.2: If possible, communicate strategy without killing options or making unnecessary promises.

Even if a strategy has been devised that is appropriately flexible and open ended it is possible to undo all this good work by communicating the strategy in the wrong way. If the strategy is set out as one path to a single destination then all other potential paths and destinations seem to lose value.

Example: Here is a practical example. Suppose you are running a university department with three research specialisms. All have value, potentially, which is why you have all three. However, right now it is team A that is in the hottest demand with evidence to suggest this is only the beginning of something even bigger. Suppose you announce a strategy that says investment will be focused on the hot research team. The response by people in the other two teams may well be to seek employment elsewhere.

In this simple example, the consequence of expressing strategy inappropriately is to kill off options (i.e. the other two research teams) that have value because things might change in the future.

Only slightly less painful is the embarassment of making a promise about the future and then later having to say things will be different. If the strategy is communicated as one path to one destination then embarrassment is the usual consequence.

There are several techniques that can be used to communicate strategy and for more details read ‘Writing about flexible plans

Controls over ongoing use of measures, monitoring of progress, and revisions

Control 3.2.1: Plan, in writing, to revise targets as frequently as possible, and allocate responsibility for making sure it happens.

Of all the outputs from strategy/planning/alignment processes, targets have the shortest shelf life. Targets are the cream cakes of management - always there, not very good for you, and fresh for only a day. After all the discussion to try to reach target performance levels that perfectly balance achievability with stretch, things usually change before the agreed targets have even been filed. What seemed just right yesterday now seems too easy, too hard, or just irrelevant.

Everyone knows that targets must be revised. The debate is just about how often. If a huge negotiation process were needed every time targets were adjusted it would be hard to justify doing so even annually. But this is not what happens.

The more often targets are adjusted the easier it becomes. You can imagine that targets people set or adjust daily are agreed quickly and easily, responding to the latest information. At the other extreme, strategy exercises done every 3 years have to be launched with training or a video just to remind everyone what to do.

Control 3.2.2: Plan, in writing, to revise priorities and allocate responsibility for making sure it happens.

Just because certain actions seemed to be high priority when last planned does not mean they still are or will stay high priority. Plan - in writing - to revise priorities regularly and in response to significant changes and discoveries.

Control 3.2.3: Plan, in writing, to revise measures and allocate responsibility for making sure it happens.

Similarly, the measures that seemed ‘key’ a few months ago may no longer be the best choice. Plan, again in writing, to review and probably revise measures. Make sure the reporting templates and supporting systems are able to cope with changes.

Control 3.2.4: Plan, in writing, to revise procedures and allocate responsibility for making sure it happens.

The process currently in use to set strategy/priorities/etc seems a good one today, but will it seem ideal next quarter, let alone next year? Quite probably not, and indeed if it still seems perfect surely that will be an indication that management have stopped thinking?

Make a written plan to review and revise the process regularly and in response to significant changes and discoveries.

Control 3.2.5: Design and implement controls over the reliability and rapidity of performance measurements.

As a minimum these will include scrutiny of the numbers, but usually this is not enough on its own. A frighteningly high proportion of non-financial indicators in organisations are wrong, often by a lot.

Control 3.2.6: Audit the processes and systems for producing KPIs to confirm they are reliable.

At least some audit coverage of the KPIs is essential.

Control 3.2.7: Show the measurement uncertainty alongside the measurements.

This simple step is a powerful example of embedded risk management at work. Measurement uncertainty refers to the fact that measurements are not usually entirely accurate or reliable. Much of this is unavoidable at least in the short term. Showing measurement uncertainty is routine in science and engineering, but in business it is more common to present numbers as if> they were totally accurate and reliable.

In truth very little management information is accurate or reliable. Even independently audited financial numbers contain more guesswork than most people realise. The final result is just one from a range that would have been acceptable to the auditors. A more obvious example of measurement uncertainty is that of customer satisfaction numbers derived from samples of customers and using questionnaires whose validity has not been tested.

Examples: Measurement uncertainty does not have to quantified to be expressed in a helpful way. Quantify if possible, of course, but otherwise some caveats about factors affecting accuracy, or simply information about the source of the data, procedures used, or the size of samples will convey the necessary sense of caution.

Control 3.2.8: Show the empirical support and uncertainty alongside all forecasts.

Forecasts are usually wrong; the only thing we don't know in advance is how wrong. Presenting a single point, ‘best guess’ style of estimate fails to convey the uncertainty inherent in the forecast. As with measurement uncertainty, forecasting error can be expressed in many ways, not just as prediction intervals (i.e. numbers giving a range with a probability attached).

Control 3.2.9: If possible, appraise and reward performance against the latest thinking.

A common assumption is that a person's performance must be appraised against the targets originally agreed. This is not true. There are companies who appraise performance taking into account the conditions that actually unfolded, and many managers do this informally despite the policies of their employer. This is more likely to be fair.

A further refinement is to appraise performance against the latest views of what is important and using the latest selection of performance measures. This means that anyone attempting to game the system or distort measures to gain personal advantage has to bear in mind the risk that the measurement system will be reformed and their tricks will rebound on them.

A very common alternative to both these is to reward people according to the economic value of their contribution regardless of their performance. All sole traders are rewarded this way and most if not all feel no resentment towards their employer as a result.

Control 3.2.10: Use data to refine the selection of KPIs.

Many ‘key’ performance indicators are chosen because they are believed to be linked to the outcomes that ultimately people are interested in. For example, customer satisfaction metrics are interesting to most people because they believe higher satisfaction leads to growth and profits.

The problem is that sometimes those hypothesised links do not exist or are unimportant.

One way to refine the selection of performance indicators is to use statistical tools on operational data to find out what links really exist and quantify their properties.

With care it is possible to do this even when there are ‘stocks’ such as learning and financial resources that complicate the links. The details are beyond the scope of this article but ‘Better management of large scale financial and business processes using predictive statistics’ explains this idea applied to key risk indicators.

Assessing the need to embed risk management

Every one of the above controls that appears in your strategy/planning/alignment process is a step in the right direction, and unless you really do have everything in place there is scope for some worthwhile improvement.

Having said that, it is still worth making an initial assessment to see where the current process stands. If it currently contains almost none of the controls listed above then it is too brittle to succeed at all. Apparent progress so far must be viewed cautiously because it is just a matter of time before unexpected events burst through its rigid defences.

Gaining acceptance for risk management controls

There are two things to bear in mind about the psychology of people responsible for devising or running strategy/planning/alignment processes.

Firstly, devising strategy/planning/alignment processes is not easy. There have been countless different designs, each with its own supporters, consultants, and business books. People care a lot about their design and can react quite strongly to what they see as misconceptions or misused words. The integrity of their process needs to be respected.

Secondly, anyone responsible for making one of these processes happen will probably also be facing a lot of cynicism, reluctance, and game playing. There will be happy enthusiasts who think the process is fantastic, but nevertheless there will be a fair number of people who think it a ridiculous waste of time.

Consequently, expect any suggestions of change or additions, even to details of a process, to be resisted almost by default. Give reassurance that you are looking for only minor changes and that you will suggest nothing that is unreasonable.

One step forward is to get an official role representing risk management through the design of the process. (See control 1.1.1 above.)

Always ask for things that are reasonable and common sense, and start from simple points that are easily agreed. For example, here's a progression towards planning to review and possibly revise something (e.g. priorities):

  1. ‘Do you think surprising developments or discoveries may occur in future?’

  2. ‘Do you think the choice made initially will benefit from review and revision from time to time?’

  3. ‘If no reviews or revisions were done could that undermine the value of the process?’

  4. ‘Would it be ok to include a point in the plan about reviewing and, if necessary, updating this?’

  5. ‘How long do you think it would be safe to go without any review at all?’

  6. ‘If the review procedure was lightweight and did not reopen everything every time, would it be reasonable to have a review every quarter, say?’

This kind of thinking is so sensible it is difficult to argue against, and why would anyone do so? Strategy/planning/alignment processes that ignore uncertainty will fail, so if you help inject some robustness you are doing everyone a favour.

Final words

This article has described a large number of potential controls and some of them would be unacceptable to some strategy process theorists. Is it a problem if not all the controls get implemented?

No. Each control inserted is one more step in the right direction.

Further reading

This limited survey gives some idea of how frequently the main controls described above are used in practice at the moment: ‘Research on risk management within performance management

Designing intelligent internal control systems’ introduces the idea of intelligent controls and is the foundation of the approach to embedding used in this article.

Results of a survey on internal control and risk management recommendations’ shows that most people already recognise intelligent controls as a good idea, but organisations more often lack them.

The trouble with embedding is that anyone can claim they've done it. ‘Of course we manage risk. We do it all the time.’ they say. How can yout tell if this is true. ‘'So embedded it's disappeared'’ offers a long list of things to look for.

Open and honest about risk and uncertainty’ explains more of the psychology behind this form of embedding and discusses controls that have a helpful psychological impact.

Better management of large scale financial and business processes using predictive statistics’ shows statistical methods applied to learning what drives errors and backlogs in large scale financial and business processes.

Writing about flexible plans’ explains various techniques for writing about plans safely.

What is attractive about embedded risk management?’ was an early attempt to find out what people liked about embedding. In retrospect only the third scenario is really relevant, but at least it shows a healthy preferences among all groups of respondents for the truly embedded approach.

Words © 2006 Matthew Leitch. First published 6 December 2006.