No Silver Bullet

There are so many management terms and tools that it can be more than a little difficult to see the wood for the trees.   Vision, purpose, strategy, objectives, plan, programmes, projects, performance, change, risk, outputs, outcomes….. And then there’s the relationships, dependencies and even overlaps and contradictions that link these things together.  So that’s a lot of potential activity between defining a purpose and a outcome.

There are even published compilations of “Management Models”, including those from the Financial Times which list just the 60 every manager should know.  Of course there is no silver bullet, rather the prudent application of some fundamentals, and an overall grip on those so that they integrate and pull in the same direction.  So definitely not “off the self” but rather a set of tools relevant to specific circumstances. …less “set menu”, or even “À la carte”, but more “mezze”. 

So here’s my one page take on the key components, and their broad relativity. Not a panacea but a sense of order and structure.  At least as a map by which to navigate that wood of trees…

So that's the broad framework.  Here's the mapping of (1) leadership business, (2) managing business, (3) managing business change and (4) delivering business.

1. Leadership Business: Why and What

So some top level leadership defining the purpose, plus the objectives to achieve that purpose, and ensuring that these are sufficiently clear and specific, and engaging people on this. From these all else flows.  There may even be some preferred values which shape and steer subsequent behaviours.  Leadership is probably less head and more heart,  art rather than science, and character than personality.  So in simple terms it’s about strategy, the why (…we do what we do) and the what (…we are going to do).

2. Managing Business: How

Management is then about the planned delivery of that purpose and objectives.  That’s more head than heart and more science than art.  So it’s the plan  - the how - that makes a reality of that purpose (why) and objectives (what).

So a bit like a strategy in military conflict, at least there’s a clear starting point, even though this will adapt and adjust with application in the operational environment.  After all the term strategy is derived from the military – the plan of action to achieve a specific objective.

3. Managing Business Change

As the Greek philosopher Heraclitus stated in 500BC "The only thing that is constant in life is change".  And as echoed by Disraeli “Change is inevitable. Change is constant”.  Whether that’s (a) a virtuous circle of continual improvement, (b) a neutral circle of change for the sake of change (c) a repetitive circle of history forgotten, or (d) a vicious circle of decline, it all needs to be managed while still delivering. 

4. Delivering Business: Doing

That plan is implemented through processes, people and projects to deliver outputs and outcomes to customers.  So this is the actual doing.   That activity is probably monitored and steered through some combination of performance, portfolio, programme and project management, especially where things are changing.


It’s often the case that many of these management tools operate to independently from the others (and there may even be specific post holder for each of these roles).   Where finance, risk, performance and change are all aligned and focused in the same direction, they respond collectively and efficiently to that purpose and objectives.

If fact one of the simplest and best macro approaches is the well-established police approach of  gold/silver/bronze command.  Most notably because this has stood the test of time as fashionable models come and go.   The gold commander makes a decision about what needs doing (…seal the football ground).  The silver commander works out how to achieve that (…a team on each entrance/exit, plus a mobile spare).  And the bronze commander(s) actually do it (…present and responsive on each entrance/exit).  Also the Gold commander will openly seek and take expert advice from specialist advisers (negotiation, public order…) as part of that decision making process.  A simple and effective framework to structure the broader suite of components.

So a map by which to navigate that wood of trees…not the only map nor indeed the perfect map.  At least a functional map, with a clearer view of the landscape, with some places visited and some places yet to be explored.

A Framework for Performance Information

There are frameworks and there are frameworks. The latest best practice data management framework is that provided by the Code of Practice for Official Statistics published by the UK Statistics Authority (see Statutory Statistical for the overview).

While designed for the public sector it's based on more generally applicable principles which hold fast in other sectors.  However that is quite detailed, and while outcome focussed, there are 74 business practices to consider. So it's worth being open to some of the predecessors who facilitated the preliminary debate and did some of the earlier consolidation in a simpler way.  Not necessary simple, just simpler.

Notably is the Framework for Performance Information - "FABRIC".  A joint product from the National Audit Office, Audit Commission, Office for National Statistics, Cabinet Office and HM Treasury.  The irony is that a decade on the Cabinet Office and HM Treasury are now legally bound to be compliant with the UK Statistics Authority Code of Practice. What goes around, comes around.

Like all such frameworks they tend to give you the "test answers" rather than the "workings" to get there.  But perhaps more frustratingly, there can be lots of good content that can seem more than a little unconnected...framework, criteria, components and so on.  So here's that Framework for Performance parts summarised, and the big picture distilled.

A. Framework for Performance Information

How performance data relates to the business...

B. Criteria for Individual Performance Measures

So what makes a good measure...

C. Components for Managing Performance Measures

Planning, assuring and using measures.   And mapping on the relationship with Framework (A) above...

D. Corporate Performance Framework - Distilled.

So this is my take on what that framework might look like if integrated and presented as a whole, to get a better sense of the relationship of those parts.  Developed from the various elements, this is structured this around the nature of business activity - which is after all is the focus of performance measurement - in terms of corporate activity (resources, inputs, processes, outputs and outcomes) and corporate success (economy, efficiency and effectiveness).

Statistically Speaking

Do numbers really speak for themselves?  Sometimes. Mostly it's the words that really speak...."That's the best satisfaction we've achieved in the last 15 years, and well above the typical level for the sector."

So here's a dip into my collection of statistically related quotes.  So can these words capture the essence of the world of numbers?

Analytical Insight

In the world of analysis, it’s common to have some data and some outputs from the analysis of that data.  There might even be a structured process for that analysis (even an analysis strategy with some direction, structure and flexibility).  So usually some inputs (data), process (analysis) and outputs (data).  But the real purpose for all of this is about making something happen or change, to make things better and that’s the eventual purpose or outcome.

So here’s the overall road map for the journey, through data, analysis, and the often plentiful outputs.   Not quite so many squares as a monopoly board, but a journey all the same.

[The data components are described at…Data Provenance. And the analysis process at What a Performance, and the propensity for that analysis to be successful at Analytical Insight Index.]

The reality is that there’s a lot to get right.  That’s 12 data activities either explicitly – more worrying implicitly – need to be dealt with before the analysis.  Then 6 factors to influence the success of an analysis, including 6 components to consider in the analysis strategy.   So 24 activities by here.  Then there’s the outputs.  Much more options here but often multiple products of combinations of words, numbers and visuals.

The key point is that shortcuts, omissions or mistakes effect the validity of everything than follows.   And it’s a long one way street.  Decide to categorise in a specific way for data collection and you’re probably stuck with it ….. ask individuals their age and record in 10 year age bands, means that you can’t differentiate into 5 year bands later down the line.   Equally, poor validation means that the errors associated with that becomes implicitly (and even unknowingly) present in the analysis and messages that emerge.  In short, errors compound during the journey.   In the words of Saint Thomas Aquinas in the 13th Century… “A small error at the outset can lead to great errors in the final conclusions”.  The principle also nicely illustrated by the WW1 quote  (although of questionable authenticity)  “Send reinforcements, we're going to advance” which became “Send three and fourpence, we're going to a dance”.

So if that’s all done well enough and we have some confident outputs, it can be quite a leap from outputs to making things happen.  How do those outputs turn into outcomes?  To get real outcomes means really engaging people, and that is perhaps the realistic proxy outcome here: getting sufficient engagement to facilitate and drive those changes for the better.

That’s about getting understanding or more specifically, Insight, defined variously as
 ..... to perceive clearly or deeply
..... a penetrating (and often sudden) understanding  of a complex situation or problem;
 ..... the act or result of understanding the inner nature of things;
 ..... the power of acute observation and deduction, penetration, discernment;
 ..... Psychology…. the capacity for understanding one’ own mental processes
..... Psychiatry…. the ability to understand one's own problems.

Insight is where things can get a bit more “heart” and bit less “head”, more “art” less “science”, and perhaps more fundamentally more “internal” and less external.   Turning those physical products full of words, numbers, graphics or visualisation into personal understanding in the minds of others. But insight is more than understanding observations, more about understanding messages and meaning in a broader context. 

Those output products emerge from a broadly internal process to become external product(s), so basically inside-out.   Insight is a more about an outside-in take on that analysis. Looking at that analysis from the point of view of its context, and that of key stakeholders. 

A useful seen stakeholder approach will be to:

(1) test or challenge a message to check it’s underlying robustness - if necessary unpacking those proceeding data and analysis steps.  Could be considered as a “depth” check.

(2) check the scope of analysis to see that the relevant factors are included, and the relevant context reflected, and tested against some external wisdom or benckmark.  Could be considered a “breadth” check.

So helpful to look at those analysis products in a different way, with increasing stakeholder interest and engagement. So:

Step 1. Cover the Angles: Ensuring that the scope of consideration is complete, priorities reflected and analysis robust.

Step 2. Synthesis: Consolidating that analysis into a single integrated, coherent, even holistic, perspective.  More than simply the whole being greater than the sum of the parts, more about making connections to see relationships, influences and dependencies as clearly and simply as possible.   

Step 3. Key Messages:  Distilling the complexity and meaning to the smallest number of most strategic messages.

So in short, taking an outside-in view and make messages as simple as can be, while respecting and reflecting the underlying complexity, relationships and context.  To paraphrase Einstein “Make things as simple as possible, but not simpler.”

Analysis Recipe Card

There are real similarities in language between cooking and data analysis….as previously described (slow Roast Data).

To take this relationship a step further, it’s quite possible to consider the analysis process to be just like a food recipe.   Basically they are the same processes…. some ingredients, a process to work with those ingredients, to create something to bring out the best in those ingredients.

Perhaps this is really just a project plan, with some inputs, process and outputs. The data are the ingredients, analysis of that data represents the cooking process, and the finished meal is the outcome.   To push this further then we might like to think beyond output to outcome.  So not just a meal that provides functional energy, but one which is a pleasure ride for the senses, sight, smell, touch (texture), taste and even hearing (think sizzling, steak, peppers or indian tandoor).  So an analysis that really delights, finds new truths and drives progress.

After all, for the food that hits the spot, it’s not unusual to hear…can I have the recipe.  Perhaps this is the test we might like to hear for a good piece of analysis.  This also provides that sense of ‘repeatabiliy’, the rigor of science, and potentially a useful audit trail in the right circumstances.

So here’s my recipe card.  In fact it refers to some real analysis looking at the senior civil service salaries and their corresponding variation in financial responsibility.....(For Whom the Buck Stops)