99% Disambiguation

Disambiguous.  That's the plural of sale.  Also defined as clarification that follows from the removal of ambiguity.

With some of the grand and overt promotion that goes with Sales, there might well be some contradiction here in these two definitions. Some of the proportionality implied in those marketing materials might raise an eyebrow at the very least.

Here's some sale posters for a leading high street clothing retailer of over 100 years...

Notice the "selected lines" tucked in at the bottom under the "50% off sale".  Not much emphasis there.

Might we reasonably expect the space given over the the "50% discount, compared to the space given to "selected lines" to be in some way proportionate.  So the spatial emphasis gives us a fair sense of the lines in the "50% off sale".

Let's measure.

So in comparative terms, the "50% off sale" gets 100.8 sq units compared to 1,08 sq units for "Selected lines".

That works out at 99% of the spatial emphasis on the "50% off sales". Assuming some reasonable sense of proportionality, that might well lead us to assume that the discount might well apply to 99% of lines.  That's nearly everything (assuming of course that all "lines" have a similar number of items).  Perhaps we might be surprised if all bar 1% of the lines had 50% off.  Perhaps we're used to this so much, that we just know that's not going to be the case..... 

So perhaps Disambiguation might well be the right term, which intuitively seems to be more about disingenuous ambiguity.....

With the shift to more overt ethically produced clothing, then how about a shift to more proportionate marketing of those clothes....


I am what I am

Whatever the balance between nature and nurture, this phrase has some provenance.

Whether it's from the song made famous by Shirley Bassey and then most recently John Barrowman ... I am what I am, my own special creation, a notable Popeye phase from the 1930s, and with earlier biblical references too.

We are all different, and that's reflected in every personal interaction, and hence every work interaction too. That's why in the world of work there's a lot of time and effort expended in that understanding of self, as a steeping stone to understanding others.

One of the most popular models of types of us is the Myers-Briggs Type, based on the work of the Karl Jung. Here's my ready reconner that lays this out in a visual way to get a personal sense of where our personal preferences lay.



This is one of my terms.

This is used to describe a number that should be berated for being presented out of it's context.  Most typically this is a number is a numerator, presented in the absence of any denominator context


Project Traction

Everything is a project.  With some combination of purpose, time and resources.  This applies to life in general too.  Going on holiday is a project, some purpose (maybe sun or surf or serenity), then some planning, then using resources and doing activities.  We think of projects more often in the workplace context , especially where project management methodologies start to be wheeled out.

Once project management methodologies get wheeled out then even the simpler projects can seem more complex, because with project management methodologies, there's simply more to do.  So not just to achieve the purpose of the project, but now also to construct and feed the project management machinery along the way.

So at this point it can be helpful to distinguish between what the project delivers to achieve it's purpose and what the project delivers to support the project management.  After 20+ years, PRINCE 2 is the de-facto project management standard for public and not for profit sector (originally developed through the office for Government Commerce (OGC), now subsumed within the Department for Business, Innovation and Skills).

It can be more helpful to also think of this is a more accessible way.  And to maintain some explicit air gap between these things.   The things that the project delivers to fulfil its purpose are deliverables. products or outputs.  The things that the project delivers to support the project management can be more simply described as the project tools.  The same way a carpenter's toolbox contains things that enable the carpenter to achieve some purpose with wood, the tools are the means to that end,and while important - even critical  -to achieving that purpose - those tools are a means to an end, rather than an end in themselves.  (Sure for someone else their purpose is to make tools for the carpenter in the first place - probably making tools with machinery.

It's the project management that can reliably provide the traction for the project.  The way a tractor is designed to provide the traction in  range of circumstances - to pull different tools.  The tractor is that generic  tool, just just those project management methodologies. Both can be applied to different purposes in different circumstances....


Risky Business

Risk is an integral part of our world and our lives and so too of the workplace. 

Risk management, an industry in in’s own right, defined as “the identification, assessment, and prioritisation of risks (defined in ISO 31000 as the effect of uncertainty on objectives, whether positive or negative) followed by coordinated and economical application of resources to minimise, monitor, and control the probability and/or impact of unfortunate events or to maximize the realisation of opportunities”. Source: Wikipedia

Corporately risk is often seen in the use of ‘risk logs’ – a project management tool used to list, assess and (even) managing risk– used to support larger streams of work.  Then of course the more implicit approaches in activities like SWOT analysis – strengths, weaknesses, opportunities and threats.  Those threats being potential risks.  However those opportunities present a more positive approach to risk – the risk of not doing positive things.

There’s some really helpful pointers to getting to grips with risk from David Spiegelhalter, who is Professor of Public Understanding or Risk at Cambridge University.  The author of some really useful materials, not just studying public understanding of risk, but also enabling that understanding too.  So some really helpful materials.   Having seen the talk (the University’s Department of Continuing Education)  on communication risk and uncertainty explaining “how people's perceptions of risk and uncertainty are influenced by the words, numbers and pictures used to communicate them”, there’s some really helpful observations and pointers to help get to grips with practically thinking about risk.  Here my key observations from that talk….

Approaching risk…  

We have two approaches to risk.  Firstly there’s the feeling approach.  That’s the gut reaction, fuelled by intuition, emotion and culture.  Secondly there’s the rational approach, weighing things up in a structured way, with careful analysis and measurement.  We’re seem to be comfortable with the more intuitive approach especially for the more personal matters.   When it comes to decisions taken on our behalf – for example by local and national government – we of course expect to see the more rational approach.  There in lies the world of business cases and cost benefit analysis.

Framing this issue… 

The purpose or context in which risks are presented is key.  It might simply be to attract attention to the issue, perhaps to inform or educate, or even change behaviour.


Risk is often reported as tables of numbers, but increasingly as graphics too.  Interesting that the smallest number actually represents the biggest risk, 1 in 100 (higher risk) vs 1 in 10,000 (lower risk) when represented this way.  It’s proven that we act is some pre-disposed ways to how risks or probabilities are presented.  For example, Ratio Bias is where we focus on the numerator more so than the denominator.  So those two black bags...bag ‘A’ with one while and two red balls, and bag ‘B’ with two white and six red balls… so if the challenge is to get a while ball  then we’ll opt for bag B, the chances are exactly the same but the fact that we know there’s more while balls is too much to resist.  While that’s subtle, then how about Denominator Neglect where we ignore the denominator entirely.  So with those balls in the bag, we behave as it the red balls don’t exist at all.

Evidence and consensus… 

Certainly for some of the big areas of risk with lots of focus by lots of experts, we can look to a “meta” assessment of the risk, to get an overall impression of the understanding of the risk.  The two dimensions here are (a) the robustness of the evidence and (b) the degree of consensus.   So the robustness covers type, amount, quality and consistency, and the consensus covers the extent of agreement amongst experts.  Here’s how the assessment quality of evidence might considered, based confidence in estimates of medical treatment effects:
  • High quality – further research is very unlikely to change our confidence in the estimate of the effect.

  • Moderate quality – further research is likely to have an important impact on our confidence in the estimate and may change the estimate.

  • Low quality – further research is very likely to have an important impact on our confidence in the estimate and is likely to change the estimate.

  • Very low quality - any estimate of the effect is uncertain.

Of course the same evidence can be interpreted in different ways.  A favourite example of mine is the measurement of “staff turnover”, which for several years a national performance indicator for all police forces in England.  So a low turnover either means “staff are very happy here and are not drawn elsewhere” or  “staff are not good enough to achieve employment elsewhere”.  Probably the former that holds sway by default.

So arguably what we seek is a strong expert consensus based on robust evidence, rather than disagreement based on weak evidence.  That said, the world of consensus has had plenty of wake up calls, or “paradigm shifts” which can turn current thinking on it’s head….the worlds in now officially round. 


There’s interesting ways to help present risk to ease understanding.  The idea of the “micro mort” which is a 1 in a million chance of dying, being used to more easily compare probabilities.  Examples of experiencing a micromort are travelling 6,000 miles in a train, driving 230 miles in a car, riding six miles on a motorbike.

Then there are the visual tools.  For example, a risk ladder shows the level of risks on a log scale to make comparing bigger and smaller risks easier.  Arrays of icons – grids of icons – help see the risk relative to a larger group.  Blocked arrays (where risk icons are clustered) are good for comparing, while scattered arrays (where risk icons are scattered) are good for seeing personal risk. 


Meeting Measure

Meeting…. defined as the act or process of coming together as an assembly for a common purpose. A very laudable and effective exercise when done well.  However, the practice of meetings can seem a little less rigorous at times, and not as effective as they could be.  Some key factors at play here, and here’s thoughts specifically around preparation and purpose.

In practice preparation can be quite variable.  Here’s a simple meetings measure to reflect the degree of preparation.  At one end of the continuum there is “Nightclub” style…just turn up and play it by ear.  Might even leave to go to something more interesting part way through.  At the other end of the continuum is “Bookclub” sytle …pre read with insightful collective reflection.  Materials are read and reflected on in advance,  and the meeting is the chance to share perspectives.  Most practical meetings are somewhere in between, but there’ll be a tipping point where just the right amount of preparation can maximise effectiveness.

In practice the purpose might not always be explicit enough.   Perhaps we need a more dynamic term for the person that’s in charge, rather than the ‘chair’.  So not a great metaphor - a passive functional object.  How about the “purposer”, as someone who is relentlessly focussed on delivering the purpose or objective of the meeting…


Data Quality Standards

This is a helpful framework for explicitly thinking about data quality.

There have been plenty of frameworks for performance and data, see for example A Framework for Performance Information or the Code of Practice for Official Statistics.  However this one, although not the most recent, has a quite unique depth of focus on the quality of data as an end in itself, and the corporate standards to help achieve this.

A joint product from the collective auditing bodies of the UK.....

There are are two broad sets of considerations here.  The first are the characteristics which are displayed by good quality data.  These are the outcome of the second set of considerations, which concern the corporate arrangements which directly influence the quality of data, and are described as the standards.

So the characteristics of good quality data....

And those characteristics are delivered by these five standards, the corporate arrangements to secure good quality data, and summarised as follows....

And each of these standards have some exacting questions to test against, each of which are listed below.

Not surprisingly perhaps, the emphasis is on the governance and leaderships for data quality from which all else follows.  It's worth noting that this is not describing the leadership for data, rather the leadership specifically for data quality, and hence for which there might even be different leadership roles.

It might be helpful to consider policies-and-procedures and systems-and-processes more as a whole. Given these terms are not explicitly defined.  In simple terms, policies-and-procedures can be considered as the guidance, and the systems-and-processes the actual data activity.

And here's all those 30 questions as a wordcloud...so that data quality emerges from the predominant themes of management, reporting, staff, recording, proceedures....


Decisions Decisions

So what’s in a decision?  It would appear to come down to judgement, helpfully defined as… “the evaluation of evidence in the making of a decision“.

So this means there are two parts to that judgement: (1) the evident itself and (2) the process of evaluation.   So a good judgement needs the right evidence (or information) to be evaluated (or analysed) in the right way.   So that’s the right information AND the right analysis for a good decision.  Hence either poor information OR poor analysis can lead to a poor decision.

So judgement tends to be the shorthand for using evidence and evaluating it.   The evidence is perhaps clear enough – some history and context.  Of course it needs to be the right evidence (the scope or breadth of evidence) and there need to be enough of it (the depth of evidence).    The evaluation might well be a structured process , weighing up (even weighting)  different evidence,  but perhaps more often than not a more ephemeral exercise.  

So there is a helpful balance to be struck between what might be called “process judgement” and that more ephemeral “judgement by osmosis”.    The process judgement is really what’s described above, a transparent approach to think about evidence and its evaluation in coming to a decision.     So not just the oft quoted “evidence based” decision making - that’s not enough - but rather evidence and evaluation based.   More science than art, and perhaps best typified by the “business case” approach.

The “judgement by osmosis” is the more attractive proxy for all of the process business.   The right decision just permeates through all of the potential complexity of evidence and evaluation.   In some cases a fantastic short cut to assimilate and distil the information and its analysis, and providing a natural blending with experience and risk appetite.  In other cases less so...  Overall probably more art than science.

So all decisions are judgement based, but how many of those judgements are evidence and evaluation based.

Let’s not forget it’s also quite possible to get the right decision randomly, “more by luck than judgement’.  So judgement is just the helping hand to increase the chances of making the right decision first time by using sufficient evidence and evaluating it systematically.  


Context is King

As the future King of England is wed, so it's actually the context of the wedding which is the big attraction.  After all the marriage, a personal commitment between two people, is legally just the same as any other (well actually multiple marriage registers rather than just one).  It's all the context that provides the whole picture....royalty, romance, and hence the ensuing super spectacle.

And so it is that context is king in any effective analysis and interpretation, whether strategic or operational.  Here's a couple of examples at the other extreme...

The "12 inch pizza"  from the supermarket looked a little small, and on measuring not only was it smaller than it's backing, but the backing itself was not the full 12" (being only 30cm wide, rather than the expected 30.5cm).  So we're missing at least 0.5cm of pizza!

If we were missing that 0.5cm from the middle of the pizza that that's only 0.2 squared centimetres.  But from the edge of the pizza, that's missing all the way round, that's a different story....a whole 24 sq.cm missing, 800 times more pizza than if it was missing from the middle...

Another simple example of the importance of context...In need of a garage or loft conversion, then how about some recommendations...some quotes, proudly displayed on the side of a van....

"The best builder I have ever used".

Well if it's the first builder I have ever used then.... that also makes it the worst builder I have ever used.

"Standard of finish was beyond my expectation".  

Well if the expectation was very poor, the standard of finish could be poor, and this would still hold true.


Communicate to Connect

Communication is of course important to the success of many activities.  Not least the larger scale projects which will have their own communications plan, and may even be driven by a stakeholder plan. 

However, if communication is important, it’s connection that’s paramount.  It’s that difference between hearing (communicating) and listening (connecting).   If hearing is detecting sound , then listening is understanding and interpreting that sound.

Lots of communication planning and development gets built around the physical and tangible tools – the media, the mechanisms.  “We need a web site” or “we need a newsletter” or “we need a {blank}”.  That’s quite a practical way to get started, but needs some mitigation.  The risk is the mechanism becomes end in itself rather than a means to an end.   

Those mechanisms are the simply the things that join a message with people.   After all people might well communicate via a mechanism, but will connect with a message.   While we can generalise about groups of people using phraseology like “stakeholder segments”, they are individual people.  So perhaps we tend to think about communicating with impersonal stakeholder groups, rather than actually connecting with real people, in fact real persons.  Think less 'group of people', more collection of individuals.  So that's communication with groups to connect with individuals. So a new term perhaps... 'communect'... communication that connects.

The real issue with starting with the mechanism is that this is a uni-dimensional approach to a (typically) multi-dimensional situation.  We have (1) message(s) to share, (2) mechanisms to communicate those messages, and (3) the people with whom to connect.   While that mechanism centric approach can work well enough with one message for one group of people, it’s not such a sensible starting point for multiple messages or multiple stakeholders, and even more tricky for multiple messages and multiple stakeholders.

So the mechanism is simply the means to connect messages to people. Here’s a take on that which, which can  help identify, clarify and structure those multi-dimensional situations.  So a framework for planning project comms....

That message might simply be awareness raising, information giving, or about brand presence, through to a stronger sense of engagement.  That said this tends towards the traditional ‘broadcast’ approach rather then more two way engagement, but there’s still a place for that, and a structured approach to managing it.  After all engagement is built on communication and connection.


No Silver Bullet

There are so many management terms and tools that it can be more than a little difficult to see the wood for the trees.   Vision, purpose, strategy, objectives, plan, programmes, projects, performance, change, risk, outputs, outcomes….. And then there’s the relationships, dependencies and even overlaps and contradictions that link these things together.  So that’s a lot of potential activity between defining a purpose and a outcome.

There are even published compilations of “Management Models”, including those from the Financial Times which list just the 60 every manager should know.  Of course there is no silver bullet, rather the prudent application of some fundamentals, and an overall grip on those so that they integrate and pull in the same direction.  So definitely not “off the self” but rather a set of tools relevant to specific circumstances. …less “set menu”, or even “À la carte”, but more “mezze”. 

So here’s my one page take on the key components, and their broad relativity. Not a panacea but a sense of order and structure.  At least as a map by which to navigate that wood of trees…

So that's the broad framework.  Here's the mapping of (1) leadership business, (2) managing business, (3) managing business change and (4) delivering business.

1. Leadership Business: Why and What

So some top level leadership defining the purpose, plus the objectives to achieve that purpose, and ensuring that these are sufficiently clear and specific, and engaging people on this. From these all else flows.  There may even be some preferred values which shape and steer subsequent behaviours.  Leadership is probably less head and more heart,  art rather than science, and character than personality.  So in simple terms it’s about strategy, the why (…we do what we do) and the what (…we are going to do).

2. Managing Business: How

Management is then about the planned delivery of that purpose and objectives.  That’s more head than heart and more science than art.  So it’s the plan  - the how - that makes a reality of that purpose (why) and objectives (what).

So a bit like a strategy in military conflict, at least there’s a clear starting point, even though this will adapt and adjust with application in the operational environment.  After all the term strategy is derived from the military – the plan of action to achieve a specific objective.

3. Managing Business Change

As the Greek philosopher Heraclitus stated in 500BC "The only thing that is constant in life is change".  And as echoed by Disraeli “Change is inevitable. Change is constant”.  Whether that’s (a) a virtuous circle of continual improvement, (b) a neutral circle of change for the sake of change (c) a repetitive circle of history forgotten, or (d) a vicious circle of decline, it all needs to be managed while still delivering. 

4. Delivering Business: Doing

That plan is implemented through processes, people and projects to deliver outputs and outcomes to customers.  So this is the actual doing.   That activity is probably monitored and steered through some combination of performance, portfolio, programme and project management, especially where things are changing.


It’s often the case that many of these management tools operate to independently from the others (and there may even be specific post holder for each of these roles).   Where finance, risk, performance and change are all aligned and focused in the same direction, they respond collectively and efficiently to that purpose and objectives.

If fact one of the simplest and best macro approaches is the well-established police approach of  gold/silver/bronze command.  Most notably because this has stood the test of time as fashionable models come and go.   The gold commander makes a decision about what needs doing (…seal the football ground).  The silver commander works out how to achieve that (…a team on each entrance/exit, plus a mobile spare).  And the bronze commander(s) actually do it (…present and responsive on each entrance/exit).  Also the Gold commander will openly seek and take expert advice from specialist advisers (negotiation, public order…) as part of that decision making process.  A simple and effective framework to structure the broader suite of components.

So a map by which to navigate that wood of trees…not the only map nor indeed the perfect map.  At least a functional map, with a clearer view of the landscape, with some places visited and some places yet to be explored.


A Framework for Performance Information

There are frameworks and there are frameworks. The latest best practice data management framework is that provided by the Code of Practice for Official Statistics published by the UK Statistics Authority (see Statutory Statistical for the overview).

While designed for the public sector it's based on more generally applicable principles which hold fast in other sectors.  However that is quite detailed, and while outcome focussed, there are 74 business practices to consider. So it's worth being open to some of the predecessors who facilitated the preliminary debate and did some of the earlier consolidation in a simpler way.  Not necessary simple, just simpler.

Notably is the Framework for Performance Information - "FABRIC".  A joint product from the National Audit Office, Audit Commission, Office for National Statistics, Cabinet Office and HM Treasury.  The irony is that a decade on the Cabinet Office and HM Treasury are now legally bound to be compliant with the UK Statistics Authority Code of Practice. What goes around, comes around.

Like all such frameworks they tend to give you the "test answers" rather than the "workings" to get there.  But perhaps more frustratingly, there can be lots of good content that can seem more than a little unconnected...framework, criteria, components and so on.  So here's that Framework for Performance parts summarised, and the big picture distilled.

A. Framework for Performance Information

How performance data relates to the business...

B. Criteria for Individual Performance Measures

So what makes a good measure...

C. Components for Managing Performance Measures

Planning, assuring and using measures.   And mapping on the relationship with Framework (A) above...

D. Corporate Performance Framework - Distilled.

So this is my take on what that framework might look like if integrated and presented as a whole, to get a better sense of the relationship of those parts.  Developed from the various elements, this is structured this around the nature of business activity - which is after all is the focus of performance measurement - in terms of corporate activity (resources, inputs, processes, outputs and outcomes) and corporate success (economy, efficiency and effectiveness).