This is a helpful framework for explicitly thinking about data quality.
There have been plenty of frameworks for performance and data, see for example A Framework for Performance Information or the Code of Practice for Official Statistics. However this one, although not the most recent, has a quite unique depth of focus on the quality of data as an end in itself, and the corporate standards to help achieve this.
A joint product from the collective auditing bodies of the UK.....
There are are two broad sets of considerations here. The first are the characteristics which are displayed by good quality data. These are the outcome of the second set of considerations, which concern the corporate arrangements which directly influence the quality of data, and are described as the standards.
So the characteristics of good quality data....
And those characteristics are delivered by these five standards, the corporate arrangements to secure good quality data, and summarised as follows....
And each of these standards have some exacting questions to test against, each of which are listed below.
Not surprisingly perhaps, the emphasis is on the governance and leaderships for data quality from which all else follows. It's worth noting that this is not describing the leadership for data, rather the leadership specifically for data quality, and hence for which there might even be different leadership roles.
It might be helpful to consider policies-and-procedures and systems-and-processes more as a whole. Given these terms are not explicitly defined. In simple terms, policies-and-procedures can be considered as the guidance, and the systems-and-processes the actual data activity.
And here's all those 30 questions as a wordcloud...so that data quality emerges from the predominant themes of management, reporting, staff, recording, proceedures....
Decisions Decisions
So what’s in a decision? It would appear to come down to judgement, helpfully defined as… “the evaluation of evidence in the making of a decision“.
So this means there are two parts to that judgement: (1) the evident itself and (2) the process of evaluation. So a good judgement needs the right evidence (or information) to be evaluated (or analysed) in the right way. So that’s the right information AND the right analysis for a good decision. Hence either poor information OR poor analysis can lead to a poor decision.
So judgement tends to be the shorthand for using evidence and evaluating it. The evidence is perhaps clear enough – some history and context. Of course it needs to be the right evidence (the scope or breadth of evidence) and there need to be enough of it (the depth of evidence). The evaluation might well be a structured process , weighing up (even weighting) different evidence, but perhaps more often than not a more ephemeral exercise.
So there is a helpful balance to be struck between what might be called “process judgement” and that more ephemeral “judgement by osmosis”. The process judgement is really what’s described above, a transparent approach to think about evidence and its evaluation in coming to a decision. So not just the oft quoted “evidence based” decision making - that’s not enough - but rather evidence and evaluation based. More science than art, and perhaps best typified by the “business case” approach.
The “judgement by osmosis” is the more attractive proxy for all of the process business. The right decision just permeates through all of the potential complexity of evidence and evaluation. In some cases a fantastic short cut to assimilate and distil the information and its analysis, and providing a natural blending with experience and risk appetite. In other cases less so... Overall probably more art than science.
So all decisions are judgement based, but how many of those judgements are evidence and evaluation based.
Subscribe to:
Posts (Atom)