icm2re logo. icm2:re (I Changed My Mind Reviewing Everything) is an 

ongoing web column  by Brunella Longo

This column deals with some aspects of change management processes experienced almost in any industry impacted by the digital revolution: how to select, create, gather, manage, interpret, share data and information either because of internal and usually incremental scope - such learning, educational and re-engineering processes - or because of external forces, like mergers and acquisitions, restructuring goals, new regulations or disruptive technologies.

The title - I Changed My Mind Reviewing Everything - is a tribute to authors and scientists from different disciplinary fields that have illuminated my understanding of intentional change and decision making processes during the last thirty years, explaining how we think - or how we think about the way we think. The logo is a bit of a divertissement, from the latin divertere that means turn in separate ways.

Chronological Index | Subject Index

When the accuracy page is not found

A case of arguable data (or systems) engineering

How to cite this article?
Longo, Brunella (2018). When the accuracy page is not found. A case of arguable data (or systems) engineering. icm2re [I Changed my Mind Reviewing Everything ISSN 2059-688X (Print)], 7.6 (June).

How to cite this article?
Longo, Brunella (2018). When the accuracy page is not found. A case of arguable data (or systems) engineering. icm2re [I Changed my Mind Reviewing Everything ISSN 2059-688X (Print)], 7.6 (June).

We have to recognise others as both different to us and the same as us. If we see others as entirety different we cannot understand them. If we see them as entirely the same we cannot see what makes them original and different.
Edgar Morin

London, 26 September 2018 - Another page of mine has recently disappeared from the Web and there is no trace of it anywhere but in my own records, thanks to the fact that I printed when I created in November 2017, when I published Decluttering machine learning through accuracy. About changing perceptions and achieving global consensus on data quality, icm2re 6.4.

No, not another case of induced self censorship like the ones I re-published last year in The Neglected Librarian: Seven articles about cataloguing big data 2010-2011.

In this occasion, the content I put forward was weaved into a small collection of conversations about change to be made to one of those apparently just technical IT and HR standard, SFIA 7.

It consisted of a quite simple proposal and definition I reproduce here:

Accuracy can be defined as a core competence that affects performance in terms of behaviour knowledge and professionalism and is possibly enabled and developed through qualifications and experience.
The demand for accuracy should be explicitly addressed at each level of responsibilities - answering a global call for more data quality.
This core competence should be further declined throughout all the categories and skills of the framework.

All gone from the web, you do not find neither my page nor any of the others submitted as proposal for version 7 of the standard. The entire collection of submission for changes has simply disappeared.

The invisible shame of invisible censorship

Theis invisible censorship method is nothing new. Out of sight, out of mind. Terra bruciata, they say in Italy.

In 1660 Charles II ordered the burning of John Milton's books at the Bodleian Library on suspicion of treason (he had defended the execution of Charles I), where only a librarian had tried to hide them. What surprises me is that more than three centuries on, the embarrassment of historians, chroniclers and journalists still takes its toll swiping a layer of cognitive fog in the literature. The impression has an objective ground in the sense that people, including many scholars, find difficult to talk about, investigate and comment historical cases of censorship when they cannot find or interpret the sources (I have learned of this episode thanks to The Oxford Times that published the news on 16th September 2010 - better late than never - under an absolutely appropriate headline, Destroying ideas remains hot topic).

In our digital age, burning books and libraries is not materially necessary anymore. Censorship consists in just preventing some - writers and journalists, but also scientists and artists - from expressing their thoughts and spreading their ideas, publishing or broadcasting their work or, if that happens, deleting unwanted contents such as files, websites, electronic records from the internet as quickly and thoroughly as possible, making any trace pretty much unfindable. Between 2008 and 2012 I considered the role this technique plays in the growing phenomenon of cyber crime and legal information management - and, quite a miracle, I saved and published something in spite of being subject to immense psychological and material pressure. If that is good, it may be because good is always the product of skill as once said Charles Baudelaire.

Coping with censorship is hard. Creating new artefacts or manufacturing and seeing them destroyed causes great suffering, a sensation of empty and deserted spaces grows in the mind. This could be easily colonised by delusions, paranoia, depression or other unhealthy mental conditions. It takes time to manage thoughts and feelings of absolute despair caused even by the suppression of subtle, pitiful details, in an artist’s work This time, the hole was big.

I delayed the publication of new articles of icm2re once again. I felt hopeless and isolated. I know that anger against social constraints and socially determined laws feeds such feelings, I know that inhibitions induced by peers pressure and group norms delay and halt innovations. What I do not is why the call for accuracy has failed.

The SFIA patchwork

In this instance of invisible censorship, the librarians' cousins coalesced in the SFIA Foundation acted straightforwardly.

The SFIA foundation is a self proclaimed community of experts supported by training companies, a private company registered in England and Wales that has succeeded for the last ten years in marketing itself as the not for profit global edifice for the definition of what constitutes a skill or a competency in the digital and information technologies industry.

They removed not just my page from their website but the whole thread containing over one hundred other proposals for changes to the standard, so that there is no trace at all of what had been said, by whom and with what justifications or motivations.

There is no possibility to grant me authorship for an idea I shared openly that, submitted as a simple request for change to the core competencies of the SFIA standard, is very likely to become a very big one anyway.

I see data accuracy as a central, foundational issue for the digital civilisation together with what Morin called the challenge of global thinking. Everything in data - or systems - engineering should start from it or face unwanted, unintentional and dysfunctional consequences.

On the SFIA Foundation website there is no possibility to trace back any of the changes that have been actually put forward, approved and rejected all together: this is deeply contradicting other shared and declared values about openness of knowledge, accountability, integrity, intellectual responsibility, traceability and more .

Nobody has disputed such method of making terra bruciata of all the possible, submitted requests for change to the standard as far as I can see. Invisible censorship can be quite comprehensible and acceptable to several people. It is, in fact, the usual, normalised response or the status quo preceding any form of educated data, information and knowledge management: nobody wants to challenge anything at first, fearing the consequences of upsetting the kings and barons of the time.

The authors of the various requests for change that have been approved are satisfied because their arguments and stances have been endorsed by the organisation and made officially approved. Among the authors of the ignored or disapproved submissions the majority is obviously keen on accepting chatham rules that ensure no loom will remain visible in the public domain: in fact, a non detrimental frame for the exposure of the rejected or ingored weavings does not exist.

The invisible censorship makes the losers to retire silently, burying their ideas, especially if they had put them forward as clever and quick copycats, for the sole purpose of grabbing some free publicity from the open process - that is in fact what can happen as soon as a catching and powerful new idea comes out of the blue.

Where are the rules and the information management activities to ensure that the creation of new knowledge and the subsequent decisions made on it are not subject to predictable levels of common disgraceful judgements and acts (including abuse of process, vandalism, libel, bias, subjectivity, arbitrary treatment, unfair competition, abuse of copyright and authors’ moral rights)?

For an IT skills standard that in its latest version has surprisingly opened the door to knowledge management advocated by librarians and information specialists (just in time to censor one of its champions?) it seems something went utterly wrong this time.

The competency model

When I recovered from the shock, I must say that I looked at the whole of the matter still with a bit of resentment but in a new way, pretty much as I do when I prepare the Sunday roast with a substitute vegetable for potatoes - that are delicious but not so good as turnips or swede if you suffer from autoimmune diseases.

Perhaps because I looked at the subject from the perspective of a systems engineer I saw there is plenty of redundant and possibly unhealthy starch in the SFIA as well as in other competence frameworks: . Verbally overweight in many circumstances, even obese at times, these standards respond to the overarching need of their stakeholders at large to have represented in one independent place not just their current and forthcoming inventories of desired skills but also the legacy of their HR and corporate cultures and all the attached strings of marketing and publicity vested interests.

Colleagues beware: systems engineering - and I am quoting an official definition here - focuses on ensuring that pieces work together to achieve the objective of the whole. So, there is no point in rejecting turnips because they do not add any flavour and pretending you are enjoying roasted potatoes when you know potatoes cause systemic pain and inflammation in the body.

The competency model was invented in the 1970s and early 1980s by consultants working on ways to motivate and develop people and to manage effective performance. In spite of the absolute evidence that there is no scientific method useful to measure assess and above all to forecast human capabilities other than referring to a precise context, goal, behaviour or outcome, the strength of the concept is still fascinating as a motivational lever for human resources department and line managers, an inspirational tool for educational experts, and above all as an assessment framework for recruiters and professional bodies.

I played with competencies models (also called schemes, or frameworks) in the 1990s when I was struck by the dramatic skills shortage that seemed holding back the development of new businesses in the looming digital economy. I was asked to give my contributions for the definition of skills and competencies schemes useful for training and development purposes and I did it with great intellectual and practical engagement, advising, tutoring, writing, designing and selling training services.

The number of existing competence schemes is nowadays uncountable. Every organisation or professional association has developed its own as a human capital shop window, almost always in connection with the offer of qualifications, professional registrations and certifications. It is not unusual to find that in the same company or division different schemes are used to recruit people said to have a common set of skills but diverse personality, expertise or attitudes . In sum, there is an immense fragmentation of tools used to administer and apply what is still, essentially, a highly subjective and totally discretionary playground. The so call competence model, invented within controlled environments in the 1970s by researchers leaded by D.C. McClelland, has lost method and, with that, credibility, becoming a field of abstract reasoning, creative elaborations, verbal climbings on the walls that usually separate academic disciplines. It looks fine, everybody can argue. But it goes nowhere.

By the way, if you are not familiar with the vocabulary of the field, it is useful to know that the words competence, competency, skills, attitude, capability - plus possibly other more attractive to its bored audiences - are used with peculiar different meanings among experts, sectioning the abstract notion of “what do we need to know and master” to ensure a certain degree of predictable performance at all possible levels, beyond the solely declarative or disciplinary knowledge. But all these juicy terms are in fact very similar and used interchangeably in everyday dialogues - a case in which accuracy is possibly good enough at a very low level of semantic differentiation, my opponents could argue!

However, different functional point of views (human resources, personal development, training, organisational culture, recruitment etc) keep alive these distinctions that have become inevitably politically relevant, even when they are substantially superseded or preposterous, and glued to the corporate culture spreadsheets or organigrams. It is indeed very rare, difficult and perhaps even absurdly slow to get rid of some skills and definitions that survive by inertia, or are just dictated by few monopolists, or sponsored by industry moguls and accepted for general marketing convenience. The same is true when we try to introduce new skills in the labour market. And particularly, new ones not yet as skills achievable through formal training but as capabilities that come from the core competency level.

In sum, my censored contribution triggered a reflection on the whole of the weave. I may be wrong but I am convinced I am not. I have found in the maturity of the competence model as it is reflected in the SFIA framework something very immature, due to its lack of focus on what are the core competences we need at work, for business development and for the application and management of innumerable technologies.

Accuracy is here to stay

There are great socio-economic expectations from data management, from design traceability, from model based design and other complex innovations across all sectors that do seem to me making the case for data and systems engineering: all these visions, automated workflows, plans and designs increasingly and strongly rely on the fundamental accuracy of data, processes and integration of different systems, for which the perception of good enough is simply not enough.

So I returned to my discarded definition of accuracy as a key, core competence that affects performance and I saw it for what it is. An absolute primer calling for a more systems engineering approach to the definition of skills, schemes and frameworks.

There is an aspect of the notion that is implicit for information security, up to the point that when we refer to data integrity we assume, in practice, that certain data are just kept accurate as they have been created or intended or designed to be, necessarily. For instance, think about registers or inventories of assets.

But without explicitly calling for accuracy as a basic foundational competence we undermine the reliability of the whole of the information security products and services, no matter how much we insist in detailing levels of responsibility and quality assurance audits - for which we will never have two outperforming individuals acting in exactly the same (predictable) way.

Information security controls and information governance, in a wider sense, serve to ensure that data are and remain accurate because the reliability of such information is critical for human life and artefacts. But the point I wanted to make is that data must born accurate for any complex system of systems to work.

IT applications and systems have been standardised internationally on the grounds that there must be a way to ensure predictability of execution and outcomes. For this purpose, entire IT processes rely on the accuracy of technical audits. Checks of requirements and results create the need for configuration management processes that would not be possible without measurements of accuracy. And yet, pretending that measurements and accuracy are the same concept seems such a terribly uneducated error.

When talking standards, a definition of accuracy has been already clearly achieved, comprising all the various facets of the notion I mentioned above - and that is very distinct from contiguous concepts of precision, quality, measurements or risks.

Several international standards, from metrology to software engineering. deal with it. The last has actually adopted the practical american and international PMI BOK definition of accuracy that I find absolutely aligned with my own view of accuracy as a competence. This says that the term accuracy is commonly used to refer to both an activity and an indicator when it defines it as:

  1. qualitative assessment of correctness, or freedom from error
  2. quantitative measure of the magnitude of error
  3. within the quality management system, accuracy is an assessment of correctness [that would be expressed through a degree of precision].

Recognising accuracy

So, all in all, I remain convinced and I insist that accuracy is a competence that enable individuals and systems to make qualitative assessments of correctness and measure errors and precision in the first place.

I have no doubt, on the grounds of my own experience of working with people from all walks of life, that it can also be developed, acquired as a procedural ability in many contexts, transferred between sectors or roles and activities and constructed through algorithms, so that we can also talk about an artificial or systems accuracy obtained via automation. This is nothing new.

But what I see as very relevant to reiterate after such unexpected episode of censorship of the idea of accuracy as a competence is … exactly that: accuracy is not a given. It does not come infused into an organisation’s processes and workflows just because there are quality controls in place, or formal certification of quality, quality assurance, quality management directorate and so on and so forth.

Some people possess what is commonly referred to as “attention to detail” more consistently and evidently than others: in such natural intelligence form, accuracy is easily revealed by formal intelligence tests as a distinctive aptitude to solve problems, to make quick decisions without losing concentration, to spot errors and patterns. It is a form of natural intelligence that very often belongs to people able of distinctive achievements in software development, analysis, design, financial or legal advice not less than other traditional and manual fields of human activities from gardening to carpentry.

We all would advance in handling technological innovations and changes in every sector if we recognised accuracy with the definition I offered and reiterated above.

Skills acquired in the workplace, volition, lifelong learning can help a wider range of people with different form of intelligence and communication styles (or personality types) to do very well in many roles without possessing a pure or natural tendency to be very accurate. And there are innumerable activities and contexts - mostly outside the digital world though - where people are generally quite happy with good enough products and services, from bakery to hospitals, where details and particulars are not so important, because they trust their feelings that perfection is the enemy of the good. Do we want these people, that I am sure excel for other qualities, to work with heart bypass?

Databases and data modelling are at the heart of the digital civilisation.