Relevance paradox

From Wikipedia, the free encyclopedia
Jump to: navigation, search

The relevance paradox occurs because people and organisations seek only information that they perceive is relevant to them. However, there may be information (in its widest sense, data, perspectives, general truths, etc.) that is not perceived as relevant because the information-seeker does not already have it and its relevance only becomes apparent after he has acquired it. Thus, the information seeker is trapped in a paradox.[1]. The book The Filter bubble deals with a similar phenomena.



[edit] Definition

In many cases in which action or decision is required, it is obvious what information relevant to the matter at hand may be lacking: a military attack may not have maps so reconnaissance is undertaken, an engineering project may not have ground condition details, and these will be ascertained, a public health program will require a survey of which illnesses are prevalent, etc.

However, in many significant instances across a wide range of areas, even when relevant information is readily available, the decision makers are not aware of its relevance because they don't have the information which would make its relevance clear, so they don't look for it. This situation has been referred to as the relevance paradox.[2] This occurs when an individual or a group of professionals are unaware of certain essential information which would guide them to make better decisions, and help them avoid inevitable, Unintended consequences and undesirable consequences. These professionals will seek only the information and advice they believe is the bare minimum amount required as opposed to what they actually need to fully meet their own or the organization's goals.

An analogy would be that of a short-sighted persons who are unaware of the condition. They would not be able to see the glasses they need to improve their sight until they had said glasses on. Such a situation can be resolved only by a clear sighted assistant handing them the glasses and giving instructions as to how they are to be used.

The Relevance Paradox is cited as a cause of the increase in diseases in developing countries even while more money is being spent on them: "Relevance paradoxes occur because of implementation of projects without awareness of the social or individual tacit knowledge within a target community. The understanding of the individual and the social tacit knowledge in a given community, which is a function of knowledge emergence, is the foundation of effectiveness in leadership practice." [3]

[edit] Examples

Civil engineers, from the 1950s onwards, unwittingly caused a massive increase in the debilitating water borne infection schistosomiasis (Bilharzia) for locals as a result of irrigation schemes that lacked simple low-cost countermeasures built in, simply because they had no knowledge of these countermeasures. Yet at the same time, the United Nations had already published guidelines explaining cheap countermeasures and how they could be built in to the design of the irrigation schemes: matters as simple as keeping velocities above a certain level to prevent the disease vector (a water snail) from attaching to the conduits. The civil engineers were victims of the relevance paradox because they only thought they needed to know only about concrete, water flows, etc., not how to control flow velocities to prevent the snail species that carried the disease from multiplying, so they didn't go and look for the information.[4]

Another example is the NASA engineers who, having spent a fortune on unsuccessfully developing the complex sliding and articulating inside knee joint needed for space suits, eventually went to the Tower of London and ruefully copied the armour of Henry VIII with just such a joint. They stated, “[w]e wish we had known about this earlier!”[5]

The relevance paradox can and usually does apply to all professional groups and individuals in numerous ways.[6] While there are many examples of willful ignorance, there are also many cases where people do not look outside the paradigms they are operating in and thus fail to see the long term consequences as described in the previous two paragraphs. Many are cited in the book The IRG Solution - hierarchical incompetence and how to overcome it. An overall example is of course the present environmental and financial situation in which the planned expansion of the world's economies was planned by people blissfully, as opposed to willfully, unaware of the impossibility of continued credit and growth because of fundamental resource limitations.

Some user manuals for equipment exhibit this condition as the manual will make clear how to operate the equipment only if one already possess a fairly good idea how to operate it to start with. If the person reading the manual has no idea at all how it all works the manual will not help, as it is written for someone who already had knowledge of the equipment. Hence, they will never be able to work out how to use it, even though they possess the instructions.

Another example is wiki technology, like the one Wikipedia uses. They contain a great deal of information, but they are most useful if one already knows that the information that one wants is in the wiki and where it is located in the wiki. That allows wikis to be extremely useful except for the first time that one wants to look for the piece of information. One is able to look up something quickly if one already knows where it is.

[edit] Avoidance

The notions of Information Routing Groups (IRGs) and Interlock research were designed to counter this paradox by the promotion of lateral communication and the flow of tacit knowledge, which, in many cases, consists of the unconscious and unwritten knowledge of what is relevant.

A related point is that in many cases, despite the existence of good library indexing systems and search engines, the way specific knowledge may be described is not obvious unless one already has the knowledge.

[edit] See also

[edit] References

  1. ^ The IRG Solution, Chapter 5 page 87
  2. ^ "The Importance of Knowing the Right People". The Guardian. 1980-03-20. 
  3. ^
  4. ^ Charnock, Anne (1980-08-07). "Taking Bilharziasis out of the irrigation equation". New Civil Engineer 1 (8). 
  5. ^ Andrews, D. (February 1986). "Information routing groups - ( Towards the global superbrain: or how to find out what you need to know rather than what you think you need to know". Journal of Information Technology 1 (1): 22–35. doi:10.1057/jit.1986.5. 
  6. ^ Andrews, David (1984). The IRG Solution - Hierarchical Incompetence and how to overcome it. London: Souvenir Press. pp. 200–220. ISBN 0285626620. 
Personal tools