Perspectives on professional practice

Perspectives on professional practice

Bo Strangert (RD2)

Perspectives on professional practice 


I like this phrase of Taleb (2010, p.xxvi): "We do not spontaneously learn that we don't learn that we don't learn …". For example, we often don't change habits or thinking after making mistakes. Learning from history of cumulated experiences is often said to be infrequent, too. Nor do we spontaneously reflect upon this incapability. Taleb asserts that this is a consequence of not understanding the meaning of uncertainty and randomness of events. This is true of individuals as well as institutions, and makes it difficult to predict events and be ready for innovations.


Taleb's thesis is that most cutting-edge events are improbable – not likely to be true or to happen beforehand. Some disparate examples are the collapse of the Soviet Union (unpredicted by the CIA), the invention of internet, the Chernobyl disaster, the murder of Olof Palme, the terrorist attack of 11 September 2001, the tsunami in the Indian Ocean 2004, the global financial crisis, Russia's invasion in Georgia 2008, Steve Job's business achievement, the Arabic spring, the Quick case – the scandalous legal process of "creating" an alleged serial killer.


But what's the use of my understanding this tenet of randomness and predictability? Kahneman (2011, p.209ff) tells a story that is germane to this topic.  He and his associate once observed and rated soldiers' eligibility for officer training. Later, when comparing the rating scores with the actual performances of the cadets at the officer-training school, they realized that their ability to forecast the actual performances of the cadets was negligible. This feedback was quite discouraging, because they felt they had compelling impressions for the validity of their ratings. How did the feedback influenced the continued ratings of new candidates? Not at all, the feedback had no effect on how they evaluated new candidates and very little effect on their confidence of rating. Kahneman coined the term illusion of validity for this experience.


A  psychological perspective  on practitioners' dilemmas

The inability to learn from experience in spite of statistical evidence is detrimental in project management. Of course, there are many psychological explanations of this phenomenon; Kahneman himself is a leading authority on this topic. Argyris and Schön (1974) presented another, still useful view of theory in practice to increase professional effectiveness. It is essentially a normative model of action that describes detailed defensive mechanisms (closed single-loop control) causing invalid inferences, but also suggests ways of changing the behavior into strategies to maximize valid information (open, double-loop control).


The purpose of the model may be illustrated by an example of a decision maker's dilemma regarding the present political debate in Sweden about the capability of the armed forces to judge future threats and what to do. Are we helped by historic lessons of false predictions, for example Sweden's low defense preparedness at the outbreak of World War II? Do we have sufficient time now to prepare for possible future threats of the next decades? Or doesn't it exist any military threats against the country in the foreseeable future?


To structure the questions on reasoning and action, I will paraphrase Taleb's expression: We do not spontaneously know that we don't know that we don't know! First, there is a fact that we now don't have some definitive facts about future military threats to base our prediction on. This is of course something that everybody must agree upon. The next step of reasoning, however, cannot be left blank for political or psychological reasons. Then, as a decision maker, you will construct some mental scenarios about possible course of events, based on assumptions of a potential enemy's motives and capability. These scenarios are generalized into a mental belief structure to formulate predictions and suggest contingent preventive actions or no-actions. Thus, psychologically, a transition occurs from an objective random state of prediction to a qualified personal guess with an appreciable subjective probability of being right.


Due to the factual uncertainty, many possible threat and action scenarios may be constructed. Uncertainty is always a source of divergent opinions among decision makers. Unfortunately, depending on political and other reasons, the reasons for an opinion may be completely irrelevant for the original prediction problem.  It seems to be a sad empirical fact that validation of different standpoints or predictions seldom occurs spontaneously. Accordingly, a third, higher level of valid reflecting about the state of "not knowing" and its consequences for action will not be attained.


How do Argyris and Schön explain the dilemmas and possible psychological solutions for the decision maker? The subject's actual reasoning and behavior are commonly governed by implicit goals and assumptions, and a defensive strategy to achieve it; this is called the subject's Theory-in-Use. For example, assume that your implicit goal is to not waste money on increased armaments but instead increase the resources for environmental technology. Therefore, you will select and communicate information that supports your beliefs and disregard other respects. However, if you are asked how you would think and behave under certain circumstances, you will present an espoused theory of goals, assumptions, and values that may sound publicly appropriate in the situation.


For example, your espoused theory may be used to disguise your theory-in-use, for example by elaborating arguments about Russia's lack of motive and capability to attack Sweden. The defensive strategy relies on selective biasing your information processing, which makes you sealed off from incongruent information.  Your implicit goal will not be challenged, in particular because it is congruent with your arguments in the espoused theory.


We can notice two different mechanisms or sources to invalid information processing. One is the self-deceiving form of biased information selection and assessment in the decision maker's private sphere. The other may occur on the intersubjective arena, where the decision-maker can use any communicative tactics and skills for avoiding or diverting focus from incongruences in own standpoints.


Argyris and Schön next describe guidelines for effective learning a non-defensive strategy of collecting and acting on valid information, represented as an open, double-loop control model (II).  Its aim is to ensure rational assessment of the implicit goals and dilemmas. For example, your implicit goal priority has to be challenged against an unbiased set of evidence pro and con increased armament, which is not contingent on your hidden agenda. The Model I-II transition consists of explicit and detailed qualitative guidelines for instructors and trainees, though the training should be individualized and build on personally caused experience. This involves very time-consuming sessions, and the approach cannot be generalized but with rather loose specifications.


The single-loop learning is of course sufficient for all decisions that concern established and well-functioning tasks and routines. However, most activities and their rationales should be evaluated off and on, which requires at least some double-loop control. In case of great uncertainty, this should be a continuing obligation.


Some tentative conclusions

A complex system or course of events is by definition uncertain. Its development or impact is difficult to predict. Many, perhaps most, innovative advances have these characteristics. Unfortunately, this uncertainty is also true of disasters and other undesirable outcomes.


Two important circumstances are apparent when we focus specifically on complex social systems or courses of events in the information age. One is that complexity is an outcome of human design – not by deliberate attempts but as an undesirable side effect. The other is that human achievements to understand and control positive and negative consequences of complexity are of prime concern – but also strongly blocked by inherent psychological, social, and cultural resistance of individuals as well as of organizations.


My purpose with this little paper is to continue the discussion about the role of behavioral science as a vigilant power in development projects. In an earlier paper (Strangert, 2013), I summarized a critical review from 1964 of the usefulness of psychological research for solving practical problems. Now I have referred to Argyris' and Schön's very ambitious and relevant case research 1974 about learning to increase professional effectiveness. Their experiences demonstrated the extreme importance of individualizing, contextualization, and dynamic factors, that is, the complexity of the issue.


My bold guess is that not much of theoretical or methodical significance has has influenced the situation since then. On the contrary, I fear that the market of pseudoscientific knowledge artifacts has exploded and tends to block the view for the rare scientific attempts to clarify the complexity of professional effectiveness. I hope that I am wrong.


What is left of rational – not commercial or career – approaches to increase professional effectiveness in complex contexts? The view of science as a human endeavor to counteract human errors and illusory validation is the answer. Psychology is the branch of science that has generating a wealth of knowledge about human errors in observation and thinking. Unfortunately and self-evidently, this knowledge may sometimes explain but does not allow of predicting and controlling complex phenomena.  Nonetheless, the basic methodological principles of science should  always apply. Therefore, the referred proposals of Scriven, Argyris and Schön, Taleb, and many others to concentrate the R&D on practical problems and the practitioner's active collaboration should be followed.


References


Argyris, C. & Schön, D.A. (1974). Theory in practice: increasing professional practice. San Francisco: Jossey-Bass Publications.


Kahneman, D. (2011). Thinking, Fast and Slow. London: Penguin Books.


Scriven. M. (1964). Views of human nature. In Wann, T.W. (Ed.) Behaviorism and Phenomenology. Chicago: The University of Chicago Press, 1964.


Strangert, B. (2013). Kan beteendevetenskaplig metodik förbättra professionellt utvecklingsarbete? www.arborg.se/


Taleb, N.N. (2010). The Black Swan. The impact of the highly improbable. London: Penguin books.