Skip to content

Social Friction

Have you ever had a fantastic idea that got crushed in discussions with peers and colleagues? Perhaps, the idea was not completely crushed, but it became necessary to compromise and to modify or delay implementation of your idea in order to reach enough consensus for the organization to act on your idea. Or has it ever happened that you later conclude that an idea that you thought was great at one time, turns out to be not so wise and you thank your lucky stars that resistance from within the organization prevented you from making a big mistake. Almost any idea that requires an organization to change course or to try something new or different will come up against resistance (will experience social friction).

In some cases, the social friction will alter or delay implementation of an innovative good idea. But in other cases the friction will result in constructive improvements to the initial idea, and in other cases the friction will prevent the organization from implementing a change that might appear to be a good idea to some, but that would have actually led the organization down a risky or dangerous path.

Essential Friction

Imagine trying to walk or stand on a surface with minimal friction (e.g., on very slippery ice). Under such conditions, maintaining stability can be problematic. On the one hand, friction or drag is considered to be a an obstacle or cost when it comes to movement (e.g., a waste of energy). However, on the other hand, friction can be essential to controlling motion (e.g., maintaining balance and being able to walk to a goal and stop without sliding past). Thus, zero friction is not a desirable condition when it comes to controlling locomotion. As Gene Rochlin has noted:

Without the damping effect of friction, we would live in an impossibly kinetic world in which the consequences of every action would persist and multiply to the point of insanity (p. 132)

Of course, there can be too much friction, such that the energy costs of motion are prohibitive. The bottom line, however, is that some friction is essential for stable, controlled locomotion.

As in the physical world, Rochlin suggests that friction may also be essential to sanity in the social world.

In the realm of the social and political, morals, ethics, knowledge, history, and memory may all serve as sources of "social friction," by which gross motions are damped, impetuous ones slowed, and historical ones absorbed. Such friction is essential to prevent the persistence and multiplication of social and political movements once their driving force is removed (p. 132)

In the social context, an analog to friction might be the opposition and second guessing that tends to arise with any new idea or prospect of change within an organization. A new idea that might at first appear as an innovation, will gradually lose steam in the face of opposition and critiquing. Thus, ideas that are not continually pushed or infused with energy will dissipate, perhaps without ever being implemented. Some of these ideas might have been positive innovations and others might have been simply ill-formed or bad ideas.

Incrementalism

If the abduction or adaptive control logic illustrated in the previous blog (#7) is representative of everyday sensemaking, and if the underlying dynamic is essentially muddling through, then a natural question to ask is: What does skilled muddling look like?

Lindblom's term 'muddling' suggests a messy, chaotic process - a kind of meandering with little chance of convergence. This term stands in stark contrast to the term 'control' and the image of a 'servomechanism' that is the typical image of a control system used in the social sciences. In the servomechanism metaphor there is an implication of a well defined goal and well-defined criteria for comparing the current state to the goal state to yield a well-defined error signal for guiding activity.

However, Lindblom notes that for many public policy decisions there is no single, well-specified goal. Rather there will typically be many competing goals or value systems. In many cases these will be incommensurate relative to each other and only tenuously  linked with actions or outcomes, making it difficult to even know when you are on the right track. Thus, with regards to public policy and many important personal decisions (e.g., buying a home, choosing a profession, wooing a mate, voting for a president) there is no a priori well-specified goal or performance standard to specify the right path to a satisfying end.  In fact, one might claim that the only reliable metric for judging the quality of a decision or action is the degree of satisfaction with the result. At best, we can recognize a satisfactory solution when we get there - but even that might be questioned (e.g., sour grapes).

However, Lindblom's term incrementalism does suggest something about what quality muddling might look like. This term suggests that quality muddling results from making small (incremental) changes. This suggests a conservative approach that tends to progress through small tweaks to policies that have worked in the past. This strategy progresses through small changes to existing policies, rather than through dramatic innovations.

Stability in Closed-Loop Systems

For the social sciences, the simple servomechanism (e.g., thermostatic control of room temperature) is the prototype of a  control system. However, from the perspective of control theory, the simple servomechanism is only one of many solutions for regulating processes. In regulating complex processes (e.g., multi-dimensional), complex control strategies are necessary (e.g., multiple sources of feedback that must be integrated in ways consistent with the process dynamics). In many cases (e.g., when there are long lags in the process or when there are uncertainties about the process dynamics) simple compensatory control (i.e., based on current error feedback) may not yield a stable control solution.

In assessing alternative solutions - particularly for complex processes - the first priority of control theory tends to be stability or robustness. Typical ways to increase the stability or robustness of control solutions is to lower the gain or add damping.  In many respects this is analogous to adding friction. The lower gain or damping makes the system more conservative - less responsive to error or deviations from a goal or ideal. It makes the system resistant to change, and reduces any tendency to follow the 'noise' down a garden path to catastrophe.

With respect to gain, good designs trade off speed for accuracy and stability. Lower gain means slower responses - but it also reduces susceptibility to noise or over shooting the target. With respect to robustness, good designs typically trade-off local optimality for stability. That is, a robust controller may not be optimal for any situation, but it will typically be satisfactory for a wider range of situations than a controller that is tuned to be optimal for particular situations.

The prototypical example of a case where a control system is not conservative enough is pilot induced oscillations. This is a situation where the pilot's gain is too high. The pilot overreacts to the errors and the result is that his actions actual result in divergence from the intended target state - often with calamitous results.

Skilled Muddling

Thus, Lindblom's intuitions about incrementalism as a good strategy for dealing with complex sociotechnical problems is consistent with principles derived from control theory.  The point is not for organizations to be rigid or completely adverse to change. Change is necessary to keep up with the demands of a changing ecology.  However, skepticism and checks and balances with respect to radical new ideas can be essential to skilled muddling. In other words, the friction associated with building consensus within an organization of diverse people with conflicting opinions and values can be essential to the long term success (stability) of the organization.

In order for a control system to be robust in a complex world, a conservative approach to change is generally a good strategy. This helps to ensure that the actions of the organization will generally be responding to the signals (i.e., actual changes in the ecology) rather than the noise (i.e., imagined changes), and that the system will result in satisfactory performance over a wider range of situations. This strategy allows good ideas that are persistently advocated to eventually influence the direction of the organization, while protecting the system against the risks associated with bad ideas and misplaced enthusiasm.  The bottom line is that slow and steady progress (i.e., the turtle's strategy) is what usually wins the race in a complex risky world. 

 

Rochlin, G.I. (1998). Essential friction: Error-control in organizational behavior. In The necessity of friction (ed.) N. Ackerman,  Boulder, CO: Westview Press. 132-163.

 

 

2

Abduction

Peirce introduce Abduction (or Hypothesis) as an alternative to classical forms of rationality (induction, deduction). I contend that this alternative is more typical of everyday reasoning or common sense. And further, that it is a form of rationality that is particularly well suited to both the dynamics of circles and the challenges of complexity. However, my understanding of Abduction may not be representative of how many philosophers or logicians think about it.

In my view, what Peirce was describing is what in more contemporary terms would be described as an adaptive control system, as illustrated in the following figure.This figure represents medical treatment/diagnosis as an adaptive control system. This system has two loops that are coupled.

slide5

The Lower or Inner Loop - Assimilation

The lower loop is akin to what Piaget described as assimilation or  what control theorists would describe as a feedback control system. This system begins by treating the patient based on existing schema (e.g., internal models of typical conditions; or standard procedures). If the consequences of those actions are as expected, then the physician will continue to follow the standard procedures until the 'problem' is resolved. However, if the consequences of following the standard procedures are 'surprising' or 'unexpected' and the  standard approaches are not leading to the desired outcomes, then the second loop becomes important.

The Upper or Outer Loop - Accommodation

The upper loop is  akin to what Piaget described as accommodation and this is what makes the loop 'adaptive' from the perspective of control theory. Other terms for this loop from cognitive psychology are 'metacognition' and 'situation awareness.'

The primary function of the upper loop is to monitor performance of the lower loop for deviations from expectations. Basically, the function is to evaluate whether the hypotheses guiding actions are appropriate to the situation. Are the physician's internal model or expectations consistent with the patients actual condition? In other words, is the patients condition consistent with the expectations underlying the standard procedures?

If the answer is no, then the function of the upper loop is to alter the hypotheses or hypothesis set to find one that is a better match to the patient's actual condition, In other words, the function of the upper loop is to come up with an alternative to the standard treatment plan. In Piaget's terms, the function is to alter the internal schema guiding action.

Muddling Through

The dynamic of the adductive system as illustrated here is very much like what Lindblom described as 'muddling through' or 'incrementalism.' In other words, the logic of this system is trial and error. In facing a situation, decisions and actions are typically guided by  generalization from past successes in similar situations (i.e., the initial hypothesis or  schema; or standard procedure). If the consequences are as expected, then the schema guiding behavior is confirmed and the experience of the physician is not of decision making or problem solving, but rather it is "just doing my job."

If the consequences of the initial trials are not as expected, then skepticism is raised with respect to the underlying schemas and alternatives will be considered. The physician experiences this as problem solving or decision making - "What is going on here? What do I try next?" This process is continued iteratively until a schema or hypothesis leads to a satisfying outcome.

This dynamic is also akin to natural selection. In this context the upper loop is the source of variation and the lower loop provides the fitness test. The variations (i.e., hypotheses) that lead to success (i.e., good fits), will be retained and will provide the basis for generalizing to future situations. When the ecology changes, then new variations (e.g., new  hypotheses or schema) may gain a selective advantage.

Lindblom's term 'incrementalism' reflects the intuition that the process of adjusting the hypothesis set should be somewhat conservative. That is, the adjustments to the hypothesis set should typically be small. In other terms, the system will tend to anchor on hypotheses that have led to success in the past. From a control theoretic perspective this would be a very smart strategy for avoiding instability, especially in risky or highly uncertain environments.

19th International Symposium on Aviation Psychology

Proposal Submission is Open! – New submission deadline is Nov 4, 2016

The 19th ISAP will be held in Dayton, Ohio, U.S.A., May 8-11, 2017. Proposals are sought for posters, papers, symposia, and panels. Any topic related to the field of aviation psychology is welcomed. Topics on human performance problems and opportunities within aviation systems, and design solutions that best utilize human capabilities for creating safe and efficient aviation systems are all appropriate. Any basic or applied research domain that generalizes from or to the aviation domain will be considered.

The proposal submission deadline is now November 4, 2016. We have revised the submission due to our delay in opening the Proposal Submission link.

Please see Author Info at https://isap.wright.edu/conferences/author-info for more information about the submission requirements. Contact isap2017@isap.wright.edu if you have any questions.

Thank you for your interests in ISAP and patience for the submission link to be functional.

John Flach (Symposium Chair), Michael Vidulich and Pamela Tsang (Program Co-Chairs)

3

In his 1985 book Surely You're Joking Mr. Feynman, Richard Feynman describes what he calls 'Cargo Cult' Science:

In the South Seas there is a cargo cult of people. During the war they saw airplanes land with lots of good materials, and they want the same thing to happen now. So they've arranged to make things like runways, to put fires along the sides of the runways, to make a wooden hut for a man to sit in, with two wooden pieces on his head like headphones and bars of bamboo sticking out like antennas - he's the controller - and they wait for the airplanes to land. They're doing everything right. The form is perfect. It looks exactly the way it looked before. But it doesn't work. No airplanes land. So I call these things cargo cult science, because they follow all the apparent precepts and forms of scientific investigation, but they're missing something essential, because the planes don't land. (p. 310 - 311)

I often worry that academic psychology is becoming a Cargo Cult Science. Psychologists have mastered the arts of experimental design and statistical inference. They do everything right. The form is perfect. But I don't see many airplanes landing. That is, I see lots of publications of clever paradigmatic experiments, but have difficulty extracting much value from this literature for understanding human experience, particularly in the context of complex work - such as clinical medicine. This vast scientific literature does not seem to generalize in ways that suggest practical ways to improve the quality of human experience.

On the surface, these papers appear to be addressing practical issues associated with cognition (e.g., decision making, trust, team work, etc.), but when I dig a bit deeper I am often disappointed, finding that these phenomenon have been trivialized in ways that make it impossible for me to recognize anything that aligns with my life experiences. Thus, I become quite skeptical that the experiments will generalize in any interesting way to more natural contexts. Often the experiments are clever variations on previous research. The experimental designs provide tight control over variables and minimize confounds. The statistical models are often quite elegant. Yet, ultimately the questions asked are simply uninteresting with no obvious implications for practical applications.

Not everyone seems to be caught in this cult. However, those that choose to explore human performance in more natural settings that are more representative of the realities of everyday cognition are often marginalized within the academy and their work is typically dismissed as applied. For all practical purposes, when an academic psychologist says 'applied science' s/he generally means 'not science at all.'

Perhaps, I have simply gotten old and cynical. But I worry that in the pursuit of getting the form of the experiments to be perfect, the academic field of psychology may have lost sight of the phenomenon of human experience.

A new edition of our What Matters? book is now available online.

In the new version an acknowledgment section, endorsements, indexes, and a back cover have been added. Also, a number of typos have been corrected.

The paperback edition is now available for purchase through What Matters on Lulu

slide3

(A) The cognitive system as an open-loop dynamic. (B) The cognitive system as a closed-loop dynamic.

Open- versus Closed-loop Systems

Another very important distinction between how the dyadic and triadic frames for psychology have developed is that the dyadic frame tends to view the cognitive dynamic as an open-loop causal system.  In this open-loop perspective, a causal sequence, akin to a sequence of dominos, is typically assumed:

stimuli --> sensations --> perception --> decision --> response.

In this framework, the key is to describe the internal computations (i.e., transfer functions) that translate input to output for each of the distinct stages of information processing. There is, at least an implication, that each of the distinct stages can be understood in isolation from the other stages (e.g., as modules within a computer program); and researchers typically identify with specific stages in this sequence (e.g., one might describe herself as a perceptual researcher, another might call himself a decision researcher, while another might be referred to as a motor control researcher).

In contrast, the triadic frame tends to view the cognitive dynamic as a closed-loop system. In the closed-loop system, the precedence relationships in time that have typically been used to differentiate causes (prior events) from effects (later events) are lost. For example, in the circular system responses can be both causes of stimuli (e.g., looking around) and the effects of stimuli (e.g., orienting to a sound). In a circular system, there is no sense in which any portion of the circle is logically prior to any other portion of the circle. Thus, causal explanations and parsings based on a domino model (based on sequence in time) make no sense for a closed-loop dynamic.

In a closed-loop dynamic there are constraints at the system level (stability) that determine relations that must be satisfied by the components.  Thus, for the system to be stable (i.e., to survive) certain relations among the components must be satisfied. Thus, in contrast to the open-loop system where the behavior of the whole is determined by the behavior of the parts. The opposite is true of circular systems - the circular dynamics of the whole (i.e., the organization) creates constraints that the components must satisfy or the system will go out of existence.

For example, in the circular dynamic associated with prey-predator systems, it makes no sense to isolate either the prey or the predator as the cause of the pattern of population levels. Unless certain relations exist between the prey and predators (e.g., as described by a differential equation) the system will either converge on a stable population, oscillate, or collapse into extinction. It is important to emphasize that a differential equation expresses constraints over time associated with relations among the components. The equations describe the coupling of prey and predators.  In the dynamic of cognition we are interested in the coupling of perception and action through an ecology.

Events in Time versus Constraint Over Time

In the open-loop dynamic, the focus tends to be on events in time and the challenge is to identify the aspects of prior effects that cause later events (e.g., find the root cause of an accident by tracing back in time to find the initiating fault). However, in dealing with closed-loop dynamics explanations tend to be better framed in terms of constraints over time. For example, the laws of motion are constraints over time that are typically expressed in the form of a differential equation. The constraints over time, do not determine events, but they set limits on the fields of possible events. For example, the laws of motion set constraints on the possible trajectories a body might take (e.g., aerodynamics). Similarly, a goal (e.g., to land safely at a specific airport) or value system (e.g., a desire to minimize energy consumption)  will not determine the path of an aircraft. However, these constraints will limit the set of possible paths.

The term circular causality has typically been used to indicate that the logic of circular dynamics requires new ways to think about causality and explanation. However, this term does not identify the key distinction from typical causal reasoning. I prefer to say that when dealing with circular systems, it is necessary to dispense with the notion of causality all together and to replace it with the construct of constraint. 

I think this shift has some similarities with the shift from particle based explanations to field based explanations in physics.  Thus, rather than framing psychology in terms of discovering the causes of behavior, the focus shifts to understanding the dynamic constraints on behavior. In this context the terms affording, specifying, and satisfying refer to different sources of constraint on the cognitive dynamic. Affording refers to the constraints on action (e.g., laws of motion). Specifying refers to the constraints on information (e.g., laws of optics). Finally, satisfying refers to the constraints on value (e.g., principles of reinforcement and punishment).

The figure below suggests how these three sources of constraint map onto the triadic semiotic. Note that these are constraints over the components of the system, but that tend to be grounded in different components. Affording is grounded in the physics of the ecology (e.g., the nature of the gravitational field, the surfaces of support, vehicle dynamics, etc.). Specifying is grounded in the properties of the interface or representation (e.g., optical flow field, acoustic field, computer interface). Satisfying is grounded in the intentions and preferences of the cognitive agent.

Note that there tends to be a parallel structure in the way in which Rasmussen has framed Cognitive Systems Engineering (CSE). His Abstraction Hierarchy (AH) focuses on how the domain constraints shape the affordances (determine the field of possibilities) relative to the goals and capabilities of an agent. The SRK model tends to reflect the internal strategies and expectations of an agent in terms of Skills, Rules, and Knowledge relative to the problem constraints and the intentions of the agent. Finally, the Ecological Interface Design (EID) focuses on the design of interfaces that specify the field of possibilities in ways that are consistent with the capabilities of an agent. I will talk more about CSE in future blogs. For those interested in going into this deeper, I suggest you look at the link to Bennett & Flach's Interface Design book.

slide4

A Triadic Semiotics

Inspired by the computer metaphor and the developing field of linguistics (e.g., Chomsky), the main stream of cognitive science was framed as a science where mind was considered to be a computational, symbol-processing device that was evaluated relative to the norms of logic and mathematics. However, there were a few, such as James Gibson, who followed a different path.

Gibson followed a path that was originally blazed by functional psychology (e.g., James, Dewey) and pragmatist philosophers (e.g., Peirce). Along this path, psychology was framed in the context of natural selection and the central question was to understand the capacity for humans to intelligently adapt to the demands of survival. Thus, the question was framed in terms of the pragmatic consequences of human thinking (e.g., beliefs) for successful adaptation to the demands of their ecology.

An important foundation for Gibson's ecological approach was Peirce's Triadic Semiotics. In contrast with Saussure's Dyadic approach - Peirce framed the problem of semiotics as a pragmatic problem - rather than as a symbol processing problem. Saussure was impressed by the arbitrariness of signs (e.g., C - A - T) and the ultimate interpretation of an observer (e.g., kind of house pet). In contrast, Peirce was curious about how our interpretation of a sign (e.g., pattern of optical flow) provides the basis for beliefs that support successful action in the world (e.g., braking in time to avoid a collision). In addition to considering the observer's interpretation of the sign, this required a consideration of the relation of the sign to the functional ecology (e.g., how well the pattern specifies relative motion of the observer to obstacles - the field of safe travel), and the ultimate pragmatic consequences of the belief or interpretation relative to adaptations to the ecology (e.g., how skillfully the person controls locomotion).

The figure below illustrates the two views of the semiotic system. In comparing these two systems it is important to keep in mind Peirce's admonition that the triadic system has emergent properties that can never be discover from analyses of any of the component dyads. For Peirce the triad was a fundamental primitive with respect to human experience. Thus, arguing that the whole of human experience is more than the sum of the component dyads.

slide2

Mace's Contrast

William 'Bill' Mace provided a clever way to contrast the dyadic framework of conventional approaches to cognition with the triadic framework of ecological approaches to cognition.

The conventional (dyadic) approach frames the question in terms of computational constraints, asking:

                            How do we see the world the way we do?

The ecological (triadic) approach frames the question in terms of the pragmatic constraints, asking:

                            How do we see the world the way it is?

What Matters?

For a laboratory science of mind, either framing of the question might lead to interesting discoveries and eventually some of the discoveries may lead to valuable applications. However, for those with a practical bent, who are interested in a cognitive science that provides a foundation for designing quality human experiences, the second question provides a far more productive path. For example, if the goal is to increase safety and efficiency and to support problem solving in complex domains such as healthcare or transportation, then the ecological framing of the question will be preferred! You can't design either training programs or interfaces to improve piloting without some understanding of the dynamics of flight.

If the goal is to discover what matters in terms of skillful adaptations to the demands of complex ecologies, then a triadic semiotic frame is necessary. To understand skill, it is not enough to know what people think (i.e., awareness), it is also necessary to know how that thinking 'fits' relative to success in the ecological context (i.e., the functional demands of situations).

Why Wundt?

The development of psychology as a science has tended to buy into and to reinforce the dichotomy of mind and matter. In most histories of psychology, Wilhelm Wundt's lab is identified as the first experimental psychology lab - as the birthplace of a scientific psychology. However, certainly there were others who had experimental programs before Wundt (e.g., Fechner and Helmholtz).

Perhaps the reason is that whereas Fechner and Helmholtz were studying relations between mind and matter (i.e., psychophysics), Wundt, with the emphasis on introspection, framed psychology as mental chemistry.  This methodology emphasized the distinction between the stimulus as an object in the ecology and the stimulus as a property of mind. And there was a clear understanding that it was only the properties of mind that were of interest to the 'science' of psychology. In fact, Titchener would characterize associations between introspections and the ecological object as 'stimulus errors.' And Ebbinghaus would focus on nonsense syllables in an attempt to isolate the mental chemistry of memory from experiences outside the experimental context.

Of course, not everyone bought into this. William James characterized the experimental work of Wundt and Titchener as 'brass instrument psychology.' In framing a functionalist psychology, James was particularly interested in mind as a capacity for adaptation in relation to the dynamics of natural selection.  In this context, the pragmatic relations between mind and matter (satisfying the demands of survival) were a central concern.

Note that Wundt's research program was very broad, particularly if you consider his Volkerpsychologie. Thus, the key point is not to criticize his choice of focus or specialization. Rather, it is the later field of psychology that choses this focus as the 'birthplace' of the science that reinforces the idea that the science of psychology should be framed exclusively in terms of the mind, in isolation from matter (e.g., a physical ecology).

While Behaviorism brought the methodology of introspection under suspicion, and shifted attention to 'behavior,' the idea of 'stimulus' remained psychological (if not mentalistic) in that the nature of the stimulus (e.g., reinforcement versus punishment) was derived from the impact on behavior (e.g., increasing or reducing its likelihood), rather than as a consequence of its physical attributes. Thus, the Laws of learning could be pursued independently from any physical principles (e.g., the Laws of Motion).

The Computer Metaphor and Symbol Processing

With the development of information technologies, the mind again became a legitimate object of study. However, now the topic was not mental chemistry, but mental computation. The computer metaphor added new legitimacy to the separation of mind (i.e., software) from matter (i.e., hardware). And the new science of linguistics, with its basis in a dyadic model of semiotics (Saussure) shifted the focus to symbol processing in a way that made the link between the symbol and the ecology completely arbitrary.  The focus was on the internal computations - the rules of grammar, the 'interpretation' that resulted from the mental computations.  It became apparent to many that the stimuli for mental computations were arbitrary signs (e.g. C-A-T) and that the 'meanings' of these arbitrary signs were constructed through mental computations.

In this climate, people such as James Gibson, who followed the Functionalist traditions of William James in pursuing the significance of mind for adapting to an ecology, were marginalized. The field of psychology became the study of internal computational mechanisms for processing arbitrary signs. The focus of psychology was to identify the internal constraints of the computational mechanisms. In this context, the most interesting phenomena were errors, illusions, and biases, because these might give hints to the internal constraints of the computations.  A mind that was successful or situations where people behaved skillfully tended to be ignored - because the internal constraints were not salient when the mind worked well.

Neuroscience

Ironically, in linking mental computations to brain structures, the dichotomy between mind and matter continues to be reinforced, at least to the extent that 'matter' reflects the physical constraints in an ecology.  While neuroscience involves the admission that the hardware matters, by isolating the computation to the 'brain' there remains a strong tendency for psychology to ignore the role of other physical properties of the body and ecology in shaping human experience.  For many, neuroscience effectively reduces psychology and cognition back to a mental chemistry or to brain mechanisms that can be understood independently  from the pragmatic aspects of experience in a complex ecology.  In this regard, I fear that increased enthusiasm for neuroscience is a backward step or an obstacle to progress toward a science of human experience.