Skip to content


Four years ago, I left academia to join a small design startup, Mile Two. Ironically, after more than 30 years in academia teaching about applied cognitive psychology, human factors and cognitive systems engineering, it feels like my education on design has just begun. I discovered that essential people on our design teams were trained very differently than I was and brought skills and talents to the team that I was lacking. I recently collaborated with some of these teammates on a chapter to reflect on some of the different perspectives that contribute to the development of software to support cognitive work. The first figure was our attempt to summarize some of the different disciplinary perspectives and the second figure is a more elegant revision of that figure suggested by my friend and colleague Fred Voorhorst. 

Figure 1. Multiple perspectives on design from Flach, Bennett, Butler & Heroux (In press Handbook of Human Factors). 

Figure 2. Multiple perspectives on design by Fred Voorhorst.

These diagrams suggest four classes of constraint that must be considered in the design and development process:

  • Agency Constraints: These are properties of the "agents" in the system. Traditionally, the agents have been humans, but as machines becoming increasingly autonomous - we now have to consider the internal constraints of these autonomous agents. 
  • Implementation Constraints: These reflect properties of the computers that will host the software. They include things such as the types of displays and the control/input devices.
  • Functional Constraints: These are properties of the work domains or the problem spaces that are the targets for the cognitive work. These include the physical, regulatory, and organizational dynamics that limit the range of possible actions and consequences. 
  • Aesthetic Constraints: These are properties of the psychological and cultural values that determine the degree of satisfaction or frustration associated with using the software products. 

All of the four disciplines, at least implicitly, recognize the full range of constraints, however, each discipline tends to emphasize a subset of these constraints in training and practice. 

Human Factors [HF] tends to emphasize the constraints associated with human agency. That is, the focus is on specifying the physical [e.g., reach envelops], perceptual [e.g., visual acuity] and cognitive [e.g., working memory capacity] limitations of humans, so that these limitations are respected in the final design. 

User Interface Design [UI] or Human Computer Interaction [HCI] tends to emphasis the impact of various computer capabilities on performance. For example considerations include types of displays [e.g., table-top, large-screen, hand-helds, or head mounted], types of controls [e.g., keyboard, mouse, game controller, touchscreen], and types of mediation [e.g., co-located or distributed via internet]. 

Cognitive Systems Engineering [CSE] focuses on the problem or work domain constraints. This involves describing the means-ends relations in terms of affordances and exploring alternative strategies or heuristics that might be used to efficiently navigate the functional space. 

User-eXperience Design [UX] tends to consider the larger psychological and cultural context of use that impacts people's self-image and sense of satisfaction when using the design product. 

I come from an HF perspective and was involved in the early development of the CSE perspective. Thus, much of my career has been spent arguing for the value of a CSE perspective for filling gaps in the design space that I felt were not adequately addressed by the HF perspective. The point was not that the HF perspective was wrong, but that it was not sufficient. 

However, over the last few years spent developing software products I have come to realize that even the combination of HF and CSE leaves many gaps in the design space that require skills and perspectives associated with UI and UX design. I realize that my sense of the full demands of design were still quite naive and inadequate. 

I now regret that I did not have and was not able to give my students a broader sense of the full breadth of the design challenge. Yet, I realize that it would be impractical to cover the full breadth of design in any reasonable course of study. Thus, much of the education must continue beyond the formal course work. And ultimately, I have come to the conclusion that design is a team sport that requires a diverse set of perspectives and people who are open to and  who welcome alternative perspectives.  To function on such a team I have had to dampen my tendencies to be an advocate for a particular perspective and I have had to heighten my ability to listen to and appreciate the value of alternative perspectives. 


I've spend much of my career exploring how representations can be designed to help people to manage and control complex systems. The goal has been to use representations to increase perspicacity and to help people to 'see' what matters with respect to the situations that they are managing.

So it makes me sad and angry to see people intentionally use graphics to hide information and to purposely mismanage situations. In this case, in a way that can have severe consequences. Look at these two graphics and note how the graphical images are intentionally manipulated to misrepresent the COVID situation in the state of Georgia. This manipulation is intentionally designed to create the impression that there is no increasing risks of COVID in the state of Georgia. This manipulation is being used to justify policies and decisions that threaten the health of people in Georgia.

The graphic below is a more current version. Note the numbers continue to increase, but the graphic remains much the same - small red spots in a sea of blue. Note that the stated intention of the graph is to "aid understanding whether the outbreak is growing, leveling off, or declining...." But of course, the increases across these graphs will be hidden, as long as the color-to-number-mapping changes to accommodate the increasing number of cases.

In my view, the public officials responsible for these graphics are criminally liable for intentionally misleading the public. I don't understand why every newspaper in Georgia is not calling out this deception and publishing corrected graphics that show the true state of the COVID situation. This is not the first time that state officials have presented misleading data. The image below is from a press conference in May to help make the case that things were getting better. Note that the dates on the x-axis are not chronological.

For those involved in designing technology - this illustrates the powerful impact that representations can have on how people think. And this is an important reminder that we have an ethical responsibility to use that power to enhance and improve human experience.


Cartoon created by Fred Voorhorst

Last week, in the middle of the pandemic, the highly publicized police killings of black men, and the resulting protests and demonstrations I learned of the death of Professor Anders Ericsson. Professor Ericsson was a preeminent psychologist who studied the development of expertise. He was interested in the development of high levels of skill that allow 'experts' to do things that are beyond the capacity of most humans. In particular, his work was instrumental in illustrating that deliberate practice was critical in developing the heuristics that allow experts to become both faster and more accurate in processing information. In essence, because of these tricks of the trade, experts are able to avoid the information limits that bound the performance of most humans. This is the positive, good aspect of heuristics. These heuristics allow experts to focus on the patterns (or chunks) that specify significant aspects of situations and to coordinate their action to respond automatically - quickly and accurately. 

The term heuristic is also used to describe biases in decision making. As the extensive research of Kahneman and Tversky demonstrates, heuristics can lead people to make choices that violate the conventions of traditional logic. Heuristics are a form of bounded rationality that apply within certain domains of activity. Thus, the automatic responses that work in one set of situations can seem illogical or mindless when they are applied in situations outside that domain. This is the negative, bad side of heuristics - they can lead to performance that Jim Reason referred to as 'strong, but wrong.' 

The ugly side emerges when we put people in domains where the circumstances lead them to develop heuristics or implicit biases that not only violate logic, but that violate our cultural values!  

My father entered the Marines when he was 17 at the end of WW II. He only served for two years and never left the country. But in his forties, when someone tried to mug him on the street one night, he automatically responded as he had been trained - slash, kick, gauge! It thwarted the attack and may have saved his life.

There are many parallels in the training of police and soldiers - to prepare people to respond automatically to defend themselves and to survive potentially deadly situations. It is very tempting to attribute these mindless responses to evil intent of individuals, but we might consider that these implicit biases are a product of training and socialization. These biases are the result of years of deliberate practice!

In some cases, the violent acts that we see on the cell phone videos are the result of many years of deliberate practice, that result in mindless responses to situations.  The implication is that the problem of police violence is not simply a function of a few bad apples, but the product of a system that deliberately trains mindless responses to threatening situations. 

In hindsight, it is easy to attribute the clearly mindless violence to evil intent. But, at least in some cases, consider that these mindless responses are the product of a system of training designed for soldiers, not for peace officers. We want to blame the officer, but this will not ultimately solve the problem of police violence. Ultimately, we must change the system and reconsider what type of skill training is most important for creating expert peace keepers, rather than expert soldiers. 


It's the same temperature, as measured by the thermometer, but one person experiences miserable cold, while another experiences a refreshing chill. Which experience is 'true' or 'real.' Or are both experiences illusions (i.e., purely subjective)? 

Most introductory psychology texts describe an experiment in which the participant leaves one hand in a bucket of ice water and another hand in a bucket of hot water for a few minutes, then simultaneously puts both hands into a bucket of water at room temperature. What happens? One hand experiences the water as being warm, while the other hand experiences the bucket as being cold. 

Does this demonstrate that perception is illusionary? Or does it suggest that the experience of temperature is dynamic. That it reflects change, or relations over time (as opposed to isolated events in time). 

Imagine two intersecting lines on a graph one sloping down and the other sloping up. At the intersection, both point values are identical - but the slopes are different. Classically, we have tended to treat human experience as if it were a collection of isolated points in time.  Thus, we have failed to consider that the slopes may be different. 

If experience is dynamic, then it is essential that we consider the points as integrated components of the line. Rather than isolating behavior in time, we need to examine behavior over time (we need to consider the lines). 

From the perspective of dynamics, the experiences of both people can be real. Both experiences are partial functions of the current physical temperature (as measured by a thermometer), but they are also functions of different past histories and different potential futures (e.g., intentions). Though the temperature 'points' may be the same for both, the slopes may be very different. The experiences may be situated on different trajectories. 

When the different experiences are dismissed as 'subjective,' there is an implication that all experience is illusionary. That experiences are groundless with respect to the objective physical situations (e.g., the objective temperature). That experiences are 'in the head.' However, if you view experience as a dynamic property over time (e.g., with both a position and a velocity or slope), then you can see that the differences may be due to the fact that both are grounded, but in different ways. In this context, both experiences can be considered to be 'real' (in the sense that they are grounded, but with respect to objectively different situations).

Note that (as with constructs such as affordance) the 'situation' is not purely physical (objective) or purely mental (subjective). Both the situation and the experience are dynamical phenomenon reflecting both where agents are coming from and where agents are going. And that trajectory is shaped by both objective physical properties (e.g., the temperature) and by mental properties (e.g., an intention).

Cognitive Science will resolve many mysteries and contradictions, if it will only connect the dots and begin to consider trajectories as the fundamental units of analysis. 


Ever since the Cybernetic Hypothesis was introduced to Psychology, there has been greater appreciation of the "intentional" nature of cognitive systems. Yet, despite this awareness, causal (or stimulus-response) forms of explanation continue to dominate the way many people think about how humans (and other animals) process information. For example, most cognition texts begin with sensation and then follow 'stimulation' through successively deeper levels of processing (perception, decision-making ...).

A result of this framing is an at least implicit suggestion that sensations cause action. And there is a danger that people fail to appreciate many of the significant aspects of the circular coupling of perception and action (e.g., self-organization to skillfully and creatively adapt to the ecology) that differentiate animals from plants. 

While it is true that in a circular coupling, there is no sense in which any element in the circle must be given priority as "the cause," I wonder if a simple reorganization of how we depict the dynamic would help people to break away from conventional notions of causality and to better appreciate the intentional dynamic of cognitive systems.

Perhaps, the most important implication of the Cybernetic Hypothesis is that 'action' becomes the prime mover of the dynamic. In a framing that gives priority to action, looking becomes the prerequisite for seeing, and the function of our senses is to serve, rather than to cause action.  

Ideally, science is motivated by the curiosity of individuals and success depends on their ability to formulate well structured questions about important phenomenon. But in practice, science requires resources and these resources depend on the ability of individuals to convince those with the resources that they have an answer to some important contemporary problem. 

The process of convincing the people with the resources to fund your curiosity often hinges on the ability to provide a simple, easy to understand answer to the complex problem. This is where having the right buzz word can make all the difference. For example, in seeking funds to explore the teaming of humans with autonomous systems, one might frame the problem in terms of self-organizing dynamics or trust

I think a strong case can be made that both words are valid descriptions of important aspects of the natural phenomenon. And conversely, a case can be made that both are 'buzz' words. That is, they are fashionable terms or jargon that tend to be open to a broad range of interpretations and uses. 

As buzz words, both terms tend to suggest ways to reduce the complex problem into simpler terms. For example, the term self-organizing systems can suggest reducing the phenomenon to a particular model (e.g., coupled pendulums) or to a particular methodology (e.g., 1/f).  Similarly, trust can suggest reducing the problem of human-technology interactions to simple analogs of human-human interactions. In both cases, the buzz words tend to reduce the problem and to narrow attention to specific dimensions that are familiar and potentially manageable. Somehow the problem becomes less mysterious and there appear to be obvious solutions. 

While the reductions and suggestions of solutions are extremely useful for marketing work to funders and gaining resources, there are also obvious dangers. Too often, the reductions associated with buzz words tend to hide the natural complexity - trivializing the natural phenomenon. Thus, if the researchers get caught up in the 'buzz' of marketing - then research programs can end up being framed around the trivializations, rather than the real phenomenon. In the worse case, the Buzz words become the answers, rather than the questions; experiments often become demonstrations of trivial relations, rather than tests of interesting hypotheses; and the results tend to have little practical value relative to solving the actual phenomenon (e.g., how to improve performance of human-autonomy teams). 

For me, self-organization and trust suggest important questions about the nature of human-autonomy teaming. However, I get worried when I see them being marketed as 'answers.'




In discussions about the nature of cognition, a central question focuses on how meaning emerges from interactions between agents and their environments. It seems clear that the 'meaning' of any object depends in part on properties of the object, in part on the observer, and in part on the situation. For example, consider the following observations from Rasmussen (1986)

The way in which the functional properties of a system are perceived by a decision maker very much depends upon the goals and intentions of the person. In general, objects in the environment in fact only exist isolated from the background in the mind of a human, and the properties they are allocated depend on the actual intentions. A stone may disappear unrecognized into the general scenery; it may be recognized as a stone, maybe even a geologic specimen; it may be considered an item suitable to scare away a threatening dog; or it may be a useful weight that prevents manuscript sheets from being carried away by the wind - all depending on the needs or interests of a human subject. Each person has his own world, depending on his immediate needs.

(p. 13)

There are two subtly different ways to think about the dynamics of experience that underlies the emergence of meaning. Conventionally, constructivist approaches to cognition talk about making meaning. This makes a lot of sense in the context of language, where arbitrary signs such as a sequence of marks on a page (e.g., C - A - T) are interpreted relative to prior learning about alphabets and word definitions. The suggestion is that the meaning is the result of adding prior knowledge to the arbitrary sign to make (or construct) meaning. The implication is that the symbols are meaningless until they are interpreted.

An alternative way to think about the dynamic of experience, that reflects ecological or situated perspectives on experience, is that meaning is discovered. This perspective makes a lot of sense in terms of perceptual-motor skills. For example, we discover affordances like graspable and reachable by interacting with the objects in the environment. The underlying relations that determine whether an object will fit comfortably in the hand are not arbitrary (though the affordances of a specific object like a basketball may vary from individual to individual as a function of hand sizes). Affordances reflect meaning-full properties of the ecology - that exist independent from perception or interpretation. The intention will not be realized if the affordance is not detected, but the affordance exists and can be specified objectively, whether or not it is ever realized in action. Further the meaning can be mis-perceived, but will be corrected through the feedback that results from acting on the misperception.

The framework of meaning making makes sense if you think about the stimuli of experience as punctate instances in time (e.g., isolated frames in a movie reel). In this case, experiencing a melody requires that the significance of a particular note be constructed  by retrieving the prior notes from memory and mentally adding them together to re-construct the melody.

In contrast, the framework of meaning discovery suggests that perceptions are not punctate, but that they are extended over time so that the pattern of notes is experienced as a whole (as a chunk). This extension may go beyond the notes heard to include prior experiences with a particular memory that allow prediction or anticipation of the entire melody. The metaphor does not have to invoke memory in terms of adding the prior notes. Rather, the metaphor is one of attuning or resonating to a pattern - and recognizing a melody.

Note that the meaning discovery framework does suggest the existence of mental structures (schema or frames) - but these structures function more like filters - that resonate to some properties or patterns, as a function of prior experience. In this framework, the function of experience or learning is not about storing past instances (that can be added to new instances to construct meaning), rather it is about tuning attention to those properties of experience that have functional significance (e.g., tuning the weights in a neural net).

Back to the processing of (C-A-T). These symbols may be arbitrary in that there is no obvious physical or analogical relation to the animal that they represent. But they are NOT arbitrary in a cultural sense. If we assume that the meaning of C-A-T is created by a culture - not by a mind. Then the meaning discovery framework could be sensibly applied to language as well as to perceptual motor skills. In this sense, learning language is not about creating meaning from arbitrary signs - but about discovering the cultural significance of the signs (in the same way that discovering affordances is about discovering the significant action properties of an object).

The danger of the constructivist framework where minds make meaning is the implication that everything is meaningless until it comes in contact with a mind. There is a subtle implication that we live in a meaningless world. I can't accept that implication - and thus prefer to think of the dynamic of learning and experience as one of discovering meaning. There is a subjective dimension to meaning, but I can't accept that meaning is purely subjective.


Pirsig "Zen and the Art of Motorcycle Maintenance"

The development of and standardization of metrics was critical to the development of science. The standard metrics provided "objective" standards for describing events and experiments to ensure that they could be replicated and generalized appropriately. Without objective standards of measurement there could be no science.

Development of objective, observer independent standards of measurement was essential to the success of the physical sciences.

However, the great error in Western Science was to take the description of the world in terms of these metrics as an objective reality - in opposition to a subjective reality! The implication is that the objective distance in terms of meters is true, but the functional relations such as graspable, reachable, near or far are 'subjective.' This implies that the variability associated with individual differences along such dimensions is "noise" with regard to the "true" reality. And there is an implication that this "noise" has to be somehow filtered and added-to in order to construct a mental model of the objective truth - in relation to the standard metrics (e.g., the size in meters).

One implication is that since people and animals are not well calibrated to the standard metrics, then their perceptions of the world must be 'indirect' and therefore it is necessary for them to reconstruct the true world (recover the correct standard) in order to act appropriately.

Another implication is that many of the relations that directly impact how people make judgements about graspability (e.g., their own hand size), reachability (their arm length or height), or closeness (e.g., available modes of transportation) are less real - less basic - or that they are derivative. But of course, these relations are every bit as 'real' and every bit as specifiable as the elements comprising these relations.

These relations are part of a "whole" that can not be discovered in the components. These relations are 'emergent properties' of the whole. A central premise of ecological psychology is that these emergent properties are 'essential and fundamental' elements for a science that hopes to describe how people adapt to their ecologies. Ecological Psychology argues that the size of an object relative to a hand or the distance to a cliff relative to your height is every bit as objective as the size relative to a meter stick.

Further, ecological psychology argues that these functional relations exist in the world to be discovered and perceived directly. And that there is information (e.g., structure in optical arrays) that specifies these emergent properties. Thus, there is no need for internal processing to construct or reconstruct these relations. These are NOT mental constructions - they are functional properties of the coupling of an animal with its ecology - they are properties of the umwelt. They are affordances that can be directly experienced.

Too close as dependent on height and specified as a visual angle.

The mistake that Western Science has made is that it has taken the arbitrary metrics created to aid formal scientific enterprises as 'fundamental' and it has taken the relations that emerge from the functional interactions of people with their ecology to be 'derivative.' However, I think there is little doubt that the experiences of graspable, reachable, near or far are fundamental primitives of the human-ecology system. These pragmatic/functional relations are the raw primitives of experience. They are REAL! The metrics of objective science are also real - but they are the wrong level of description for exploring how people adapt to the functional demands of everyday living.

As Protagoras claimed: Man is the measure of all things.

In our everyday lives we directly experience the ecology in terms of the REAL properties that emerge as a function of the perception-action coupling with our ecology! We will never construct a satisfying understanding of human performance if we start by denying the reality of these essential emergent properties. Thus, the claim is that a science of human performance must be built using different bricks than those used to construct an 'objective' physical science. These bricks, these essential elements are different from those used by physicists, but they are no less real.

The essential elements for building a science of human experience are different than those that have been used successfully in building a science of an observer independent physical world. However, these elements are no less real.

The irony of using different bricks or working at different levels of description is that this may be the path that might allow us to escape from a collection of little sciences to a single, unified science, that spans the field of possibilities reflecting the joint constraints of mind and matter.

See What Matters for an exploration of the implications of these ideas for cognitive science and experience design.

Although I have used the term "wicked problems" in my writing, I only recently read Rittel & Webber's (1973) original description of this concept along with an editorial by Churchman (1967) commenting on his hearing Rittel talk about this construct.

I have little to add to the original formulation and encourage others to access and read both papers.

Rittel, H.W. & Webber, W.M. (1973). Policy Sciences, 4, 155-169.

Churchman, C. W. (1967) Wicked Problems. Management Science, 14(4), B141-B142,

Rittel and Webber list 10 attributes of wicked problem, that I will list here, but encourage readers to go to the original source for further explication.

  1. There is no definitive formulation of a wicked problem.
  2. Wicked problems have no stopping rule.
  3. Solutions to wicked problems are not true-or-false, but good-or-bad.
  4. There is no immediate and no ultimate test of a solution to a wicked problem.
  5. Every solution to a wicked problem is a "one-shot operation"; because there is no opportunity to learn by trial-and-error, every attempt counts significantly.
  6. Wicked problems do not have an enumerable (or an exhaustively describable) set of potential solutions, nor is there a well-described set of permissible operations that may be incorporated into the plan.
  7. Every wicked problem is essentially unique.
  8. Every wicked problem can be considered to be a symptom of another problem.
  9. The existence of a discrepancy representing a wicked problem can be explained numerous ways. The choice of explanation determines the nature of the problem's solution.
  10. The planner has no right to be wrong.

From the Churchman article:

... the term "wicked problem" refers to that class of social system problems which are ill-formulated, where the information is confusing, where there are many clients and decision makers with conflicting values, and where the ramifications in the whole system are thoroughly confusing. The adjective "wicked" is supposed to describe the mischievous and even evil quality of these problems, where proposed "solutions" often turn out to be worse than the symptoms.

p. B141

Churchman raises some ethical issues in the context of OR associated with approaching wicked problems piecemeal, that I think applies far more broadly than to just OR:

A better way of describing the OR solution might be to say that it tames the growl of the wicked problem: the wicked problem no longer shows its teeth before it bites.

Such a remark naturally hints at deception: the taming of the growl may deceive the innocent into believing that the wicked problem is completely tamed. Deception, in turn, suggests morality: the morality of deceiving people into thinking something is so when it is not. Deception becomes an especially strong moral issue when one deceives people into thinking that something is safe when it is highly dangerous.

The moral principle is this: whoever attempts to tame a part of a wicked problem, but not the whole, is morally wrong.

p. B141 - B142

A consequence of an increasingly networked world is that our problems are getting increasingly more wicked. These two papers should be required reading for anyone who is involved in management or design.

Fred Voorhorst has created a poster to help us organize our thoughts with respect to the design of representations that help smart people to skillfully muddle through wicked problems.  In the case of wicked problems - there is no recipe that will guarantee success - but there are things that we can do to improve our muddling skill and to shape our thinking in more productive directions.