A look at why intuitive reactions can hinder success

Firefighters sometimes lose their lives because of a loss of Situational Awareness. [1]

Situational Awareness is the ability to capture the clues and cues, and see bad things coming in time to change the outcome. In IT support, Situational Awareness can be hard to maintain in high-stress situations, and some simple actions can help teams drive more successful outcomes with small changes to their environment and working practices.

Customer satisfaction highly depends on the speed with which incidents are solved. Yet we must be careful not to rush to an answer too quickly when an incident is reported to us. Our brain can trick us into jumping to incorrect conclusions. Nobel laureate Daniel Kahneman advocates 'slow' thinking under certain circumstances, many of which happen during the lifecycle of some incidents in Incident Management.

For example, how often in resolving an incident have you thought: "How could that wrong turn really have happened to me?" To what extent did your intuition let you down? Incident Management relies on intuitive thinking based on knowledge and experience, and once an issue has escalated in complexity, a different kind of thought process is needed.

During consultancy practice, we frequently encounter organizations paying insufficient attention to critical moments in the lifecycle of an incident. This includes being unable to gain:

  • An accurate understanding of the priority of an Incident (not just “P1,” as that’s meaningless, but the actual textual description of the effects of an incident);
  • A quality understanding about why the issue/product/situation is failing and the way in which it’s failing;
  • A well thought out risk analysis of actions that are being planned.

When a problem is not clearly described, the search for a solution is hindered. It slows down the resolution of the issue, which is not only expensive but also makes everyone involved unhappy.

The Trouble with Intuition
Daniel Kahneman, the first psychologist to win the Nobel Prize in Economics in 2002, explains why an intuitive reaction is not always the best.

In his groundbreaking book Thinking: Fast and Slow [2] , Kahneman distinguishes between intuitive ('fast') thinking and rational ('slow') thinking. He illustrates how intuitive reactions can lead to problems and explains the limitations of our common sense.

We tend to believe that we asses problems correctly, with a quick and accurate understanding. Therefore, we often respond quickly and intuitively. But beware, as Kahneman warns, our brain plays games.

The following five examples illustrate why intuition is not necessarily the best trusted advisor:

1. The Halo Effect: [3] Believing that when a certain quality is present, it indicates other qualities are present as well. For example, a child is good in languages and reading and therefore will probably also do well in other subjects...

2. WYSIATI (What You See Is All There Is): Due to tunnel vision, we are not open to other observations. A familiar example is a video in which a gorilla walks through the scene and no one notices because they are instructed to focus their attention on another activity.

3. Framing: The same information can be framed as both positive and negative, depending on how the message is stated. For example, which product would you prefer: one that is 1% contaminated or 99% pure?

4. The Anchoring Effect: We make a decision based on a certain reference point, the anchor. Here we are strongly influenced by the way in which facts and figures are presented to us, and which are not really relevant to the issue.

5. The Availability Bias: We consider an event more likely, if we can memorize a clear example of this event. We suffer from selective memory and recall the impactful, unusual occurrences. For example, there are many media reports about kidnappings, so we think more kidnappings have occurred this year.

In the next article in this two-part series, we’ll take a look at how to distinguish between decisions that require fast and slow thinking—and how this impacts your customers.

[1] FireChief.com Nov 14, 2012
[2] Kahneman, Daniel. Thinking: Fast and Slow, 2011, Penguin Books.
[3] The Halo Effect is a term coined by psychologist Edward Thorndike.

Part 2: How Our Brain Plays Games

The KT problem solving approach is used worldwide for root cause analysis
and to improve IT stability

()