A Stanford undergraduate doing a case analysis on using intuition versus systematic analysis wrote me an email last night to get my thoughts on the difference between the two, especially in light of the work that Jeff Pfeffer and I did on evidence-based management. Below is my lightly edited response. This is just off the top of my head (is it mostly intuition?). I would love to hear your thoughts on this distinction -- if it is useful, how the two concepts fit together, when one is more useful than the others, and so on:
I don't think that intuition and evidence-based management are at odds. There are many times when decision-makers don't have very good data because something is new, the situation has changed (e.g., where do you invest money right now?), or because what might seem like intuition is really mindless well-rehearsed behavior that comes from years of experience at something, so even though people can't articulate the pattern they recognize, they still are acting on a huge body of experience and knowledge. And on the very other side of experience there are virtues to the gut reaction of naive people, as those who are not properly brainwashed may see things and come up with ideas that expertise drives out of their brains (e.g, that is why Jane Goodall was hired to observe chimps, in part, because she knew nothing).
The trouble with intuition is that we now have a HUGE pile of research on cognitive biases and related flaws in decision-making that show "gut feelings" are highly suspect. Look-up confirmation bias --- people have a very hard time believing and remember evidence that contradicts their beliefs. There is also the fallacy of centrality, a lot more obscure, but important in that people -- especially those in authority -- believe that if something important happens, they will know about it.
My belief -- and it is only partially evidence-based -- is that intuition works best in the hands of wise people (this is all over hard facts), when people have the mindset to "act on their beliefs, while doubting what they know," so that they are always looking for contradictory evidence, encouraging those around them to challenge what they believe, and constantly updating (but always moving forward), then I think that intuition -- or acting on incomplete information, hunches, conclusions -- is right. Here is one place I've talked about it. Brad Bird of Pixar is a good example of someone with this mindset, as we learned when we interviewed him for the McKinsey Quarterly. So is Andy Grove. I think the most interesting cases to look at are those where people with a history of good guesses or gut decisions -- what mistakes has Steve Jobs made? What about Google... indeed, it is interesting that they believed they were going to crush Firefox with Chrome , but their market share remains modest a year later. My point here isn't to say anything negative about Jobs or Google -- they have impressive track records, plus some history of the usual failures that all humans and human organizations suffer from. Rather, my point is that by looking at errors by people and firms that have generally good track records, you can learn a lot about conditions under which judgment fails, because you can rule out the explanation that they generally suffer from judgment.
There is a lot written on intuition and the related topic of quick assessments --- see Blink -- and some evidence (although Gladwell exaggerates about the virtues of snap judgments, as the best are often made by people with much experience in the domain, but as always he makes wonderful points). Also see this book by David Myers for a balanced and evidenced perspective on intuition.My view is that intuition and analysis are not opposing
perspectives, but tag team partners that, under the best conditions, where
hunches are followed and then evaluated with evidence (both quantitative and
qualitative, that is another issue, qualitative data are different than
intuition, and often better) versus when hunches and ingrained behaviors are
mindlessly followed and impervious to clear signs that they are failing.
Work Matters readers: Again, I would appreciate your thoughts, as this is one of those core challenges for every boss and for a lot of behavioral scientists too!
"Informed Intuition"
That's the term we use for the strength of experience-led intuition. It's even a term we included in our Glossary of the Future because we believe it will be ever more important to develop this skill. We've been polishing our Glossary and this is just the link we needed to top it all off; definitely was worth waiting for! Now we can launch it with extra confidence next week. Thx! ;)
Posted by: Nancy Giordano | August 20, 2011 at 07:50 PM
Intuition or 'it feels right' is a product of deliberate practice. Story on chess prodigy - http://scienceblogs.com/cortex/2010/01/chess_intuition.php?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+scienceblogs%2FwDAM+%28The+Frontal+Cortex%29&utm_content=Google+Reader
Posted by: Account Deleted | January 22, 2010 at 04:03 AM
@Blink
beware Gladwell.
http://www.psychologytoday.com/blog/extreme-fear/200912/gladwells-stickiness-problem
Posted by: Raoul Duke | December 31, 2009 at 12:59 PM
Very important concepts...thank you for sharing. I agree with Sutton, this is not a dichotomy - intuitive OR data driven approaches. Like the saying goes...."the harder you work (data gathering), the luckier you get (better intuition)"!
Posted by: Aravind Manohar | November 11, 2009 at 12:26 AM
Ken Hammond's Cognitive Continuum Theory gives a framework for discussions like this (and more).
Posted by: BrianSJ | November 09, 2009 at 05:30 AM
Hi Bob,
A very thought provoking article and some great comments by the readers. I sometimes wonder what percentage of our decisions is intuitive as opposed to evidence based. I do believe most managers decisions are intuitive based on their experiences. Has there been any research done to determine it?
Posted by: Sudhir Mathew | November 04, 2009 at 11:05 PM
I have been considering the similarities between being evidence-based and test (assessment) validity, which is defined by Messick as:
. . . an integrated evaluative judgement of the degree to which empirical evidence and theoretical rationales support the adequacy and appropriateness of interpretations and actions based on test scores or other modes of assessment . . . (Messick, 1990: ED395031at http://www.eric.ed.gov:80/).
I think validity relates to evidence-based practice because it links actions with measurement (leading to) evidence and theory. Guided by this perspective I follow with three points relating to your post:
1. All practice involves making judgements, which will falls somewhere on a continuum between unsupported and well-supported. This fits (in my view) with your overall approach, I just think painting the choice as between judgement and evidence obscures the nature of decision-making. Evidence does not lead directly to action without leading through judgement.
2. How evidence and expertise becomes useful is highly dependent on the situational contexts involved. Sometimes it's search for evidence then decide, sometimes it's decide and then search for evidence and sometimes it's necessary to decide and move-on when forced by morals and time-constrains. (Kahneman, D. & Klein, G. (2009). Conditions for Intuitive Expertise: A Failure to Disagree, American Psychologist, 64, #6, 515-524. has an interesting take. Sorry, thanks to the APA this is not openly assessable. I briefly reviewed it here if anyone does not have access: http://howardjohnson.edublogs.org/2009/09/22/naturalistic-decision-making-or-algorithmic-practice-which-is-appropriate-and-when/ )
3. All evidential reasoning is multifaceted (a nomological network, even if imperfect) and especially if it involves humans a moral network. (Orton builds an interesting case here: http://www.ed.uiuc.edu/eps/PES-Yearbook/1998/orton.html)
Posted by: Howard | November 04, 2009 at 09:18 AM
"Intuition" and more formal methods work together in specific ways. It is not hard to find real situations where either class of approach can be found to mislead us. "Intuition" has its strengths in things for which we have deep expertise or else biological preparation of some sort. Formal systematic methods are strongest in areas where our experience fails us and where we have reliable ways of choosing between them in framing the problem. In real problems, both of these situations tends to occur.
My general impression so far is that the two critical leverage points here are: (1) the skill of distinguishing situations where expertise and rapid cognition are most likely to be reliable, and (2) learning how to size up situations and choose systematic formal methods based on expertise.
The literature on naturalistic decision making is useful here. I've found books and journal articles by Gary Klein particularly relevant.
kind regards.
Posted by: Todd I. Stark | November 03, 2009 at 06:29 PM
I make every decision with my gut. My gut, however, makes better decisions with the benefit of fact-based input. So it's a matter of applying the discipline of passion-free analysis, then trusting your judgment.
Posted by: www.facebook.com/profile.php?id=745791008 | November 03, 2009 at 08:34 AM
So true. No one wants to be guided by the intuition of unwise people.
Posted by: working girl | November 03, 2009 at 07:08 AM
My undergrad physics prof. defined intuition as what you get from solving lots of problems.
Posted by: Michael F. Martin | November 02, 2009 at 03:05 PM
Don't forget the context of these decisions: I don't know what evidence exists but I know of a lot of anecdotal evidence that entrepreneurs are often horrible about overvaluing intuition--which is how they "sniff" out opportunities and create their valuable organizations--and undervalue a lot of the data around them--which is itself typically commonly accepted knowledge not factual data, anyway. That's not to say that after a while the entrepreneur doesn't have to start relying heavily on data for decisions to be successful, only that in a lot of circumstances the context the decision is being made in is incredibly important.
It's funny, also, that in a lot of ways the assertion that we should act on our beliefs while doubting what we know could be counter-productive in a number of entrepreneurial settings as well. It's what ultimately makes most entrepreneurs incapable of running the companies they found--and can make them intolerable asses to work with--but it's also a lot of why they succeed, initially, in creating something out of nothing.
I agree that they often work together, but I'd say that my experience has been that the context of the decision has an incredibly important role to play in how to evaluate their success or failure.
Posted by: Andrew | November 02, 2009 at 08:21 AM
It's fascinating to study a Silicon Valley CEO, and how they make decisions. They see possibilities long before anyone else sees them. But what's more important, they act on the possibilities they see. Even if other people see the same possibility, they don't believe in themselves enough to make it happen, or they don't have the resources or connections. It's hard to base that kind of decision making on data. The only thing you can do is to make a thorough analysis and ask yourself: does it make sense, do I have a good feeling about this? As a designer and programmer, I believe in gut feeling. If you get excited about your own application and you get that special feeling in your body, you are probably on the right track. Is it a guarantee? Absolutely not.
Posted by: Jan | November 02, 2009 at 12:03 AM
Bob - Gary Klein's work here is definitely worth a mention. I think we have to confront the fact that we are intuitive beings who can chose to use analysis when we want to (rather than the other way round).
Posted by: Matt Moore | November 01, 2009 at 08:09 PM
For Steve Jobs "mistakes" see the early history of Pixar, NextStep, Apple TV, the Apple cube…
Who said Chrome was going to crush Firefox? And Chrome market share is actually quite significant given the state of its development and age.
Posted by: Nivi | November 01, 2009 at 07:27 PM
Hi Bob,
Wonderful and thought-provoking, as always. I'm struggling with a related question right now, which is how best to acquaint students in leadership/ethics classes with common cognitive errors in ways that they can learn about their own tendencies and find ways to build checks and balances into their working styles as they embark upon the rest of their business careers. I'd welcome your thoughts as well as those of your readers.
CKG
Posted by: CKG | November 01, 2009 at 06:14 PM
I rely very heavily on my intuition but, perhaps, in a slightly unusual way. I seldom use it to make decisions, rather, I have learned to listen for that little niggle that is my intuition telling me I've missed something or I'm going down the wrong route. Once I've identified the niggle, I can figure out why it's there using research and evidence and finally make my decision.
In other words, I use my intuition not to make the right decision, but to save me from the wrong one.
Posted by: Ellie | November 01, 2009 at 05:19 PM
Very interesting. I've been toying lately with the notion that most cognitive biases have roots in functional behaviors.
In the same way that optical illusions are interesting not just because they are wacky or mind-bending, but because they reveal things to us about how the brain is adapting or interpreting the visual image, cognitive distortions are interesting in that they make explicit and visible the cognitive shorthand that we are using to interpret the world all the time.
I think it's primarily an efficiency of the brain -- if we couldn't automate some (or even the majority) of our decision-making, we'd never be able to get things done with any efficiency (similar to the phenomena that Antonio Damasio described around the difficulty of decision making in the absence of emotion).
The disconnect shows up when those automated patterns of thinking become calcified or lazy, or when the thought pattern is so ingrained that it's completely transparent to the individual. I'm particularly interesting in the cognitive distortion you described in the Flawed Self-Evaluations post last week. The prevalence of the inflated self-view (majority of people believing that they are above-average intelligence or better than average drivers, etc. - all statistical impossibilities) makes me wonder if there isn't some functional basis for those beliefs (although it could be as simple as needing to protect one's self-esteem, or statistical illiteracy).
Similar to the idea from evolutionary psychology that an attraction to foods with high caloric density (sugar or fats) once conveyed an evolutionary advantage, but now work against us in our food-abundant societies, it seems like many cognitive biases have functional roots in the right context, and it might be useful to identify those contexts, to be better able to understand where those same behaviors are then misapplied. With a number of cognitive biases, it's fairly easy to hypothesize a context where that behavior could be valuable (the fallacy of centrality, for example -- in most cases, people are probably *right* to think that if something was going on, they'd know about it, and if this couldn't use this mental shorthand, they would be hopelessly mired in detail or wild goose chases).
The question remains -- what to do about it? Evidence-based management is definitely one key tool to check against intuition or habit. Also, I think the mindset you described of "act on their beliefs, while doubting what they know" is very useful. But because the behaviors are so automated, it becomes particular difficult to recognize and question them. It might be useful to have some predefined criteria that triggers specific analytical activities to guard against it. Some people (as you've described) seem to do it naturally, but the rest of us may need to define implementation intentions (http://bit.ly/2JkQuw) for our own behavior (If I find myself doing X, I will sit down and do Y).
After all, which of us haven't had the experience where you were absolutely certain you were right, had no reservations about expressing your *rightness*, and then found later you were...um...yeah...completely wrong?
Thanks for sharing your thoughts on this -- very helpful (and thought-provoking - clearly it's been on my mind...).
Posted by: usablelearning.wordpress.com | November 01, 2009 at 02:31 PM
As always, thank you so much for the thoughtful post! I've only got one slight suggestion, and its a difference in shading only. You mention that intuition and analysis might play as a tag team, but maybe that divide isn't so great. In some cases, like some great moments in professional wrestling, the two partners might both jump into the ring simultaneously. This is not that far away from improvisation in jazz, where what appears spontaneous might actually be the result of major practice and study-- brought to life by carefully listening to and acting within the same moment (Weick, 1998).
And of course, the real question on my mind (and why I love your work) isn't just about how this plays out in the minds of individual actors or managers, but how managers might promote improvisation, listening, and application of skill for others in their organizations. This interesting TED talk about musical conductors brought this idea to life for me recently. http://www.ted.com/talks/itay_talgam_lead_like_the_great_conductors.html
Weick, K.E. (1998). Improvisation as a mindset for organizational analysis. Organizational Science, 9(5), 543-555
Posted by: Vincent Cho | November 01, 2009 at 12:14 PM