Published on:

Cognitive quirks impair our ability to reach correct conclusions – filters and interpretation

“Humans have biases that underlie how information is filtered, interpreted and often bolstered.” That summary, from an article in the MIT Sloan Mgt. Rev., Vol. 50, Winter 2009 at 43, provides a three-part framework for my posts about cognitive distortions we all face. For general counsel and indeed all lawyers, these cognitive traps mean an unconscious tendency to frame complex or ambiguous issues in a certain way – without fully appreciating other possible perspectives – and then become overconfident about our particular view.

We filter out much of what impinges on us. What we pay attention to is very much influenced by what we expect or want to see (See my post of May 14, 2006: fundamental attribution error; July 10, 2007: fundamental attribution bias; and Nov. 21, 2008: value attribution that distorts perceptions.). Cognitive bias researchers refer to filtering as “selective attention.” Whereas selective attention draws us to what makes us feel correct, cognitive dissonance drives us away from thoughts that cause conflict and uncertainty (See my post of April 5, 2007: cognitive dissonance; and Nov. 21, 2008: cognitive dissonance and law firms we like.).

Even if we had no filters and perceived accurately, we still suffer from all kinds of distorted interpretations (See my post of April 8, 2008: overconfidence, salience and confirmation biases; Jan. 18, 2008 #4: fallacy of misplace concreteness; and April 5, 2007: diagnosis momentum.). We certainly can’t trust much-vaunted intuition (See my post of May 1, 2005: two limits of intuition; March 18, 2005: decision-making partakes of cerebral rationalizing as well as genetic hardwiring, chemical reactions, and bodily signals that guide us faster than thought; Sept. 4, 2005: lawyers over-rate their intuitive judgment and should use metrics more; and Dec. 3, 2007 #3: intuition and emotional intelligence.). Then, too, we listen selectively to advice (See my post of April 4, 2006: we over-value advice if problem is hard, undervalue it if the problem seems easy.).

Third, even if we filter fairly and interpret wisely, we tend to imprudently bolster the conclusions we reach (See my post of April 17, 2006: our proclivity to seek confirming evidence; July 10, 2007: confirmation bias; April 17, 2006: we uncritically accept information that supports our view; and Jan. 28, 2008: availability and premature solutions.).

Other cognitive traps include the sunk-cost fallacy (See my post of March 23, 2006: sunk-cost fallacy; and Aug. 5, 2005: illustrates sunk-cost fallacy with facilities charges.), risk aversion as a rational blind spot (See my post of Aug. 24, 2008: lawyers and risk averse behavior with 11 references.) and peer pressure (See my post of Jan. 15, 2006: groupthink.). Some of the chinks in the armor of our presumed rationality show more generally (See my post of April 27, 2005: three key information failures.).

Posted in:
Published on:

Comments are closed.