Biases and Lies, Told By Us, To Us

A while back Jaybird brought up “Priming” while dealing with the Yanni vs Laurel thing. Cognitive bias, however, influences just about every thought you have in one way or another, and dealing with it beyond just behavior in understanding it better is a complex endeavor.
“Why You Lie to Yourself” By Ben Yagoda writing in The Atlantic takes on the subject in a new piece that give us plenty to think about.

Kahneman and others draw an analogy based on an understanding of the Müller-Lyer illusion, two parallel lines with arrows at each end. One line’s arrows point in; the other line’s arrows point out. Because of the direction of the arrows, the latter line appears shorter than the former, but in fact the two lines are the same length. Here’s the key: Even after we have measured the lines and found them to be equal, and have had the neurological basis of the illusion explained to us, we still perceive one line to be shorter than the other.

Biases and Lies, Told By Us, To Us

Müller-Lyer illusion

At least with the optical illusion, our slow-thinking, analytic mind—what Kahneman calls System 2—will recognize a Müller-Lyer situation and convince itself not to trust the fast-twitch System 1’s perception. But that’s not so easy in the real world, when we’re dealing with people and situations rather than lines. “Unfortunately, this sensible procedure is least likely to be applied when it is needed most,” Kahneman writes. “We would all like to have a warning bell that rings loudly whenever we are about to make a serious error, but no such bell is available.”

Because biases appear to be so hardwired and inalterable, most of the attention paid to countering them hasn’t dealt with the problematic thoughts, judgments, or predictions themselves. Instead, it has been devoted to changing behavior, in the form of incentives or “nudges.” For example, while present bias has so far proved intractable, employers have been able to nudge employees into contributing to retirement plans by making saving the default option; you have to actively take steps in order to not participate. That is, laziness or inertia can be more powerful than bias. Procedures can also be organized in a way that dissuades or prevents people from acting on biased thoughts. A well-known example: the checklists for doctors and nurses put forward by Atul Gawande in his book The Checklist Manifesto.

Is it really impossible, however, to shed or significantly mitigate one’s biases? Some studies have tentatively answered that question in the affirmative. These experiments are based on the reactions and responses of randomly chosen subjects, many of them college undergraduates: people, that is, who care about the $20 they are being paid to participate, not about modifying or even learning about their behavior and thinking. But what if the person undergoing the de-biasing strategies was highly motivated and self-selected? In other words, what if it was me?

The whole article is worth a read. So what do you think? Comment below, share and discuss.

Please do be so kind as to share this post.
Share

7 thoughts on “Biases and Lies, Told By Us, To Us

  1. “Unfortunately, this sensible procedure is least likely to be applied when it is needed most,”

    Save while you’re young and can benefit from compound interest. Lots of other examples.

    Great subject.

    That Codex is unreadable if you click on it.

      Quote  Link

    Report

  2. Looking forward to reading the article. I read Kahnemann’s book – that Yagoda references heavily – while back and had very conflicting opinions.

    On the one hand the basic arguments are very sound and well backed. I do not doubt “system 1” and “system 2” as good ways of describing a true thing that happens.

    On the other hand a lot of the broad and overreaching claims he makes in the book generalizing from very specific situations were based on studies with an n of 20 or less, and other design flaws.

    As some trained in biology, where n=20 is basically something you do because you’re curious about a beer bet, or a single run of something you’re going to repeat another 50 times … and NOT a real experiment, I see that and I see him making great bones about how scientific and amazing he and his partner are, and my hackles go way up. There were some other issues that I’ve forgotten too.

    I actually trusted Kahnemann less by the end of the book than I did at the beginning.

    (end rant)

    Yagoda, on the other hand, I love reading and love his thinking patterns. So, looking forward to the article :D.

      Quote  Link

    Report

  3. This is based on little more than the article itself and no other expertise, but I get the sense that the two sides aren’t really disagreeing very much. Kahneman is mainly saying that you can’t change the biases of system 1 itself and it’s very difficult for system 2 to consistently account for them and override. Nisbett’s examples (and Yagoda’s experience) seem to be just cases where people have trained their system 2 to account for the system 1 biases in some specific circumstances, but not actually eliminated their bias. Given the 100 different kinds of biases mentioned, the training seems like just a drop in the bucket that picks off some low-hanging fruit, and no real threat to Kahneman’s belief.

    It’s like trying to write legibly while looking in a mirror — with a lot of practice you can eventually learn to do it reasonably well, but the end result is not that you’ve changed your perception to not see the reversed images anymore, just that you’ve learned how to temporarily override your existing habits in that context.

      Quote  Link

    Report

Leave a Reply

Your email address will not be published. Required fields are marked *