Second-Guessing Confirmation Bias

Vikram Bath

Vikram Bath is the pseudonym of a former business school professor living in the United States with his wife, daughter, and dog. (Dog pictured.) His current interests include amateur philosophy of science, business, and economics. Tweet at him at @vikrambath1.

Related Post Roulette

9 Responses

  1. greginak says:

    I can’t tell what agreeing with this piece means and if i should feel confident that i’m not sure or more uncertain. And if i’m uncertain about my possible agreement, is that good or bad.Report

    • Road Scholar in reply to greginak says:

      And so it begins…Report

      • Stillwater in reply to Road Scholar says:

        I see the bias greg is exhibiting all the time: bias-uncertainty. Sometimes a person is biased towards uncertainty and they see confirming evidence in just about everything. They’re even uncertain about essays that confirm bias-uncertainty.Report

        • Stillwater in reply to Stillwater says:

          Note that bias-uncertainty shouldn’t be confused with bias-curious, which is a phenomenon in which people temporarily “try on” various biases as a conscious rejection, on both a personal as well as public level, of any identification with bias-confirmation.Report

          • Greg seemed to be making a good point, but then I read Road Scholar’s comment and that seemed spot on. Now I read yours, and I’m like, “what was I thinking?”Report

            • Stillwater in reply to Gabriel Conroy says:

              The technical name for what you’re experiencing is “bias-agreement”, in which a person’s bias is towards agreeing with the most recent information they’ve internalized. Btw, the type of bias I’m exhibiting is known as “bias-labeling”, in which a person’s proclivity to categorize and name various states of affairs in the world is confused with understanding how the world actually works.Report

  2. Guy says:

    I’ve read things and failed to be convinced by them. There are feelings associated with them, generally mediated by the style of the piece and the distance from the conclusion and my prior opinion. The feelings range from a slight sense of wrongness in the piece to smug superiority about my opinion to a visceral anger at the author for writing whatever it is they wrote.

    These feelings never seem to relate to a conscious process. I suspect they are associated with my tendency to nitpick – my tendency to say “I agree, except…”. I sometimes wind up accidentally arguing myself out of that initial “I agree”. But again, from the inside, it never feels conscious.Report

  3. Doctor Jay says:

    Well, just to be clear, nothing that goes before the Supreme Court is easy, or obvious. So yeah, it makes sense that both sides of the argument are going to sound pretty good.

    When we look at stuff that is more empirical, I think the bedrock is data. What does the data say? what are the limits to what the data can tell us? I keep a mental category called “Things I don’t understand” around. Sometimes data that doesn’t quite conform to what I think is true goes into this category, hopefully to be hauled out and reexamined later when even more data becomes available.Report

  4. DavidTC says:

    I don’t think what’s going on is ‘confirmation bias’ anymore.

    Confirmation bias is when you hear ten things, and *remember* the two that confirm what you already thought, and not really the other eight, even if two of the other eight were an argument against that.

    If ten stories about a part of town you think are dangerous happened to pass through your knowledge, you’ll remember the two of them are about crime there. Meanwhile, if someone else is actively trying to fight the perception that that part of town is dangerous, they’ll only notice the two heartwarming stories.

    But that’s not what happens anymore. What happens *now* is selective pre-filtering, where people only *learn* things that confirm what they already thought.

    This always used to happen a little, of course. But it would happen in the form of rumors and whatnot, and at least part of our minds would put qualifiers on it. Maybe it’s true, maybe that guy is spinning it.

    But now…it’s actual news stories, on reputable news sites. And it’s mostly real facts we’re learning…we’re just not learning *opposing* facts. Nor do we receive any correction when it turns out the facts were wrong. (And if we do…traditional confirmation bias kicks in.)

    And our minds don’t seem to qualifies those things with ‘Wait a minute…if all my friends have the same biases as me, isn’t it possible they’re only *sharing* stories that agree with my biases? When was the last time *I* read something that argued against what I believe, and then *I* shared it?’

    This trap seems a lot harder to escape than ‘confirmation bias’.Report