Let’s Deal the Cards!

Related Post Roulette

50 Responses

  1. James Hanley says:

    which makes North the dummy.

    Good lord, I hope our dear Canadian/Minnesotan friend misses this post. That was totally uncalled for.

    The funny thing about the Monty Hall problem is that I’ve seen those clear and obvious solutions a dozen times, and I get them, but the answer still escapes my intuition. That and I have a tendency to confuse them with Monte Carlo simulation. (Not to mention the Full Monty.)Report

  2. Jason Kuznicki says:

    A great post on an important topic. Wikipedia’s “other host behaviors” table is quite interesting.

    To get a really thorough handle on the probability from a Bayesian standpoint, see here, then return to the Bayesian account of the Monty Hall problem.

    If you prefer to do without the math, just consider intuitively that the initial offer gave you a one in three chance of picking the winning door, while the second offer gave you a one in two chance of picking the winning door… only you haven’t taken the second offer yet, and you still need to do so.Report

  3. Jaybird says:

    The explanation for the Monty Hall problem that finally broke through my stupid, stupid brain was this one:

    Imagine that there are 100 doors. Behind 99 of them, there is a goat. You pick your door… and Monty Hall shows you behind 98 other doors. Goats galore… then he asks you if you want to trade for the last one.Report

    • James Hanley in reply to Jaybird says:

      my stupid, stupid brain

      I thought you were an egghead.Report

    • Chris in reply to Jaybird says:

      Yeah, multiplying the doors always seems to help people see the way the probabilities work. I used to tell students to imagine a million doors. That always seemed to make it clear.

      In grad school, I took a modeling course with 10 students: 2 cognitive psychology students (one of them me), 7 robotics guys (they were all really into this), and 1 applied math guy. One of our first assignments (maybe our first assignment, I can’t remember) was to do a Monte Carlo simulation of the Monty Hall problem (this will make James’ head explode). The applied math guy did the simulation, got what we all got (you should switch), and still refused to believe it. He ended up arguing with the professor for most of the subsequent class about the mathematics of the problem. It was really amusing for the two innumerate cog psych guys (one of them me, in case you’d forgotten).Report

      • Pat Cahalan in reply to Chris says:

        It’s pretty amazing how often math people get that one wrong.

        I think it goes to show you that mathematicians love applied computational math (diff eq and orbitals and stuff like that) a bunch more than theoretical math (number theory and abstract algebra and stuff like that) and most of them glide through statistics and forget it immediately.

        Stats is a boring subject to a surprising number of mathematicians. The stats guys are usually baseball nuts.Report

    • Fnord in reply to Jaybird says:

      The hundred or million doors version of the problem helped, but that still wasn’t enough to intuitively get the difference between the conventional problem and the “Monty Acts Randomly” problem, where switching doesn’t matter.

      What finally made it make sense to me is that, if Monty is acting randomly, if he opens 98 doors (or 999,998 doors, or even one, the evidence is not so strong in that case but the principle is the same) and doesn’t find a car, then you have very strong evidence that he couldn’t find a car, that the car was behind the door you initially picked. And so sticking with your initial choice is significantly more likely to be correct. But if he knows which door the car is behind, and specifically doesn’t choose it (as in the the conventional problem), there’s no such evidence. Intuitively, it’s obvious in the random case that it doesn’t matter which door you pick. Once I grasped that the random case gives you significant information that the conventional case doesn’t, it made sense that switching becomes a better choice in the conventional case.Report

      • Kazzy in reply to Fnord says:

        I’m not understanding it any better with more doors. Can someone break it down for me? Thanks.Report

        • Jaybird in reply to Kazzy says:

          Monty Hall is giving you information when he shows you what’s behind the doors.

          I’ll use the 101 doors example because, hey, that’s how I roll.

          You’ve your hundred-and-one doors in front of you. Pick one. Any one. Got it? Now I’m going to show you 99 doors that were *NOT* the car leaving one, and only one, standing in front of you.

          Out of the 101 doors, what are the odds that you just happened to pick the car?

          1 out of 101, right?

          So what Monty Hall is, essentially, asking you is “would you rather stick with the *ONE* door you just picked… or the 100??? Here, I’ll get rid of 99 goats for you.”Report

          • Jaybird in reply to Jaybird says:

            IF YOU HAVIN GOAT PROBLEMS I FEEL BAD FOR YOU SON
            I GOT 99 PROBLEMS BUT A GOAT AIN’T ONEReport

          • Kazzy in reply to Jaybird says:

            Dude! I get it! Thanks!

            So, imagine the 101 doors, labeled, conveniently, 1 through 101. I choose #1. Monty shows me 2-100. All goats. I can stick with #1 or I can choose #101.

            I think the confusion stems from thinking, “The odds of it being behind #101 are 1/101, the same as being behind #1.” Which, technically, is true, at least at the start. But I’m really having the choice of door #1 OR doors #2-101. Do I have that right?Report

            • Jaybird in reply to Kazzy says:

              Yes.

              Paring it down to having the choice between door #1 and doors #2 and #3 and, for the record, #3 sucks so the choice is really between door #1 and #2… pares down the odds enough so that it “feels” like a coin flip when, really, it’s a 2 out of 3 chance.Report

        • Chris in reply to Kazzy says:

          OK Kazzy, let’s play a game. Here are one MILLION dollars (imagine I said that in my best Dr Evil accent). Behind one of these doors is a fabulous beach getaway! Behind each of the other 999,999 doors is a pile of poop. I want you to pick one of the doors. Go ahead, pick, I’ll wait…
          … … … … … OK, pick already!

          Great, thank you! You’ve selected door #371,619. What is the probability that the door you selected has the beach getaway behind it? One in a million, right? Now, I know which of the doors has the getaway behind it, so what I’m going to do is open all but one of the remaining 999,999 doors and show you a pile of poop behind it.

          OK, it’s two and a half weeks later. All of the doors are open, the entire room smells like poop, and you’re probably thinking that no beach getaway is worth all of this, but bear with me. There are two doors remaining: the door you selected, which had a 1 in 1,000,000 chance of being the getaway door, and the one of the remaining 999,999 doors that I didn’t open. Behind one of these two doors is a vacation getaway, and behind the other, a pile of poop. You can either stick with the door you initially chose, or you can switch to the one door you didn’t choose that remains unopened. What do you want to do?

          Is it clear now what you should do? If not, think about it like this. When you chose the 1 door, there was a 1 in a million chance that you chose right. Just because I opened the other doors does not change the probability that you chose correctly. There is still a 1 in a million chance that the door you chose has the beach getaway behind it. There was a 999 ,999/1,000,000 chance (a probability of 0.999999) that the vacation getaway was behind one of the remaining doors. I opened all but one of those doors. So the probability that the beach getaway is behind the one remaining door is 0.999999 (999,999/1,000,000). The probability associated with all of the remaining doors attaches itself to the one door I didn’t open, because the probability that the door you chose has the getaway doesn’t change.

          If this doesn’t help, think of it like this: imagine we do this several times. If, as most people initially think, the probability that the door you selected has the getaway is .5 (50/50 chance) after I’ve opened all but one of the remaining doors, then that means that you should get it right about half of the time if we run it multiple times. So even though the probability that you’ve chosen the right door is 1 in a million, initially, just by opening 999,998 of the remaining 999,999 doors, I’ve made you right 50% of the time. That’s obviously wrong, right?Report

    • Chris in reply to Michael Cain says:

      If it’s any consolation, it looks like it takes the pigeons a few thousand trials to figure it out (here’s the paper), whereas someone just has to tell us how to do it. Yay humans!Report

      • James Hanley in reply to Chris says:

        But if nobody tells us how to do it, we apparently can’t learn from empirical experience as well as pigeons. That’s a bit depressing, but perhaps the immunity to empirical learning it explains why so many students never learn that regular study leads to better grades.Report

        • Chris in reply to James Hanley says:

          I wonder if pigeons can do the Wason Selection Task.Report

          • James Hanley in reply to Chris says:

            Time for a J-Stor search. Maybe after class.Report

            • Chris in reply to James Hanley says:

              Heh… I’m trying to think about how you’d set it up for pigeons. First, they’d have to learn the rule, which, even though it’s a simple rule (something like, “If a card has a square on one side, it is blue on the other side”), then give them trials to see if they can learn the minimum number of cards to turn over in order to prove the rule. Something tells me no one’s tried this with pigeons, but maybe with primates.Report

          • Mopey Duns in reply to Chris says:

            I would be amazed if a pigeon could do that, since it is purely ‘if…then’ thinking.

            Honestly, I am actually surprised that Wikipedia says most people have trouble on it. Anyone with even a little formal logic training should be able to suss out the correct answer quickly. Monty Hall is a different kettle of fish; formal training seems to make people worse at it, most of the time.

            The fact that pigeons perform better than humans on Monty Hall just shows that intelligence isn’t always what it’s cracked up to be. Sometimes it pays to be dumb.Report

            • Chris in reply to Mopey Duns says:

              Yeah, I meant it as a joke.

              If I remember correctly, professional philosophers do better at the Wason Selection Task than most people, which does suggest that a little training in logic helps. But in essence, it just shows that while modus ponens seems to come naturally to us, modus tollens has to be beaten into us, at least as an abstract principle (people do just fine with the Wason Selection Task when you make it relevant to them).Report

              • Mopey Duns in reply to Chris says:

                Guess my sarcasm filter is on the fritz.

                It saddens me that professional philosophers don’t perform essentially perfectly.

                I bet if you parsed out philosophers by specialty, you would see a difference though.Report

        • I suppose that the human response to the situation is more likely to be “Food comes out of this gizmo; therefore there must be food inside the gizmo; how do I take the gizmo apart?” Intelligence comes in different flavors…

          I always liked the story about the chimpanzee. Researchers hung a bunch of bananas higher than the chimpanzee could reach or jump. After a while, they provided a pole long enough to reach the bananas; the question they were thinking was, “How long does it take a chimp to figure out it can use the pole to knock down the bananas?” After a while, the chimp stood the pole on end directly beneath the bananas, used its four “hands” to scamper up the pole while keeping it balanced, grabbed the bananas and dropped to the ground.Report

        • Fnord in reply to James Hanley says:

          Allow me to present an alternative hypothesis. The students are there in order to get course credit on psychology course, and they presumably get it regardless of how well they do in solving the MHP. They’re doing 200 trials of a boring “game”, and the only reward for success is an electronic congratulations message from the computer. What they’re optimizing for is probably not obtaining a maximum score.

          Connecting that to “why so many students never learn that regular study leads to better grades” is left as an exercise for the reader.Report

  4. Fnord says:

    Looking at your contract bridge version, it seems to rely on the fact that, if a player has both Jack and Queen, they pick which one to play randomly. If, for some reason, a player always plays the Jack first out of the Jack-Queen double (and you know that), then you have no additional information. Of course, if you know that, and they play the Queen, then you know with certainty that they don’t have the Jack, which is presumably why sophisticated players pick randomly.

    I suppose the analogous Monty Hall-type problem is if Monty always opens the lowest numbered door that does not contain a prize. In that case, when the player picks door number 1, if Monty opens door number 2 switch or not makes no difference, whereas if he opens door number 3 switching is always correct.Report

  5. Jaybird says:

    This makes me wonder if there’s an optimal strategy for Deal or No Deal.

    I think it’s “Always trade suitcases” but I’ve no idea what to do before that.Report

    • Michael Cain in reply to Jaybird says:

      I never watched much, but I always wondered if the person up in the booth knew where the money was. As they came down to the end, sometimes the offers were pretty close to the expected value of choosing one of the remaining suitcases at random (which would be reasonable given no knowledge of where the big prize actually was) and sometimes they were way below the expected value (which would be reasonable if they knew the big prize wasn’t in the contestant’s suitcase, and wanted the player to hold on to the one they had).

      One of those tough situations for the contestant — they get to play the game exactly once. The person in the booth gets to play many times.Report

      • Patrick Cahalan in reply to Michael Cain says:

        Michael’s right; the optimal strategy depends on whether or not the guy in the booth knows anything, and the way they make the offers indicates that maybe they know and maybe they don’t.

        Which leads me to believe that they *do* know, but they have a good statistician who sets the value of the offer based upon a randomizing algorithm that pretends that they don’t always know. Just sometimes.

        Otherwise, a smart guy would have a major advantage in the game, the offer would always be a tell.Report

    • Ramblin' Rod in reply to Jaybird says:

      That show fascinated me when it first came on (and not for the 30 obvious reasons, well at least not JUST for those reasons). I actually worked up a spreadsheet to figure the expected value of the unopened cases. As near as I could tell, the offer was always lower than that number. But I don’t know what if anything that means other than they wanted the contestants to “Go On!” because it made for better television.

      BTW, did you know that game show winnings are typically paid from an insurance policy taken out by the producers? So someone’s figured out all the stats.Report

  6. DensityDuck says:

    What’ll really bake your noodle is the Two Envelopes Problem.

    I have two envelopes. One contains n dollars, the other contains 2n. You pick an envelope, but before you open it I give you the opportunity to open the other one instead. Is there a logical proof that you should (or should not) switch, as there is with the Monty Hall Problem?

    *****

    Rather easier is The Missing Dollar. Three men stay at a hotel; the manager charges them thirty dollars, and they each pay ten dollars. He later changes his mind and refunds them five dollars. The busboy, taking the money to the room, thinks “it’s hard to split five dollars three ways, so I’ll just keep two, and pretend that the manager gave them each a dollar back.”

    So each man has paid nine dollars, plus two that the busboy kept; that adds up to twenty-nine dollars. Where did the other dollar go?Report

    • BlaiseP in reply to DensityDuck says:

      Oh shut up. The busboy could have only refunded three dollars to each man. 30 – 5 + 3 + 2 is still 30 dollars.Report

      • wardsmith in reply to BlaiseP says:

        And yet programmers specifically make the contra mistake, multiplying 9×3=27, adding 2 getting 29 and they’re stuck. Now ask yourself why?Report

        • BlaiseP in reply to wardsmith says:

          A programmer wouldn’t make that mistake. A programmer would store the money in a numeric total. Nowhere in the spec was the programmer asked to multiply anything.Report

        • Fnord in reply to wardsmith says:

          They get hung up on 27 x 3 + 2 because you told them it was important. It’s not a mathematical puzzle, it’s a test of their ability to stop dealing with the stated problem in good faith and say “you’re full of shit and your question is nonsensical.”Report

    • MikeSchilling in reply to DensityDuck says:

      Obviously, there’s no reason either to switch envelopes or not to: you have no information to distinguish one envelope from the other. Is the argument about expectation, like

      I currently have N. There’s a fifty percent chance that the other envelope has N/2 and fifty that it has 2N, so my expectation after I switch is 5N/4. I should switch. OK, now I’m good. Except that that same logic still applies! I should switch back! Please, can I … Oh, thank you. Good, back to the first envelope. Man, good thing that same logic doesn’t still … arrrrrrrrrrrrrgh!

      If so, it’s silly because it completely overvalues the concept of “expectation” when important information is missing.Report

    • Ramblin' Rod in reply to DensityDuck says:

      It’s the way it’s stated that makes the missing dollar problem perplexing. It suggests the wrong answer and it’s hard to un-suggest it in your head. But really the math is 3 x 9 = 27 = 25 (in the till) + 2 (in the bus-boy’s pocket). This is more of a brain bug than a logic problem.Report