26 March, 2012 — Menlo Park, CA

All my formal statistics training has come from the frequentist perspective, and lately I’ve been taking steps to remedy that.

So it’s not too surprising that, as my mind drifted awake in the shower this morning, I began to think about Monty Hall. Not the person, but the famously counterintuitive problem. That is:

You are presented with three doors. B1, B2, and B3. Behind one of these doors is a car. The other two reveal goats. Monty asks you to choose a door, and says that you will be able to keep whatever is behind your selection. So you make your choice.

Monty then smiles mischievously, and opens one of the doors you did not choose. The door Monty opens will always contain a goat. Now Monty offers you the option to switch doors. Presuming you want a car, what should you do?

Naturally, you should always switch doors. But is that intuitive?

Instructors present this problem as a parlor trick to just about every undergraduate who’s ever taken a math class, and so I’d long since internalized the careful logic that leads to an appropriate answer. But today something new occurred to me.

I realized that if I’d been “brought up” as a Bayesian, the Monty Hall problem wouldn’t be counterintuitive at all. On the contrary, the value of the additional information revealed by Monty’s actions should be very clear. Consider:

We have some priors: P(B1), P(B2), P(B3)
B(i) is the event "car behind door B(i)"
P(B(i)) is the prior probability that the car is behind door B(i)
A reasonable assignment of priors is: P(B1) = P(B2) = P(B3) = 1/3
Let's say, without loss of generality, that we choose door B1.
Again, without loss of generality, say Monty opens door B2. 
Call this event A.
So we want to know the posterior: P(B1|A)
(the probability that the car is behind B1 given that A happened)
By Bayes theorem: 
P(B1|A) = P(A|B1) * P(B1) / 
                (P(A|B1) * P(B1) + P(A|B2) * P(B2) + P(A|B3) * P(B3))
Fill in the priors:
P(B1|A) = P(A|B1) * (1/3) / 
                (1/3)*(P(A|B1) + P(A|B2) + P(A|B3))
Now for the likelihoods:
P(A|B1) = 1/2 (the car could be in either B2 or B3)
P(A|B2) = 0 (Monty will not open door B2 if the car is in B2)
P(A|B3) = 1 (can't open B1 (since you chose it) or B3 (it hides the car))
Fill in the likelihoods:
P(B1|A) = (1/2) * (1/3) / 
                (1/3)*((1/2) + 0 + 1)
So: P(B1|A) = (1/6) / (1/2) = 2/6 = 1/3
We also know P(B2|A) = 0
Obviously then, P(B3|A) = 2/3 and we should switch doors.

Although, writing all this out makes it look rather messy, the intuition is very straightforward. Monty gives you information and you can use this information to update your beliefs.

You can improve your model of the world, as justified by the data. And that’s what Bayesian reasoning is about.