Alright to just work through another example here, let's take a particular case of people being audited by the IRS. So in this particular example, lets suppose we've got an accountant. He's got three clients who filed returns of more than $5 million for their estate. So there is a 50% chance for each of these clients, that they're going to be audited. Right, well if we look at what are the chances that all three of them are going to be audited? So client one and client two and client three all audited. All right, well, 50% chance for client one, and a 50% chance for client two. So we're going to multiply those together. And a 50% chance for client three turns out that that joint probability of all three being audited is going to be 12.5%. All right, what about none of them being audited? Well, client one is not audited, client two is not audited, and client three is not audited. It's actually going to turn out to be the same because we're dealing with a 50% probability here. But it's the probability that client one's not audited, multiplied by the probability client two is not audited, multiplied by the probability of client three not being audited. And then how likely is it that at least one of these clients is audited? So we're not saying that just one is audited, we're saying at least one of them. It could be one, it could be two, it could be all three of them. Well we can actually use the compliment rule here to say what's the compliment of at least one of them being audited? It's the probability that none of them are audited. So it's 1 minus 12.5% gives us the 87.5%. The big assumption that we're making here is that these observations are independent of each other. If we have a reason to believe that the likelihood that client one is audited is linked to the likelihood that client two is audited, this is going to break down, we're going to have to use the more general multiplication rule. Again, another example of where these addition and multiplication rules comes into play, let's look at service plans. Retailers like to offer them buy a TV, you should buy a service plan to make sure that it's covered in case anything were to go wrong. So let's suppose that 20% of LCD TVs need service once, 10% need service twice, 5% have to be serviced three times or more. All right well, probability that your TV never needs to be serviced? Well, it means that it is not serviced once, it's not going to be serviced twice, it's not going to be serviced three times or more. So one minus each of those probabilities using the complement rule. Probability of your television requiring at least two service calls, which means it could need two service calls or it could need three or more service calls, or being the tip-off for the addition rule. So at the 10% chance of needing two service calls, 5% chance of needing three or more gives us 15% chance of needing at least two service calls. Now in terms of pricing out this warranty plan and we'll look at this as an example a little bit later on in the course. What else do you need to know? This is telling you how frequently televisions are going to need to be repaired. Also probably going to have to have some information in terms of what it costs to do the repairs. From the customer's stand point to make the decision of am I going to pay for the warranty plan? I probably have to have some price information, how much is it going to cost me to pay for this plan versus how much would it cost me just to go out and get a new TV? All right, so let's go over to Vegas for a little bit. If we think about having a pair of standard dice, what I've put in this table are the possible ways of rolling different combinations. So everywhere from rolling snake eyes, rolling a two all the up to rolling a 12, so a six and a six. So in the middle of this table, the possible pairs that I've enumerated, this is all the different ways you could roll a particular sum and then with a pair of dice there are 36 possible rolls that you can get. So our probabilities are all based out of 36. So how many different pairs could I get that to add up to a particular number divided by 36? That's going to give me the probabilities. Well if you're playing games like craps, this is how we'd go about calculating the likelihood that you'd win on a particular role. Now so if we look at this table, and what I'm going to look at is the lower corner for a second. These one roll bets. Rolling snake eyes, it's going to pay off for every dollar bet, it's going to pay out 30 to 1. Or rolling two sixes going to pay out 30 to 1. Is that a good bet for you? Or let's look at it a different way, where does the casino, where does the house make it's money? All right, well, that's where we gotta look at the idea of the expected value or the expected payout. And mathematically, what that means is what's your expected payout is what's the probability of an event occurring multiplied by the payout associated with that event. And then we're going to add up all of the particular combinations. So if we again go to this lower corner of the table. Rolling snake eyes. Chances of that happening. One out of 36, but they're only paying you $30 dollars for every dollar that you bet. So there's that difference instead of the payout matching the probability. That difference is where the casinos are making their money. Because your expected payout, once you take into account your bet you're actually going to be losing money on that one. Yes, it's paying out 30 to one, but the chances of getting it are only one out of 36. You can also as an exercise, you may want to go through that field bet area and take a look at what are the probability of rolling these different numbers? What's the payout associated with rolling these numbers, and am I making any money off of that bet? Another game where we can calculate expected payouts relatively easily is in roulette. So roulette, we've got red and black, numbers one through 36 plus zero and double zero. Well if we take a look at the Roulette table if you bet on any one of these columns or any one, in this case they look like rows where there are 12 numbers in there. The payout for every dollar you bet they're going to payout, it's going to pay out two to one. And it seems on the surface, well that's not bad, there are 12 numbers but it's not 12 out of 36. It's actually 12 out of 38 are your chances because of 0 and 00. If you were to bet red or bet black, if you were to bet one through 18 or 19 through 36 where it's close to a 50-50 coin flip for you in terms are you going to win. But the odds are actually in the houses favor because all of the probability aren't based on numbers one through 36, they're actually based on 38 numbers when you factor in zero and double zeroes. So that's what going to tilt things in the casino's favor. All right, so if we go back to this question of investing in our customers. How do we make this decision of how much we should expect a new customer to be worth, and how much we should be willing to invest in that customer? Well, if we go through our expected payout calculation is actually the average customer life time value that our expectation of what our customer's is going to be worth to us. That can give us that upper bound and then in this case that average is around 1,200. That gives us that upper bound for how much we should be willing to spend in terms of managing the relationship. Now obviously, we're going to want to leave room so that we're profiting off of these relationships. But it does establish that upper bound for us. So let me go through an example where again, we're going to apply everything we've talked about so far in term of the probability rules. This is referred to as the Monte Hall problem popularized by game shows if you've seen the movie 21 about the MIT card counters going to Vegas there's a scene with the actor Kevin Spacey popularizing this a little bit more, but let me walk through the basics of the game. So Monte Hall problems, you know its a simple game there are three doors Behind two of those doors are goats and behind one of those doors is a new car. Well, you as a contestant, you don't know what's behind any one of these doors. So your going to have make your best guess as to which door you want to select. Door 1, door 2, door 3. And once you've selected a door, the game show host is going to reveal one of the goats. In this case the game show host tells you there's a goat behind door number one. Let's say you select door number two. You now have the option of changing the door that you've selected and saying okay, well I've selected door number two, now I know there was a goat behind door number one. Should I change over to door number three? Does it help me? And it's tempting to say that well, no, it doesn't make any difference to me. I've still got two doors, it's a coin flip. So that's one argument. But when work through what happens based on the information that we've been provided. And the way that we're going to formulate this problem saying, how likely is it that I win by changing the door that I initially chose, all right? Well, there are two ways that you could win potentially by changing. One possibility is, you guessed right initially and you win on the change. The other possibility is, you guessed wrong initially and you win on the change. All right, so we've just decoupled this based on whether I guessed right or wrong initially. And let's break that down a little bit further. We're going to use the multiplication rule here. So what's the probability that I guessed right initially and I win on a change? Well, it's a probability that I Win on a change given that I guessed right initially. Multiplied by the probability that I guessed right initially. And if we start filling in some numbers here, well if I guessed right initially and I changed, well now I lose. So that first probability, that's going to be a zero. What's the probability that I guessed right initially? Three doors, only one chance for me to guess right, so that's 1/3. All right, well let's look at it the other way. Whats the probability that given that I guessed wrong initially. If I change, well the other goat's been removed, only the car remains, so 100% chance I win. And the probability of selecting a goat initially is 2/3. So if we work through the math on this the probability of me winning when I change my initial selection, is going to be two thirds. All right, so all we've done is populate the appropriate figures here. Now we can go through this very similar map and say what's the probability that I win by not changing? Alright, then I probably when with no change. Well, how do I do that? It means that the only way for me to do that was I had to guess right initially and win on not changing. But that only happens one-third of the time that I guess right initially. Because the probability that I guessed wrong initially, if I don't change, then I lose and I guess wrong initially two-thirds of the time. So, the probability of winning on a change, two-thirds. Probability of winning without making a change, one-third. You're actually better off taking advantage of the information that's provided to you and making that change. You're actually twice as likely to win. All right, so why does this matter? Well, when we're evaluating the quality of information, if we're looking at analyst predictions, if we're making an investment in acquiring additional information, how valuable is that information going to be to me? It's going to depend on how reliable the information is. It's also going to depend on how much that information costs me to acquire.