Let me remind you once more of the concept of independence.
Independence meant to occurrence of in terms of one event does not effect
the chances of another event occurring.
That was a probability of A is equal to the probability of A given B.
If that's not the case, if they're unequal, you say, dependence.
Now what happens if we take our multiplication rule and
now assume that A, and B are Independent?
In that case,
the conditional probability of A given B is just the original probability of A.
Now replace the conditional probability in that general multiplication rule and
you get a specialized multiplication rule, and that's the multiplication rule I
showed you in the previous module when we talked about independent events.
So probability of A intersection B is equal to P(A) times P(B).
Look, that looks much easier than the general multiplication rule.
That's why people like this independence events, assumptions and this rule.
Lets look at this example where we can easily use.
Lets say, you play a dice game, three dice.
What's the probability of three ones?
One on the first roll, on the second roll and on the last dice.
So you want the probability of a one and a one, and a one?
Rolling three dice, they are independent, I'm allowed to multiply the probability
one-sixth times one-sixth times one-sixth is one and two hundred sixteen and
there is nothing special about one, one, one.
I can ask you what's the probability of first of one, then a three, then a five.
Same math.
So you see, rather large complex events like 3 numbers in a row,
you can do this for 20 numbers in a row, for 200 numbers in a row.
Suddenly, gets very, very easy under the assumption of independence.
We can just multiply their probabilities.
As simple as this is, we have to be careful in real world applications.
There often, we have to ask ourself, is that assumption reasonable?
Can we really assume independence?
If yes, great for you.
We can use the independence multiplication.
If however, the answer is no, you are not allowed to use this rule.
You may get into real trouble.
And later on in this course, I will show you some devastating applications
where people assumed independence and terrible real world things happened.
Here now, I want to give you a very simple example.
Let's say, you have a machine in an assembly line and
that machine carries a heavy load.
And as a result, it breaks down on average in one out of ten days.
On nine out of ten days, it can handle the workload and it works fine.
So if we uses historical data now and
the equilibrium concept number two, empirical definition,
we can now say, the probability of a good day of working is 0.9,
of a breakdown is 0.1 and here's now the question.
What is the probability that this machine works two days in a row?
So if I even don't know anything,
I would have to say, use a general multiplication rule.
The probability of working well on the first day and
working well on the second day.
Yes, probability of working well on the first day times the conditional
probability working well on the second day,
given it worked well on the first day.
One probability I know, P of working well on the first day is 0.9.
That's from my data, but what's the conditional probability?
Now, I'm in trouble.
So now, I would love to assume independence.
If I have independence, then I can use the simpler rule for
independent events at the bottom of the slide,
0.9 times 0.9, 0.9 squared is 0.81.
But if I cannot assume independence, then I need more data.
Now, what is it?