Hi and welcome back. In this video, we're going to continue our study of discrete random variables. Specifically, we're going to study Bernoulli and geometric random variables. Discrete random variables can be categorized into many different types or classes. Each class models many different real-world situations. These groupings occur so frequently they get names. And in particular, the first one will be a Bernoulli random variable. This is sometimes called a binary random variable. And it's any random variable that has only two possible outcomes, a zero or a one, false or true, failure or success, just two outcomes. The probability mass function is going to be given by the probability that X=1. Usually we write that as P, so success probability p. And that means the probability that X = 0 equals s 1-p. We can also construct the cumulative distribution function, that's F of X and recall that's the same as the probability that X is less than or equal to X. That's going to equal zero, if our little x is less than zero. Then it will be 1-p if X is greater than or equal to zero and less than one. And it's one if X is greater than or equal to one. Graphically, we have the following. So here's zero, here's 1. Maybe we'll put 1-p there. So it'll be zero all the way up to that, then it will jump to 1-p until you get to one. And then thereafter it's always one. Notationaly, we're going to write X. This little tilde design is going to mean has the distribution of, so X has the distribution of a Bernoulli random variable with probability p. When we write this, X has the distribution of, we mean all of this, we mean the probability mass function, and we mean the cumulative distribution function. That's all encapsulated in this one little symbol. Let's talk about a geometric random variable. And I want to start with a motivating example, suppose a patient needs a kidney transplant and they are waiting for a matching donor. Successive prospective donors are tested, and until a success, until a match is found. The probability that a randomly selected person is going to be a match is P. What's the sample space? What's an appropriate random variable? What's the probability mass function? How can we organize this motivating example into a probabilistic situation? Well, let's start with the sample space. So if the first person is a match, that's 1. If the first person is not a match, but the second person is, that's a 01 and so on. And we go forever on that, because we don't know where to stop. We're going to define a random variable. Well, the most reasonable random variable would be the number of donors tested until a match is found. X can be 1, 2, 3, and so on. The probability mass function we can calculate. So X is one, so that very first person is a match. That will happen with probability P. The probability that x = 2 is (1-p) p. So we have a failure, a non match for the first person and then a match for the second. The probability that x = 3 to non matches. And then the third person tested is a match. We can continue in this manner. So the probability that x = k is going to be 1- p to the k -1. So the first k -1 people tested are not matches. And then the kth person is a match. And this is for k = 1, 2, 3, and so on. So this is our probability mass function for the geometric random variable. Now, you might recognize this as a term in a geometric series. That's that's where the name geometric random variable comes from, from the geometric series. So what I'd like to do now is do a short review on geometric series. So let's start, what does a geometric series look like? It's usually written as a + ar + ar squared and so on. We can write that in summation notation as the sum k equals one to infinity ar to the k -1. We know from calculus that this is going to converge to a over 1- r, if the absolute value of r is less than one. And it's going to diverge if the absolute value of r is greater than or equal to one. From the previous slide, we have that the probability mass function for a geometric random variable is given by the probability that x = k is 1- p to the k -1 times p. And this p access are a, and the 1- p access are r. So what we'd like to do is verify that we indeed have a probability mass function. So we have all the probability of the sample space adds up to one. So we have to verify that the sum from k = 1 to infinity of the probability that x = k. We want to verify that that's equal to one. Well, this is the same as k = 1 to infinity of 1- p to the k -1 times p. And we note that r is 1- p, and that's less than 1. So we can use the summation formula here from the geometric series. So we get p over 1 -(1- p), and that is indeed equal to 1. So we verified that we actually have a probability mass function for our random variable. Let's summarize where we are. A geometric random variable consists of independent Bernoulli trials, each with the same probability of success P, and they're repeated until the first success is obtained. We can break that definition down. So each trial is identical and can result in a success or a failure. The probability of success, p, is constant from one trial to the next. The trials are independent, so the outcome on any particular trial does not influence the outcome of any other trial. And the trials are repeated until the first success. If any one of these four criteria are violated, then we don't have a geometric random variable, we have some other kind of random variable. And sometimes we'll have to study those separately. So to summarize, the sample space for a geometric and invariable S is 1, 01, 001, and so on. The probability mass function, probability of x = k, 1- p to the k -1 times p, for k =1, 2, 3, and so on. And our notation is going to be X has the distribution of a geometric random variable with success probability p. And this notation is going to encapsulate the definition of a geometric random variable, and what we understand the probability mass function to be. In the next video, we'll continue our study of random variables and we'll talk about binomial, discrete random variable. See you then.