When we talked about discrete random variables,
we said that the way we're going to work with probability for
them is to assign a probability to every value that they can take.
So why don't we just call that a function?
We'll call it the probability mass function.
And this is simply the function that takes any value that a discrete random variable
can take, and assigns the probability that it takes that specific value.
So a PMF for a die roll would assign one sixth for the value one,
one-sixth for the value two, one-sixth for the value three, and so on.
But you can come up with rules that a PMF must satisfy in order to then satisfy
the basic rules of probability that we outlined at the beginning of the class.
First, it must always be larger than zero because we, larger than or
equal to zero, because we've already seen that a probability has to
be a number between zero and one, inclusive.
But then also the sum of the possible values that the random variable can
take has to add up to one, just like if I add up the probability a die takes
the value one, plus the probability that it takes the value two,
plus the probability it takes the value three, plus the value it takes four, five,
six, that has to add up to one, otherwise.
The probability of something happening, right,
that the die takes one of the possible values, would not add up to one,
which would violate one of our basic tenets of probability.
So all a PMF does, has to satisfy, is these two rules.
We won't worry too much about these rules.
Instead, we will work with probability mass functions that
are particularly useful, like the binomial one, the canonical one for
flipping a coin, and the Poisson one, the canonical one for modelling counts.