Discrete Random Variables
Introduction
Given a sample space , a random variable with values in some set is a function:
Where is usually .
Probability mass function
Each value of a random variable will have a different probability of occurring. Think of a certain value as an event. We calculate this probability using a formula called the probability mass function.
For a full list of discrete distributions, see the next topic.
Stirling’s approximation
Many of the distribution functions involve factorials in some way. These can be approximated using this useful model which the lecturer insists is very important.
Expected value
This topic has explained how to calculate the probability of individual values of a random variable, but you may be asked to calculate its expected value. Think of this as a weighted average.
If a random variable is given as some function then you can still calculate its expected value.
Joint distributions
So far we have examined cases with the possible values of one random variable. Now we need to think about what to do when we want to know the probability of two random variables having certain values.
If these two random variables are independent then:
Otherwise, we need to look at the marginal distributions which are essentially the probability mass function of each value of a random variable laid out in a table.
For example, let’s take a case where we have 2 red pens, 1 green pen and 1 blue pen. Two pens are picked at random without replacement. Let be the number of red pens chosen and be the number of green pens chosen. We can calculate the marginal distributions as:
All values in each table must sum to 1.
These can then be joined into a joint distribution table:
Each row and column must sum to the equivalent value in its respective marginal distribution table.
Some properties of joint distributions include:
-
-
-
Sum of the product of x,y,p for each element in the joint table.
You can also use the following to calculate joint distribution table values:
Or rearranged it to calculate conditional probabilities:
Variance & covariance
The variance of a discrete random variable is a measure of spread for a distribution of a random variable that determines the degree to which the values of a random variable differ from the expected value. It can be calculated using:
The covariance, on the other hand, determines how much the possible values of two random variables and differ from each other. It can be calculated using:
As a side note, the standard deviation can be calculated by:
Independent variables
By definition, two random variables are independent if:
An implication of this, using the equivalencies in the section above, is:
Be aware that this statement only works in one logical direction. Covariance being zero does not implicate anything. However, by contraposition, you can prove that two random variables are not independent if: