Probability Revision
Sample space
For any probabilistic experiment or process, the set of all its possible outcomes is called its sample space.
In general, sample spaces need not be finite, and they need not even be countable. In this course, we focus on finite and countable sample spaces. This simplifies the axiomatic treatment needed to do probability theory.
In any repeated situation, such as flipping a coin times, the sample space is given by:
So the size of the sample space is the number of options in each trial to the power of the number of trials.
Probability distributions
A probability distribution over a finite or countable set , is a function:
Such that .
It is the set of probabilities of each of the outcomes in the sample space.
Events
For a countable sample space , an event, E, is simply a subset of the set of possible outcomes. Given a probability distribution , we define the probability of the event as .
Conditional probability
Let be a probability distribution, and let be two events, such that . The conditional probability of given , denoted , is defined by:
Independence
Events and are called independent if .
This means that if and are independent and then:
Bernoulli trials
A Bernoulli trial is a probabilistic experiment that has two possible outcomes: success or failure.
Binomial distribution
Take PwA distributions sheet into the exam with you.
Random variables
A random variable is a function , that assigns a real value to each outcome in a sample space .
We can define a probability distribution for each possible value of a random variable. This is often denoted as:
The ‘range’ of a random variable is the set of all the possible values it can have.
Bayes’ theorem
Extending everything we have covered in this topic, we have seen different ways to approach conditions in probability. This culminates in a formula that is often applied to these situations depending on the information you are given.
Expected Values
The expected value, orexpectation, of a random variable is defined by:
Where is the underlying probability distribution and is the value assigned to the random variable.
For example, let be the output value of rolling a six-sided die. Then the expected value of is given by:
However, this method is not always feasible. There are examples where the sum would have thousands of terms and remember that you don’t get a calculator in this exam. So to fix this we can also define:
The expected # of successes in (independent) Bernoulli trials, with probability of success in each, is . This is basically a binomial distribution.
The expected # of trails needed to obtain success, with probability of success in each, is . This is basically a geometric distribution.
Linearity of expectation
Apparently, this is very important.
For any random variables on :
Furthermore, for any :
Independent RVs
Two random variables, and , are called independent if for all :
If and are independent random variables on the same space . Then:
$