Continuous Random Variables
Introduction
Where discrete random variables look at a value in at a certain point, continuous random variables look at values over an interval. This allows us to calculate the probability of a random variable being between two values.
In an example where can be any number between 0 and 1:
Cumulative distribution
Similar to a probability mass function, when examining the values of greater than or less than a value we use a distribution function called the cumulative distribution.
Combining everything above, we can now see that:
can also be graphed as it is a continuous function which is increasing from 0 to 1.
Probability density function
As well as the cumulative distribution, we also define a function called the probability density function. This function is the antiderivative of the cumulative distribution and can tell us more information about a continuous random variable.
Reffering this back to our original premise at the top of the page:
A Key fact to remember is that:
Expected value
The expected value of a continuous random variable is defined as:
A moment of a continuous random variable is the expected value of a given power of . The moment of a continuous random variable is defined as:
Variance and covariance appear to work the same way as discrete random variables.
Gaussian random variables
A random variable is considered Gaussian if it has the density function:
Gaussian distributions are also referred to as normal distributions and are given the notation .
De Moivre-Laplace theorem
This is a method of essentially using the central limit theorem to approximate binomial distributions as normal distributions.
Where are Bernoulli trails.
Hazard rate functions
Known as the mortality or failure rate, this is a function that gives the probability that an object will fail within time .
Where is the probability density function and is the cumulative distribution.