Continuous Probability Distributions

Probability Density Function (PDF) f(x)=\frac{dF(x)}{dx}=F'(x)

(Cummulative) Distribution Function (CDF) F(x)=P(X\leq x)=\int_{-\infty}^{x}f(t)dt

P(a<X<b)=\int_{a}^{b} f(x)dx

E(X)=\int_{-\infty}^{\infty} xf(x)dx

V(X)=E(X^2)-[E(X)]^2 where E(X^2)=\int_{-\infty}^{\infty} x^2f(x)dx

Unform Probability Distribution

f(x)=\begin{cases} \frac{1}{\theta_2-\theta_1} \text{ if } \theta_1 \leq x \leq \theta_2 \\ 0 \text{ elsewhere} \end{cases}

E(X)=\frac{\theta_1 + \theta_2}{2}

Var(X)=\frac{(\theta_2-\theta_1)^2}{12}

M_x(t)=\frac{e^{t\theta_2}-e^{t\theta_1}}{(\theta_2-\theta_1)t}

Normal Probability Distibution (a.k.a. Gaussian Distribution) X \sim N(\mu , \sigma^2)

f(x)=\frac{1}{\sigma \sqrt{2 \pi}}e^{-(x-\mu)^2/(2\sigma^2)}

E(X)=\mu

V(X)=\sigma^2

A normal random variable X can always be transformed into a standard normal random variable Z by:

Z=\frac{X-\mu}{\sigma}

For Z we have \mu = 0 and \sigma = 1

Gamma Probability Function X \sim \Gamma (\alpha , \beta)

f(x)=\begin{cases} \frac{\beta^\alpha}{ \Gamma(\alpha)}x^{\alpha-1}e^{-\beta x} \text{ if } x \leg 0 \\ 0 \text{ elsewhere} \end{cases}

where \Gamma(\alpha)=\int_{0}^{\infty} x^{\alpha-1}e^{-1}dx

and \Gamma(n)=(n-1)!, provided that n is an integer.

\mu=E(X)=\alpha\beta

\sigma^2=V(X)=\alpha\beta^2

Chi-square X \sim \chi ^{(k)}

A chi-square probability with v degrees of freedom is a gamma probabilit distribution with \alpha = v/2 and \beta=2.

\mu=E(X)=v

\sigma^2=V(X)=2v

Exponential Probability Distribution X \sim \text{Exp}(\lambda )

To model waiting times or lifetimes. I.e. the continuous version of Poison probability distribution.

Gamma probability distribution wit \Gamma=1.

\lambda = \frac{1}{\mu}

PDF f(x)=\begin{cases} \lambda e^{-\lambda x} \text{ if } x\geq 0 \\ 0 \text{ if } x<0\end{cases}

CDF F(x)=\int_{x}^{-\infty}f(t)dt=\begin{cases} 1-e^{-\lambda x} \text{ if } x\geq 0 \\ 0 \text { if } x<0 \end{cases}

\mu=E(X)=\lambda

\sigma^2=V(X)=\lambda ^2

To find median solve 1-e^{-\lambda \eta}= \frac{1}{2}=\eta = \frac{log2}{\lambda}

Beta Probability Distribution X \sim Be(\alpha, \beta)

f(x)=\begin{cases} \frac{x^{\alpha-1}(1-x)^{\beta-1}}{\beta(\alpha, \beta)} \text{ if } 0 \leq x \leq 1 \\ 0 \text{ elsewhere} \end{cases}=\frac{\Gamma(\alpha + \beta)}{\Gamma(\alpha)\Gamma(\beta)}x^{\alpha-1}(1-x)^{\beta -1}

where \int_{0}^{1} x^{\alpha -1}(1-y)^{\beta -1}dx=\frac{\Gamma(\alpha)\Gamma(\beta)}{\Gamma(\alpha + \beta)}

NB If \alpha and \beta are both integers: (Sim of probabilitied for Binomial Distribution)

f(x)=\int_{0}^{x} \frac{t^{\alpha-1}(1-t)^{\betal-1}}{\beta (\alpha , \beta)}dt=\sum_{i=\alpha}^{n} {n \choose i}x^i(1-x)^{n-i}

where n=\alpha + \beta - 1

\mu=E(X)=\frac{\alpha}{\alpha + \beta}

\sigma^2=V(X)=\frac{\alpha\beta}{(\alpha + \beta)^2(\alpha + \beta + 1)}