In statistics, the **gamma distribution** can be defined as a two parameter family consisting of continuous probability distributions. As seen in the log-normal distribution, X as well as both the parameters m and p must be positive.

In the parameters:

- p is the shape parameter.
- m is the inverse scale parameter.

Related Calculators | |

calculating gamma function | Binomial Distribution Calculator |

Calculating Poisson Distribution | Calculator for Distributive Property |

The probability density function for the Gamma distribution with shape parameter $\alpha$ > 0 and scale parameter $\lambda$ > 0 is

f(x) = f(x) = $ \frac{1}{\Gamma(\alpha)\lambda^\alpha }x^{\alpha -1}e^{-\frac{x}{\lambda }}$

where, $\Gamma(\alpha) = \int_{0}^{\infty}u^{\alpha -1}e^{-u}$du is called gamma function. The domain is 0 $\leq$ x < $\infty$. The probability density function can be generated from any domain of the form $\gamma$ $\leq$ x < $\infty$ where $\gamma$ $\geq$ 0.

If $\alpha$ = 1, the Gamma pdf produces to the pdf of an exponential distribution.

The formula in terms of probability density function of the**gamma distribution with parameter p** and X as any continuous random variable, can be written as

f(x) = $\left\{\begin{matrix}

\frac{x^{p - 1} e^{-x}}{\Gamma {p}}& p > 0, 0 \leq x< \infty \\

0 & otherwise

\end{matrix}\right.$

The probability density function can also be written in the form

f(x) = $\frac{m^{p}}{\Gamma {p}}$ e^{-mx} x^{p -1}, 0 $\leq$ x < $\infty$

This is called a**Gamma distribution with parameters m and p**.

The cumulative distribution function of a Gamma distribution is

f(x) = $\frac{1}{\beta ^{\alpha }\Gamma(\alpha )}x^{\alpha -1}e^{\frac{-x}{\beta }}$

where $\Gamma(\alpha )$ is the Gamma function

$\Gamma(\alpha ) = \int_{0}^{\infty}t^{\alpha -1}e^{t} $ dt

The following are the properties of the Gamma distribution:

Using the direct way

E(x) = $\int_{0}^{\infty}$ $\frac{e^{-x} x^{p - 1}}{\Gamma{p}}$ x dx

= $\frac{1}{\Gamma{p}}$ $\int_{0}^{\infty}$ e^{-x} x^{p} dx

= $\frac{\Gamma{p + 1}}{\Gamma{p}}$

= $\frac{p!}{(p - 1)!}$

= p

Gamma Distribution Mean, E(x) = p

Variance of gamma distribution can be found as,

E(x^{2}) = $\int_{0}^{\infty}$ $\frac{e^{-x} x^{p + 1}}{\Gamma{p}}$ x dx

= $\frac{1}{\Gamma{p}}$ $\int_{0}^{\infty}$ e^{-x} x^{p+1} dx

= $\frac{\Gamma{p + 2}}{\Gamma{p}}$

= $\frac{(p + 1)!}{( p - 1)!}$

= p(p + 1)

Hence, variance = E(x^{2}) - [E(x)]^{2}

where, mean (E(x)) = p

Variance = p(p + 1) - p^{2}

= p

Variance of Gamma Distribution, Variance = p

Let X_{1} and X_{2} be independent random variables following gamma distribution with parameter p_{1} and p_{2} respectively. Then, the moment generating function of the sum of two gamma distribution will have parameter p_{1} + p_{2}.

So, the sum of two independent random variables following gamma distribution also follows the gamma distribution and its parameter is the sum of the parameters of the two variables. This property is known as the**additive property**. This results may be extended to the case of any number of independent gamma variables, that is, if X_{1}, X_{2}, X_{3}…., X_{n} are independent random variables following gamma distribution with parameters, p_{1},p_{2},……p_{n} respectively, their sum, X = X_{1} + X_{2} + …. + X_{n} follows the Gamma distribution with parameter, p = p_{1} + p_{2} +……+p_{n}.

The gamma distribution can be elaborated as the sum of a fixed number of exponential random variables, under the condition that the shape as well as the scale parameter can have only positive values. Gamma distribution totally depends on the scale factor and the shape factor. So, we can say the gamma distribution is well defined by the two parameters, scale factor and shape factor. When the gamma distribution is represented as the sum of the exponential distribution variables,

Exponential: If the shape parameter has to be one and the scale parameter equal to the mean interval between the defined events, then the gamma distribution will be modified to exponential distribution.

**Erlang:** The shape parameter will be the number of events defined and the scale parameter becomes the average or mean interval between the events. This is mostly used to model the total intervals connected with Poisson events.

**Chi Squared:** If the shape parameter is taken as the degree of freedom divided by 2 and if the scale parameter is taken as 2, then the gamma distribution is changed to chi squared distribution.

The normal-gamma distribution (or Gaussian-gamma distribution) is a bivariate four-parameter family of continuous probability distributions. Let X_{i}, i = 1, 2, 3, …., n be the independent, identical exponential random variables $\lambda$ and lets take Y as their sum.

This distribution has $\mu$ = $\frac{1}{\lambda}$ and variance be $\sigma^{2}$ = $\frac{1}{\lambda^{2}}$ when the n value is large.

Then, Y = $\sum$ x_{i} $\sim$ N $\left(\frac{n}{\lambda}, \frac{n}{\lambda^{2}} \right)$

Here, Y which can be taken as $\Gamma{(n, \lambda)}$ will have a normal approximation for large values of n.

The bivariate gamma distribution is only one possibility for extending the gamma. The bivariate gamma distribution is formed from two univariate gamma distribution with fixed shape parameters and scale parameters takes one of two values with a generalized Bernoulli distribution. Multivariate gamma distributions generating variates from the multivariate gamma distribution that they considered. Multivariate stable distributions in which the multivariate stable random variable is a weighted sum of a univariate stable random variable times a point on the unit sphere. Several multivariate gamma distribution of the standard gamma is available. Prominent among them are distributions whose marginals are standard gamma. The Wishart distribution is taken as a**multivariate generalization of the gamma distribution**.

The inverse gamma distribution is a two-parameter family of continuous probability distributions on the positive real line, which is the distribution of the reciprocal of a variable distributed according to the gamma distribution.

**Inverse gamma distribution‘s** probability density function is expressed as

f(x) = $\frac{m^{p}}{\Gamma{p}}$ x^{-p-1} e$^{\frac{-m}{x}}$ where m and p are parameter and x > 0.

Now, by taking the definition of factorial

$\int_{0}^{\infty}$ e^{-t} t^{n} dx = n!

When the n is replace by x, the function generated is

f(x) = $\int_{0}^{\infty}$ e^{-t} t^{x} dx

Taking only the limits 0 to $\infty$, when t = 0 => e^{-t} = 1

So, e^{-t} t^{x} $\sim$ t^{x} = $\frac{1}{t^{-x}}$

Here, this improper function will convergent at +$\infty$ what so ever the value of x is.

So, the domain of f(x) must be (-1, $\infty$).

As we need (0, +$\infty$) as the required domain, the x-axis is translated to a new function

$\Gamma$(x) = f(x - 1) = $\int_{0}^{\infty}$ e^{-t} t^{x - 1} dt

This is called**Gamma function or Euler’s second integral**.

The regression models can be fit with the generalized gamma distribution with the following probability density function f(t).

f(t) = $\frac{|\lambda|}{t\sigma \Gamma\lambda^{-2}}(\lambda^{-2})^{\lambda^{-2}}exp[\lambda^{-2}(\lambda(\frac{log(t)-\mu}{\sigma})-exp(\lambda(\frac{log(t)-\mu}{\sigma})))]$

If a lifetime T has the generalized gamma distribution then the logarithm of the lifetime X = log(T) has the generalized log-gamma distribution, with the following probability density function g(x). When the gamma distribution is specified, the logarithms of the lifetimes are used as responses and the generalized log-gamma distribution is used to estimate the parameters by the maximum likelihood.

The formula $\Gamma$(x) = f(x - 1) = $\int_{0}^{\infty}$ e^{-t} t^{x - 1} dt also be defined using the logarithm function.

$\Gamma$ (x) = $\int_{0}^{\infty}$ e^{-t} t^{x - 1} dt = $\int_{0}^{1}$ (ln$\frac{1}{t}$)^{x - 1} dt

So, it can be taken as the**logarithm gamma distribution**. The graph of the gamma function for the interval z$\in$ [-6, 6] is as follows:

Poisson-Gamma distribution is the application of the negative binomial distribution. The negative binomial distribution can be viewed as a Poisson distribution where the Poisson parameter is itself a random variable, distributed according to a Gamma distribution. The following graphs show the examples of gamma distribution with respect to the change in m and p.

When p = 2 and m = 4

When p = 4 and m = 4

f(x) = f(x) = $ \frac{1}{\Gamma(\alpha)\lambda^\alpha }x^{\alpha -1}e^{-\frac{x}{\lambda }}$

where, $\Gamma(\alpha) = \int_{0}^{\infty}u^{\alpha -1}e^{-u}$du is called gamma function. The domain is 0 $\leq$ x < $\infty$. The probability density function can be generated from any domain of the form $\gamma$ $\leq$ x < $\infty$ where $\gamma$ $\geq$ 0.

If $\alpha$ = 1, the Gamma pdf produces to the pdf of an exponential distribution.

The formula in terms of probability density function of the

f(x) = $\left\{\begin{matrix}

\frac{x^{p - 1} e^{-x}}{\Gamma {p}}& p > 0, 0 \leq x< \infty \\

0 & otherwise

\end{matrix}\right.$

The probability density function can also be written in the form

f(x) = $\frac{m^{p}}{\Gamma {p}}$ e

This is called a

The cumulative distribution function of a Gamma distribution is

f(x) = $\frac{1}{\beta ^{\alpha }\Gamma(\alpha )}x^{\alpha -1}e^{\frac{-x}{\beta }}$

where $\Gamma(\alpha )$ is the Gamma function

$\Gamma(\alpha ) = \int_{0}^{\infty}t^{\alpha -1}e^{t} $ dt

The following are the properties of the Gamma distribution:

- $\Gamma{(n)}$ = (n - 1)!n when n = 1, 2, 3.....
- $\Gamma{(x + 1)}$ = x$\Gamma{(x)}$ for any x > 0
- $\Gamma{(1)}$ = 1
- $\Gamma{(x)}$ = $\lim_{x \to +\infty}$ $\frac{n^{x} n!}{x(x + 1).....(x + n)}$ for every n $\geq$ 1
- Relation between mean and variance of Gamma distribution

$\sigma^2$ = $\frac{\mu^2}{\alpha}$

- One of the main application is based on the interval which occurs between events when derived from it becomes the sum one or more than one exponentially distributed variables. Here the following examples can be taken like queuing models, the flow of objects through manufacturing and distribution processes and the load on many web servers also different forms works of telecom exchange.
- As it is moderately skewed, it can be very well used in different areas like it can be used as a workable model for climatic conditions, like raining and also in financial services in order to model the different patterns of insurance claims and the different size of loan defaults and so it has been used in different levels of probability of ruin and value at risk calculations.

Using the direct way

E(x) = $\int_{0}^{\infty}$ $\frac{e^{-x} x^{p - 1}}{\Gamma{p}}$ x dx

= $\frac{1}{\Gamma{p}}$ $\int_{0}^{\infty}$ e

= $\frac{\Gamma{p + 1}}{\Gamma{p}}$

= $\frac{p!}{(p - 1)!}$

= p

Gamma Distribution Mean, E(x) = p

Variance of gamma distribution can be found as,

E(x

= $\frac{1}{\Gamma{p}}$ $\int_{0}^{\infty}$ e

= $\frac{\Gamma{p + 2}}{\Gamma{p}}$

= $\frac{(p + 1)!}{( p - 1)!}$

= p(p + 1)

Hence, variance = E(x

where, mean (E(x)) = p

Variance = p(p + 1) - p

= p

Variance of Gamma Distribution, Variance = p

Let X

So, the sum of two independent random variables following gamma distribution also follows the gamma distribution and its parameter is the sum of the parameters of the two variables. This property is known as the

The gamma distribution can be elaborated as the sum of a fixed number of exponential random variables, under the condition that the shape as well as the scale parameter can have only positive values. Gamma distribution totally depends on the scale factor and the shape factor. So, we can say the gamma distribution is well defined by the two parameters, scale factor and shape factor. When the gamma distribution is represented as the sum of the exponential distribution variables,

- Shape factor denotes the number of variables present
- The scale factor becomes the mean of the exponential distribution.

For any non negative continuous random variable, the probability density function in terms of generalized gamma distribution is given as follows:

f(x, a, d, p) = $\frac{\frac{p}{a^{d}}x^{d - 1}{e^{(\frac{x}{a})}}^{p}}{\Gamma{(\frac{d}{p})}}$

The generalized gamma distribution has three parameter, a > 0, d > 0, p > 0.

Exponential:

The normal-gamma distribution (or Gaussian-gamma distribution) is a bivariate four-parameter family of continuous probability distributions. Let X

This distribution has $\mu$ = $\frac{1}{\lambda}$ and variance be $\sigma^{2}$ = $\frac{1}{\lambda^{2}}$ when the n value is large.

Then, Y = $\sum$ x

Here, Y which can be taken as $\Gamma{(n, \lambda)}$ will have a normal approximation for large values of n.

The bivariate gamma distribution is only one possibility for extending the gamma. The bivariate gamma distribution is formed from two univariate gamma distribution with fixed shape parameters and scale parameters takes one of two values with a generalized Bernoulli distribution. Multivariate gamma distributions generating variates from the multivariate gamma distribution that they considered. Multivariate stable distributions in which the multivariate stable random variable is a weighted sum of a univariate stable random variable times a point on the unit sphere. Several multivariate gamma distribution of the standard gamma is available. Prominent among them are distributions whose marginals are standard gamma. The Wishart distribution is taken as a

The inverse gamma distribution is a two-parameter family of continuous probability distributions on the positive real line, which is the distribution of the reciprocal of a variable distributed according to the gamma distribution.

f(x) = $\frac{m^{p}}{\Gamma{p}}$ x

Now, by taking the definition of factorial

$\int_{0}^{\infty}$ e

When the n is replace by x, the function generated is

f(x) = $\int_{0}^{\infty}$ e

Taking only the limits 0 to $\infty$, when t = 0 => e

So, e

Here, this improper function will convergent at +$\infty$ what so ever the value of x is.

So, the domain of f(x) must be (-1, $\infty$).

As we need (0, +$\infty$) as the required domain, the x-axis is translated to a new function

$\Gamma$(x) = f(x - 1) = $\int_{0}^{\infty}$ e

This is called

The regression models can be fit with the generalized gamma distribution with the following probability density function f(t).

f(t) = $\frac{|\lambda|}{t\sigma \Gamma\lambda^{-2}}(\lambda^{-2})^{\lambda^{-2}}exp[\lambda^{-2}(\lambda(\frac{log(t)-\mu}{\sigma})-exp(\lambda(\frac{log(t)-\mu}{\sigma})))]$

If a lifetime T has the generalized gamma distribution then the logarithm of the lifetime X = log(T) has the generalized log-gamma distribution, with the following probability density function g(x). When the gamma distribution is specified, the logarithms of the lifetimes are used as responses and the generalized log-gamma distribution is used to estimate the parameters by the maximum likelihood.

The formula $\Gamma$(x) = f(x - 1) = $\int_{0}^{\infty}$ e

$\Gamma$ (x) = $\int_{0}^{\infty}$ e

So, it can be taken as the

Poisson-Gamma distribution is the application of the negative binomial distribution. The negative binomial distribution can be viewed as a Poisson distribution where the Poisson parameter is itself a random variable, distributed according to a Gamma distribution. The following graphs show the examples of gamma distribution with respect to the change in m and p.

When p = 2 and m = 4

When p = 4 and m = 4

Related Topics | |

Math Help Online | Online Math Tutor |