To get the best deal on Tutoring, call 1-855-666-7440 (Toll Free)

Gamma Distribution

In statistics, the gamma distribution can be defined as a two parameter family consisting of continuous probability distributions. As seen in the log-normal distribution, X as well as both the parameters m and p must be positive.
In the parameters:

  • p is the shape parameter.
  • m is the inverse scale parameter.
The mean of this distribution is $\frac{m}{p}$ and the variance is $\frac{m}{p^{2}}$. If in any condition m is greater than 1, the mode will be $\frac{m -1}{p}$

Related Calculators
calculating gamma function Binomial Distribution Calculator
Calculating Poisson Distribution Calculator for Distributive Property

Gamma Distribution Function

Back to Top
The probability density function for the Gamma distribution with shape parameter $\alpha$ > 0 and scale parameter $\lambda$ > 0 is

f(x) = f(x) = $ \frac{1}{\Gamma(\alpha)\lambda^\alpha }x^{\alpha -1}e^{-\frac{x}{\lambda }}$

where, $\Gamma(\alpha) = \int_{0}^{\infty}u^{\alpha -1}e^{-u}$du is called gamma function. The domain is 0 $\leq$ x < $\infty$. The probability density function can be generated from any domain of the form $\gamma$ $\leq$ x < $\infty$ where $\gamma$ $\geq$ 0.

If $\alpha$ = 1, the Gamma pdf produces to the pdf of an exponential distribution.

Gamma Distribution Formula

Back to Top
The formula in terms of probability density function of the gamma distribution with parameter p and X as any continuous random variable, can be written as

f(x) = $\left\{\begin{matrix}
\frac{x^{p - 1} e^{-x}}{\Gamma {p}}& p > 0, 0 \leq x< \infty \\
0 & otherwise

The probability density function can also be written in the form

f(x) = $\frac{m^{p}}{\Gamma {p}}$ e-mx xp -1, 0 $\leq$ x < $\infty$

This is called a Gamma distribution with parameters m and p.

Gamma Distribution Cdf

Back to Top
The cumulative distribution function of a Gamma distribution is

f(x) = $\frac{1}{\beta ^{\alpha }\Gamma(\alpha )}x^{\alpha -1}e^{\frac{-x}{\beta }}$

where $\Gamma(\alpha )$ is the Gamma function

$\Gamma(\alpha ) = \int_{0}^{\infty}t^{\alpha -1}e^{t} $ dt

Gamma Function Properties

Back to Top
The following are the properties of the Gamma distribution:
  • $\Gamma{(n)}$ = (n - 1)!n when n = 1, 2, 3.....
  • $\Gamma{(x + 1)}$ = x$\Gamma{(x)}$ for any x > 0
  • $\Gamma{(1)}$ = 1
  • $\Gamma{(x)}$ = $\lim_{x \to +\infty}$ $\frac{n^{x} n!}{x(x + 1).....(x + n)}$ for every n $\geq$ 1
  • Relation between mean and variance of Gamma distribution

$\sigma^2$ = $\frac{\mu^2}{\alpha}$

Gamma Distribution Uses

Back to Top
Gamma distribution’s usage can be broadly defined as
  1. One of the main application is based on the interval which occurs between events when derived from it becomes the sum one or more than one exponentially distributed variables. Here the following examples can be taken like queuing models, the flow of objects through manufacturing and distribution processes and the load on many web servers also different forms works of telecom exchange.
  2. As it is moderately skewed, it can be very well used in different areas like it can be used as a workable model for climatic conditions, like raining and also in financial services in order to model the different patterns of insurance claims and the different size of loan defaults and so it has been used in different levels of probability of ruin and value at risk calculations.

Gamma Distribution Mean

Back to Top
The gamma distribution mean can be determined directly or by expanding the moment generating function. Gamma Distribution Mean is also called as Expected Value of Gamma Distribution.
Using the direct way

E(x) = $\int_{0}^{\infty}$ $\frac{e^{-x} x^{p - 1}}{\Gamma{p}}$ x dx

= $\frac{1}{\Gamma{p}}$ $\int_{0}^{\infty}$ e-x xp dx

= $\frac{\Gamma{p + 1}}{\Gamma{p}}$

= $\frac{p!}{(p - 1)!}$

= p

Gamma Distribution Mean, E(x) = p

Variance of Gamma Distribution

Back to Top
Variance of gamma distribution can be found as,

E(x2) = $\int_{0}^{\infty}$ $\frac{e^{-x} x^{p + 1}}{\Gamma{p}}$ x dx

= $\frac{1}{\Gamma{p}}$ $\int_{0}^{\infty}$ e-x xp+1 dx

= $\frac{\Gamma{p + 2}}{\Gamma{p}}$

= $\frac{(p + 1)!}{( p - 1)!}$

= p(p + 1)

Hence, variance = E(x2) - [E(x)]2

where, mean (E(x)) = p

Variance = p(p + 1) - p2

= p

Variance of Gamma Distribution, Variance = p

Additive Property of the Gamma Distribution

Back to Top
Let X1 and X2 be independent random variables following gamma distribution with parameter p1 and p2 respectively. Then, the moment generating function of the sum of two gamma distribution will have parameter p1 + p2.

So, the sum of two independent random variables following gamma distribution also follows the gamma distribution and its parameter is the sum of the parameters of the two variables. This property is known as the additive property. This results may be extended to the case of any number of independent gamma variables, that is, if X1, X2, X3…., Xn are independent random variables following gamma distribution with parameters, p1,p2,……pn respectively, their sum, X = X1 + X2 + …. + Xn follows the Gamma distribution with parameter, p = p1 + p2 +……+pn.

Generalized Gamma Distribution

Back to Top
The gamma distribution can be elaborated as the sum of a fixed number of exponential random variables, under the condition that the shape as well as the scale parameter can have only positive values. Gamma distribution totally depends on the scale factor and the shape factor. So, we can say the gamma distribution is well defined by the two parameters, scale factor and shape factor. When the gamma distribution is represented as the sum of the exponential distribution variables,
  • Shape factor denotes the number of variables present
  • The scale factor becomes the mean of the exponential distribution.

For any non negative continuous random variable, the probability density function in terms of generalized gamma distribution is given as follows:
f(x, a, d, p) = $\frac{\frac{p}{a^{d}}x^{d - 1}{e^{(\frac{x}{a})}}^{p}}{\Gamma{(\frac{d}{p})}}$

The generalized gamma distribution has three parameter, a > 0, d > 0, p > 0.

Different Cases of Gamma Distribution

Back to Top
Given below are the different cases of gamma distribution.

If the shape parameter has to be one and the scale parameter equal to the mean interval between the defined events, then the gamma distribution will be modified to exponential distribution.

Erlang: The shape parameter will be the number of events defined and the scale parameter becomes the average or mean interval between the events. This is mostly used to model the total intervals connected with Poisson events.

Chi Squared: If the shape parameter is taken as the degree of freedom divided by 2 and if the scale parameter is taken as 2, then the gamma distribution is changed to chi squared distribution.

Normal Gamma Distribution

Back to Top
The normal-gamma distribution (or Gaussian-gamma distribution) is a bivariate four-parameter family of continuous probability distributions. Let Xi, i = 1, 2, 3, …., n be the independent, identical exponential random variables $\lambda$ and lets take Y as their sum.

This distribution has $\mu$ = $\frac{1}{\lambda}$ and variance be $\sigma^{2}$ = $\frac{1}{\lambda^{2}}$ when the n value is large.

Then, Y = $\sum$ xi $\sim$ N $\left(\frac{n}{\lambda}, \frac{n}{\lambda^{2}} \right)$
Here, Y which can be taken as $\Gamma{(n, \lambda)}$ will have a normal approximation for large values of n.

Bivariate Gamma Distribution

Back to Top
The bivariate gamma distribution is only one possibility for extending the gamma. The bivariate gamma distribution is formed from two univariate gamma distribution with fixed shape parameters and scale parameters takes one of two values with a generalized Bernoulli distribution.

Multivariate Gamma Distribution

Back to Top
Multivariate gamma distributions generating variates from the multivariate gamma distribution that they considered. Multivariate stable distributions in which the multivariate stable random variable is a weighted sum of a univariate stable random variable times a point on the unit sphere. Several multivariate gamma distribution of the standard gamma is available. Prominent among them are distributions whose marginals are standard gamma. The Wishart distribution is taken as a multivariate generalization of the gamma distribution.

Inverse Gamma Distribution

Back to Top
The inverse gamma distribution is a two-parameter family of continuous probability distributions on the positive real line, which is the distribution of the reciprocal of a variable distributed according to the gamma distribution.

Inverse gamma distribution‘s probability density function is expressed as

f(x) = $\frac{m^{p}}{\Gamma{p}}$ x-p-1 e$^{\frac{-m}{x}}$ where m and p are parameter and x > 0.

Now, by taking the definition of factorial

$\int_{0}^{\infty}$ e-t tn dx = n!

When the n is replace by x, the function generated is

f(x) = $\int_{0}^{\infty}$ e-t tx dx

Taking only the limits 0 to $\infty$, when t = 0 => e-t = 1

So, e-t tx $\sim$ tx = $\frac{1}{t^{-x}}$

Here, this improper function will convergent at +$\infty$ what so ever the value of x is.

So, the domain of f(x) must be (-1, $\infty$).

As we need (0, +$\infty$) as the required domain, the x-axis is translated to a new function

$\Gamma$(x) = f(x - 1) = $\int_{0}^{\infty}$ e-t tx - 1 dt

This is called Gamma function or Euler’s second integral.

Log Gamma Distribution

Back to Top
The regression models can be fit with the generalized gamma distribution with the following probability density function f(t).

f(t) = $\frac{|\lambda|}{t\sigma \Gamma\lambda^{-2}}(\lambda^{-2})^{\lambda^{-2}}exp[\lambda^{-2}(\lambda(\frac{log(t)-\mu}{\sigma})-exp(\lambda(\frac{log(t)-\mu}{\sigma})))]$

If a lifetime T has the generalized gamma distribution then the logarithm of the lifetime X = log(T) has the generalized log-gamma distribution, with the following probability density function g(x). When the gamma distribution is specified, the logarithms of the lifetimes are used as responses and the generalized log-gamma distribution is used to estimate the parameters by the maximum likelihood.

The formula $\Gamma$(x) = f(x - 1) = $\int_{0}^{\infty}$ e-t tx - 1 dt also be defined using the logarithm function.

$\Gamma$ (x) = $\int_{0}^{\infty}$ e-t tx - 1 dt = $\int_{0}^{1}$ (ln$\frac{1}{t}$)x - 1 dt

So, it can be taken as the logarithm gamma distribution. The graph of the gamma function for the interval z$\in$ [-6, 6] is as follows:

Log Gamma Distribution

Gamma Poisson Distribution

Back to Top
Poisson-Gamma distribution is the application of the negative binomial distribution. The negative binomial distribution can be viewed as a Poisson distribution where the Poisson parameter is itself a random variable, distributed according to a Gamma distribution.

Gamma Distribution Example

Back to Top
The following graphs show the examples of gamma distribution with respect to the change in m and p.

When p = 2 and m = 4

Gamma Distribution Example

When p = 4 and m = 4

Gamma Distribution Examples
Related Topics
Math Help Online Online Math Tutor
*AP and SAT are registered trademarks of the College Board.