To get the best deal on Tutoring, call 1-855-666-7440 (Toll Free)
Top

Sum of Random Variables

In probability theory, the concept of random variables play a vital role. They are also known by the names - stochastic variables and aleatory variables. Random variables are defined as the variables whose values are subjected to change by chance. They possess the virtue of randomness. Just similar to other variables, a random variable may also assign a set of different values. In probability theory, a random variable is associated with a certain probability. Therefore, the function that describes all possible values of associated probabilities of a random variable is termed as the probability distribution function.

There may be two types of random variables - discrete and continuous. The discrete random variable may take any finite set of values which form a probability mass function. On the other hand, the continuous random variable takes any numerical value within a specific interval. It forms a probability density function. Probability mass and density functions are the characteristics of probability distributions.

At times, we are required to know the sum of combinations of random variables, for example - sum of two incomes. Let us suppose that $S_{n}$ be the sum of n number of random variables which are statistically independent and are represented by $x_{i}$. Then in general, the sum would be denoted by the following relation :
$S_{n} =  x_{i}$

In this section, we will deal with assessment of the behavior of sum of random variables in different contexts.

Related Calculators
Binomial Random Variable Calculator Calculator Variable
Geometric Series Sum Calculator Riemann Sum Calculator
 

Sum of Two Random Variables

Back to Top
Under the topic "sum of random variables", we learn about two main concepts - the sum of two random variables and the sum of more than two random variables. We may study sum of two random variables in different contexts. In probability theory, the sum of two independent variables usually tells us about the probability as well as the probability mass function or probability distribution of sum two discrete or continuous variables. When we talk about the sum of two gamma random variables, we refer to the sum of distributions of two functions of gamma variables. Thus, sum of two random variables are quite commonly seen in mathematics, especially in statistics and probability theory.

Distribution of Sum of Random Variables

Back to Top
Let us consider two independent random variables (generally discrete) A and B both of which assign only positive integers. Let function $f_{A}(a)$ represents the probability distribution function of A, while $f_{B}(b)$ represent the probability distribution function of B. Let us suppose that the distribution of their sum be denoted by C, i.e. C = A + B, then it is expressed by the following discrete convolution formula :
The probability distribution function of C is given by $f_{C}(c)$, then

$f_{C}(c) = f_{A+B} (c) = P(C = c) = \sum_{x=0}^{c} f_{A}(a)\ f_{B}(c - a)$

Sum of Independent Random Variables

Back to Top
In probability theory, two variables are said to be statistically independent, or randomly independent or stochastically independent, in case when the value of one variable is not effected by the probability of another. In other words, two random variables would be independent of each other if the realization of former leaves no effect on the probability distribution of latter and vice versa.

Two random variables are said to be independent when all the elements of their generated system are independent. More than two random variables are called pairwise independent, when each pair of random variables is independent. The sum of independent random variables can be calculated using their probability distribution function. In probability theory. the probability of sum of two independent variables is said be the joint probability. If A and B be two independent events containing random variables, then probability of their sum is denoted by P(A, B) and represented as under : P(A, B) = P(A) P(B)

Sum of Gamma Random Variables

Back to Top
The gamma variables are often seen in statistics and probability theory. The gamma distribution is said to be the family of two-parameter continuous probability distributions. The special cases of gamma distribution are chi-squared distribution and exponential distribution.
A gamma-distributed random variable say P having scale $\theta$ and shape k is represented as below :

$P \sim \Gamma(k, \theta) \equiv \Gamma(k, \theta)$

The sum of independent Gamma distributions having gamma random variables is a Gamma distribution. i.e. if $P \sim \Gamma(p, \theta)$ and $Q \sim \Gamma(q, \theta)$

Where P and Q denote independent random variables, then

$P + Q \sim \Gamma(p + q, \theta)$

Sum of Random Variables Example

Back to Top
Have a look at the following example explaining probability density function of sum of two random variables.
Example : 

If $X_{1}$ and $X_{2}$ are two independent variables with same probability density function given by-

f(x) = $2e^{-2x}$ if x $\geq$ 0 and f (x) = 0 elsewhere.

Determine the pdf of Y = $X_{1} + X_{2}$

Solution : 

The pdf of two random variables is given by the following convolution integral -

f(y) = $\int_{0}^{y} f_{1}(x)\ f_{2}(y-x)\ dx$ ; y $\geq$ 0

f(y) = $\int_{0}^{y} f_{1}(x)\ f_{2}(y-x)\ dx$ _____(1)

Since 0 $\geq$ x $\geq$ y (we have the limits 0 to y)

Hence y - x $\geq$ 0

So, taking f(x) = $2e^{-2x}$ if x $\geq$ 0

$f_{2}(y - x) = 2e^{-2(y-x)}$

Also, since x $\geq$ 0 in our integral, hence we will take $f_{1}(x) = 2e^{-2x}$

Plugging these values in equation (1), we obtain

f(y) = $\int_{0}^{y} 2e^{-2x}\ 2e^{-2(y-x)}\ dx$

= $4 \int_{0}^{y} e^{(-2x -2y + 2x)}\ dx$

= $4 \int_{0}^{y} e^{-2y}\ dx$

= $4e^{-2y} \int_{0}^{y} dx$

= $4e^{-2y} (y)$

f(y) = $4y e^{-2y}$ for y $\geq$ 0

Also, f(y) = 0 for y < 0

Hence pdf of y is :

Sum of Random Variables Example

Related Topics
Math Help Online Online Math Tutor
*AP and SAT are registered trademarks of the College Board.