In probability theory and statistics, a **probability distribution**
identifies either the probability of each value of a random variable
(when the variable is discrete) or the probability of the value falling
within a particular interval. In statistics, there are two types of random variables, which can be called as discrete and continuous random variables. According to the type of the random variable, the formula will change accordingly. Expectation or mean of the random variable is denoted by the formula E[x].

Related Calculators | |

Expected Frequency Calculator | Mathematical Equation Solver |

In the** discrete type of random variable**, the definition of mathematical expectation is given as the sum of the product of the random variables and the probability mass function of those random variables.

For the**continuous type of random variable**, the mathematical expectation can be written as the integral of the product of the random variables and the probability density function of those random variables.

Consider a random variable X and lets assume it will take on the values x_{1}, x_{2},…..x_{k} with its probabilities as f(x_{1}), f(x_{2}),….,f(x_{k}), then its mathematical expectation, also called expected value is obtained as

$E(x) = \sum_{i = 1}^{k} x_i f(x_i)$

= x_{1} f(x_{1}) + x_{2} f(x_{2}) +……..+ x_{k} f(x_{k}).

The expectation of the sum of two variables is the sum of the mathematical expectation of each of the variable, provided each of them exist. If X and Y are any two random variables, then

E [X + Y] = E[X] + E[Y]

The generalization of the mathematical expectation can be written as, if X, X_{2},……,X_{k} are finite number of random variables, then

E[X_{1} + X_{2} +…..+ X_{k}] = E[X_{1}] + E[X_{2}]……..+ E[X_{k}]

That is which states that the mathematical expectation of the sum of the k number of random variables is equal to the sum of the mathematical expectation of the individual k number of random variables.

In case of two functions, if a and b be two constants and f and g be two functions, then

E[af(X) + bg(X)] = aE[f(X)] + bE[g(X)]

The mathematical expectation of the product of the two random variables are equal to the product of the mathematical expectation of the individual random variables. If X and Y are two random variable then

E[XY] = E[X] * E[Y]

This property of the mathematical expectation also has the generalized form, which states that the mathematical expectation of the product of the ‘n’ number of random variables is equal to the product of the mathematical expectation of the individual ‘n’ number of the random variables.

E(X_{1}.X_{2} …… X_{n}) = E(X_{1}) E(X_{2}) …… E (X_{n}) provided all the expectations exist.

### Mean of X

The expected value of X, E[X] is also called the mean of X . The Greek letter is used to denote it, $\mu$

E[X] = $\mu$

It can also be known as the__first moment about the origin.__

### Variance of X

When f(X) = (X - $\mu$)^{2}, then

E [f(X)] = E [(X - $\mu$)^{2}]

= $\sum_{x \in S} (x - \mu)^{2}$ f(x) is defined as the variance of X. It may be denoted as either Var(X) or $\sigma^{2}$.

### Solved Example

**Question: **Consider the following table of probability mass function for Y

Find the variance and the standard deviation of the function.

** Solution: **

Now E(Y) = 2(0.4) + 3(0.4) + 4(0.2) = 2.8

So, $\mu$ = 2.8

So, $\sigma^{2}$ = $E[(Y - \mu)^{2}]$

= (2 - 2.8)^{2} (0.4) + (3 - 2.8)^{2}(0.4) + (4 - 2.8)^{2}(0.3)

= 0.256 + 0.016 + 0.432

= 0.704

So the standard deviation,

$\sigma$ = $\sqrt{0.704}$

= 0.839

Variance can be defined in a better way

$\sigma^{2}$ = E(X^{2}) - $\mu^{2}$

By definition,

Var[X] = E [(X - $\mu$)^{2}]

= E[X^{2}] – (E[X])^{2}

So, Var [aX + b] = E [( aX + b)^{2}] - E[ aX + b]^{2}

= E [a^{2} X^{2} + 2abX + b^{2}] - (E[aX] + E[b])^{2}

= E [a^{2} X^{2}]] + E [2abX] + E [b^{2}]- E[aX]^{2} - 2E[abX] - E[b]^{2}

= E [a^{2} X^{2}] - E[aX]^{2}

= a^{2} E[X^{2}] - a^{2}E[X]^{2}

= a^{2} (E[X^{2}] - E[X]^{2})

= a^{2} Var[X]

Var [aX + b] = a^{2} Var[X]

The extend to which any two random variables change together or vary together can be defined as covariance. It is really the association between them.

### Covariance Equation

When we consider two random variables x and y: (x_{1}, y_{1}), (x_{2}, y_{1})…....(x_{n}, y_{n}), the covariance between two variables X and Y, denoted by Cov(x, y), can be measured by using the covariance equation.

Cov(X, Y) = E {[X - E(X)] [Y - E(Y)]}

where, E{x} and E{y} can be defined as the means of x and y, respectively.

So, it is the expected values of the product $\bar{X Y}$ where $\bar{X}$ is the deviation of X from expected mean, that is X - E(X) and $\bar{Y}$ is the deviation of Y from expected mean, that is Y - E(Y).

1. When $\bar{X Y}$ becomes**positive**, it has two meanings

2. When $\bar{X Y}$ becomes**negative**

→ Read More Given below are some of the examples on Mathematical Expectation.

### Solved Examples

**Question 1: **If the random variable X is the up face of an unbiased die, then the probability mass function of X is

f(x) = $\frac{1}{6}$ for x = 1, 2, 3,…....6.

** Solution: **

The expected value of X, is

E(X) = 1 $\left(\frac{1}{6}\right)$ + 2 $\left(\frac{1}{6}\right)$ + 3 $\left(\frac{1}{6}\right)$ + 4 $\left(\frac{1}{6}\right)$ + 5 $\left(\frac{1}{6}\right)$ + 6 $\left(\frac{1}{6}\right)$

= 3.5

The expected value of X is mostly a theoretical and not the actual value.

**Question 2: **Consider the table that contains the probability mass function of the discrete random variable X.

Calculate E (3), E(3X), E (4 + X), E (4 + 3X).

** Solution: **

1. Calculate E(3)

E(3) = $\sum_{i = 0}^{3}$ 3 f(x_{i})

= 3

= 3[0.1 + 0.5 + 0.2 + 0.2]

= 3[1.0]

= 3

The expected value of a constant is the constant value itself.

If c is any constant, then E(c) = c.

2. Calculate E(3X)

E(3X) = $\sum_{i = 0}^{3}$ 3x f(x_{i})

= 3 $\sum_{i = 0}^{3}$ x f(x_{i})

= 3[0(0.1) + 1(0.5) + 2(0.2) + 3(0.2)]

= 3[0 + 0.5 + 0.4 + 0.6]

= 3[1.5]

= 4.5

This can be also calculated by multiplying 3 and E(X)

So E (3X) = 3E(X)

So by property if c is a constant and if f is any function, then

E [cf(X)] = cE[f(X)]

3. Calculate E(4 + X)

Now by definition

E(4 + X) = $\sum_{i = 0}^{3}$ (4 + x) f(x_{i})

= [(4 + 0)(0.1) + (4 + 1)(0.5) + (4 + 2)(0.2) + (4 + 3)(0.2)]

= [(4)(0.1) + (5)(0.5) + (6)(0.2) + (7)(0.2)]

= [0.4 + 2.5 + 1.2 + 1.4]

= 5.5

This can be also calculated by adding 4 and E(X)

So, E (4 + X) = 4 + E(X)

So, by property if c is a constant and if f is any function, then

E [c + f(X)] = c + E[f(X)]

4. Calculate E (4 + 3X)

Now, by definition

E(4 + 3X) = $\sum_{i = 0}^{3}$ (4 + 3x) f(x_{i})

= [(4 + 3(0))(0.1) + (4 + 3(1))(0.5) + (4 + 3(2))(0.2) + (4 + 3(3))(0.2)]

= [(4)(0.1) + (7)(0.5) + (10)(0.2) + (13)(0.2)]

= [0.4 + 3.5 + 2 + 2.6]

= 8.5

This can be also calculated by adding 4 and 3E(X)

So E (4 + X) = 4 + 3E(X).

Because, E(c_{1}X+ c_{2})= c_{1}E(X)+ c_{2 }.

For the

Consider a random variable X and lets assume it will take on the values x

$E(x) = \sum_{i = 1}^{k} x_i f(x_i)$

= x

The expectation of the sum of two variables is the sum of the mathematical expectation of each of the variable, provided each of them exist. If X and Y are any two random variables, then

E [X + Y] = E[X] + E[Y]

The generalization of the mathematical expectation can be written as, if X, X

E[X

That is which states that the mathematical expectation of the sum of the k number of random variables is equal to the sum of the mathematical expectation of the individual k number of random variables.

In case of two functions, if a and b be two constants and f and g be two functions, then

E[af(X) + bg(X)] = aE[f(X)] + bE[g(X)]

The mathematical expectation of the product of the two random variables are equal to the product of the mathematical expectation of the individual random variables. If X and Y are two random variable then

E[XY] = E[X] * E[Y]

This property of the mathematical expectation also has the generalized form, which states that the mathematical expectation of the product of the ‘n’ number of random variables is equal to the product of the mathematical expectation of the individual ‘n’ number of the random variables.

E(X

The expected value of X, E[X] is also called the mean of X . The Greek letter is used to denote it, $\mu$

E[X] = $\mu$

It can also be known as the

When f(X) = (X - $\mu$)

E [f(X)] = E [(X - $\mu$)

= $\sum_{x \in S} (x - \mu)^{2}$ f(x) is defined as the variance of X. It may be denoted as either Var(X) or $\sigma^{2}$.

y |
f(y) |

2 | 0.4 |

3 | 0.4 |

4 | 0.2 |

Find the variance and the standard deviation of the function.

y |
f(y) |

2 | 0.4 |

3 | 0.4 |

4 | 0.2 |

Now E(Y) = 2(0.4) + 3(0.4) + 4(0.2) = 2.8

So, $\mu$ = 2.8

So, $\sigma^{2}$ = $E[(Y - \mu)^{2}]$

= (2 - 2.8)

= 0.256 + 0.016 + 0.432

= 0.704

So the standard deviation,

$\sigma$ = $\sqrt{0.704}$

= 0.839

Variance can be defined in a better way

$\sigma^{2}$ = E(X

By definition,

Var[X] = E [(X - $\mu$)

= E[X

So, Var [aX + b] = E [( aX + b)

= E [a

= E [a

= E [a

= a

= a

= a

Var [aX + b] = a

When we consider two random variables x and y: (x

Cov(X, Y) = E {[X - E(X)] [Y - E(Y)]}

where, E{x} and E{y} can be defined as the means of x and y, respectively.

So, it is the expected values of the product $\bar{X Y}$ where $\bar{X}$ is the deviation of X from expected mean, that is X - E(X) and $\bar{Y}$ is the deviation of Y from expected mean, that is Y - E(Y).

1. When $\bar{X Y}$ becomes

- Both are above their respective means
- Both are below their respective means

2. When $\bar{X Y}$ becomes

- Either one of it is above and the other is below their respective means

→ Read More Given below are some of the examples on Mathematical Expectation.

f(x) = $\frac{1}{6}$ for x = 1, 2, 3,…....6.

The expected value of X, is

E(X) = 1 $\left(\frac{1}{6}\right)$ + 2 $\left(\frac{1}{6}\right)$ + 3 $\left(\frac{1}{6}\right)$ + 4 $\left(\frac{1}{6}\right)$ + 5 $\left(\frac{1}{6}\right)$ + 6 $\left(\frac{1}{6}\right)$

= 3.5

The expected value of X is mostly a theoretical and not the actual value.

X | f(x) |

0 | 0.1 |

1 |
0.5 |

2 |
0.2 |

3 | 0.2 |

Calculate E (3), E(3X), E (4 + X), E (4 + 3X).

1. Calculate E(3)

E(3) = $\sum_{i = 0}^{3}$ 3 f(x

= 3

= 3[0.1 + 0.5 + 0.2 + 0.2]

= 3[1.0]

= 3

The expected value of a constant is the constant value itself.

If c is any constant, then E(c) = c.

2. Calculate E(3X)

E(3X) = $\sum_{i = 0}^{3}$ 3x f(x

= 3 $\sum_{i = 0}^{3}$ x f(x

= 3[0(0.1) + 1(0.5) + 2(0.2) + 3(0.2)]

= 3[0 + 0.5 + 0.4 + 0.6]

= 3[1.5]

= 4.5

This can be also calculated by multiplying 3 and E(X)

So E (3X) = 3E(X)

So by property if c is a constant and if f is any function, then

E [cf(X)] = cE[f(X)]

3. Calculate E(4 + X)

Now by definition

E(4 + X) = $\sum_{i = 0}^{3}$ (4 + x) f(x

= [(4 + 0)(0.1) + (4 + 1)(0.5) + (4 + 2)(0.2) + (4 + 3)(0.2)]

= [(4)(0.1) + (5)(0.5) + (6)(0.2) + (7)(0.2)]

= [0.4 + 2.5 + 1.2 + 1.4]

= 5.5

This can be also calculated by adding 4 and E(X)

So, E (4 + X) = 4 + E(X)

So, by property if c is a constant and if f is any function, then

E [c + f(X)] = c + E[f(X)]

4. Calculate E (4 + 3X)

Now, by definition

E(4 + 3X) = $\sum_{i = 0}^{3}$ (4 + 3x) f(x

= [(4 + 3(0))(0.1) + (4 + 3(1))(0.5) + (4 + 3(2))(0.2) + (4 + 3(3))(0.2)]

= [(4)(0.1) + (7)(0.5) + (10)(0.2) + (13)(0.2)]

= [0.4 + 3.5 + 2 + 2.6]

= 8.5

This can be also calculated by adding 4 and 3E(X)

So E (4 + X) = 4 + 3E(X).

Because, E(c

More topics in Mathematical Expectation | |

Covariance | |

Related Topics | |

Math Help Online | Online Math Tutor |