Top

Mathematical Expectation

In probability theory and statistics, a probability distribution identifies either the probability of each value of a random variable (when the variable is discrete) or the probability of the value falling within a particular interval. In statistics, there are two types of random variables, which can be called as discrete and continuous random variables. According to the type of the random variable, the formula will change accordingly. Expectation or mean of the random variable is denoted by the formula E[x].

 Related Calculators Expected Frequency Calculator Mathematical Equation Solver

Mathematical Expectation Definition

In the discrete type of random variable, the definition of mathematical expectation is given as the sum of the product of the random variables and the probability mass function of those random variables.

For the continuous type of random variable, the mathematical expectation can be written as the integral of the product of the random variables and the probability density function of those random variables.

Consider a random variable X and lets assume it will take on the values x1, x2,…..xk with its probabilities as f(x1), f(x2),….,f(xk), then its mathematical expectation, also called expected value is obtained as

$E(x) = \sum_{i = 1}^{k} x_i f(x_i)$

= x1 f(x1) + x2 f(x2) +……..+ xk f(xk).

The expectation of the sum of two variables is the sum of the mathematical expectation of each of the variable, provided each of them exist. If X and Y are any two random variables, then

E [X + Y] = E[X] + E[Y]

The generalization of the mathematical expectation can be written as, if X, X2,……,Xk are finite number of random variables, then

E[X1 + X2 +…..+ Xk] = E[X1] + E[X2]……..+ E[Xk]

That is which states that the mathematical expectation of the sum of the k number of random variables is equal to the sum of the mathematical expectation of the individual k number of random variables.

In case of two functions, if a and b be two constants and f and g be two functions, then
E[af(X) + bg(X)] = aE[f(X)] + bE[g(X)]

Multiplication Theorem of Expectation

The mathematical expectation of the product of the two random variables are equal to the product of the mathematical expectation of the individual random variables. If X and Y are two random variable then

E[XY] = E[X] * E[Y]

This property of the mathematical expectation also has the generalized form, which states that the mathematical expectation of the product of the ‘n’ number of random variables is equal to the product of the mathematical expectation of the individual ‘n’ number of the random variables.

E(X1.X2 …… Xn) = E(X1) E(X2) …… E (Xn) provided all the expectations exist.

Mean of X

The expected value of X, E[X] is also called the mean of X . The Greek letter is used to denote it, $\mu$

E[X] = $\mu$

It can also be known as the first moment about the origin.

Variance of X

When f(X) = (X - $\mu$)2, then

E [f(X)] = E [(X - $\mu$)2]

= $\sum_{x \in S} (x - \mu)^{2}$ f(x) is defined as the variance of X. It may be denoted as either Var(X) or $\sigma^{2}$.

Solved Example

Question: Consider the following table of probability mass function for Y

 y f(y) 2 0.4 3 0.4 4 0.2

Find the variance and the standard deviation of the function.
Solution:

 y f(y) 2 0.4 3 0.4 4 0.2

Now E(Y) = 2(0.4) + 3(0.4) + 4(0.2) = 2.8

So, $\mu$ = 2.8

So, $\sigma^{2}$ = $E[(Y - \mu)^{2}]$

= (2 - 2.8)2 (0.4) + (3 - 2.8)2(0.4) + (4 - 2.8)2(0.3)

= 0.256 + 0.016 + 0.432

= 0.704

So the standard deviation,

$\sigma$ = $\sqrt{0.704}$

= 0.839

Variance can be defined in a better way

$\sigma^{2}$ = E(X2) - $\mu^{2}$

By definition,
Var[X] = E [(X - $\mu$)2]

= E[X2] – (E[X])2

So, Var [aX + b] = E [( aX + b)2] - E[ aX + b]2

= E [a2 X2 + 2abX + b2] - (E[aX] + E[b])2

= E [a2 X2]] + E [2abX] + E [b2]- E[aX]2 - 2E[abX] - E[b]2

= E [a2 X2] - E[aX]2

= a2 E[X2] - a2E[X]2

= a2 (E[X2] - E[X]2)

= a2 Var[X]

Var [aX + b] = a2 Var[X]

Covariance

The extend to which any two random variables change together or vary together can be defined as covariance. It is really the association between them.

Covariance Equation

When we consider two random variables x and y: (x1, y1), (x2, y1)…....(xn, yn), the covariance between two variables X and Y, denoted by Cov(x, y), can be measured by using the covariance equation.

Cov(X, Y) = E {[X - E(X)] [Y - E(Y)]}

where, E{x} and E{y} can be defined as the means of x and y, respectively.

So, it is the expected values of the product $\bar{X Y}$ where $\bar{X}$ is the deviation of X from expected mean, that is X - E(X) and $\bar{Y}$ is the deviation of Y from expected mean, that is Y - E(Y).

1. When $\bar{X Y}$ becomes positive, it has two meanings
• Both are above their respective means
• Both are below their respective means
It means both $\bar{X}$ and $\bar{Y}$ will have same sign when the deviation from the mean is calculated.

2. When $\bar{X Y}$ becomes negative
• Either one of it is above and the other is below their respective means
It means both $\bar{X}$ and $\bar{Y}$ will have different sign (opposite) when the deviation from the mean are calculated. Hence, the covariance between any two variance X and Y provides a measure of the degree to which X and Y tends to move together.

Mathematical Expectation Examples

Given below are some of the examples on Mathematical Expectation.

Solved Examples

Question 1: If the random variable X is the up face of an unbiased die, then the probability mass function of X is
f(x) = $\frac{1}{6}$ for x = 1, 2, 3,…....6.
Solution:

The expected value of X, is

E(X) = 1 $\left(\frac{1}{6}\right)$ + 2 $\left(\frac{1}{6}\right)$ + 3 $\left(\frac{1}{6}\right)$ + 4 $\left(\frac{1}{6}\right)$ + 5 $\left(\frac{1}{6}\right)$ + 6 $\left(\frac{1}{6}\right)$

= 3.5

The expected value of X is mostly a theoretical and not the actual value.

Question 2: Consider the table that contains the probability mass function of the discrete random variable X.

 X f(x) 0 0.1 1 0.5 2 0.2 3 0.2

Calculate E (3), E(3X), E (4 + X), E (4 + 3X).

Solution:

1. Calculate E(3)

E(3) = $\sum_{i = 0}^{3}$ 3 f(xi)

= 3

= 3[0.1 + 0.5 + 0.2 + 0.2]

= 3[1.0]

= 3

The expected value of a constant is the constant value itself.
If c is any constant, then E(c) = c.

2. Calculate E(3X)

E(3X) = $\sum_{i = 0}^{3}$ 3x f(xi)

= 3 $\sum_{i = 0}^{3}$ x f(xi)

= 3[0(0.1) + 1(0.5) + 2(0.2) + 3(0.2)]

= 3[0 + 0.5 + 0.4 + 0.6]

= 3[1.5]

= 4.5

This can be also calculated by multiplying 3 and E(X)

So E (3X) = 3E(X)

So by property if c is a constant and if f is any function, then
E [cf(X)] = cE[f(X)]

3. Calculate E(4 + X)

Now by definition
E(4 + X) = $\sum_{i = 0}^{3}$ (4 + x) f(xi)

= [(4 + 0)(0.1) + (4 + 1)(0.5) + (4 + 2)(0.2) + (4 + 3)(0.2)]

= [(4)(0.1) + (5)(0.5) + (6)(0.2) + (7)(0.2)]

= [0.4 + 2.5 + 1.2 + 1.4]

= 5.5

This can be also calculated by adding 4 and E(X)

So, E (4 + X) = 4 + E(X)

So, by property if c is a constant and if f is any function, then
E [c + f(X)] = c + E[f(X)]

4. Calculate E (4 + 3X)

Now, by definition
E(4 + 3X) = $\sum_{i = 0}^{3}$ (4 + 3x) f(xi)

= [(4 + 3(0))(0.1) + (4 + 3(1))(0.5) + (4 + 3(2))(0.2) + (4 + 3(3))(0.2)]

= [(4)(0.1) + (7)(0.5) + (10)(0.2) + (13)(0.2)]

= [0.4 + 3.5 + 2 + 2.6]

= 8.5

This can be also calculated by adding 4 and 3E(X)

So E (4 + X) = 4 + 3E(X).

Because, E(c1X+ c2)= c1E(X)+ c2 .

 More topics in Mathematical Expectation Covariance
 NCERT Solutions NCERT Solutions NCERT Solutions CLASS 6 NCERT Solutions CLASS 7 NCERT Solutions CLASS 8 NCERT Solutions CLASS 9 NCERT Solutions CLASS 10 NCERT Solutions CLASS 11 NCERT Solutions CLASS 12
 Related Topics Math Help Online Online Math Tutor
*AP and SAT are registered trademarks of the College Board.