Among different kinds of notions of convergences studied in probability theory, the "convergence in probability" is often seen. This convergence is based on the idea that the probability of occurrence of an unusual outcome becomes more small with the progress of sequence. The problems on the concept of "convergence in probability" are commonly seen in statistics and probability theory. It is a form of convergence that is performed by the weak law of large numbers. Let us go ahead and learn about the convergence in probability in this article.

Related Calculators | |

Radius of Convergence Calculator | Calculation of Probability |

Binomial Distribution Probability Calculator | Binomial Probability Calculator |

The convergence in probability is represented by writing "p" or "P" over the arrow that denotes convergence as under :

$X_{n}\ \overset{P}{\rightarrow}\ X$

or by

$P\ lim_{n \rightarrow \infty} \ X_{n} \rightarrow X$

or by

$P\ lim_{n \rightarrow \infty} \ X_{n} \rightarrow X$

A sequence of random variables, say ${X_{1}, X_{2}, ..., X_{n}}$ is said to be converging in probability to a variable X if, $lim_{n \rightarrow \infty} P(|X_{n}-X| \geq \epsilon) = 0$ for all $\epsilon$ > 0

Following are the properties of the notion of convergence of probability.

**(1)** Convergence of probability does not mean that the sequence $X_{n}$ converge to X when n tends to infinity.

**(2)** For infinitely many n, where n belongs to a positive integer set, $|X_{n}-X| > \epsilon$ for all $\epsilon$ > 0.

**(3)** Almost sure convergence is NOT implied by the convergence in probability, since convergence in probability holds when $lim_{n \rightarrow \infty} P(|X_{n}-X| \geq \epsilon) = 0$ and almost sure convergence holds if $lim_{n \rightarrow \infty} P(|X_{n}-X|) = 1$

if $X_{n}\ \overset{P}{\rightarrow}\ X$, then $m(X_{n})\ \overset{P}{\rightarrow}\ m(X)$.

The convergence in probability always indicates convergence in distribution. On the other hand, the convergence in distribution would indicate convergence in probability only when in the case of random variable being constant. Hence, if X is constant, then the concepts of convergence in probability and convergence in distribution are said to be equivalent.

When the sequence $X_{n}$ converges in probability to random variable X, it suggests that $X_{n}$ gets closer and closer to X with the increase in n; X having very high probability. And intuitively, when we say $X_{n}$ converges in distribution to X, it would mean that the distribution of sequence $X_{n}$ goes closer and closer to the distribution of random variable X as n increases.

An example of convergence in probability is discussed below.

$lim_{i \rightarrow \infty} P(|X_{i}-c| \geq \epsilon) = 0$ ; $\forall \ \epsilon > 0$

Which means that with very high probability, we will find $X_{i}$ to be concentrated only around the number c. But the given

distribution is uniform; therefore, the our assumption is not possible. Hence, $X_{i}$ cannot converge to c in probability.

So, $Y_{i} \leq$ $\frac{1}{i}$

Eventually $Y_{i}$ is very close to zero, as i goes to infinity.

Thus, $Y_{i}$ may converge to zero in probability.

If this is the case, then

$P(|Y_{i} - 0| \geq \epsilon$

= $P(|Y_{i}|) \geq \epsilon$

$\Rightarrow \ P(|$$\frac{1}{i}$$|) \geq \epsilon$ = $\left\{\begin{matrix} 1\ if\ i\ \leq \frac{1}{\epsilon} & \\ 0\ if\ i\ > \frac{1}{\epsilon} & \end{matrix}\right.$

This indicates as long as i is big enough as compared to $\epsilon$, the quantity $P(|Y_{i}|)$ is equal to zero.

Hence, $lim_{i \rightarrow \infty} P(|Y_{i}-0| \geq \epsilon) = 0$

which concludes $Y_{i}$ converges to zero in probability.

Related Topics | |

Math Help Online | Online Math Tutor |