Top

# Convergence in Probability

Among different kinds of notions of convergences studied in probability theory, the "convergence in probability" is often seen. This convergence is based on the idea that the probability of occurrence of an unusual outcome becomes more small with the progress of sequence. The problems on the concept of "convergence in probability" are commonly seen in statistics and probability theory. It is a form of convergence that is performed by the weak law of large numbers. Let us go ahead and learn about the convergence in probability in this article.

 Related Calculators Radius of Convergence Calculator Calculation of Probability Binomial Distribution Probability Calculator Binomial Probability Calculator

## Definition

The convergence in probability is represented by writing "p" or "P" over the arrow that denotes convergence as under :
$X_{n}\ \overset{P}{\rightarrow}\ X$

or by

$P\ lim_{n \rightarrow \infty} \ X_{n} \rightarrow X$

A sequence of random variables, say ${X_{1}, X_{2}, ..., X_{n}}$ is said to be converging in probability to a variable X if, $lim_{n \rightarrow \infty} P(|X_{n}-X| \geq \epsilon) = 0$ for all $\epsilon$ > 0

## Properties

Following are the properties of the notion of convergence of probability.

(1) Convergence of probability does not mean that the sequence $X_{n}$ converge to X when n tends to infinity.

(2) For infinitely many n, where n belongs to a positive integer set, $|X_{n}-X| > \epsilon$ for all $\epsilon$ > 0.

(3) Almost sure convergence is NOT implied by the convergence in probability, since convergence in probability holds when $lim_{n \rightarrow \infty} P(|X_{n}-X| \geq \epsilon) = 0$ and almost sure convergence holds if $lim_{n \rightarrow \infty} P(|X_{n}-X|) = 1$

(4) Let m(.) be a continuous function, then according to the continuous mapping theorem :

if $X_{n}\ \overset{P}{\rightarrow}\ X$, then $m(X_{n})\ \overset{P}{\rightarrow}\ m(X)$.

(5) The topology over the fixed probability space of random variables is defined by the concept of convergence in probability.

## Relation to Convergence in Distribution

The convergence in probability always indicates convergence in distribution. On the other hand, the convergence in distribution would indicate convergence in probability only when in the case of random variable being constant. Hence, if X is constant, then the concepts of convergence in probability and convergence in distribution are said to be equivalent.

When the sequence $X_{n}$ converges in probability to random variable X, it suggests that $X_{n}$ gets closer and closer to X with the increase in n; X having very high probability. And intuitively, when we say $X_{n}$ converges in distribution to X, it would mean that the distribution of sequence $X_{n}$ goes closer and closer to the distribution of random variable X as n increases.

## Examples

An example of convergence in probability is discussed below.
Example: Given a random variable X with uniform distribution, such as X $\sim$ unif[-1, 1]. There is a sequence of variables $X_{1}, X_{2},...$, each of which has same distribution as X has and different $X_{i}'s$ are independent. Determine the following :

(a) If the sequence $X_{i}$ converges to some number, say c, in probability. i.e. $X_{i} \rightarrow c$ as n $\rightarrow$ $\infty$ ?

(b) If a sequence $Y_{i}$ defined by $Y_{i}$ = $\frac{X_{i}}{i}$ converges in probability ?

Solution :

(a) Let $X_{i}$ converges to c in probability, then by definition we should have

$lim_{i \rightarrow \infty} P(|X_{i}-c| \geq \epsilon) = 0$ ; $\forall \ \epsilon > 0$

Which means that with very high probability, we will find $X_{i}$ to be concentrated only around the number c. But the given
distribution is uniform; therefore, the our assumption is not possible. Hence,
$X_{i}$ cannot converge to c in probability.

(b) It is quite clear that
$|X_{i}| \leq 1$

So,  $Y_{i} \leq$ $\frac{1}{i}$

Eventually $Y_{i}$ is very close to zero, as i goes to infinity.

Thus, $Y_{i}$ may converge to zero in probability.

If this is the case, then

$P(|Y_{i} - 0| \geq \epsilon$

= $P(|Y_{i}|) \geq \epsilon$

$\Rightarrow \ P(|$$\frac{1}{i}$$|) \geq \epsilon$ = $\left\{\begin{matrix} 1\ if\ i\ \leq \frac{1}{\epsilon} & \\ 0\ if\ i\ > \frac{1}{\epsilon} & \end{matrix}\right.$

This indicates as long as i is big enough as compared to $\epsilon$, the quantity $P(|Y_{i}|)$ is equal to zero.

Hence, $lim_{i \rightarrow \infty} P(|Y_{i}-0| \geq \epsilon) = 0$

which concludes $Y_{i}$ converges to zero in probability.
 Related Topics Math Help Online Online Math Tutor
*AP and SAT are registered trademarks of the College Board.