Top

# Random Variables

A real valued function, defined over the sample space of a random experiment is called the random variable, associated to that random experiment. That is the values of the random variable correspond to the outcomes of the random experiment. Random variables may be either discrete or continuous.

 Related Calculators Binomial Random Variable Calculator Calculator Variable Determinant Calculator with Variables Fraction and Variable Calculator

## Random Variable Definition

A random variable is a rule that assigns a numerical value to each outcome in a sample space. Random variables may be either discrete or continuous. A random variable is said to be discrete if it assumes only specified values in an interval otherwise continuous. When X takes values 1, 2, 3, …, it is said to have discrete random variable.

## Variate

A variate can be defined as a generalization of the random variable. It has the same properties as that of the random variables without stressing to any particular type of probabilistic experiment. It always obeys a particular probabilistic law.

The numbers expected or showing during the toss of an independent die experiment are all elements of the same variate as the probabilistic factor governs their numerical part, provided the parts are identical.
1. A variate is called discrete variate when that variate is not capable of assuming all the values in the provided range.
2. If the variate is able to assume all the numerical values provided in the whole range, then it is called as continuous variate.

## Continuous Random Variable

A numerically valued variable is said to be continuous if, in any unit of measurement, whenever it can take on the values a and b. If the random variable X can assume infinite and uncountable set of values, it is said to be a continuous random variable. When X takes any value in a given interval (a, b), it is said to be a continuous random variable in that interval.

### Continuous Random Variable Example

Given below are some of the examples on Continuous Random Variable.

### Solved Example

Question: Find the mean value for the continuous random variable f(x) = x, 0 $\leq$ x $\leq$ 2 ?

Solution:

The given continuous function is f(x) = x, 0 $\leq$ x $\leq$ 2.

The formula used for the mean value is

E(X) = int _ -oo^oo  x f(x) dx

E(X) = int _ 0^2 x  * x dx

E(X) = int _ 0^2 x^2 dx

E(X) = [(x^3/3)] _0^2

E(X) = (8/3) - (0)

E(X) = 8/3

The mean value for continuous random variable f(x) = x, 0 $\leq$ x $\leq$ 2 is 8/3.

## Random Variables and Probability Distribution

The probability distribution of a random variable can be
1. Theoretical listing of outcomes and probabilities of the outcomes.
2. An empirical listing of outcomes associated, with their observed relative frequencies.
3. A subjective listing of outcomes associated with their subjective probabilities.
The probability of a random variable X which takes the values x is defined as probability function of X is denoted by f (x) = f (X = x)

A probability distribution always satisfies two conditions
1. $f(x)\geq 0$
2. $\sum f(x)=1$
The important distributions are given below: -
• Binomial distribution
• Poisson distribution
• Bernoulli’s distribution
• Multinomial distribution
• Hyper geometric distribution
• Exponential distribution
• Normal distribution

## Sum of Random Variables

Sum of the random variable is a kind of transformation on the probability distribution. Let X and Y be two independent random variables with probability density functions fX (x) and fY (y) respectively defined for all x. Then the sum Z = X + Y is a random variables with density function fZ (z), where fZ is the convolution of fX and fY .

Let X1, X2, ........, Xn denote a sequence of real valued random variables, S = $\sum_{i = 1}^n X_i$. Whenever the distribution of X is absolutely continuous, the distribution of S may be determine as,
Let X = X1, X2, ........, Xn where X1, X2, ........, Xn are the random variables. Let $\phi_x$ denote the characteristic function of X and let $\phi_s$ denote the characteristic function of S = $\sum_{i = 1}^n X_i$ then

$\phi_s$(t) = $\phi_x$(tv), t ∈ R

where v = (1, 1,..........., 1) ∈ Rn.

### Poisson Distribution

A random variable is said to have a Poisson distribution if it takes the values 0, 1, 2,.. with probability P (x) = mx e-m / x!, x = 0, 1, 2,….

The Poisson distribution is limiting forms of Binomial distribution when n tends to infinity and p tends to 0 so that np is finite say m. Poisson distribution can be used to explain the behavior of the discrete random variables where the probability of occurrence of the event is very small and the total number of possible cases is sufficiently very large. As such, Poisson distribution has found application in Queuing theory and in fields of Insurance, Physics, Biology, Economics, and Industry etc.

### Characteristics of Poisson Distribution

• Poisson distribution is a discrete probability distribution
• If x follows a Poisson distribution, then x takes values 0, 1, 2,… to infinity.
• Poisson distribution has a single parameter, m. When m is known all the terms can be found out.
• Mean and variance of Poisson distribution are equal to m.
• Poisson distribution is a positively skewed distribution.

## Sum of Poisson Variables

Sum of independent Poisson variates is also a Poisson variate. If xi, i = 1, 2, … are independent Poisson variates with parameter mi respectively, then $\sum$ X also a Poisson variate with parameter $\sum$ mi .

### Uniform Distribution

A random variable is said to have a Uniform distribution if the probability distribution function f (x) = 1/a - b , where x = a, a + h, a + 2h,…, a + (k - 1)h where a and h being fixed real numbers and k a fixed positive integer.

This distribution will occur in practice if under the given experimental conditions, the different values of the random variable happen to be equally likely. Thus, if a true die is thrown the variable which is the number of points coming on the upper most faces takes the values 1, 2, …6 which are equally probable. Again, if digits are selected from a true random number series table, the value 0, 1, …0 are equally probable.

### Characteristics of Uniform Distribution

• The Uniform distribution is symmetrical
• The Uniform distribution is highly platy kurtic.
• All the odd moments of the uniform distribution are zero.

## Sum of Uniform Variables

Sum of independent Uniform random variates will not be a uniform random variate. If x and y are two independent random uniform variates, then the sum of the variables will have a probability distribution

P (x + y) = x + y if 0 $\leq$ x + y $\leq$ 1

= 2 – x – y if 1 < x + y $\leq$ 2,

= 0 otherwise.

### Exponential Distribution

A random variable is said to have a Exponential distribution if it takes the values 0 to infinity with probability P (x) = m e-mx, 0 < x < $\infty$ and $\lambda$ > 0.

### Characteristics of Exponential Distribution

• Exponential distribution lacks memory property.
• Exponential distribution is positively skewed
• It is leptokurtic.

## Functions of Random Variable

Expectation of X or Expected value of X for a discrete variable. Let the random variable X assumes the values x1, x2, …with corresponding probability P (x1), P (x2),… then the expected value of the random variable is given by

Expectation of X, E (x) = $\sum$ x P (x).

## Transformation of Random Variables

The transformation of a random variable means to reassigns the value to another variable. The transformation is actually inserted to remap the number line from x to y the transformation function y = g(x).

### Transformation of X or Expected Value of X for a Continuous Variable:

Let the random variable X assumes the values x1, x2, x3, ..… with corresponding probability P (x1), P (x2), P (x3),…........ then the expected value of the random variable is given by
Expectation of X, E (x) = $\int$ x P (x)

## Convergence of Random Variables

Convergence of random variables in distribution is defined as follows:
let X and Xn, n ∈ N, be random variables with cdf F and Fn respectively. Then the sequence Xn converges to X in distribution and Xn -> X if

$\lim_{n \to \infty }$ Fn(x) = F(x), for every x ∈ R at which F is continuous.

## Geometric Random Variable

Geometric random variables are said to be memoryless because the probability that we will reach first success n trials from now is independent of the probability of failures. If the random variable X has probability function given by

Pr(X = k) = p(1 - p)k = p.qk for k = 0, 1, 2,.........., q = 1 - p and 0 < p < 1. Then, X is called a geometric random variable with parameter p. That is for the geometric random variable X to equal k.

## Poisson Random Variable

The process that generates random events in a given units of time is a Poisson process, then if discrete random variable X is used to count the number of successes (X = x) occurring in the given unit, this variable is called as Poisson random variable.

### Sum of Poisson Random Variables

Sum of Poisson random variables is one of the important property of Poisson distributions. Consider S is the sum of two independent Poisson random variables., the mean rate of occurrence for t he sum, E(S) = $\lambda_s$ is the sum of the contributing mean rates. Sum of independent Poisson random variables with mean $\lambda_i$ for i = 1, 2.

Then, Z = Z1 + Z2 is the is a Poisson random variable with mean

E(Z) = E(Z1 + Z2) = $\lambda_1$ + $\lambda_1$

## Binomial Random Variable

The random variable of a binomial experiment is defined as the number of successes in the n trials. It is called the binomial random variable.

### Sum of Binomial Random Variables

A linear combination of binomial random variables is generally not a binomial variable, but there is one situation where it is. The sum of independent binomial variables, each with the same success probability, has a binomial distribution. Binomial random variables with same success probability can be added as follows:

If X, Y are independent binomial random variables with nx, ny trials and all have the same success probability p, then the sum X + Y is a binomial random variable with n = nx + ny and success probability p.

If the success probabilities differ, the sum is not a binomial random variable.

## Bernoulli Random Variable

A Bernoulli trial is a random experiment in which there are only two possible outcomes, success or failure. A Bernoulli random variable describes a Bernoulli trial and assume only two values, 1 for success with probability p and 0 for failure with probability q = 1 - p.

## Normal Random Variable

The normal distribution is one of a class of distributions that are called continuous. Normal distribution having mean 0 and variance 1 and we show how to determine its probabilities. Normal random variable can be transformed to a standard normal and we use this transformation to determine the probabilities of that variable. Normal Random Variable is also called as Gaussian Random Variable.

## Expected Value of a Random Variable

A random variable, X, is a numerical measure of the outcomes of an experiment. The mean of a random variable, also known as its expected value. The expected value of a random variable is the weighted average of all possible values that this random variable can take on. The expected value of a random variable X is denoted by E(X). Expected Value of a Random Variable is also called as the mean of random variable.

## Gamma Random Variable

A random variable X is said to have a gamma distribution with parameters r and $\lambda$ if its probability density function (pdf) is given by:

f(x) = $\begin{Bmatrix} \frac{\lambda ^{r}x^{r-1}e^{-\lambda x}}{\Gamma (r)} & x\geq 0\\ 0& x < 0 \end{Bmatrix}$

where $\Gamma$ denotes the gamma function, r and $\lambda$ both are positive real numbers.

## Exponential Random Variable

Exponential random variable X with parameter $\lambda$, the probability that an event occurs no later than time t is given by

P[X $\leq$ t] = F(t) = 1 - e(-$\lambda$ t)

The exponential distribution X with the parameter $\lambda$ describes the interval between occurrence of the events defined by Poisson random variable.

### Sum of Exponential Random Variables

Sum of independent Exponential variates will be a gamma variate. If xi, i = 1, 2, … are independent gamma variates with parameter $\lambda$ respectively, then $\sum X$ will have a gamma variate with parameters 2n and $\frac{n}{\lambda }$. That is the function of the random variables transforms the exponential variables to gamma variables.
In some cases the exponential probability distribution is considered as a special case of the gamma distribution. The two independent random variables X and y were exponential with parameter $\beta$ = 1. The sum S = X + Y is a gamma random variable with parameters $\alpha$ = 2 and $\beta$ = 1.

## Jointly Distributed Random Variables

F(x, y) = P(X $\leq$ x, Y $\leq$ y), (x, y) ∈ R2.