A real valued function, defined over the sample space of a random experiment is called the random variable, associated to that random experiment. That is the values of the random variable correspond to the outcomes of the random experiment. Random variables may be either discrete or continuous.

Related Calculators | |

Binomial Random Variable Calculator | Calculator Variable |

Determinant Calculator with Variables | Fraction and Variable Calculator |

A random variable is a rule that assigns a numerical value to each outcome in a sample space. Random variables may be either discrete or continuous. A random variable is said to be discrete if it assumes only specified values in an interval otherwise continuous. When X takes values 1, 2, 3, …, it is said to have discrete random variable.

A variate can be defined as a generalization of the random variable. It has the same properties as that of the random variables without stressing to any particular type of probabilistic experiment. It always obeys a particular probabilistic law.

The numbers expected or showing during the toss of an independent die experiment are all elements of the same variate as the probabilistic factor governs their numerical part, provided the parts are identical.

A numerically valued variable is said to be continuous if, in any unit of measurement, whenever it can take on the values a and b. If the random variable X can assume infinite and uncountable set of values, it is said to be a continuous random variable. When X takes any value in a given interval (a, b), it is said to be a continuous random variable in that interval.

### Continuous Random Variable Example

### Solved Example

**Question: **Find the mean value for the continuous random variable f(x) = x, 0 $\leq$ x $\leq$ 2 ?

** Solution: **

The given continuous function is f(x) = x, 0 $\leq$ x $\leq$ 2.

The formula used for the mean value is

E(X) = `int _ -oo^oo ` x f(x) dx

E(X) =` int _ 0^2` x * x dx

E(X) = `int _ 0^2` `x^2` dx

E(X) = `[(x^3/3)] _0^2`

E(X) = (`8/3`) - (0)

E(X) = `8/3`

The mean value for continuous random variable f(x) = x, 0 $\leq$ x $\leq$ 2 is `8/3`.

→ Read More
The probability distribution of a random variable can be

A probability distribution always satisfies two conditions

_{X} (x) and f_{Y} (y) respectively defined for all x. Then the sum Z = X + Y is a random variables with density function f_{Z} (z), where f_{Z} is the convolution of f_{X} and f_{Y} .

Let X_{1}, X_{2}, ........, X_{n} denote a sequence of real valued random variables, S = $\sum_{i = 1}^n X_i$. Whenever the distribution of X is absolutely continuous, the distribution of S may be determine as,

Let X = X_{1}, X_{2}, ........, X_{n }where X_{1}, X_{2}, ........, X_{n} are the random variables.
Let $\phi_x$ denote the characteristic function of X and let $\phi_s$
denote the characteristic function of S = $\sum_{i = 1}^n X_i$ then

$\phi_s$(t) = $\phi_x$(tv), t ∈ R

where v = (1, 1,..........., 1) ∈ R^{n}.

### Poisson Distribution

A random variable is said to have a Poisson distribution if it takes the values 0, 1, 2,.. with probability P (x) = m^{x }e^{-m} / x!, x = 0, 1, 2,….

The Poisson distribution is limiting forms of Binomial distribution when n tends to infinity and p tends to 0 so that np is finite say m. Poisson distribution can be used to explain the behavior of the discrete random variables where the probability of occurrence of the event is very small and the total number of possible cases is sufficiently very large. As such, Poisson distribution has found application in Queuing theory and in fields of Insurance, Physics, Biology, Economics, and Industry etc.

### Characteristics of Poisson Distribution

_{i}, i = 1, 2, … are independent Poisson variates with parameter mi respectively, then $\sum$ X also a Poisson variate with parameter $\sum$ m_{i} .

### Uniform Distribution

A random variable is said to have a Uniform distribution if the probability distribution function f (x) = 1/a - b , where x = a, a + h, a + 2h,…, a + (k - 1)h where a and h being fixed real numbers and k a fixed positive integer.

This distribution will occur in practice if under the given experimental conditions, the different values of the random variable happen to be equally likely. Thus, if a true die is thrown the variable which is the number of points coming on the upper most faces takes the values 1, 2, …6 which are equally probable. Again, if digits are selected from a true random number series table, the value 0, 1, …0 are equally probable.

### Characteristics of Uniform Distribution

P (x + y) = x + y if 0 $\leq$ x + y $\leq$ 1

= 2 – x – y if 1 < x + y $\leq$ 2,

= 0 otherwise.

### Exponential Distribution

A random variable is said to have a Exponential distribution if it takes the values 0 to infinity with probability P (x) = m e^{-mx}, 0 < x < $\infty$ and $\lambda$ > 0.

### Characteristics of Exponential Distribution

_{1}, x_{2}, …with corresponding probability P (x_{1}), P (x_{2}),… then the expected value of the random variable is given by

Expectation of X, E (x) = $\sum$ x P (x).

The transformation of a random variable means to reassigns the value to another variable. The transformation is actually inserted to remap the number line from x to y the transformation function y = g(x).

### Transformation of X or Expected Value of X for a Continuous Variable:

Let the random variable X assumes the values x_{1}, x_{2}, x_{3}, ..… with corresponding probability P (x_{1}), P (x_{2}), P (x_{3}),…........ then the expected value of the random variable is given by

Expectation of X, E (x) = $\int$ x P (x) Convergence of random variables in distribution is defined as follows:

let X and X_{n}, n ∈ N, be random variables with cdf F and F_{n} respectively. Then the sequence X_{n} converges to X in distribution and X_{n} -> X if

$\lim_{n \to \infty }$ F_{n}(x) = F(x), for every x ∈ R at which F is continuous.

Geometric random variables are said to be memoryless because the probability that we will reach first success n trials from now is independent of the probability of failures. If the random variable X has probability function given by

Pr(X = k) = p(1 - p)^{k} = p.q^{k} for k = 0, 1, 2,.........., q = 1 - p and 0 < p < 1. Then, X is called a geometric random variable with parameter p. That is for the geometric random variable X to equal k.

The process that generates random events in a given units of time is a Poisson process, then if discrete random variable X is used to count the number of successes (X = x) occurring in the given unit, this variable is called as Poisson random variable.

### Sum of Poisson Random Variables

Sum of Poisson random variables is one of the important property of Poisson distributions. Consider S is the sum of two independent Poisson random variables., the mean rate of occurrence for t he sum, E(S) = $\lambda_s$ is the sum of the contributing mean rates. Sum of independent Poisson random variables with mean $\lambda_i$ for i = 1, 2.

Then, Z = Z_{1} + Z_{2} is the is a Poisson random variable with mean

E(Z) = E(Z_{1 }+ Z_{2}) = $\lambda_1$ + $\lambda_1$

The random variable of a binomial experiment is defined as the number of successes in the n trials. It is called the binomial random variable.

### Sum of Binomial Random Variables

A linear combination of binomial random variables is generally not a binomial variable, but there is one situation where it is. The sum of independent binomial variables, each with the same success probability, has a binomial distribution. Binomial random variables with same success probability can be added as follows:

If X, Y are independent binomial random variables with n_{x}, n_{y} trials and all have the same success probability p, then the sum X + Y is a binomial random variable with n = n_{x} + n_{y} and success probability p.

If the success probabilities differ, the sum is not a binomial random variable.

A Bernoulli trial is a random experiment in which there are only two possible outcomes, success or failure. A Bernoulli random variable describes a Bernoulli trial and assume only two values, 1 for success with probability p and 0 for failure with probability q = 1 - p.

The normal distribution is one of a class of distributions that are called continuous. Normal distribution having mean 0 and variance 1 and we show how to determine its probabilities. Normal random variable can be transformed to a standard normal and we use this transformation to determine the probabilities of that variable. Normal Random Variable is also called as Gaussian Random Variable.

A random variable, X, is a numerical measure of the outcomes of an experiment. The mean of a random variable, also known as its expected value. The expected value of a random variable is the weighted average of all possible values that this random variable can take on. The expected value of a random variable X is denoted by E(X). Expected Value of a Random Variable is also called as the mean of random variable.

A random variable X is said to have a gamma distribution with parameters r and $\lambda$ if its probability density function (pdf) is given by:

f(x) = $\begin{Bmatrix}

\frac{\lambda ^{r}x^{r-1}e^{-\lambda x}}{\Gamma (r)} & x\geq 0\\

0& x < 0

\end{Bmatrix}$

where $\Gamma$ denotes the gamma function, r and $\lambda$ both are positive real numbers.

Exponential random variable X with parameter $\lambda$, the probability that an event occurs no later than time t is given by

P[X $\leq$ t] = F(t) = 1 - e^{(-$\lambda$ t)}

The exponential distribution X with the parameter $\lambda$ describes the interval between occurrence of the events defined by Poisson random variable.

### Sum of Exponential Random Variables

Sum of independent Exponential variates will be a gamma variate. If x_{i},
i = 1, 2, … are independent gamma variates with parameter $\lambda$
respectively, then $\sum X$ will have a gamma variate with parameters 2n
and $\frac{n}{\lambda }$. That is the function of the random variables
transforms the exponential variables to gamma variables.

In some cases the exponential probability distribution is considered as a special case of the gamma distribution. The two independent random variables X and y were exponential with parameter $\beta$ = 1. The sum S = X + Y is a gamma random variable with parameters $\alpha$ = 2 and $\beta$ = 1.

When behavior of a system depends upon two or more random variables over the same sample space. Suppose that X and Y are two random variables. The joint (cumulative) distribution function of X and Y is the function on R^{2} defined by

F(x, y) = P(X $\leq$ x, Y $\leq$ y), (x, y) ∈ R^{2}.

Two jointly Gaussian random variables are statistically independent if and only if they are uncorrelated. Random variables X_{1}, X_{2}, ........, X_{n} defined on a common probability space are said to be jointly Gaussian, X = X_{1}, X_{2}, ........, X_{n} is termed a Gaussian random vector if any linear combination of these random variables is a Gaussian random variable. For any scalar constants a_{1}, a_{2}, .........., a_{n}, the random variable a_{1}X_{1} + a_{2} X_{2} + ........+ a_{n} X_{n} is Gaussian.

A variate can be defined as a generalization of the random variable. It has the same properties as that of the random variables without stressing to any particular type of probabilistic experiment. It always obeys a particular probabilistic law.

The numbers expected or showing during the toss of an independent die experiment are all elements of the same variate as the probabilistic factor governs their numerical part, provided the parts are identical.

- A variate is called discrete variate when that variate is not capable of assuming all the values in the provided range.
- If the variate is able to assume all the numerical values provided in the whole range, then it is called as continuous variate.

A numerically valued variable is said to be continuous if, in any unit of measurement, whenever it can take on the values a and b. If the random variable X can assume infinite and uncountable set of values, it is said to be a continuous random variable. When X takes any value in a given interval (a, b), it is said to be a continuous random variable in that interval.

Given below are some of the examples on Continuous Random Variable.

The given continuous function is f(x) = x, 0 $\leq$ x $\leq$ 2.

The formula used for the mean value is

E(X) = `int _ -oo^oo ` x f(x) dx

E(X) =` int _ 0^2` x * x dx

E(X) = `int _ 0^2` `x^2` dx

E(X) = `[(x^3/3)] _0^2`

E(X) = (`8/3`) - (0)

E(X) = `8/3`

The mean value for continuous random variable f(x) = x, 0 $\leq$ x $\leq$ 2 is `8/3`.

- Theoretical listing of outcomes and probabilities of the outcomes.
- An empirical listing of outcomes associated, with their observed relative frequencies.
- A subjective listing of outcomes associated with their subjective probabilities.

A probability distribution always satisfies two conditions

- $f(x)\geq 0$
- $\sum f(x)=1$

- Binomial distribution
- Poisson distribution
- Bernoulli’s distribution
- Multinomial distribution
- Hyper geometric distribution
- Exponential distribution
- Normal distribution

Let X

Let X = X

$\phi_s$(t) = $\phi_x$(tv), t ∈ R

where v = (1, 1,..........., 1) ∈ R

A random variable is said to have a Poisson distribution if it takes the values 0, 1, 2,.. with probability P (x) = m

The Poisson distribution is limiting forms of Binomial distribution when n tends to infinity and p tends to 0 so that np is finite say m. Poisson distribution can be used to explain the behavior of the discrete random variables where the probability of occurrence of the event is very small and the total number of possible cases is sufficiently very large. As such, Poisson distribution has found application in Queuing theory and in fields of Insurance, Physics, Biology, Economics, and Industry etc.

- Poisson distribution is a discrete probability distribution
- If x follows a Poisson distribution, then x takes values 0, 1, 2,… to infinity.
- Poisson distribution has a single parameter, m. When m is known all the terms can be found out.
- Mean and variance of Poisson distribution are equal to m.
- Poisson distribution is a positively skewed distribution.

A random variable is said to have a Uniform distribution if the probability distribution function f (x) = 1/a - b , where x = a, a + h, a + 2h,…, a + (k - 1)h where a and h being fixed real numbers and k a fixed positive integer.

This distribution will occur in practice if under the given experimental conditions, the different values of the random variable happen to be equally likely. Thus, if a true die is thrown the variable which is the number of points coming on the upper most faces takes the values 1, 2, …6 which are equally probable. Again, if digits are selected from a true random number series table, the value 0, 1, …0 are equally probable.

- The Uniform distribution is symmetrical
- The Uniform distribution is highly platy kurtic.
- All the odd moments of the uniform distribution are zero.

P (x + y) = x + y if 0 $\leq$ x + y $\leq$ 1

= 2 – x – y if 1 < x + y $\leq$ 2,

= 0 otherwise.

- Exponential distribution lacks memory property.
- Exponential distribution is positively skewed
- It is leptokurtic.

Expectation of X, E (x) = $\sum$ x P (x).

The transformation of a random variable means to reassigns the value to another variable. The transformation is actually inserted to remap the number line from x to y the transformation function y = g(x).

Let the random variable X assumes the values x

Expectation of X, E (x) = $\int$ x P (x) Convergence of random variables in distribution is defined as follows:

let X and X

$\lim_{n \to \infty }$ F

Geometric random variables are said to be memoryless because the probability that we will reach first success n trials from now is independent of the probability of failures. If the random variable X has probability function given by

Pr(X = k) = p(1 - p)

The process that generates random events in a given units of time is a Poisson process, then if discrete random variable X is used to count the number of successes (X = x) occurring in the given unit, this variable is called as Poisson random variable.

Sum of Poisson random variables is one of the important property of Poisson distributions. Consider S is the sum of two independent Poisson random variables., the mean rate of occurrence for t he sum, E(S) = $\lambda_s$ is the sum of the contributing mean rates. Sum of independent Poisson random variables with mean $\lambda_i$ for i = 1, 2.

Then, Z = Z

E(Z) = E(Z

The random variable of a binomial experiment is defined as the number of successes in the n trials. It is called the binomial random variable.

A linear combination of binomial random variables is generally not a binomial variable, but there is one situation where it is. The sum of independent binomial variables, each with the same success probability, has a binomial distribution. Binomial random variables with same success probability can be added as follows:

If X, Y are independent binomial random variables with n

If the success probabilities differ, the sum is not a binomial random variable.

A Bernoulli trial is a random experiment in which there are only two possible outcomes, success or failure. A Bernoulli random variable describes a Bernoulli trial and assume only two values, 1 for success with probability p and 0 for failure with probability q = 1 - p.

The normal distribution is one of a class of distributions that are called continuous. Normal distribution having mean 0 and variance 1 and we show how to determine its probabilities. Normal random variable can be transformed to a standard normal and we use this transformation to determine the probabilities of that variable. Normal Random Variable is also called as Gaussian Random Variable.

A random variable, X, is a numerical measure of the outcomes of an experiment. The mean of a random variable, also known as its expected value. The expected value of a random variable is the weighted average of all possible values that this random variable can take on. The expected value of a random variable X is denoted by E(X). Expected Value of a Random Variable is also called as the mean of random variable.

A random variable X is said to have a gamma distribution with parameters r and $\lambda$ if its probability density function (pdf) is given by:

f(x) = $\begin{Bmatrix}

\frac{\lambda ^{r}x^{r-1}e^{-\lambda x}}{\Gamma (r)} & x\geq 0\\

0& x < 0

\end{Bmatrix}$

where $\Gamma$ denotes the gamma function, r and $\lambda$ both are positive real numbers.

Exponential random variable X with parameter $\lambda$, the probability that an event occurs no later than time t is given by

P[X $\leq$ t] = F(t) = 1 - e

The exponential distribution X with the parameter $\lambda$ describes the interval between occurrence of the events defined by Poisson random variable.

Sum of independent Exponential variates will be a gamma variate. If x

In some cases the exponential probability distribution is considered as a special case of the gamma distribution. The two independent random variables X and y were exponential with parameter $\beta$ = 1. The sum S = X + Y is a gamma random variable with parameters $\alpha$ = 2 and $\beta$ = 1.

When behavior of a system depends upon two or more random variables over the same sample space. Suppose that X and Y are two random variables. The joint (cumulative) distribution function of X and Y is the function on R

F(x, y) = P(X $\leq$ x, Y $\leq$ y), (x, y) ∈ R

Two jointly Gaussian random variables are statistically independent if and only if they are uncorrelated. Random variables X

More topics in Random Variables | |

Continuous Random Variable | Discrete Random Variable |

Sum of Random Variables | |

Related Topics | |

Math Help Online | Online Math Tutor |