This is part of the mo­ments math col­lec­tion.

Random Variables

For a space of el­e­men­tary events, say \Omega=\{\omega\}, a ran­dom vari­able Xis a real num­ber func­tion X=X(\omega)de­fined on the set \Omega.

Essentially, Xmay be con­sid­ered to be a quan­tity which takes its val­ues (say x_i) from a sub­set Rof real num­bers.

We note that iff Xis a ran­dom vari­able, a func­tion g(X)is also ran­dom.

Random vari­ables are fur­ther quan­ti­fied and clas­si­fied on the ba­sis of their dis­tri­b­u­tion func­tions.

Distribution Law

A rule (tabular, func­tional, graph­i­cal, etc) which per­mits one to find the prob­a­bil­i­ties of an event (a.k.a the ran­dom vari­able) is the dis­tri­b­u­tion law for the ran­dom vari­able.

Distribution Functions

Every ran­dom vari­able is de­fined in terms of it’s prob­a­bil­i­ties, i.e they are char­ac­ter­ized by the like­li­hood of hav­ing a par­tic­u­lar value.

Mathematically, the cu­mu­la­tive dis­tri­b­u­tion func­tion of a ran­dom vari­able Xis the func­tion F(x)whose value at every point xis equal to the prob­a­bil­ity of the event {X <x}:



  • 0 \leq F(x) \leq 1
  • \lim_{x\to -\infty}F(x)=F(-\infty)=0and \lim_{x\to\infty}F(x)=F(\infty)=1
  • \forall x_i, x_2>x_1 \implies F(x_2)\geq F(x1)
  • P(x_1 \leq X < x_2)=F(x_2)-F(x_1)
  • F(x)is left con­tin­u­ous. (i.e., \lim_{x\to(x_0-0)}F(x)=F(x_0))

Types of Random Variables

On the ba­sis of the above con­cepts, we now quan­tify ran­dom vari­ables as: X \to \begin{cases} F(x)=P\{X < x\}=\sum_{x_n<x}p_n & Discrete \\ \int\limits_{-\infty}^{x} p(z)\, dz \text{ OR } p(x)=F^\prime(x) & Continuous \end{cases}(1)


The ex­pec­ta­tion (expected value) E(X)of a dis­crete or con­tin­u­ous ran­dom vari­able Xis math­e­mat­i­cally de­fined by: E\{X\}= \begin{cases} \sum_i x_i p_i & Discrete \\ \int\limits_{-\infty}^{\infty} x p(x)\, dx & Continuous \end{cases}(2)

For the con­tin­u­ous case, it is nec­es­sary that the in­te­gral or it’s cor­re­spond­ing se­ries con­verges ab­solutely.

In generic terms, the ex­pec­ta­tion is the main char­ac­ter­is­tic defin­ing the position” of a ran­dom vari­able, i.e., the num­ber near which its pos­si­ble val­ues are con­cen­trated.

Similarly, due to the sim­i­lar­ity of func­tions de­scrib­ing ran­dom vari­ables and ran­dom vari­ables, given a ran­dom vari­able Yre­lated to a ran­dom vari­able Xby a func­tional de­pen­dence Y=\phi(X)then we have:

E\{Y\}=E\{\phi(X)\}= \begin{cases} \sum_i \phi(x_i) p_i & Discrete \\ \int\limits_{-\infty}^{\infty} \phi(x) p(x)\, dx & Continuous \end{cases}(3)


The vari­ance, Var{X} is the mea­sure of de­vi­a­tion of a ran­dom vari­able Xfrom the ex­pec­ta­tion E\{X\}as de­ter­mined by:


The vari­ance char­ac­ter­izes the spread in val­ues of the ran­dom vari­able Xabout its ex­pec­ta­tion.

Graphical Preliminaries

Having in­tro­duced the den­sity func­tion and the dis­tri­b­u­tion func­tion, it is triv­ial to in­ter­pret the fol­low­ing curves in the fig­ure be­low and note, that the prob­a­bil­ity P(X\leq x)=F(x)may be rep­re­sented as an area be­tween the den­sity func­tion f(t)and the x-axis on the in­ter­val -\infty<t\leq x

Probability as an area.
Probability as an area.1

Often there is given (frequently in %) a prob­a­bil­ity value \alpha.

If P(X > x) = \alphaholds, the cor­re­spond­ing value of the ab­scissa x = x_\alphais called the quan­tile or the frac­tile of or­der \alpha

This means the area un­der the den­sity func­tion f(t)to the right of x = x_\alphais equal to \alpha.

Remark: In the lit­er­a­ture, the area to the left of x = x_\alphais also used for the de­f­i­n­i­tion of quan­tile.

In math­e­mat­i­cal sta­tis­tics, for small val­ues of \alpha, e.g., \alpha= 5\%or \alpha= 1\%, is also used the no­tion sig­nif­i­cance level of first type or type 1 er­ror rate.


Bronshtein, I.N., K.A. Semendyayev, G. Musiol, and H. Mühlig. 2015. Handbook of Mathematics. Springer Berlin Heidelberg. https://​​books?id=5L6BB­wAAQBAJ.

  1. Bronshtein et al. (2015)