This is part of the mo­ments math col­lec­tion.


Moments are de­fined for every dis­tri­b­u­tion. These are gen­er­al­ized func­tions which find use in a va­ri­ety of sta­tis­ti­cal ex­per­i­men­tal data analy­sis. In par­tic­u­lar, we briefly elu­ci­date how var­i­ous mo­ments are used to char­ac­ter­ize the shape of a dis­tri­b­u­tion.

Finally, we note in pass­ing that a dis­tri­b­u­tion may be to­tally de­scribed via its mo­ments.


The ex­pec­ta­tion E\{(X-a)^k\}is called the k^{th}mo­ment of the ran­dom vari­able Xabout the num­ber a

Initial Moments

Moments about zero are of­ten re­ferred to as the mo­ments of a ran­dom vari­able or the ini­tial mo­ments.

The k^{th}mo­ment sat­is­fies the re­la­tion: \alpha_k=E\{X^k\}= \begin{cases} \sum_i x_i^k p_i & Discrete \\ \int\limits_{-\infty}^{\infty} x^k p(x)\, dx & Continuous \end{cases}(1)

Central Moments

When a=E\{X\}, then the k^{th}mo­ment of the ran­dom vari­able Xabout ais called the k^{th}cen­tral mo­ment.

The k^{th}cen­tral mo­ment sat­is­fies the re­la­tion:

\mu_k=E\{(X-E\{X\})^k\}= \begin{cases} \sum_i (x_i-E\{X\})^k p_i & Discrete \\ \int\limits_{-\infty}^{\infty} (x-E\{X\})^k p(x)\, dx & Continuous \end{cases}(2)

Remark: We note that \mu_0=1and \mu_1=0for ran­dom vari­ables.

Central and Initial Moment Relations

We have:

\mu_k=\sum^k_{m=0} C^m_k\alpha_m(\alpha_1)^{k-m}


\alpha_k=\sum_{m=0}^k C^m_k\mu_m(\alpha_1)^{k-m}

Also we note that, for dis­tri­b­u­tions sym­met­ric about the ex­pec­ta­tion, the ex­ist­ing cen­tral mo­ments \mu_kof even or­der kare zero.

Condition for Unique Determinacy

A prob­a­bil­ity dis­tri­b­u­tion may be uniquely de­ter­mined by the mo­ments \alpha_0,=\alpha_1,\cdotpro­vided that they all ex­ist and the fol­low­ing con­di­tion is sat­is­fied:

\sum_{m=0}^{\infty} |\alpha_m|\frac{t^m}{m!}

Additional Moments

Absolute Moments

The k^{th}ab­solute mo­ment of Xabout ais de­fined by:


The ex­is­tence of a k^{th}mo­ment \alpha_kor \mu_kim­plies the ex­is­tence of the mo­ments \alpha_mand \mu_mof all or­ders m<k

Mixed Moments

We note in pass­ing that the mixed sec­ond mo­ment is bet­ter known as the co­vari­ance of two ran­dom vari­ables and is de­fined as the cen­tral mo­ment of or­der (1+1):


Moment Interpretations

We note the fol­low­ing:

  • The first ini­tial mo­ment is the ex­pec­ta­tion.
  • The sec­ond cen­tral mo­ment is the vari­ance.
  • The third cen­tral mo­ment is re­lated to the skew­ness.
  • The fourth cen­tral mo­ment is re­lated to the kur­to­sis.

A mea­sure of lop­sid­ed­ness, 0for sym­met­ric dis­tri­b­u­tions. (a.k.a asym­me­try co­ef­fi­cient)

Mathematically: \gamma_1=\frac{\mu_3}{(\mu_2)^{1.5}}

Figure 1: Relationship of the distribution curve and the asymmetry coefficient.
Figure 1: Relationship of the dis­tri­b­u­tion curve and the asym­me­try co­ef­fi­cient.

A mea­sure of the heav­i­ness of the tail of the dis­tri­b­u­tion, com­pared to the nor­mal dis­tri­b­u­tion of the same vari­ance. (a.k.a ex­cess, ex­cess co­ef­fi­cient) Essentially a mea­sure of the tails of the dis­tri­b­u­tion com­pared with the tails of a Gaussian ran­dom vari­able. (Florescu and Tudor 2013)

Mathematically (Polyanin and Chernoutsan 2010): \gamma_2=\frac{\mu_4}{\mu_2^2}-3


Florescu, I., and C.A. Tudor. 2013. Handbook of Probability. Wiley Handbooks in Applied Statistics. Wiley. https://​​books?id=2V3SAQAAQBAJ.

Polyanin, A.D., and A.I. Chernoutsan. 2010. A Concise Handbook of Mathematics, Physics, and Engineering Sciences. CRC Press. https://​​books?id=ejzS­cufw­DRUC.