- Statistics Tutorial
- Home
- Adjusted R-Squared
- Analysis of Variance
- Arithmetic Mean
- Arithmetic Median
- Arithmetic Mode
- Arithmetic Range
- Bar Graph
- Best Point Estimation
- Beta Distribution
- Binomial Distribution
- Black-Scholes model
- Boxplots
- Central limit theorem
- Chebyshev's Theorem
- Chi-squared Distribution
- Chi Squared table
- Circular Permutation
- Cluster sampling
- Cohen's kappa coefficient
- Combination
- Combination with replacement
- Comparing plots
- Continuous Uniform Distribution
- Cumulative Frequency
- Co-efficient of Variation
- Correlation Co-efficient
- Cumulative plots
- Cumulative Poisson Distribution
- Data collection
- Data collection - Questionaire Designing
- Data collection - Observation
- Data collection - Case Study Method
- Data Patterns
- Deciles Statistics
- Dot Plot
- Exponential distribution
- F distribution
- F Test Table
- Factorial
- Frequency Distribution
- Gamma Distribution
- Geometric Mean
- Geometric Probability Distribution
- Goodness of Fit
- Grand Mean
- Gumbel Distribution
- Harmonic Mean
- Harmonic Number
- Harmonic Resonance Frequency
- Histograms
- Hypergeometric Distribution
- Hypothesis testing
- Interval Estimation
- Inverse Gamma Distribution
- Kolmogorov Smirnov Test
- Kurtosis
- Laplace Distribution
- Linear regression
- Log Gamma Distribution
- Logistic Regression
- Mcnemar Test
- Mean Deviation
- Means Difference
- Multinomial Distribution
- Negative Binomial Distribution
- Normal Distribution
- Odd and Even Permutation
- One Proportion Z Test
- Outlier Function
- Permutation
- Permutation with Replacement
- Pie Chart
- Poisson Distribution
- Pooled Variance (r)
- Power Calculator
- Probability
- Probability Additive Theorem
- Probability Multiplecative Theorem
- Probability Bayes Theorem
- Probability Density Function
- Process Capability (Cp) & Process Performance (Pp)
- Process Sigma
- Quadratic Regression Equation
- Qualitative Data Vs Quantitative Data
- Quartile Deviation
- Range Rule of Thumb
- Rayleigh Distribution
- Regression Intercept Confidence Interval
- Relative Standard Deviation
- Reliability Coefficient
- Required Sample Size
- Residual analysis
- Residual sum of squares
- Root Mean Square
- Sample planning
- Sampling methods
- Scatterplots
- Shannon Wiener Diversity Index
- Signal to Noise Ratio
- Simple random sampling
- Skewness
- Standard Deviation
- Standard Error ( SE )
- Standard normal table
- Statistical Significance
- Statistics Formulas
- Statistics Notation
- Stem and Leaf Plot
- Stratified sampling
- Student T Test
- Sum of Square
- T-Distribution Table
- Ti 83 Exponential Regression
- Transformations
- Trimmed Mean
- Type I & II Error
- Variance
- Venn Diagram
- Weak Law of Large Numbers
- Z table
- Statistics Useful Resources
- Statistics - Discussion

The degree of tailedness of a distribution is measured by kurtosis. It tells us the extent to which the distribution is more or less outlier-prone (heavier or light-tailed) than the normal distribution. Three different types of curves, courtesy of Investopedia, are shown as follows −

It is difficult to discern different types of kurtosis from the density plots (left panel) because the tails are close to zero for all distributions. But differences in the tails are easy to see in the normal quantile-quantile plots (right panel).

The normal curve is called Mesokurtic curve. If the curve of a distribution is more outlier prone (or heavier-tailed) than a normal or mesokurtic curve then it is referred to as a Leptokurtic curve. If a curve is less outlier prone (or lighter-tailed) than a normal curve, it is called as a platykurtic curve. Kurtosis is measured by moments and is given by the following formula −

${\beta_2 = \frac{\mu_4}{\mu_2}}$

Where −

${\mu_4 = \frac{\sum(x- \bar x)^4}{N}}$

The greater the value of \beta_2 the more peaked or leptokurtic the curve. A normal curve has a value of 3, a leptokurtic has \beta_2 greater than 3 and platykurtic has \beta_2 less then 3.

**Problem Statement:**

The data on daily wages of 45 workers of a factory are given. Compute \beta_1 and \beta_2 using moment about the mean. Comment on the results.

Wages(Rs.) | Number of Workers |
---|---|

100-200 | 1 |

120-200 | 2 |

140-200 | 6 |

160-200 | 20 |

180-200 | 11 |

200-200 | 3 |

220-200 | 2 |

**Solution:**

Wages (Rs.) | Number of Workers (f) | Mid-pt m | m-${\frac{170}{20}}$ d | ${fd}$ | ${fd^2}$ | ${fd^3}$ | ${fd^4}$ |
---|---|---|---|---|---|---|---|

100-200 | 1 | 110 | -3 | -3 | 9 | -27 | 81 |

120-200 | 2 | 130 | -2 | -4 | 8 | -16 | 32 |

140-200 | 6 | 150 | -1 | -6 | 6 | -6 | 6 |

160-200 | 20 | 170 | 0 | 0 | 0 | 0 | 0 |

180-200 | 11 | 190 | 1 | 11 | 11 | 11 | 11 |

200-200 | 3 | 210 | 2 | 6 | 12 | 24 | 48 |

220-200 | 2 | 230 | 3 | 6 | 18 | 54 | 162 |

${N=45}$ | ${\sum fd = 10}$ | ${\sum fd^2 = 64}$ | ${\sum fd^3 = 40}$ | ${\sum fd^4 = 330}$ |

Since the deviations have been taken from an assumed mean, hence we first calculate moments about arbitrary origin and then moments about mean. Moments about arbitrary origin '170'

${\mu_1^1= \frac{\sum fd}{N} \times i = \frac{10}{45} \times 20 = 4.44 \\[7pt]
\mu_2^1= \frac{\sum fd^2}{N} \times i^2 = \frac{64}{45} \times 20^2 =568.88 \\[7pt]
\mu_3^1= \frac{\sum fd^2}{N} \times i^3 = \frac{40}{45} \times 20^3 =7111.11 \\[7pt]
\mu_4^1= \frac{\sum fd^4}{N} \times i^4 = \frac{330}{45} \times 20^4 =1173333.33 }$

${\mu_2 = \mu'_2 - (\mu'_1 )^2 = 568.88-(4.44)^2 = 549.16 \\[7pt]
\mu_3 = \mu'_3 - 3(\mu'_1)(\mu'_2) + 2(\mu'_1)^3 \\[7pt]
\, = 7111.11 - (4.44) (568.88)+ 2(4.44)^3 \\[7pt]
\, = 7111.11 - 7577.48+175.05 = - 291.32 \\[7pt]
\\[7pt]
\mu_4= \mu'_4 - 4(\mu'_1)(\mu'_3) + 6 (\mu_1 )^2 (\mu'_2) -3(\mu'_1)^4 \\[7pt]
\, = 1173333.33 - 4 (4.44)(7111.11)+6(4.44)^2 (568.88) - 3(4.44)^4 \\[7pt]
\, = 1173333.33 - 126293.31+67288.03-1165.87 \\[7pt]
\, = 1113162.18 }$

From the value of movement about mean, we can now calculate ${\beta_1}$ and ${\beta_2}$:

${\beta_1 = \mu^2_3 = \frac{(-291.32)^2}{(549.16)^3} = 0.00051 \\[7pt]
\beta_2 = \frac{\mu_4}{(\mu_2)^2} = \frac{1113162.18}{(546.16)^2} = 3.69 }$

From the above calculations, it can be concluded that ${\beta_1}$, which measures skewness is almost zero, thereby indicating that the distribution is almost symmetrical. ${\beta_2}$ Which measures kurtosis, has a value greater than 3, thus implying that the distribution is leptokurtic.

Advertisements