- Trending Categories
Data Structure
Networking
RDBMS
Operating System
Java
MS Excel
iOS
HTML
CSS
Android
Python
C Programming
C++
C#
MongoDB
MySQL
Javascript
PHP
Physics
Chemistry
Biology
Mathematics
English
Economics
Psychology
Social Studies
Fashion Studies
Legal Studies
- Selected Reading
- UPSC IAS Exams Notes
- Developer's Best Practices
- Questions and Answers
- Effective Resume Writing
- HR Interview Questions
- Computer Glossary
- Who is Who
Variance and Standard Deviation
The standard deviation (SD) of a dataset is its average amount of variability. It indicates how far each of the data values in a given distribution deviate from the mean, or center, of the distribution. In the case of normal distributions, a larger standard deviation means that the given values are generally far from the mean, while a smaller standard deviation indicates the values are clustered closer to the mean.
Variance is the average of the squared SDs from the mean. To count variance, one needs to first subtract the mean from each number and then square the outcomes to find the squared differences. Then the average of the given squared differences gives the variance.
Note − Standard deviation is calculated by counting the square root of the variance.
A Simple Example
Variance and standard deviation help us analyze things that cannot be measured just by taking averages. As an example, imagine that you have three cousins; one is 13 and the other two are twins who are 10. In this case, the average age of the cousins would be 11. Now imagine that you have three cousins aged 17, 12, and 4. In this case, the average age of your cousins would still be 11, but the variance and standard deviation would be different.
Sample Versus Population
Knowing the difference between a population and a sample is important while dealing with statistical measurements. For example, to compute the standard deviation (or variance) of a given population, you would need to collect data for everyone in the group; for sampling purposes, you would only need measurements from a subset of the given population.
If we make an assumption that a group in a set is a population, we need not do anything to the data for counting SD. However, if we treat it as a sample, then calculating the sample standard deviation and sample variance would be different. In such a case, we do not divide the sample size to find the variance, we first subtract each of the data from the sample size and then divide by the smaller number.
Importance of Variance and Standard Deviation
Variance and standard deviation serve as the basics of statistical calculations. For example, standard deviation is needed for converting test scores into Z-scores. Variance and standard deviation are used when conducting statistical tests such as t-tests. Therefore, SD and variance play an important role in statistics as well as finance.
- Related Articles
- Application of Variance and Standard Deviation in Psychology
- How is the standard deviation and variance of a two-asset portfolio calculated?
- Difference Between Beta and Standard Deviation
- Plot mean and standard deviation in Matplotlib
- Average Returns and Standard Deviation of Securities
- C++ Program to Calculate Standard Deviation
- What is Standard Deviation of Return?
- Java Program to Calculate Standard Deviation
- Golang Program to Calculate Standard Deviation
- Swift Program to Calculate Standard Deviation
- Python Program to Calculate Standard Deviation
- C program to calculate the standard deviation
- Print the standard deviation of Pandas series
- How to calculate standard deviation in Excel?
- How to create boxplot using mean and standard deviation in R?
