Home Mutual Funds Standard Deviation vs. Variance: What’s the Difference?

Standard Deviation vs. Variance: What’s the Difference?

Standard Deviation vs. Variance: An Overview

Standard deviation and variance are two basic mathematical concepts that have an important place in various parts of the financial sector, from accounting to economics to investing. Both measure the variability of figures within a data set using the mean of a certain group of numbers. They are important to help determine volatility and the distribution of returns. But there are inherent differences between the two. While standard deviation measures the square root of the variance, the variance is the average of each point from the mean.

Key Takeaways

• Standard deviation and variance are two key measures commonly used in the financial sector.
• Standard deviation is the spread of a group of numbers from the mean.
• The variance measures the average degree to which each point differs from the mean.
• While standard deviation is the square root of the variance, variance is the average of the squared difference of each data point from the mean.
• The two concepts are useful and significant for traders, who use them to measure market volatility.

Standard Deviation

Standard deviation is a statistical measurement that looks at how far a group of numbers is from the mean. Put simply, standard deviation measures how far apart numbers are in a data set.

This metric is calculated as the square root of the variance. This means you have to figure out the variation between each data point relative to the mean. Therefore, the calculation of variance uses squares because it weighs outliers more heavily than data that appears closer to the mean. This calculation also prevents differences above the mean from canceling out those below, which would result in a variance of zero.

But how do you interpret standard deviation once you figure it out? If the points are further from the mean, there is a higher deviation within the data. But if they are closer to the mean, there is a lower deviation. So the more spread out the group of numbers are, the higher the standard deviation.

Variance

A variance is the average of the squared differences from the mean. To figure out the variance, calculate the difference between each point within the data set and the mean. Once you figure that out, square and average the results. Using software like Excel can help you in this process.

For example, if a group of numbers ranges from one to 10, you get a mean of 5.5. If you square the differences between each number and the mean and find their sum, the result is 82.5. To figure out the variance:

• Divide the sum, 82.5, by N-1, which is the sample size (in this case 10) minus 1.
• The result is a variance of 82.5/9 = 9.17.

Note that the standard deviation is the square root of the variance so the standard deviation is about 3.03.

Key Differences

Other than how they’re calculated, there are a few other key differences between standard deviation and variance. Here are some of the most basic ones.

• Standard deviation measures how far apart numbers are in a data set. Variance, on the other hand, gives an actual value to how much the numbers in a data set vary from the mean.
• Standard deviation is the square root of the variance and is expressed in the same units as the data set. Variance can be expressed in squared units or as a percentage (especially in the context of finance).
• Standard deviation can be greater than the variance since the square root of a decimal is larger (and not smaller) than the original number when the variance is less than one (1.0 or 100%).
• The standard deviation is smaller than the variance when the variance is more than one (e.g. 1.2 or 120%).

The table below summarizes some of the key differences between standard deviation and variance.