Standard Deviation
B. Standard Deviation
So far you have looked at a variety of distributions and described them using the shape of their histograms. It is also possible to describe a distribution numerically. Mean, median, mode, and range are all values you have used in the past to describe a set of data.
Recall that the range is a measure of dispersion. Another measure of dispersion is the standard deviation. Just as the mean, median, and mode were different ways of measuring the "centre" of a data set, the range and standard deviation are different ways of measuring how spread out the data is (the dispersion of the data).
Open the Standard Deviation Histogram applet (4 March 2013, Created with GeoGebra) to see how the shape of a histogram and the standard deviation of a set of data are related.
- Try recreating the histograms from the previous Practice Run. What do you think standard deviation is measuring?
- Try to make the standard deviation as large and as small as you can.
- Knowing that the standard deviation is a measure of dispersion, what does a large standard deviation mean about the data? What does a small standard deviation mean?
Using the Standard Deviation Histogram applet, you may have found that when the data is fairly clustered, the standard deviation is small, whereas when the data is more spread out, the standard deviation is larger. Standard deviation is a measure of how closely the data clusters around the mean of the data. The following table shows some histograms representing data sets with small and large standard deviations.
Distributions with Smaller Standard Deviations | Distributions with Larger Standard Deviations |
---|---|
![]() |
![]() |
![]() |
![]() |
The mean of a set of data is often represented using the lowercase Greek letter mu, ยต, while the standard deviation is often represented using the lowercase Greek letter sigma, ฯ. In the next example you will see how the mean and standard deviation can help you interpret a set of data.