The standard deviation is a measure of how much a group of numbers that have a bell-shaped (or Gaussian, or normal distribution) vary around their average. It is defined as
$$\sigma=\sqrt{\frac{1}{N-1}\Sigma (x_i-{\bar x})^2}$$
It tells us if the data has a Gaussian distribution, then 68% of the numbers should fall between the ${\bar x}-\sigma$ and ${\bar x}+\sigma$. Let's see if this holds for the data set we're using here.
Now you try. Fix the code to compute the average, standard deviation, and then to tally the results.
Type your code here:
See your results here:
Theory tells us 68% of numbers should fall within 1 standard deviation of the average. Is this true for the data?
Look up the expected fractions for $2\sigma$ and $3\sigma$. See if this data has the proper fractions for these as well.
Share your code
Show a friend, family member, or teacher what you've done!