by StaceyKoprince Thu May 22, 2008 3:20 pm
Actually calculating standard deviation requires you to:
1) find the average of the entire set
2) subtract each individual term from that average in step 1
3) square each result from step 2
4) average the results from step 3
5) take the square root of the result from step 4
If the test actually required you to do this in 2 minutes without a calculator (unless it only gave you like 3 numbers)... I'll eat my hat. (I don't actually have a hat... but you know what I mean!)
The test can ask you to interpret something about a standard deviation, including asking you what could change a standard deviation, etc. So you could certianly be doing some calculations - but not the full set of calculations listed above!
To address Jimmy's question, let's say I tell you that you have a set of 5 terms (numbers) with an average of 3 and a standard deviation of about 1.4. Then I ask you to add a 6th term that will INCREASE the standard deviation (make it larger than 1.4).
Standard deviation is a measure of how far the terms are from the average. So:
- Anything that falls within one standard deviation of the average will decrease the standard deviation (in this case, anything between about 1.6 and 4.4).
- Anything that falls exactly one standard deviation from the average will maintain the same standard deviation (in this case, 1.6 and 4.4)
- Anything that falls outside of that one standard deviation will either increase the standard deviation (in this case, anything smaller than about 1.6 or larger than about 4.4)
In this case, the right answer will be something that fits with the third bullet.
Stacey Koprince
Instructor
Director, Content & Curriculum
ManhattanPrep