Why Standard Deviations Do Not Add
A common mistake is to add standard deviations directly when combining measurements, totals, or averages. That is not how spread works. Variances add cleanly under the right assumptions, not standard deviations.
The reason is geometric: standard deviation is the square root of variance. Once uncertainty comes from more than one source, you combine the squared contributions first and only take the square root at the end.
Variance-standard deviation relationship
Quick rule
If you want to verify the arithmetic after reading, use the site's variance calculator, sample standard deviation calculator, and standard error of the mean calculator. For the theory behind the formulas, this article pairs naturally with Understanding Variance, Standard Error vs Standard Deviation, and Pooled Standard Deviation.
Core Rules for Sums, Differences, and Averages
Assume X and Y are independent random variables with standard deviations sd(X) and sd(Y). Then the following rules hold:
Sum of independent variables
Difference of independent variables
Average of n independent observations with common SD sigma
| Goal | Correct variance rule | Resulting standard deviation |
|---|---|---|
| Add two independent quantities | Var(X + Y) = Var(X) + Var(Y) | SD(X + Y) = โ(sd(X)^2 + sd(Y)^2) |
| Subtract two independent quantities | Var(X - Y) = Var(X) + Var(Y) | SD(X - Y) = โ(sd(X)^2 + sd(Y)^2) |
| Multiply one variable by a constant c | Var(cX) = c^2 Var(X) | SD(cX) = |c| SD(X) |
| Average n independent values | Var(mean) = sigma^2 / n | SD(mean) = sigma / โn |
Why subtraction still adds variance
Worked Example: Combining Two Measurements
Suppose a lab reports two independent sources of variability for a final reading: instrument noise with SD 1.2 units and sampling noise with SD 0.8 units. The total measurement error is the sum of those two independent error terms.
| Component | Standard deviation | Variance |
|---|---|---|
| Instrument noise | 1.2 | 1.44 |
| Sampling noise | 0.8 | 0.64 |
| Combined total | sqrt(2.08) = 1.44 | 2.08 |
The combined variance is 1.44 + 0.64 = 2.08. Taking the square root gives a combined SD of โ2.08 โ 1.44. If you had added the SDs directly, you would get 2.0, which overstates the true spread.
Square each SD
Add the variances
Take the square root
How Averages Reduce Standard Deviation
Averages are less variable than individual observations because independent noise partly cancels out. If each observation has population SD sigma, the variance of the sample mean is sigma^2 / n, so the standard deviation of the mean becomes sigma / โn.
This is exactly why standard error shrinks as sample size grows. If a process has SD 12 and you average 9 independent observations, the SD of that average is 12 / โ9 = 4.
Single observation
Average of many observations
If you want the inferential version of this idea, continue with Standard Error vs Standard Deviation and Confidence Intervals.
When Covariance Changes the Answer
The rules above require independence or, more generally, zero covariance. When variables move together, covariance changes the combined variance.
General two-variable rule
Positive covariance makes the total spread larger. Negative covariance makes it smaller. This is why portfolio risk, repeated measurements on the same subject, and paired experimental designs cannot be handled by the independence shortcut alone.
Do not assume independence blindly
Practical Checklist
- Identify whether you are combining totals, differences, or averages.
- Check whether the variables are reasonably independent or whether covariance must be modeled.
- Convert each standard deviation into a variance before combining terms.
- Apply any constant multiplier before taking the final square root: Var(cX) = c^2 Var(X).
- Interpret the final SD in the original units of the combined quantity.
Common Mistakes
| Mistake | Why it is wrong | What to do instead |
|---|---|---|
| Add standard deviations directly | Standard deviations are square-root quantities, so direct addition misstates uncertainty | Add variances, then take the square root |
| Assume subtraction cancels variability | Direction affects the mean, not the variance contribution | Use the same variance-addition rule for X - Y when variables are independent |
| Use sigma / n for the SD of an average | Averages reduce variance by n, not SD by n | Use sigma / sqrt(n) |
| Ignore covariance in paired or correlated data | Correlation changes the combined spread materially | Use the covariance-adjusted formula or a model that captures dependence |
For direct calculation practice, compare results in the mean calculator, variance calculator, and sample standard deviation calculator. If your problem involves combining group summaries instead of random variables, the relevant next stop is Pooled Standard Deviation for Multiple Groups, not the formulas in this article.
Further Reading
Sources
References and further authoritative reading used in preparing this article.
- NIST/SEMATECH e-Handbook of Statistical Methods โ NIST
- Variance - Wikipedia โ Wikipedia
- Propagation of uncertainty - Wikipedia โ Wikipedia