Σ
SDCalc
ระดับกลางApplications·12 min

Combining Standard Deviations: Sums, Differences, and Averages

Learn how to combine standard deviations correctly for sums, differences, and averages, why variances add instead of standard deviations, and when covariance changes the result.

By Standard Deviation Calculator Team · Data Science Team·Published

Why Standard Deviations Do Not Add

A common mistake is to add standard deviations directly when combining measurements, totals, or averages. That is not how spread works. Variances add cleanly under the right assumptions, not standard deviations.

The reason is geometric: standard deviation is the square root of variance. Once uncertainty comes from more than one source, you combine the squared contributions first and only take the square root at the end.

Variance-standard deviation relationship

variance = (standard deviation)^2

Quick rule

For independent quantities, convert each standard deviation to variance, add the variances, and then take the square root. Do not add the standard deviations themselves.

If you want to verify the arithmetic after reading, use the site's variance calculator, sample standard deviation calculator, and standard error of the mean calculator. For the theory behind the formulas, this article pairs naturally with Understanding Variance, Standard Error vs Standard Deviation, and Pooled Standard Deviation.

Core Rules for Sums, Differences, and Averages

Assume X and Y are independent random variables with standard deviations sd(X) and sd(Y). Then the following rules hold:

Sum of independent variables

Var(X + Y) = Var(X) + Var(Y)

Difference of independent variables

Var(X - Y) = Var(X) + Var(Y)

Average of n independent observations with common SD sigma

SD(mean) = sigma / √n
GoalCorrect variance ruleResulting standard deviation
Add two independent quantitiesVar(X + Y) = Var(X) + Var(Y)SD(X + Y) = √(sd(X)^2 + sd(Y)^2)
Subtract two independent quantitiesVar(X - Y) = Var(X) + Var(Y)SD(X - Y) = √(sd(X)^2 + sd(Y)^2)
Multiply one variable by a constant cVar(cX) = c^2 Var(X)SD(cX) = |c| SD(X)
Average n independent valuesVar(mean) = sigma^2 / nSD(mean) = sigma / √n

Why subtraction still adds variance

Spread measures uncertainty around the center, not direction. A negative sign flips the mean but does not make uncertainty cancel out, so the variance contribution from Y stays positive in X - Y.

Worked Example: Combining Two Measurements

Suppose a lab reports two independent sources of variability for a final reading: instrument noise with SD 1.2 units and sampling noise with SD 0.8 units. The total measurement error is the sum of those two independent error terms.

ComponentStandard deviationVariance
Instrument noise1.21.44
Sampling noise0.80.64
Combined totalsqrt(2.08) = 1.442.08

The combined variance is 1.44 + 0.64 = 2.08. Taking the square root gives a combined SD of √2.08 ≈ 1.44. If you had added the SDs directly, you would get 2.0, which overstates the true spread.

1

Square each SD

Convert 1.2 and 0.8 into variances: 1.44 and 0.64.
2

Add the variances

For independent error sources, add them: 2.08.
3

Take the square root

Return to the original units: √2.08 ≈ 1.44 units.

How Averages Reduce Standard Deviation

Averages are less variable than individual observations because independent noise partly cancels out. If each observation has population SD sigma, the variance of the sample mean is sigma^2 / n, so the standard deviation of the mean becomes sigma / √n.

This is exactly why standard error shrinks as sample size grows. If a process has SD 12 and you average 9 independent observations, the SD of that average is 12 / √9 = 4.

Single observation

Spread is measured by the ordinary standard deviation of the process or dataset.

Average of many observations

Spread is measured by the standard deviation of the mean, often called the standard error when estimated from sample data.

If you want the inferential version of this idea, continue with Standard Error vs Standard Deviation and Confidence Intervals.

When Covariance Changes the Answer

The rules above require independence or, more generally, zero covariance. When variables move together, covariance changes the combined variance.

General two-variable rule

Var(X + Y) = Var(X) + Var(Y) + 2Cov(X, Y)

Positive covariance makes the total spread larger. Negative covariance makes it smaller. This is why portfolio risk, repeated measurements on the same subject, and paired experimental designs cannot be handled by the independence shortcut alone.

Do not assume independence blindly

Measurements taken on the same device, from the same person, or from the same time series window often share correlation. In those cases, the simple variance-addition rule can be wrong in either direction.

Practical Checklist

  • Identify whether you are combining totals, differences, or averages.
  • Check whether the variables are reasonably independent or whether covariance must be modeled.
  • Convert each standard deviation into a variance before combining terms.
  • Apply any constant multiplier before taking the final square root: Var(cX) = c^2 Var(X).
  • Interpret the final SD in the original units of the combined quantity.

Common Mistakes

MistakeWhy it is wrongWhat to do instead
Add standard deviations directlyStandard deviations are square-root quantities, so direct addition misstates uncertaintyAdd variances, then take the square root
Assume subtraction cancels variabilityDirection affects the mean, not the variance contributionUse the same variance-addition rule for X - Y when variables are independent
Use sigma / n for the SD of an averageAverages reduce variance by n, not SD by nUse sigma / sqrt(n)
Ignore covariance in paired or correlated dataCorrelation changes the combined spread materiallyUse the covariance-adjusted formula or a model that captures dependence

For direct calculation practice, compare results in the mean calculator, variance calculator, and sample standard deviation calculator. If your problem involves combining group summaries instead of random variables, the relevant next stop is Pooled Standard Deviation for Multiple Groups, not the formulas in this article.

Further Reading

Sources

References and further authoritative reading used in preparing this article.

  1. NIST/SEMATECH e-Handbook of Statistical MethodsNIST
  2. Variance - WikipediaWikipedia
  3. Propagation of uncertainty - WikipediaWikipedia