Two Ways to Measure Spread
Both range and standard deviation measure how spread out data is, but they capture fundamentally different aspects of dispersion. Understanding when to use each is essential for proper data analysis.
Range tells you about the extremes—how far apart the highest and lowest values are. Standard deviation tells you about the typical spread around the average. Both are useful, but for different purposes.
Quick Decision Guide
Definitions and Formulas
Range
Standard Deviation
Head-to-Head Comparison
Range Advantages & Disadvantages
SD Advantages & Disadvantages
When to Use Each
Use Range when:
- You need a quick, rough estimate of spread
- Extreme values are what matters (e.g., temperature range for HVAC design)
- Data is known to be clean with no outliers
- Communicating with audiences unfamiliar with statistics
- Sample size is small and fixed (same size for all comparisons)
Use Standard Deviation when:
- Performing statistical analysis or hypothesis testing
- Comparing variability across different sample sizes
- Computing confidence intervals or p-values
- Assessing typical variation rather than extremes
- Data may contain outliers that shouldn't dominate the measure
Practical Examples
Example: Daily Temperatures
Example: Test Scores with Outlier
Advanced Considerations
Relationship Between Range and SD: For normally distributed data, Range ≈ 4-6 × SD for typical sample sizes. This allows rough conversion between them.
Interquartile Range (IQR): A compromise that uses Q3 - Q1 instead of max - min. It's more robust than range while simpler than SD.
Best Practice