Is there a general rule of thumb for how accurate a gauge should be in compared to the parts being checked? In my chemistry class, I remember making standards where the instrument needed 1 more significant figure in compared to the standard solution we were creating. Is it the same rule for gauges?
1 Answers
A measuring device will have as two main contributions to its accuracy and precision, a calibration uncertainty and a scale uncertainty.
Consider a ruler. It expands and contracts as it is heated and cooled. Consider a beam that is being measured by that ruler. The beam does not expand and contract at the same extent as the ruler. The measurements at two different temperatures will give two different lengths. This is the calibration uncertainty. It causes readings to be inaccurate.
Consider a ruler marked in millimeters and one marked in inches. The former has a scale (device) uncertainty of $\pm 0.5$ mm. The latter has a device uncertainty of $\pm 0.5$ in $\approx 13$ mm. Device uncertainties affect the precision of the measurements, not the accuracy.
The best approach to obtain a rule of thumb is to use relative uncertainties rather than absolute.
Consider that you will measure a beam that is EXACTLY 1 m long regardless of temperature. You measure on a cold day and on a hot day using a ruler that expands and contracts by ... say ... 100 microns. Your relative calibration uncertainty is $100/10^6 = \pm 100$~ppm. Using the millimeter ruler, your relative device uncertainty is $0.5/10^3 = \pm0.05$%. Using the inch ruler, your relative device uncertainty is $13/10^3 = \pm 1.3$%.
As a rule of thumb, I routinely teach (in a colloquial manner) that better than $\pm 0.5$% is national labs standards, better than $\pm 1$% is calibration standards, better than $\pm 5$% is highly regarded, better than $\pm10$% is routinely reproducible, better than $\pm15$% is engineering seat of the pants, and anything higher than $\pm20$% is gossip.
Are these rules codified in some resource? Yes, in my hand-written or print out notes for some of my classes. Otherwise, no. The requirements for measurement tolerance (overall uncertainty) may be codified in a document that defines how the component is to be used. At that point, your job is to find a measuring device (gauge) that has no more than that overall uncertainty, presumably being better.
In summary, it is NOT about the "extra" number of significant digits. It is about the extent that you accept that you will have an uncertainty and will need to define/control it.
- 3,021
- 8
- 16