Accuracy and Uncertainty in Performance Specifications

Accuracy is the measure of a calibration product’s performance and quality. While accuracy is the most popular indicator of quality, not all manufacturers use it. Some use the term uncertainty to determine the quality of the calibration equipment.

The waters get even muddier when these two terms are used (incorrectly) interchangeably. While accuracy is the overall proximity a reading is to its true value, uncertainty pertains to the outliers and anomalies that would otherwise skew accuracy readings.

Read "Accuracy vs. Precision" to learn more about these two common terms.

Generally, uncertainty is the measure of statistical dispersion of the values measured. All measurements from weight to temperature and pressure are subject to some level of uncertainty. Why is this important to determine accuracy?

How Accuracy, Uncertainty and Deviation Connect

While accuracy indicates how close a measurement is to its true value, uncertainty takes into account any statistical outliers that don’t conform. These may exist due to anomalies, adjustments or other outside factors. To factor these anomalies directly into an instrument’s accuracy would be misleading. Taking the uncertainty values as a whole and calculating them as a component of accuracy gives a better indicator of an instrument’s overall performance.

To better determine measurement uncertainty, you must also calculate deviation in a reading. Deviation is the difference between measured values and the true or expected value. As an example, deviation on a display is the limited resolution or error in reading that leads to the measurement uncertainty. The deviation reflects the random and systematic components of a measurement. The accuracy is proportional to the deviation. This means the greater the deviation, the higher the measurement uncertainty, the less accurately the instrument operates.

Measurement Uncertainty

Guidelines for Performance Specifications

In the world of measurement technology, there are numerous guidelines that provide the means to evaluate the performance of calibration instruments. These are tailored for the International Vocabulary of Metrology (VIM) or Guide to Uncertainty Measurement (GUM). However, the guidelines are often generally formulated and do not include all terms from the standards for measurement technology. European Association of National Metrology Institutes (EURAMET) has drawn up guidelines to harmonize measurements in the pressure measurement and calibration industry. These guidelines are meant to support laboratories and establish practical procedures.

Typical parameters of calibration, such as measurement uncertainty and measuring error, are dependent upon the following factors:

  • Reference instrument and its characteristics
  • Repeatability
  • Hysteresis properties of test item
  • Resolution
  • Offset error

A mathematical model can be created from these characteristics for the determination of measurement uncertainty and measuring error.

When it comes to pressure calibration instruments with complex capabilities such as deadweight testers and controllers, and instruments with minute measuring deviations, measurement uncertainty plays an important role in the accuracy of the measurement. For these instruments, the systematic components, the uncertainty of the reference instrument, the ambient conditions and the uncertainty contributions for the instrument itself are considered to form the components of the accuracy of the instrument.

So, uncertainty and accuracy are NOT the same, especially when comparing instruments from various manufacturers who might be misinterpreting these terms themselves!

Download 'Performance Specifications of Calibration Instruments' Trade Article

 

Related Reading:

tag Learning Calibration measurement specifications accuracy uncertainty