Resolution in Metrology, quite like the ones we make on New Year’s Day, can be perceived as a key indicator of the overall value of the measurement. However, many times we get fixated on the number of counts or digits that we read out of our measurement instrument without understanding if those digits are even valid output values or simply noise appended to a signal.
Before we go 10 digits deep in to resolution (get it?), let’s first understand what it means in the pressure calibration industry. According to the Vocabulary of Metrology (VIM), resolution is the smallest change in a quantity being measured that causes a perceptible change in the corresponding indication.
Let’s take the modern day silicon pressure sensor. Technically speaking, the resolution you can get out of a raw silicon sensor is infinite and only limited by the transducer being used to measure the sensor. However, packaged pressure sensors have electronics that interpret the data, such as A/D converters, that limit the overall resolution that can be derived out of the transducer. Mensor’s CPT9000 Premium Pressure Transducer has an integrated 24 bit A/D converter with 16,777,216 steps or counts, which is the maximum number of discrete values or divisions that it is possible to produce from the selected number of bits. So does this mean you can get over 0.1 PPM of resolution when reading the transducer? Theoretically, yes. Practically, it depends…
Theoretically you can get over 0.1 PPM resolution out of the digital pressure instrument but its validity depends on the process of measurement, the overall uncertainty of the instrument and the readout limitations set by the manufacturer of the instrument. So let’s evaluate those one by one.
Process of measurement
Consider reading out a single pressure value from a pressure transducer. The resolution of the reading represents one part true signal and one part noise. As this noise is random, you can’t give a high level of confidence to the trueness of the single value being read.
Now, if this process is repeated “n” times in the same conditions, you can calculate the average of the readings to get significantly closer to the true value. Calculating the standard deviation of the mean, assuming the noise is random, significantly increases the overall confidence level on the trueness of the readout.
So imagine a 0 ... 1 psi full span transducer, with a 7 digit resolution, where in you can derive as small as 0.1 x 10-5 psi in resolution. The confidence in the overall signal to noise ratio of a single reading is significantly poorer than taking a greater sample size.
Uncertainty of the instrument
Resolution Uncertainty is a factor that contributes to uncertainty in measurement. Resolution is also considered to be a factor in the overall Type A uncertainty. This means its contributions are calculated by statistical analysis obtained under defined conditions. Typically, the resolution component of uncertainty is calculated as a rectangular distribution to the overall type A uncertainty root sum square (RSS) equation.
This means for the same 0 … 1 psi transducer with 7 digits of resolution, there is a factor of uncertainty contribution relating to the 0.1 x 10-5 psi minimum resolution. Although, its overall impact depends greatly on how this relates to the other factors in both Type A (deviation) and Type B (reference uncertainty, linearity, repeatability, hysteresis etc.) uncertainties.
Limitations set by the manufacturer
In response to resolution’s contribution to the overall uncertainty, device manufacturers restrict the readout of the instruments (specifically instruments with a digital display indication) to reduce displayed noise and make reading the display easier.
In the previous example, the overall stated uncertainty of the transducer is 0.01% FS (reported at k = 2 with a confidence level of ~95%). This implies any readings with a sensitivity better than 0.1 x 10-3 psi do not have the same confidence level of their trueness. So in this case, despite the resolution being capable of 0.1 x10-5 psi, it is unclear if the last two digits of the recorded measurement are of any significance.
This problem is somewhat mitigated by reading filters (exponential, boxcar or Butterworth) deployed to counteract the noisy output. However, it is imperative that the filters don’t make the display reading lag behind the true reading.
For pressure instruments like the Precision Pressure Indicator CPG2500, the measurement uncertainty reported is 0.008% (IntelliScale accuracy) or 80 ppm and the resolution is defined as 6-7 digits. This is the case mostly to ensure that the signal to noise ratio of the pressure output is high.
So in summary, more resolution doesn’t necessarily mean a truer value especially when the sample size of the reading itself is limited. This being said, it is important to know that resolution does factor in to the overall uncertainty of the instrument.