In calibrating pressure transducers and transmitters, a standard field practice is to perform a simple one-point calibration. This is performed at the point of maximum deviation - typically the “Zero” or “Span” points. The aim is to correct the point in the measurement range, which is drifting, and retain the overall specification of the transducer throughout the range. It's widely perceived that a one-point calibration would not affect other measuring points in the range, but this is not really the case!
Let us take an example of a one-point calibration performed to “zero” a pressure transducer. Because of the one-point calibration, a zero offset is applied to the transducer. This offset is the adjustment at the zero point, which causes a shift in the entire calibration curve. The shift is significant for transducers with a low range as this causes the span to shift dramatically. On the other hand, if the one-point calibration is performed at the “span” point, the adjustment causes the curve to rotate around the zero point. This happens because a span adjustment is essentially a multiplier that results in a greater offset the farther we move from the span point. So how do we rectify this undesired shift in the calibration range?
The ideal solution is to linearize the transducer throughout its range. Unlike in a typical calibration lab environment his is not a feasible solution in a field setting because of time and resource requirements. The practical, field-friendly solution to this problem is to perform a two-point calibration!
The two-point calibration allows the user to adjust two points in the range of the transducer closer to the span and zero points. This allows the user to have a "zero offset,” as well as a “span multiplier,” to ensure validity of the complete calibration curve. The calibration is performed by comparing the reference values at the high point and low point and comparing them to the respective values from the measured high and low points. The span adjustment is then utilized to calculate the zero offset. This adjustment ensures the performance of the transducer is met throughout the calibration curve.
Mensor’s pressure controllers have a dedicated application for easy setup and access to two-point calibrations. The paper below explains the best practices to perform a two-point calibration and the ill effects of incorrect adjustments on the performance of the transducer.
- How to Calibrate Pressure Instruments
- 10 Reasons to Calibrate Your Instruments
- An Explanation of Calibration Traceability
- How to Choose the Correct Type of Calibration
- What Does As-Found and As-Left Data Mean in a Calibration?
- What is the Difference Between NIST Traceable and ISO/IEC 17025 Accredited Calibrations?