When considering how to calibrate pressure instruments in house, there are a few things to think about before setting up the lab and committing to a scope of calibration:
- Review or create in-house quality procedures
- Determine the level of calibration required
- Do you need traceability to NIST or other national institutes?
- Is a calibration accredited to ISO 17025 necessary?
- Or can you verify without traceability or accreditation?
- Are there requirements for an as-found calibration and/or an adjusted as-left calibration?
- Do you have personnel trained in these procedures?
Once these questions are answered, you must choose a pressure standard to use in the calibration. The standard can be any pressure sensing device (Controller, Transducer, Digital Gauge, Deadweight Tester, Portable) that has a range and accuracy capable of covering all the devices being tested. The standard's accuracy should be four times better than the accuracy of the device under test (DUT). The range of the standard must at least encompass the highest ranged device you plan to calibrate.
Accuracy specifications are sometimes reported as a percent of full scale range or as a percent of reading down to a specific percent of full scale. In either case the accuracy of the standard and the accuracy of every DUT you plan to calibrate must be compared to make sure the standard has sufficient accuracy in every case. For example, a standard with full scale range of 3000 psi with an accuracy of ±0.01% of full scale is accurate to 3000*±0.01% = ±.3 psi. This standard would be inadequate when calibrating a transducer with a full scale range of 15 psi and an accuracy of ±0.1% of full scale, which equates to ±0.015 psi.
Many pressure calibrators are equipped with multiple internal transducer ranges and multiple pressure controlling channels for exactly this reason. This information will be your guide in determining if you already have an appropriate pressure standard or if you will need to purchase more.
The calibration will consist of generating a range of pressure points determined by your procedure. These points must be stable enough to record a simultaneous reading from the standard and the DUT. The pressure difference between the standard and the DUT is recorded as the error. Generation of a common pressure to the standard and to the DUT can be accomplished using various devices such as:
- a deadweight tester
- a manual pressure pump and volume controller connected to the DUT and a transducer or indicator
- an automated pressure controller.
In any case, the pressure difference between the standard and the DUT for each calibration point is recorded as the error. These data points can be compiled into a record that can serve as the calibration report. The initial calibration or "as-found" calibration shows the initial state of the DUT. If the DUT is found to have errors exceeding its accuracy specification, corrections can be made to bring it into compliance (reference the manufacturer's manual for instructions). After corrections have been made, an as-left calibration can also be done to verify the corrected values.
For a detailed overview on how to calibrate pressure instruments, check out our video explaining the calibration setup for a multi-transducer pressure controller connected to a fully automated calibration system.
- Accuracy and Uncertainty in Performance Specifications
- Accuracy vs Precision
- What is Intelliscale Accuracy? | A Mensor Measurement Specification
- What is the Meaning of PPM and PPB?
- Additional Uncertainty for Pressure Instruments in Emulation Mode
- Five Challenges of Low Pressure Calibration
- Pressure Calibration Instruments: Controllers, Indicators, Transducers