Calibration of Pressure Gauge, How to Determine Calibration Tolerance



Hi All,

My company just buy new digital pressure with regulator. It can use unit Kpa and mmHg. the pressure range is 0~100 for kpa and 0~750 for mmHg

In the process we use mmHg unit and we use 150mmHg as constant setting.

Here are my questions?
1. how to determine the calibration tolerance when the process tolerance is constant (fix)?
2. How to calibrate this equipment?

Thank You


Looking for Reality
Trusted Information Resource

There are typically two ways to choose a calibration tolerance...factory, and need.

For a pressure gage (which is part of the regulation system), I would almost always use an outside service for calibration...simply ask if the service can calibrate pressure gages to the needed range before engaging their services.

1. Factory: If you don't tell the calibration company what your acceptance criteria are, they will typically judge against manufacturing works as new, or it fails. While this is fine if it can be expensive if the gage fails but still meets your needs.

2. Need: You leave the gage at 150mmHg all the time...but what does your process actually NEED? Will your process still run fine at 140mmHg? 180mmHg?
Basing calibration acceptance criteria on what actually matters to your process is a more effective way to judge if the gage is good enough.

Example: Factory acceptance as new requires +/- 1mmHg accuracy.
Your process needs +/- 20mmHg.
- Your gage meets factory specs...great, move on.
- Your gage cannot hold +/- 30mmHg, get a new gage.
- Your gage holds +/- 2mmHg, no better.
In this third situation, using factory settings as your criteria means you have to buy a new gage...but your real process needs dictate that the gage is still fine for that use.

Note that I used a 10:1 accuracy:need ratio.
If your process needs +/- 20mmHg, and your gage can only hold +/- 15mmHg, you may be tempted to use it...but you'd be better off getting a new gage...

Top Bottom