C
Hoping someone can help clear this up. We have a 0-3000 psi gauge with a tolerance of +/-3% FS. Our QE's understanding of Full Scale is that this is the full range of the gauge (e.g., 3000), therefore the tolerance should be +/- 3% of 3000. Is this correct (our calibration supplier says the tolerance is based on 1000psi)? So, if the calibration cert indicates applied pressure of 100, 1500 and 3000psi, and the results indicate 105, 1507 and 3009psi, respectively, is the gauge within tolerance? Can someone explain how this is calculated?
Much thanks!
Much thanks!