Calibration of a Coating Thickness Gage



We just got our coating thickness gage back from calibration. The unit only reads to one decimal place. The nominal is given to 3 decimal places but the actual is only 1 decimal place. On the low end this does not even measure into the tolerance range (nom:2.041 tol+/- 3% = 0.07). Is it even possible to calibrate a digital coating thickness gage to more decimal places than it can display? The gage is a Phase II PTG-4000.

Thanks in advance for any insight on this.


Englishman Abroad

Involved In Discussions

What happens if you change units to display the thickness in microns instead of thousandths of an inch? It seems that you are limited by the displayed resolution in mil?


Hello Steve!

I'm trying to picture this scenario, and how a calibration source provided data with greater resolution than the device can display.

It is possible that the calibration source used some kind of reverse calibration process? So for... say 1 inch on your thickness gauge, they report what their standard read?

Also, verify the operating settings for your thickness gauge (if it's digital). The resolution may can be adjusted.


The results are only 1 decimal place. The nominal values are 3 decimal places. It just looked wrong for some reason but, as BradM touched on, they can not report accuracy beyond the units ability to display which is 1 decimal place. I guess I looked at it from the wrong frame of mind and it looked off.

Thanks for the help
Top Bottom