Hi Gordon,
Did you make analysis of you measurement system for Suitability, or analysis of the measurement uncertainty?
I think NO.
Some points for M16, PD-tolerance = 0,16mm acc. to VDA 5, MSA 3, FORD, BOSCH,
1. Resolution <= 5% of work piece tolerance = 0,008 mm, recommended resolution 0,005 mm, your is 0.01 mm
2. Suitability of the measuring instrument - uncertainty sources: reference standard, bias, linearity, repeatability,…
Umi <=0.2*0.33T = 0,011, expected min. Umi for your instrument is 0,04mm
3. Suitability of the measurement system - uncertainty sources: Umi, temperature, operator, method,…
Ums <=0.2*T = 0.08 mm, expected min. Ums for your system is 0.12 mm
You must make the analysis.
Hi Stefan,
I've just logged in to apologize if you felt what I wrote was criticism. It wasn't intended as such - it was written to emphasize that we were looking at the "problem" from different angles. The information Wes wrote was the type of thing I was looking for i.e.
practical information for a machinist. Your information is for the calibration guy sitting in his measuring environment
Re your remarks on uncertainty, reliability etc. then one of the secrets of obtaining certainty and reliability is to measure a component with a known pitch diameter before starting. Personally I have a calibration piece with a known external and internal pitch diameter with me when I measure thread pitch diameter. It's only really necessary when measuring internal pitch diameter as, like virtually all internal measuring instrument, they should be checked (or calibrated) before use.
The second, and main, secret is that I
always use a pressure device to ensure uniform measurement pressure as that is the main reason for digital caliper innacuracy - too much or too little pressure! I'd feel more comfortable if you looked at the website again - and especially the Test Report from an authorized lab. I've measure thousands of times and never had a deviation greater than 0.02 mm.
Look at (broken link removed)
Remember the product is intended for setting up a machine and checking at intervals to see movement and deviation. The GO thread gauge need only be used to verify full thread profile - due to possible cutting tool wear or breakage.
Again, the measuring inserts are made for the machinist - not the gauge calibration guy.
I also have a digital caliper with a 0.005 mm display (and twice as accurate as a standard digital caliper) plus I also have a digital caliper with a 0.001 mm display and guaranteed as accurate as a micrometer.
Finally, I've personally measured the same calibrated thread gauges with 3-wires, micrometer thread inserts and my thread inserts - I got the same value within 0.01 mm on each! It would be suicide or madness for me to claim this accuracy if I wasn't certain.
If our paths ever cross I'd love to demonstrate
I can't resist this - measuring thread pitch diameter with 3-wires is regarded as the most accurate. What if the flank angle isn't exactly as specified???? Even thread gauge flank angles have a tolerance!
Now I'd better be careful or I'll invoke the wrath of Gageguy (Wayne)
