Some very basic questions that are eluding me:
1. How does one define a gage's tolerance (i.e arbitrary or based on a spec)
1.1 From the Manufacturer's documents (they don't seem to like to specify this)?
1.2 From a specification (i.e. GGG-C-105C is obsolete, is there a replacement or other document that is more comprehensive?)
1.3 What are the best resources for learning more about gage tolerance
1.4 Is it acceptable to apply the 1:10 rule based on the tolerances of the parts I will be inspecting - can anyone provide a citation that invokes the 1:10 rule?
2.0 The basis for my question include 0-1" Digimatic Micrometers with .00005" resolution (blade, 1/4" barrel, and .080" step barrel); 0-1" Digimatic Height Gage; SJ-400 Surface Finish Gage; and even a Nikon vision system. The manuals for these instruments speak to "accuracy" but are silent about their uncertainty (at least not in numeric terms - inches or mm).
Thank You,
Jim
1. How does one define a gage's tolerance (i.e arbitrary or based on a spec)
1.1 From the Manufacturer's documents (they don't seem to like to specify this)?
1.2 From a specification (i.e. GGG-C-105C is obsolete, is there a replacement or other document that is more comprehensive?)
1.3 What are the best resources for learning more about gage tolerance
1.4 Is it acceptable to apply the 1:10 rule based on the tolerances of the parts I will be inspecting - can anyone provide a citation that invokes the 1:10 rule?
2.0 The basis for my question include 0-1" Digimatic Micrometers with .00005" resolution (blade, 1/4" barrel, and .080" step barrel); 0-1" Digimatic Height Gage; SJ-400 Surface Finish Gage; and even a Nikon vision system. The manuals for these instruments speak to "accuracy" but are silent about their uncertainty (at least not in numeric terms - inches or mm).
Thank You,
Jim