N
I have audit next week. Pls help!
I want to use a gauge block to calibrate a micrometer.
My product's min width 0.002mm.
My micrometer can show to 4 digit, i.e. 0.0005mm, 0.0010, 0.0015, etc.
A standard 0.002mm gauge block that is measured by external lab. The lab. report shows it is 0.00205mm.
A consultant told me I can't use it the gauge block to calibrate micrometer because it is out of tolerance.
His explanation as below:
min spec=0.002mm
micrometer's resolution=0.0005 (4 digit at least)
micrometer's tolerance=0.0005/2=0.0002mm (at least)
gauge block's resolution from lab report=0.00005mm (5 digit at least)
gauge block's accepted tolerance=0.00005/2=0.00002mm (at least)
He said the gauge block is out of tolerance as 0.00210-0.002=0.00005mm, out of 0.00002 acc tol.
--------------------------------------------------------------------
I don't understand his calculation. It's so difficult to pass the calibration as he said above.
By 1/2 accuracy rule (best is 1/10), I suppose
min spec=0.002mm
micrometer's resolution=0.0005 (4 digit at least)
micrometer's tolerance=0.0020/2=0.0010mm (at least)
gauge block's resolution from lab report=0.0005mm (4 digit at least)
gauge block's accepted tolerance=0.001/2=0.0005mm (at least)
Am I correct?
Can you all explain how to define the internal MMD and external standard tolerance?
I want to use a gauge block to calibrate a micrometer.
My product's min width 0.002mm.
My micrometer can show to 4 digit, i.e. 0.0005mm, 0.0010, 0.0015, etc.
A standard 0.002mm gauge block that is measured by external lab. The lab. report shows it is 0.00205mm.
A consultant told me I can't use it the gauge block to calibrate micrometer because it is out of tolerance.
His explanation as below:
min spec=0.002mm
micrometer's resolution=0.0005 (4 digit at least)
micrometer's tolerance=0.0005/2=0.0002mm (at least)
gauge block's resolution from lab report=0.00005mm (5 digit at least)
gauge block's accepted tolerance=0.00005/2=0.00002mm (at least)
He said the gauge block is out of tolerance as 0.00210-0.002=0.00005mm, out of 0.00002 acc tol.
--------------------------------------------------------------------
I don't understand his calculation. It's so difficult to pass the calibration as he said above.
By 1/2 accuracy rule (best is 1/10), I suppose
min spec=0.002mm
micrometer's resolution=0.0005 (4 digit at least)
micrometer's tolerance=0.0020/2=0.0010mm (at least)
gauge block's resolution from lab report=0.0005mm (4 digit at least)
gauge block's accepted tolerance=0.001/2=0.0005mm (at least)
Am I correct?
Can you all explain how to define the internal MMD and external standard tolerance?