Calibrating .001" resolution Dial Calipers - Calibration and safeguarding procedure

apestate

Quite Involved in Discussions
Calibrating .001" res dial calipers

Hello..

could someone familiar with the handy tool indicator called "dial calipers" read this & tell me if there are any special calibration and safeguarding procedures to follow when using them for inspection?

product is screw machine parts for a 16949 organization, we use the dial indicators like mitutoyo 505 or brown & sharpe's to verify tolerances of .003" -- after that we go to snap gauges or geometric gauges or the micrometer.

Are dial calipers scrutinized for their accuracy and R&R moreso than micrometers or geometric gauges (go/notgo, thread rings & plugs, et cetera) ?

Any special procedures you can relate?

How do we meet ISO 9001:2000 7.6 d) "be safeguarded from adjustments that would invalidate the measurement result;" ?
 
D

D.Scott

atetsade said:
Hello..

could someone familiar with the handy tool indicator called "dial calipers" read this & tell me if there are any special calibration and safeguarding procedures to follow when using them for inspection?

product is screw machine parts for a 16949 organization, we use the dial indicators like mitutoyo 505 or brown & sharpe's to verify tolerances of .003" -- after that we go to snap gauges or geometric gauges or the micrometer.

Are dial calipers scrutinized for their accuracy and R&R moreso than micrometers or geometric gauges (go/notgo, thread rings & plugs, et cetera) ?

Any special procedures you can relate?

How do we meet ISO 9001:2000 7.6 d) "be safeguarded from adjustments that would invalidate the measurement result;" ?

I would recommend you look through MSA Third Edition. It gives great guidance on system analysis.

Just as a rule of thumb, your measurement system increments should be 1/10 of your tolerance so if your calipers read to .0001" you should be fine with them as long as your R&R results look good.

I don't think any measurement system is subject to more scrutiny than another. As long as you do your work up front on the MSA and can justify the use of the gage, the requirement is that it is suitable for the measurement being made. After that, it needs to be calibrated and reviewed just like the micrometers or anything else.

Most adjustable gages have an adjustment access (sometimes a screw other times a cover). If you cover the access point with the calibration sticker (on your ring gages you will see the adjustment hole filled with wax) the equipment cannot be adjusted without breaking the "seal". Some measurement equipment might require a password to access a calibration program and others may need to be covered by procedure and training (this is always good no matter what else you do).

I am sure others will have additional guidance for you but in the meantime, I hope this helps.

Dave
 

Mike S.

Happy to be Alive
Trusted Information Resource
Generally speaking, I don't think you will find calipers with .0001" resolution -- .0005" is the best I've seen. I calibrate with gage blocks to +/- .001".

If I had a +/- .003" tolerance I would not use calipers unless all the parts were within +/- .002" as I have found +/- .001" is about as good as calipers are good for. Maybe, if calipers are preferred for ease of use, you could use them as long as the parts are measuring within +/- .002" and any parts closer to the spec. edge get looked-at with micrometers.

One thing to beware of with calipers is that the thin portions of the OD measuring part, and the ID measuring faces, both of which are thin, will wear much quicker than the wide part of the OD measuring face.
 
M

mooser

I would agree w/ Mike on the accuracy of the caliper for what you are checking. A 4 to 1, tolerance to accuracy can be used when your have no other choice but shouldn't be the stanard you use all the time.
But read some ther issues in your question:
1. How is a caliper calibrated?
Check the caliper through almost all it's range for the outside blades. Checked inside blades, unless you have documented that they are not used. Also check the parallel of the each set of blades.
2. The reference to 7.6 d in the case of the caliper I think means that the calipers should be verified before each use or make sure that nothing has happen to the caliper between uses ( not dropped or damaged)
mooser
 
B

ben sortin

Brown & Sharpe model 577-1 vernier calipers are my personal choice. I would tend to use micrometers for anything less than a 0.010 inch tolerance. Calibrate across the range of the instrument either way.
 

Wes Bucey

Prophet of Profit
atetsade said:
Hello..

"dial calipers"
calibration and safeguarding procedures to follow when using them for inspection?

"safeguarded " ?
Frequency of calibration is directly related to frequency of use. Some organizations I am aware of use number of measurements as frequency guide rather than time period.

Vernier and dial calipers are pretty rugged, but have little idiosyncrasies which can collect damage or debris and throw readings off. Cleanliness is important. Electronic readout calipers can be affected by humidity (use new batteries at each calibration.)

Note: "stabilizing" or "normalizing" can be accomplished by keeping test instrument and standards (blocks, etc.) in same room for 24 hours before performing calibration procedure to assure expansion for temperature is uniform for everything.

"safeguard" usually means putting tape or wax over adjustment screws.

In using for measurement of piece parts, always ensure part and instrument are clean (maybe swish screw machine parts in mineral spirits to remove cutting fluid and metal shavings)

Here's a copy of one of my work procedures for calibrating dial caliper:
Calibration Frequency: 12 months

(can be modified according to stability, purpose and usage)

1.0 Scope:

This method describes the calibration of calipers, inside and outside dimensions

up to 72 inches (1800 mm) and depth dimension up to 12 inches (300 mm).

Instrument resolution: 0.001 or 0.0005 inch (0.02 or 0.05 mm).

2.0 References: This document is based on the DOD procedures.

3.0 Definitions: TI: Test Instrument

DOD: Department of Defense

4.0 General Requirements:

Environment:

· Temperature: Change should not exceed 2 deg. F/hr (1 deg. C/hr).

· Humidity: No excessive humidity.

· Air quality: N/A.

Stabilization:

·Stabilize equipments and standards at ambient temperature.

Preliminary Operations:

· Clean TI.

· Verify TI for damage such as nicks or burrs.

· Zero TI.

Standards and Calibrating Equipment:

· The standards and equipment used must have a valid calibration certificate.

5.0 Equipment: The following equipment is considered as a minimal requirement

and any equivalent equipment may be used.

1. Gage blocks, grade 4 (B).

2. Precision pin.

3. Calibrated ring gage or gage blocks with caliper jaws.

4. Grade A / Class 1 surface plate.

5. Lapping kit.

Item # 1

Test Characteristics----- Graduations

Acceptance Limit----- Good contrast

Test Method----- Visual

Item # 2

Test Characteristics----- Sliding jaw free play

Acceptance Limit----- No perceptible free play; (adjust if required)

Test Method----- Visual

Item # 3

Test Characteristics----- Sliding jaw movement

Acceptance Limit----- Smooth along full length of the beam

Test Method----- Visual

Item # 4

Test Characteristics----- Synchronization of The zero; The Dial (if applicable)

Acceptance Limit----- At zero ± one grad.; At 12 o’clock ± 1/16 rev.; (synchronize gear if req'd)

Test Method----- Visual

Item # 5

Test Characteristics----- Wear and parallelism of the jaws

Acceptance Limit----- Reading to be within nominal ; ± .001 inch (0.02 mm)

Test Method-----(see items 6, 7, 8)

Item # 6

Test Characteristics----- Outside jaws

Acceptance Limit----- Three places

Test Method-----Precision pin or gage blocks



Item # 7

Test Characteristics----- Inside jaws

Acceptance Limit----- Two places

Test Method----- Ring gage or gage blocks



Item # 8

Test Characteristics----- Depth Gage

Acceptance Limit----- One place only

Test Method----- Surface plate or gage block



Item # 9

Test Characteristics----- Linearity of scale; (outside jaws only); Verify at: (see Note 1); 0.5 inch (10 mm), 25%, 50% and 100% of full scale range; Example:; 0 to 6 in. = 0.5, 1.5, 3.0 and 6.0; 0 to 150 mm = 10, 40, 75 and 150

Acceptance Limit----- inches; 0.001 or 0.0005 = ± 0.001/10; Millimeters; 0.02 = ± 0.02/300

0.05 = ± 0.05/300

Test Method----- Gage blocks

7.0 Notes:

1.If TI has inches and millimeters scales, both scales are to be calibrated.
For Digital TI, only one scale should be calibrated.
For Dial TI, change scale values to read at 25%, 50% and 75% of dial.
Example: 0.5, 1.25, 2.75 and 5.950 inches.
If TI has a scale graduated in 1/100 or 1/128 inch, this scale should be
calibrated only if required.


 
R

Ryan Wilde

atetsade said:
How do we meet ISO 9001:2000 7.6 d) "be safeguarded from adjustments that would invalidate the measurement result;" ?

Everything has been covered except this clause in depth. On calipers, you will find two tiny screws that are adjusted to keep the carriage (the moving jaw) tight against the body of the caliper without binding. As you look at the dial, they will be on the edge closest to the "0" marking. Misalignment of these screws will cause the carriage to be loose, and will give inaccurate results at the ends of the jaws. After calibration, these screws should be sealed in place (I use red fingernail polish - it's $0.99, lasts a few years, and is very obvious if tampered with). If you find during the next calibration that the seal is tampered with, you know that somemone has invalidated the prior calibration, and heads should roll.

Ryan
 
Top Bottom