Micrometer Calibration Procedure - Please critique

apestate

Quite Involved in Discussions
Hello forum

This procedure has some logical problems, but aside from the mistakes of writing quickly, I was hoping someone would be interested in generally critiquing 1.) style 2.) technical content

The calibration procedure isn't written for accredited calibration services or ISO 17025--it would simply be used in an ISO 9001:2000 category job shop.

How do you feel about the content? Measurement uncertainty has been left out entirely. Anything involving torque of measurements has been left out, leaving that up to the operator. Gage block wringing and stacking has also been left out. I haven't included anything about optical flats, and I made attribute judgements out of spindle play, perpendicularity, flatness, etc.

Technically, are any of the instructions inappropriate?

Are there any instructions that should be included?

Is the basic outline style ugly, or inefficient; has too much detail been included for steps? Would a flowchart and data tables be any easier to follow?

Feel free to hack away, I'm looking for good general suggestions. I'd also be interested in anyone's take on the technical content.

--Erik
 

Attachments

  • Micrometer.doc
    68.5 KB · Views: 1,145

apestate

Quite Involved in Discussions
The bullets & numbering is now top of the line.
 

Attachments

  • Micrometer.doc
    111 KB · Views: 766

sonflowerinwales

In the country
Looks Ok to me.
It may be an idea in clauses 4.3 to add 4.3.10, recheck of zero, and add 4.8.11, recheck of zero. I would avoid removing the spindle to oil, (4.5) as I've had more problems removing them than leaving in place. Sometimes, for no real reason, when you re-assemble, it gets tight, and you then have problems. Also I would move this para earlier in the procedure, before calibration starts, around para 4.2. One other point, what about metric equipment?
Paul
 

apestate

Quite Involved in Discussions
thanks sonflowerinwales.

I was a little concerned about spindle removal and tightening of spindle and adjustments such as this, as things can get messy. I certainly won't be making instructions on how to disassemble a dial caliper, that never works.

I do like the idea of taking measurements before doing any calibration work, even cleaning.

This is a draft, of course. In the second version posted above the formatting of the outline numbering has been changed so that it's easy to tab over to bulleted point or NOTE: which seems to work nice.

what's metric? :biglaugh:
 

gpainter

Quite Involved in Discussions
Think about making it shorter. We did most of our own calibration and no WI was over one page and had no problems.
 

CarolX

Trusted Information Resource
gpainter said:
Think about making it shorter. We did most of our own calibration and no WI was over one page and had no problems.

I have to concur on this...my cal procedure for micrometers is a single page.

Remeber the KIS method, keep it simple.
 
K

Kevin H

Some random thoughts - do you really want to specify temperature as 68 F, +/- 1.8 F - what are you using to measure and control the temperature that closely is it tracable to NIST, etc. Same comments for relative humidity - if you specify it you have to measure and control it and trace back to NIST. Is it even relative to micrometer calibration? (My personal opinion is that as written, it's an easy nonconformance for an auditor unless you really are controlling that closely.)

With temperature, you can justify a much wider range especially as you are only going out 3 decimal places - get the coefficient of thermal expansion for the material your gauge blocks are made of and calculate the effect of temperature on them for the temperature range you'd like to work to. (We had blocks made of 52100 steel, and the effect of temperature on them for a 40 degree F range was negligible compared to the .001 we were measuring to.

I'd try to shorten it as much as possible - substitute training/competency for extremely detailed instructions. With detailed instructions, if they are not followed and an auditor observes them not being followed you get a nonconformance. If memory serves, the one I used for the mechanical lab I ran was about 1 & 1/2 pages and was in a larger font size - it passed muster for ISO 9001:1994, QS-9000, and ISO Guide 25 (predecessor to ISO 17025)
 

Hershal

Metrologist-Auditor
Trusted Information Resource
Good beginning.....

I would add "exercising" the mic to watch for skipping or sticking which indicates the need for cleaning.....

Add the need for white gloves and have the instructions state that they must be used to pick up the gage blocks - NEVER touch gage blocks with your fingers.....

Add the requirement to document the specific path for comparison and where to find the procedure to calculate measurement uncertainty.....under international definitions for traceability, there must be an unbroken chain of comparisons to National or international standards, and there to SI and must have stated uncertenties at each step, or you do not have traceability.....

As always I strongly suggest sending items for calibration to a cal lab accredited to ANS/ISO/IEC 17025.....while I understand cost issues, metrology professionals are the best folks to handle your calibration.....

Hope this helps.

Hershal
 

apestate

Quite Involved in Discussions
Thank you for your comments everyone. I'll try to address some of the good advice offered on behalf of this procedure...

There is no need for a tightly controlled environment, and temperature in the main lab runs around 70-72F with little control over humidity, so that part will likely have to get cut. This part of temperature control is probably less important than heat from the hands, as mentioned by Hershal.

While I think a basic calibration procedure for a mic would suffice in most cases, I really believe being able to discriminate to +/- .0001" requires a higher class of gage discipline.

Hershal, I agree that measurement uncertainty should be assessed during calibration and also at different stages of manufacturing. It shouldn't take too long to alter a collection of factors for different situations to come up with a measurement uncertainty, when, for example, the calibration tech changes.

It was interesting to read the ISO 9001:2000 7.6 specification that says measurement tools are to be calibration by comparison to traceable standards... I wonder if this can be seen as lack of requirement to do MU analysis on tools, which may not be traceable themselves, but have been calibrated to a traceable set of gage blocks...

However, with our product and good quality record, I've been fortunate and happy to drum up interest in calibration procedures at this level of detail. It may not be the most useful or vital investment to make, from the eyes of my manager, but it should get rid of problems such as micrometers in use that are .0003" off, calipers cycled through calibration with dented tips. We have a good company and a good product, and I want the machinists to have finely tuned instruments.

Thank you all for the great comments, the procedure has been much improved because of it.

--Erik
 
S

SpaceMan

I know it is a little late, but I wanted to add a few observations.

The tools and supporting equipment are a little too generic. They need to explicitly state enough information so that items may be substituted if the original item is not available. The specific item I am talking about is the gage blocks. There needs to be a grade, class, or minimum accuracy requirement.

The checkpoints established do a good job of checking the barrel and spindle throughout the range, however, they could be consolidated into 5 measurements that do both. (example X.000, X.195, X.390, X.585, X.780, and X+1.000 - these check both the barrel and the spindle at evenly spaced points)

I noticed you put the GGG-C-105C table in your procedure, but there is no standard for verifying parrallelism or flatness in your procedure. These are accomplished by using optical parallels and monochromatic light, or a combination of optical flat and ball tester (No longer available for sale, but can be manufactured). The majority of micrometers I have rejected have been due to parallelism/flatness issues.

Just my two cents.
 
Top Bottom