Calibration of Tape Measures used on our production floor - What are 'Best Practices'



We recently received a minor nonconformance during a surveillance audit. The finding concerned the calibration (or lack thereof) of tape measures used on our production floor. These tapes are used to confirm lengths after a cutting operation. I am looking for a logical, simple method for calibrating tape measures. Does anyone have any "best practices" to share? I appreciate the assistance.



I think that you may need to provide more information on the subject.

Are the tapes used to "determine acceptance of the product" for movement to the next function area or for shipment?

If not, label them as "reference tools" (assuming that you have some other method of measuring/testing the product for conformance to specification.

If they are, I believe that you may be in a pinch. Tape measures are subject to so much regular abuse, malformation, and other circumstances that would impact their accuracy (to some tolerances) --> you might find yourself hard pressed to keep them calibrated (or spend a ton of time rechecking them).

Apologies for only being semi-helpful. Throw some more details out there so that those more knowledgeable than I on the subject can give you some better advice.



dscumaci: I must agree with ALM, more info is needed to give a helpful reply.
What type of verification is used at your facility to determine calibration of tape measures as specified under your calibration procedures?

Quality Mgr.
QS9000 Coordinator


Thank you for your response and I apologize for the vagueness of the question. To clarify the matter, the tape measures are used to make accept/reject decision after steel bar has been cut. Tolerances for length of bar and plate range from 1/4" and 1/16". However, my question is what my be a logical means of calibrating the tape measures? Does anyone have experience implementing a process for calibrating tape measures? What are some options we may have?

Thank you for your help;


Roger Eastin

You can't calibrate a tape measure (because you can't adjust it to bring it back into conformance), but you can verify aspects of it. You can verify that it is still readable and you can compare it to a standard (if you suspect warpage, for instance). One option is to verify it at incoming inspection and then establish reasonable intervals for verification. Also, come up with rules of action, such as what to do if the tape measure has been dropped or been exposed to unusual wear conditions.

Kevin Mader

One of THE Original Covers!
Remember to keep in mind that you are required to calibrate equipment through their usable range. Purchase the right Tape measure for the job. To add to Roger's post, consider the things that break or become loose (i.e. the end-hook on the tape measure). Is there any play?


Do you need to consider the "readibility" factor, where the measurement device must read one more decimal than the stated tolerance? 1/16 is .0625. Your device should be readible to .006? (1/125?) This may be splitting hairs, but could this be an audit target, and more importantly an issuue for you? I guess it depends upon the criticality of the measurement.

barb butrym

Quite Involved in Discussions
keep a steel rule for verification when tape gets worn/abused ...discard (or tag) and replace with a new one when wear reaches an unacceptable level...the steel rule comes in calibrated/verified or what ever........and if not heavily used has an infinitye cal cycle....or tap on one of 15 years.....You set the rules, what makes sence for you

Kevin Mader

One of THE Original Covers!
I agree with Barb's recommendation. Currently, our program to calibrate a tape measure includes verification that the end-hook does not have too much play (dependent on the nature of the tolerance and item being measured) by ensuring that when compared to a steel rule, the allowable slop is within your established limits. The rest of the tape should remain consistent, barring any kinks or bends.


Regarding the tape measure, if I have a tolerance of +/- 1/8 inch, does that mean that I must be able to read the measurement to +/- 0.0125" if the measurement appears on my control plan but is not a customer designated key (special) characteristic? Also what about conversions from metric to English? As an example, say the spec for the thickness of material is 1+/- 0.1 mm; this would convert to (approx.) 0.03927+/-0.003927. How would this be handled best?
Top Bottom