How to specify a calibration accuracy?

B

BishDamo

#1
Hi
I was just going through some old threads on calibration accuracies. Some very good info, but I'm still confused as to how one should spec a calibration accuracy requirement.

Suppose you need to measure a spec of say +/-0.002" (total tolerance = 0.004"), then my understanding from prior threads is that:
1) I should use a piece of eqpt that has resolution of at least 0.0004" (10:1 rule of thumb)
2) I should calibrate using a standard that is at least 4 times more accurate than the accuracy I want (4:1 rule of thumb)
But how do I decide on the accuracy needed?

From looking at previous threads, a common answer seems to be that I need to calibrate to an accuracy of 1/4 the tolerance (i.e. 0.001"). I find it hard to believe that the 4:1 rule is intended to be used like this - 0.001" just seems way too crude. As I understand it, the 4:1 rule refers to the accuracy of the standard relative to that of the equipment, not the equipment relative to the spec.??

Anyways, my question is would anyone know how would you go about specifying a calibration accuracy for the equipment in this case?

Thanks
 
Elsmar Forum Sponsor

Jen Kirley

Quality and Auditing Expert
Staff member
Admin
#2
I'm going to make myself look ignorant I suppose, but I never understood where a 4:1 rule comes from. However, other people do and this thread Please clarify the Rule of 10 to 1 - AND - What is the ndc number? is a good discussion between them on 4:1 and 10:1 rules.

I have historically operated under the 10:1 rule, but that is not about specification tolerance window, it is about in what increments (in your case, thousandths) the instrument is to measure.

That means the instrument should be calibrated using a standard accurate to ten times the resolution. That is, calipers that measure in .001" should be calibrated using gage blocks/whatever that are certified accurate to at least .0001".

And that is just what I would say in a calibration procedure: use a standard with at least 10 times the resolution of the item it is being used to validate.

This Wikipedia page Wikipedia reference-linkGauge_blocks lists the different gage block grades and their tolerance.

I hope this helps!
 

CalRich

Involved In Discussions
#3
Suppose you need to measure a spec of say +/-0.002" (total tolerance = 0.004"), then my understanding from prior threads is that:
1) I should use a piece of eqpt that has resolution of at least 0.0004" (10:1 rule of thumb)
2) I should calibrate using a standard that is at least 4 times more accurate than the accuracy I want (4:1 rule of thumb)
But how do I decide on the accuracy needed?

From looking at previous threads, a common answer seems to be that I need to calibrate to an accuracy of 1/4 the tolerance (i.e. 0.001"). I find it hard to believe that the 4:1 rule is intended to be used like this - 0.001" just seems way too crude.
If your part spec is +/-0.002" (total tolerance = 0.004"), then you should use a gage with accuracy (not just resolution) of .0004". That should be calibrated with something 1/4 the tolerance, as you stated. But 1/4 of .0004" is .0001" ("a tenth of a thousandth"), not .001" (a thousandth) as you have noted. But one would probably be using gage blocks as masters at this point, which are commonly no greater than 50 millionths in tolerance (.000050").
 

Jerry Eldred

Forum Moderator
Super Moderator
#4
I'm not quite sure I understand the added spec requirement you mentioned. In my understanding of calibration (in basic terms), there is the spec of the device being calibrated, and the spec of the measurement standard you use to calibrate it.

THE T.U.R. (Test Uncertainty Ratio) RULE

The spec (tolerance) for the device under cal in your example was +/-0.002. In the calibration world we would not add in a re-naming of that tolerance as 0.004. To keep things simple, keep terms in common format, resolution, etc. I recommend expressing everything in this rule in "+/-" format (the normal method to express such details in calibration). So for 4:1 T.U.R., the UNCERTAINTY of the measurement standard needs to be +/-0.0005 (1/4 of +/-0.002).

So the tolerance is +/-0.002. The tolerance of the measurement standard you use to calibrate it with should be four times more accurate as a minimum to provide guardbanding. If your standard drifted over time, making the standard four times more accurate than the item being calibrated provides a security factor so that even if the standard drifts out of its spec over time, the risk of mis-calibrating the item being calibrated due to the standard being out of spec is minimized.

THE RESOLUTION RULE

The rule of 10:1 resolution is a separate (but equally important) detail. Don't confuse it with the items discussed above. Let's call the 10:1 resolution issue a new topic. Keep it as a separate thought from the T.U.R. rule. It applies differently.

When you make a measurement of a given accuracy, your resolution should be significantly better than the accuracy need. If you had an accuracy of +/-0.002 and a resolution of 0.001, then each digit of measurement resolution would be 50% of the tolerance. If you want to do statistical analysis of that measurement (stability, etc.), you could not get very meaningful data (the SPC gurus on this site can explain this part better than I). But by requiring resolution ten times better than the accuracy requirement (tolerance), you get much more meaningful statistical information. If you're evaluating SPC of a process, you need good resolution to help you evaluate and adjust process parameters.

That's my shot.
 
B

BishDamo

#5
Jennifer,
Thanks for the pointer to the 10:1 thread.

Jerry,
Thanks for your input. I get the resolution/uncertainty difference and everything you said makes perfect sense to me. But the +/-0.002" is not the spec on the device being calibrated, but the tolerance on the product which is being measured by that device. My question is what should the device's required calibration accuracy be set at given that it'll be used to measure something that has a tolerance of +/-0.002"? Once I've decided that, I completely understand that I then need to look for a standard with at least 4x that accuracy. CalRich has suggested a 10:1 ratio.

CalRich,
Everything I've read on the 10:1 rule of thumb suggests it should be used for deciding minimum resolution of the measurement device, but you've suggested used it for something completely different. Makes good sense to me but I just want to make sure you're not confusing resolution and accuracy. Are are you saying that good practice would be for the eqpt resolution AND the required calibration accuracy to be both 10:1 or better?
 

Jerry Eldred

Forum Moderator
Super Moderator
#6
Okay. I think I've got it now. Sorry. The variety of terms can get confusing.

I can mostly speak only to the relationship between the calibrated device and what you use to calibrate it. Keep that 4:1 relationship pretty fixed.

Let's see if I can get it this time.

There should be a 10X relationship between your final product spec and the instrument/gage/etc. used to test it.

The instrument/gage/etc. you use to test the product must be calibrated by a standard that is 4X more accurate than the instrument/gage/etc. that you use.

And for any instrument/gage/etc. or testing a product (any of the items involved), the resolution of any readouts used in any of the measurements should be 10X better than a tolerance or spec.

Don't know if that just muddied the waters more, or helped.
 
G

Gordon Clarke

#7
At the risk of invoking the wrath of the truly knowledgeable I’ll give my contribution. Unless I was given “Rules and Regulations” chiselled in granite then I’d use a combination of experience and common sense. I’d also settle for a national or international standard that gave me the information I needed. Re the “national” it’d have to be from my own country of course.

As I’ve never seen the existence of a standard that could be used in all cases I’ll go with experience and common sense.

For calibration I’d use an instrument I knew to be preferably at least 10 times as accurate as that of the “instrument” which I intended to calibrate. Without getting into things like temperature and humidity, I’d find an environment I considered reliable and suitable.

With solid gauges it’d be 10 times that of the gauge's manufacturing tolerance. When all’s said and done, it amounts to being able to give a trustworthy result that can be documented or explained rationally and of course repeated. There are those auditors that are oblivious to common sense, but I’d insist they tell me what they would have done if they’d had to do the calibration or tell me where I can find rules that tell me what to do.

Slightly off-topic but don’t you just hate the auditors that start a question with, “What if …….”
 

BradM

Staff member
Admin
#8
It's a good question.

Basically it's about confidence. You want confidence in the device you are measuring.

Keep in mind two things... One, all your standards should be traceable to a known national standard. So you are going to pick up error as you move down the chain. You want to know what that error is (uncertainties), and have more confidence in the equipment as you move up the chain.

Second, you want to avoid the "blind leading the blind". If you have two instruments of the same make/model (and published accuracy), you have a 50/50 chance of making the correct estimation about the accuracy (or inaccuracy) of one of the instruments. That's no good!! You want better odds than that.

So the more accurate standard that you have, the better probability you have that it will yield confident readings. Also, the larger the ratio, the more error can occur in the standard without having adverse effects on all the instruments calibrated by it.

My rule of thumb is this... always buy the best standards that you can afford. When I say best, that does not always mean the one with the best stated accuracy. Make sure the instrument can be calibrated by competent labs, and try to find out if they fail every other time they're calibrated. If they fail all the time, there's not much confidence in it.
 

CalRich

Involved In Discussions
#9
CalRich,
Everything I've read on the 10:1 rule of thumb suggests it should be used for deciding minimum resolution of the measurement device, but you've suggested used it for something completely different. Makes good sense to me but I just want to make sure you're not confusing resolution and accuracy. Are are you saying that good practice would be for the eqpt resolution AND the required calibration accuracy to be both 10:1 or better?
I wouldn't necessarily say I meant to focus on accuracy and resolution. But when looking at the 10:1 ratio for parts/measurement device and the 4:1 ratio for measurement device/ calibration master, accuracy is what is key, not necessarily resolution. An instrument's accuracy is not always equal to its resolution.
There are times when a gage (e.g. OD micrometer) has a resolution of, say, .0001 but is only accurate to .0002 - check the specs or the gage's last calibration. Here are manuf. specs on OD mics where you can see that the accuracy is not always equal to the resolution:


Not only should you be wary of this from the persepective of the manufacturer's new gage accuracy, but the calibrated accuracy. I've found that many people stick with the manufacturer's accuracy spec for the life of the gage, but the accuracy can be different based on calibration results.
 
B

BishDamo

#10
Thanks all for your help, this was very useful for me and I've a much better appreciation of the topic now.
 
Thread starter Similar threads Forum Replies Date
M ECG lead leakage currents - How to specify ECG leads during electrical safety testing IEC 60601 - Medical Electrical Equipment Safety Standards Series 5
D Drawing (print) does not specify restrained or free state Inspection, Prints (Drawings), Testing, Sampling and Related Topics 2
L How do you specify your receiving inspection parameters? Misc. Quality Assurance and Business Systems Related Topics 2
B How do you specify a one sided tolerance for an MSA in Minitab v 16 Gage R&R (GR&R) and MSA (Measurement Systems Analysis) 6
somashekar How do you specify color tolerance ... Manufacturing and Related Processes 6
P How to specify criteria for TECN (Temporary Engineering Change Notice) Document Control Systems, Procedures, Forms and Templates 3
I Guidelines that specify the frequency of ETO Sterilizer Re-Validation ISO 13485:2016 - Medical Device Quality Management Systems 5
E How to specify Paint or Plastic Color - Pantone or Manufacturer Specification? Manufacturing and Related Processes 15
J Change Management System - Does ISO 9001 specify you need a process flow? ISO 9000, ISO 9001, and ISO 9004 Quality Management Systems Standards 11
S Cleaning Protocol Validation - Do we also have to specify the level of cleaning? ISO 13485:2016 - Medical Device Quality Management Systems 7
L Material Safety Data Sheet - MSDS should specify how to make the material innocuous Miscellaneous Environmental Standards and EMS Related Discussions 10
G AS9100 4.9 Process Control - Specify process documents and their revision levels AS9100, IAQG, NADCAP and Aerospace related Standards and Requirements 6
lanley liao Question regarding the calibration of monitoring and measure equipment. Oil and Gas Industry Standards and Regulations 0
N IATF Calibration Lean in Manufacturing and Service Industries 3
Q Do these certificates of calibration meet ISO 9001 requirements for traceability to NIST? ISO 9000, ISO 9001, and ISO 9004 Quality Management Systems Standards 10
T Plug Gage Calibration Calibration and Metrology Software and Hardware 1
M Load Cell Calibration using a totalizer on a flow meter General Measurement Device and Calibration Topics 0
E Calibration Records needed ISO 9000, ISO 9001, and ISO 9004 Quality Management Systems Standards 1
D Limited Range Calibration - 5000 lb Industrial floor scale General Measurement Device and Calibration Topics 3
D Calibration of Small Scales General Measurement Device and Calibration Topics 26
C How to Establish the Calibration & Measurement Capability (CMC)? ISO 17025 related Discussions 1
I IQOQ or just initial calibration required? General Measurement Device and Calibration Topics 3
B Calibration in real life ISO 9000, ISO 9001, and ISO 9004 Quality Management Systems Standards 3
J Calibration cycle for monitoring & measuring tools used in medical device manufacturing General Measurement Device and Calibration Topics 5
A Is calibration of test weight required General Measurement Device and Calibration Topics 4
S Calibration Frequency for Slip Gauge Kit used for CMM Calibration? General Measurement Device and Calibration Topics 0
S Calibration/Verification of customer fixtures IATF 16949 - Automotive Quality Systems Standard 6
Ron Rompen Calibration by manufacturer ISO 17025 related Discussions 4
Q Calibration verification records 7.1.5.2.1 IATF 16949 - Automotive Quality Systems Standard 2
B AS9100D 7.1.5.2 Calibration or Verification Method using outside cal lab AS9100, IAQG, NADCAP and Aerospace related Standards and Requirements 1
W Next Calibration Due Date Calibration Frequency (Interval) 5
S Where do l get calibration standards to run a calibration lab? Other ISO and International Standards and European Regulations 2
A OEM On-Site Calibration issues during Covid19 ISO 9000, ISO 9001, and ISO 9004 Quality Management Systems Standards 12
D Calibration tolerance question using Pipettes Medical Device and FDA Regulations and Standards News 1
M Calibration Certificate Result issued by an accredited external laboratory General Measurement Device and Calibration Topics 9
G Calibration of "Master Parts" Used as Gauges Calibration Frequency (Interval) 5
R Calibration lab environmental monitoring General Measurement Device and Calibration Topics 4
G Calibration of Rotronic probe but not digital readout? General Measurement Device and Calibration Topics 3
F Standard Calibration Procedures: Recommended Practice ISO 17025 related Discussions 0
T Temperature Requirements For In House Calibration - AS9100 AS9100, IAQG, NADCAP and Aerospace related Standards and Requirements 16
B Gage calibration frequency, ISO and IATF - What are the requirements Calibration Frequency (Interval) 3
Crimpshrine13 Laboratory Scope - Calibration vs. Test Methods - IATF 16949 IATF 16949 - Automotive Quality Systems Standard 3
Crimpshrine13 Calibration of pH Meter Probe Calibration and Metrology Software and Hardware 3
F ESD workbench "calibration" Manufacturing and Related Processes 2
C Correct Calibration Method for Dial Depth Gage General Measurement Device and Calibration Topics 6
F Nist traceable calibration certificates General Measurement Device and Calibration Topics 1
G Tool tracebility and First calibration requirements for aerospace (AS9100) organisation AS9100, IAQG, NADCAP and Aerospace related Standards and Requirements 5
D Calibration of Digital thermometer with surface probe General Measurement Device and Calibration Topics 1
T Calibration or Verification -> Cm and Cmk, etc. Definitions, Acronyms, Abbreviations and Interpretations Listed Alphabetically 3
G Is repeatability required for equipment calibration? General Measurement Device and Calibration Topics 10

Similar threads

Top Bottom