How to Set Acceptable Calibration Limits for Measurement Instruments?

sowmya

Involved - Posts
during verification of calibration certificates, we find deviations.how to fix the acceptable limits for the instruments?. Is there any standards available, so as to say that for vernier acceptable limits are this much like that.......

Some people say, that acceptable limits can be based on the process tolerance. but we are contract manufacturers and process tolerance varies for different products........any help?
 

Al Rosen

Leader
Super Moderator
during verification of calibration certificates, we find deviations.how to fix the acceptable limits for the instruments?. Is there any standards available, so as to say that for vernier acceptable limits are this much like that.......

Some people say, that acceptable limits can be based on the process tolerance. but we are contract manufacturers and process tolerance varies for different products........any help?
The instrument's manufacturer determines the limits. Notice the accuracy in the specs.
 
Last edited:

BradM

Leader
Admin
during verification of calibration certificates, we find deviations.how to fix the acceptable limits for the instruments?. Is there any standards available, so as to say that for vernier acceptable limits are this much like that.......

This is a bit of a problem. When you say you are finding deviations, do you mean Out of Tolerance, or just finding deviations within tolerance? There will always be deviations, but the instrument should be adjusted to minimize any deviations.

Are you calibrating your own instruments, or having someone calibrate them for you? If someone is doing them for you, are they competent to do your work? What tolerance do they cite on the certificates?


Some people say, that acceptable limits can be based on the process tolerance. but we are contract manufacturers and process tolerance varies for different products........any help?

Yes... to an extent. For your situation, though, Al's suggestion is the correct one. You would create a complex situation trying to keep up with everything. However, you might can have different standards for different processes (which have different tolerances).
 

Jerry Eldred

Forum Moderator
Super Moderator
I don't at all disagree with any previous replies. As has already been covered, tolerances are by model. A given model caliper for example, has a given tolerance. Tolerance does not change based on what is being measured. The overall measurement uncertainty may change based on a variety of factors, but the tolerance for a model stays the same.

However, one consideration in establishing tolerances (for your own instruments) is that there are legitimate methods for defining your own tolerances. I say this very carefully. So please read details below carefully if you consider doing this.

-------------

The normal industry practice is to certify an instrument meets its specified tolerances. However, if you use an instrument to tolerances different than the manufacturer's (OEM = Original Equipment Manufacturer), you may use a process-based, local tolerance rather than OEM tolerance. However, to do this requires documentation to assure it is fully defined.

-Your quality system must specifically allow local tolerances.
-You must document this for each instrument, so that there is no confusion whether OEM tolerances are used or a locally established tolerance is used.
-You must be able to ensure there can be no confusion as to which tolerance applies.
-Users must have documented training to assure only the correct tolerance is used at the correct process.
-If you use multiple differing tolerances for the same instrument on different processes, each instrument must be CLEARLY labeled for what tolerance and which process it may used on.
-Calibration Procedures, process documents, and calibration labeling must account for the differences.

This is a risky method and should only be done if it is worthwhile for your application.

I have said previously that you can calibrate anything to any tolerance you want. **HOWEVER** (and it is a BIG HOWEVER), when ever you deviate from calibrating to OEM tolerances, there is risk (users will misuse instruments and make erroneous measurements - another long topic). So if you wish to calibrate to multiple local tolerance carefully consider the risks, and be sure they are all accounted for in your quality system.

P.S. - Hi J. S.
 

BradM

Leader
Admin
Nice job, Jerry. :agree1:You typed what I thought about mentioning.

P.S.-Good to have you checking back in.
 

sowmya

Involved - Posts
Thanks for the great inputs jerry. Still some confusions......

We are doing calibrations for soldering stations (Accuracy is defined as ±5°C, do not know how it is defined, but the process is not critical for our application and it works till now. during verification we will give go ahead for production if the deviation is within ±5°) in house and other instruments like vernier caliper, micrometer goes to outside calibration agencies (NABL accredited).

For vernier caliper model 500-196, the accuracy given is ± 0.001in. Is that means, for a nominal value of 20mm it if measures 19.97mm we need to reject the instrument or give correction factor?. confused.....

The calibration agency has given a statement on the top of the results.........instrumental allowable error of measuring length ± 0.03mm. which error it talks about?

Another thing, for load cell we could not find NABL accredited lab, so we have done calibration without NABL. Last time TS auditor questioned about it and we replied the same. what should we do in that case?

thanks,

Sowmya
 
P

pinpin - 2009

:thanx:Dear Jerry,

How can we tell from the calibration report that our instrument can still be used or fit to be used?

I heard some people say we have to look at the measurement uncertainty value and compare with the part's tolerance that we are to measure using this instrument, then can determine whether it can be used or not...

Please teach me.

Thank You.....:thanx:
 

BradM

Leader
Admin
I'm not Jerry:D, but I will add a few thoughts for you!

We are doing calibrations for soldering stations (Accuracy is defined as ±5°C, do not know how it is defined, but the process is not critical for our application and it works till now. during verification we will give go ahead for production if the deviation is within ±5°) in house and other instruments like vernier caliper, micrometer goes to outside calibration agencies (NABL accredited).

Sowmya

You did not list the process requirement, so I am not sure how to address that. You state accuracy is defined at +/-5 degrees C. Are the stations meeting that criteria when you verify them? If so, you should be fine. If not, then you need to address what the problem is. Either the stations cannot meet that accuracy, there are significant errors in your measurement system, or a combination of both. I assume you have the appropriate standard to measure these.

For vernier caliper model 500-196, the accuracy given is ± 0.001in. Is that means, for a nominal value of 20mm it if measures 19.97mm we need to reject the instrument or give correction factor?. confused.....

I'm confused also!:D You have a nominal value of 20mm listed, but then the accuracy listed as +/-.001 inch. I would keep to one criteria (mm or inch). The caliper information should listed a tolerance for mm and inch. If this is a four place caliper, .001 inch should be obtainable. If it is a 3 place unit, that will be difficult. Too, there is some error accumulated through measurement styles and such that should be accounted for.


The calibration agency has given a statement on the top of the results.........instrumental allowable error of measuring length ± 0.03mm. which error it talks about?

Sowmya

Without seeing it, it's hard to determine. It can either be what they are stating the acceptable tolerance is on the instrument they are verifying, or the uncertainty of their standard measurement system. At .03mm, I would think that would be the error of the instrument they are verifying for you.


Just something for you to think about....I sense that you are getting frustrated dealing with tolerances on the certificates, in/out assessments, etc. A good calibration house is well-worth what you pay them; and they are there for you. However, there is no magical book or lists of perfect tolerances for equipment/instrumentation. There are simply too many variables. You need to establish good communication with them.

I encourage you to determine what your requirements are, and that the measurement system is working for those requirements. You should make sure that all your instruments are providing the proper confidence you need in measurements. Talk with your calibration source about the tolerances they have on there. They should be more than happy to assist you with that. In the end, you should have established knowledge that you are providing the proper system for your customers, and your calibration vendor is providing information that you understand and can use.

I hope something here helps.
 

BradM

Leader
Admin
How can we tell from the calibration report that our instrument can still be used or fit to be used?

I heard some people say we have to look at the measurement uncertainty value and compare with the part's tolerance that we are to measure using this instrument, then can determine whether it can be used or not...

Well, I'm not Jerry:D, but I hope you don't mind if I throw a thought or two your way.

A calibration report from a competent lab is a snapshot of the equipment. It states that on this date, using this measurement system with this uncertainty, using these standards and procedures, here are the values we found. Now, without any communication with them, they will generally report against mfg. specifications on that equipment.

What you do with that information is up to you. Does the error they found fall within your requirements? Is it suitable for your use?

Hopefully, you (or your customer), your specification, something states the acceptable tolerance required for that process. If the measurement system is within that tolerance, then you can have confidence in it. If it exceeds, then it should be adjusted, replaced, interval shortened, etc. or whatever remedial action suited appropriate.

The reason for all of this is: If the instrument is within the required tolerance (your requirement) then whatever noted error is insignificant to your results. If the insrument is outside of that tolerance, the noted error may have impacted your results. Thus, some level of investigation should ensue to assure no poor quality occurred due to the measurement of an instrument.

Sorry for the long-windedness. I hope I helped in some fashion.
 
Top Bottom