Digital Multi-Meter Calibration Procedure for various Meter Brands and Functions

wesatwork

Learning what I can.
Task- create a internal calibration procedure for multiple DMM's for internal use in a ISO 9001 mechanical development lab.

Problem- little time, no budget to send them out, and about 8 to 10 different models.

Ok- I am hoping everyone can chime in and tell me just how bad of an idea this is with suggestions. We have DMM's ranging form old Fluke 37's to Agilent/HP 34401A & 3468B, to Keithley 2000's. We used to use our meters (Qty. 73) routinely for all sorts of measurments. Now we use only about four meters, and 99% of that time they are only measuring 0 to 10 DC Volts. I need to create a calibration procedure using a Fluke 5500A as the reference standard. Does it make sense to create one generic procedure that only verifies partial DMM capability (limited calibration)? We would only ever need to measure, DC Volts, DC Amps, Ohms, and Freq. at 1V AC. Instead of doing an indepth calibration, only check the few mesurment ranges we might use and have a blanket tolerance (Error in percent of reading).

In short:
Multiple Meter Models
One Procedure
Limited Calibration
Blanket tolerance (% of reading)

Thanks for your time.
 

Mikishots

Trusted Information Resource
I am not aware of anything in 9001, 17025 or Z540-1 that states that you cannot perform a limited calibration on a device and subsequently put it into active use. We do this at my workplace, our external lab (whose system is certified to all three aforementioned standards) can do this for us, and best of all, it's cheaper to do.

All that you'd want to make sure you do is use a clear method of identifying the limited scope on the device itself, so the operator knows what function(s) the device has been approved for.

BTW, you can also perform limited range calibrations (pressure gauges, temeprature monitoring devices) in addition to limited functions.

Blanket tolerance? I'd think that you'll still need to adhere to the inherent stated accuracy tolerances of each model.
 
Last edited:

wesatwork

Learning what I can.
Mikishots, thanks for the comments!

In regards to the "clear method of identifying the limited scope on the device itself" I have be applying a label-
LIMITED CALIBRATION
see calibration form for restrictions

In regards to the Blanket Tolerance, we have so many different types of meters and some define accuracy differently. Each measurment type/range has different tolerances too. Is there a way to keep the procedure and calibration form/report simple (KISS)?

All constructive replies appreciated!!
 

Hershal

Metrologist-Auditor
Trusted Information Resource
I do not envy you your task.

First, a 5500 and HP/Agilent 34401A. Uncertainties mean you can truly calibrate a few settings at most. Even a 5520A cannot do a 34401A using 4:1 TUR. Some of the meters are OK with 5500A, some are better bluntly put.

For me, the question comes to this:

Does management want calibration done RIGHT? Or done quick and cheap? The two - given the situation as you outlined - are NOT and never will be, the same.

Hope this helps.
 

wesatwork

Learning what I can.
Hershal,

I can only do what I can with what I have available. I have communicated multiple issues, but I try to pick my battles, especially since I see so many more to come.

The general thought is:
1. Get procedures created for how we are doing things now, even if they need improved. (we currently have next to zero calibration procedures)

2. Improve calibration (both method and standards, as needed)

3. Revise procedures to include improvements.

We are in step one right now, which will take most of this year to complete. Step two will be a major focus in the following three years.

Thanks,
Ten pounds in a five pound bag.
 
P

Pezikon

This sounds like a great idea. If only for the fact that it simplifies everything. In the world of commercial calibration, "customer requirements" are king. You have every right to produce your own calibration procedure which will become official once it is scrutinized by a 9001 auditor. There is no point in taking the time to verify functions and ranges that you never use.
In your situation, all you need to do is determine the needs of your process. That is to say, determine what functions, ranges, and tolerances are essential to whatever it is you are using your multimeters for. You already pointed out that you know what functions and ranges are essential. All you need to do is produce a calibration procedure that tests those functions and ranges.
If you are unsure where to start, collect all the service manuals for your multimeters and look at the verification procedure portion. Extract all the test points that are applicable to your needed functions/ranges. Then put all that into a single document; removing any test points that are redundant.
Your final concern is what tolerances to use for each test point. As long as your procedure includes tests that do not exceed the accuracy specification of any one of your multimeters, you can use a blanket tolerance. For example if the manufacturer accuracy for each of your multimeters are 0.1%, 0.3%, and 1%, then you could use a blanket tolerance of 1%. All the multimeters are capable of such accuracy. On the other hand you could not use a blanket tolerance tighter than 1% because one of the multimeters is not capable of such accuracy. What this ultimately means is, by using a blanket tolerance, your best multimeter (34401A) will be calibrated to the accuracy of your worst multimeter.
I assure you, in this case, a Fluke 5500A will be a plenty suitable standard. Lastly, the LIMITED calibration sticker is a good idea. Be sure to note the use of a limited calibration sticker at the end of your procedure.
 
R

Ruebenn

Mr.H,

Greetings and how do you do?
Hope you are doing great.

I am finding some issues on understanding the basics of correction to a measurement say for an example in voltage or current measurements.
I have been looking at the calibration of a multimeter , yes the HP34401A and i have set aside 4 factors in determining the uncertainty of the measurement.

First being the type A or the repetitive measurements , the others being the drift or inastability of the calibrator( we are using the 5720A fluke calibrator ) , the calibrator specs and the resolution of the multimeter itself to compute the expanded uncertainty for the measurement.
I know for the Fluke calibrator , there is a spec stating the accuracy such as the ppm of reading + floor . How do we quantify this value? Should we take the ppm value of the readings obtained by injecting the calibrator values( the reading of the dmm?) or the ppm of the nominal value? And say we send this calibrator to an external lab for annual calibration - do we take the CORRECTED reading? How do we apply correction to the reading? Do we calculate the accuracy based on ppm * measured or the corrected value?

I am a but lost here and i have seen and read some journals but i think i am still blue about this. Can someone help or point me to the right direction?

Thank you and have a nice day.
B.R
Ruben
 
Last edited by a moderator:
Top Bottom