Best Method for Calibrating Multimeters

  • Thread starter cmwilliams48186
  • Start date
C

cmwilliams48186

What is the best way to calibrate a multimeter? Is it more cost effective to just buy a new one yearly? This is for a small IVD manufacturer.
 

BradM

Leader
Admin
Hello, and welcome to the Cove!:bigwave:

To your question, the "best" way and the least expensive way may be two different things. :D

My opinion is to keep one multimeter and have it calibrated on a frequent basis. If it passes calibration OK a number of times, then you can extend the calibration frequency.

If it fails calibration, then you need to shorten the frequency. If the instrument then has a history of not passing calibration, then dispose of it, and get one that will pass.
 

Marc

Fully vaccinated are you?
Leader
<snip> Is it more cost effective to just buy a new one yearly? <snip>
I know some companies that do that. I also know some companies that do that with a lot of their glassware, mercury thermometers, etc. By the time you have it calibrated the cost is nearly the same as, and sometimes more than, just buying a new one that comes with a calibration certificate.
 
T

tomvehoski

The first question is, what is it being used for? Second, how accurate does in need to be?

If it is in the maintenance guy's toolbox to troubleshoot light switches and electrical outlets there is no need to calibrate it. If it is being used to assess the quality of medical diagnostic equipment, you probably need a higher accuracy (expensive) device with certificates of calibration to traceable standards. Its been awhile since I've worried about calibrated DMMs, but as I recall you still had to buy the calibration cert even if you were buying a new meter.
 

BradM

Leader
Admin
I know some companies that do that. I also know some companies that do that with a lot of their glassware, mercury thermometers, etc. By the time you have it calibrated the cost is nearly the same as, and sometimes more than, just buying a new one that comes with a calibration certificate.

I know a lot that do that also. :agree1: The 'kicker' is that since someone was using a "calibrated" device and there is no second calibration/verification (from the initial calibration), one does not know if they were using it with a problem. :)
 
D

dv8shane

What is the best way to calibrate a multimeter? Is it more cost effective to just buy a new one yearly? This is for a small IVD manufacturer.
I do not suggest buying new yearly, because if the unit has gone out of specification during the year it was used you will not know and could have possibly released non-conforming product. With a multimeter calibration costing <$100 this is a lot less than a year's worth of work being recalled when a non-conformance caused by this type of problem is discovered by the customer. I suggest using top quality multi-meters like Fluke as they have proven to be reliable over long time periods, maintaining them and extending the interval as suggested by BradM.

Shane
 
Last edited by a moderator:

Marc

Fully vaccinated are you?
Leader
I know a lot that do that also. :agree1: The 'kicker' is that since someone was using a "calibrated" device and there is no second calibration/verification (from the initial calibration), one does not know if they were using it with a problem. :)
Good point.
 
G

Graeme

What is the best way to calibrate a multimeter? Is it more cost effective to just buy a new one yearly?

There are two reasons for calibrating any measuring instrument. The most commonly cited and important is to verify measurement traceability to the International System of Units (the SI). Most will agree that this is essential.

The other reason, as said by dv8shane, is to determine the status of the instrument at the end of the calibration interval, in order to detect any potential problems. That is of equal importance for any industry, and especially where health and/or safety are a prime consideration.

When an instrument of any type is received by a calibration laboratory, they typically (should) follow a multistep process.
  • Go through the entire calibration procedure, recording the "as found" values
  • Determine if the instrument is within acceptable limits or not
    • If the instrument is at or outside acceptable limits and is adjustable -
      • notify the customer of the out-of -tolerance condition
      • adjust or repair as necessary (if approved by customer)
      • repeat the entire calibration procedure, recording the "as left" values
    • If the instrument is at or outside acceptable limits and is NOT adjustable -
      • notify the customer of the out-of -tolerance condition
      • If approved by the customer, calculate correction values
  • complete the instrument and report and return to the customer
Something like a digital multimeter may or may not be adjustable. A bench style meter may be, a handheld model may not. If something like this is out of acceptable limits, you will need to evaluate the effect on your processes and initiate corrective action. If the meter is not adjustable, then you will also have to consider buying a new one.

Something like a liquid-in-glass thermometer obviously is not adjustable. If it is outside acceptable limits, the calibration laboratory can report one or two correction values that are added to the measurements you make. One correction is the offset at the ice point (0.0 °C) that applies to all readings. Seen less often, the other may be an offset to account for non-linearity that may only affect readings in a certain range.

If dispose of a measuring instrument instead of calibrating and continuing to use it, you are also throwing out one other thing - the history of the measurement processes using that instrument. Periodic calibration does assure the continued accuracy, uncertainty and measurement traceability of the measurement. By doing that, it also indirectly indicates the performance of your measurement system. It is not a direct relationship since your environment is different from the calibration laboratory. It is still valuable information, and over time can help improve the characterization of your processes. It can also help you predict the future performance - compare the values on the current report of calibration with all previous ones (an individuals process control chart is good for this) to detect any trends. Another possible benefit of analysing instrument history may be the ability to reduce measurement uncertainty based on documented performance of the instrument.

Cost is certainly a factor. When evaluating costs, though, look at ALL actual and potential costs. The cost of calibration is the most obvious and the one most people focus on, as it is cash out of your pocket. So is the cost of buying new calibrated instruments with certificates and data. The other things to consider are potential costs that can be avoided with a calibration program - the potential costs of product recalls, the potential liability costs, and so on.
 

bobdoering

Stop X-bar/R Madness!!
Trusted Information Resource
Nicely put, Graeme. The "reason" for calibration I usually cite is that it answers the question "How do you know the instrument you are using is still measuring accurately?" "How do you know..." questions are the basis of audits, and they are for a good reason. Calibration to a national standard at a frequency that ensures the device does not get out of calibration - causing its data to be suspect - is how you answer the question.
 
Top Bottom