Frequency of Measurement Systems Analysis to Ensure Suitability

M

Montserrat P

Frequency of MSA

The last week we had a QS 9000 Audit. The finding was "There is not objective evidence that MSA are updated in order to ensure the sutiability of the measurement system (Last record: 2000). Minor.

I did check the QS 9000 Manual and the MSA and both do not mention the frequency of the MSA studies in the equipment. I would like to know your comments. In spite of it, we will start with the corrective action.
 
M

Montserrat P

Thanks!

Iwill appreciate a lot your support.

Also, I would like to know your comments with regard it...

Which concept is more important in the tecnical information of a scale: the readibility or the tolerance?
I am asking it, because I was trying to found information about the tolerance of a scale and in all the technical information write about readibility but not about the tolerance (manufacturing and sales representatives).

Thanks in advance.:)
 

Marc

Fully vaccinated are you?
Leader
I'm assuming that with respect to readability, you're talking needle gages / instruments and the parallax effect. You could use an R&R to assess that and calculate % of tolerance. It should be very small with respect to tolerance.
 
M

Montserrat P

Scales

Well, this is the issue:

I count on with a scale that according with our documents its tolerance (Weight to be checked 500 grs) is +/-.002 grs (2mg) however, reviewing the literature , I found an information about the tolerance is really 70 mg.

I ask that information to one of the suppliers and he send me information about the precision (std desv) that is +/-0.001 g and the maximum reaability that is 0.001 g.

Therefore, my doubt is: is possible to take the tolerance as the precision factor?

I am confuse, and on the other hand, I am learning about all the themes about calibration and MSA, therefore sorry for all my doubts, but I am trying to learn.

Thanks in advance

:confused: :confused:
 
A

Atul Khandekar

I think when you say 500gm +/- 2mg, you are talking about the accuracy of the scale.

Precision of the scale / instrument is the smallest readable value or the least count of the scale. The smaller the least count the more precise the instrument will be.

When you have a number of data points ( say measuring the same thing 20-30 times), you can use standard deviation as a statistical measure of precision.

Precision and accuracy are not the same.

For calibration purpose you may want to specify some tolerance as the acceptability criteria. These criteria can change according to the intended purpose of the measurement. For example, at 100gm level, you may accept the scale if it measures within +/- 2mg, even if it is capable of measuring to a precision of 1mg.

R&R study will help you calculate various errors in the measurement system - errors due to operators, instrument, methods etc.

I hope I haven't added to the confusion.

-Atul
 
R

Ryan Wilde

Montserrat P,

(I've got your back on this one Atul)

Okay, putting all the little parts together on this one tells me (and I may be very wrong, but being wrong is one of my strongest traits):

You are checking a weight that has a tolerance of ±0.002 g. You are using a scale with a:

- Resolution of 0.001 g
- Precision of ± 1 count (±0.001 g)
- Absolute accuracy of ± 70 g

Okay, from what I see, your scale could do ± 0.004 g, as long as you have a master weight that is ± 0.001 g and you use a relative measurement rather than an absolute measurement.

Let me know if I am way off-base, or close, or whatever.

Ryan
 
Top Bottom