Error in MSA Manual - Discrimination - Bias Linearity Worksheet

A

AllenLee

Error in MSA Manual

Does anyone can confirm with me whether the fomula in MSA manual page 19 is corrrect or not -- (Cp)2obs = (Cp)2actual+(Cp)2msa.

Thanks!
 
A

Atul Khandekar

AllenLee said:
Does anyone can confirm with me whether the fomula in MSA manual page 19 is corrrect or not -- (Cp)2obs = (Cp)2actual+(Cp)2msa.

Thanks!

Here is the Errata published by AIAG.
 

Attachments

  • MSA3p19.pdf
    64 KB · Views: 829
A

AllenLee

Another confusion point is about gauge selection. it says "General rule of thumb is the measuring instrument discrimination out to be at least one-tenth of the range (process variation) to be measured", however, it also says "the ndc is truncated ot the interger and out to be greater than or equal to 5."

How to understand those two sentences. Do they talking about the same thing? If not, what's the difference?
 
A

Atul Khandekar

AllenLee said:
Thanks very much! However, i don't have the authority to open the file attached. Could you please send me a coy?

My email is: [email protected]

OK, I'll email the file to you.

Here are a few links for you:

AIAG MSA Page:
https://www.aiag.org/publications/quality/msa3.asp

Changes to MSA-3 from MSA-2
https://www.aiag.org/forms/MSA_Changes.pdf

ALL the errata files can be downloaded from:
Errata for MSA Third Edition:
https://www.aiag.org/publications/quality/msa3_errata.asp

Frequently Asked Questions (FAQ)
https://www.aiag.org/publications/quality/msa_faq.asp

The new equation is:
Error in MSA Manual - Discrimination - Bias Linearity Worksheet


Hope this helps..
-Atul.
 

Attachments

  • Error in MSA Manual - Discrimination - Bias Linearity Worksheet
    Math_Equation.gif
    515 bytes · Views: 352
A

AllenLee

Thanks Autl. I got it.

by the way, can anybody help me understanding the following confusion point.--- For gauge selection. it says "General rule of thumb is the measuring instrument discrimination out to be at least one-tenth of the range (process variation) to be measured", however, it also says "the ndc is truncated ot the interger and out to be greater than or equal to 5."

How to understand those two sentences. Do they talking about the same thing? If not, what's the difference?
 
A

Atul Khandekar

AllenLee said:
by the way, can anybody help me understanding the following confusion point.--- For gauge selection. it says "General rule of thumb is the measuring instrument discrimination out to be at least one-tenth of the range (process variation) to be measured", however, it also says "the ndc is truncated ot the interger and out to be greater than or equal to 5."

How to understand those two sentences. Do they talking about the same thing? If not, what's the difference?
They are not the same thing.

There is a general rule of thumb which says that your gage should have a least count of at least 1/10th of the expected tolerance range ( or sometimes the expected process variation). That means your gage should be able to discriminate the data into at least 10 'baskets' within the tolerance zone.

NDC (No. of Distinct data Categories) is calculated after you know the errors (typically GRR) in the measurement system (not just the gage). Due to presence of GRR error, the number of classes that the data can be classified is reduced thus reducing the discriminating power. A good measurement system should be able to classify the data in at least 5 categories if you want to use it to center the process within the central 20% of the tolerance zone (or process spread).

Also: https://elsmar.com/elsmarqualityforum/threads/5922/
 
E

Emil Signes

discrimination in MSA 3rd edition

This is the first time I'm posting to this group, so I'm not sure I'm doing this right. At any rate, I'm still not clear as to the "rule of thumb" is that the measurement instrument discrimination ought to be at least 1/10 of the range to be measured. Our tensile testing machine reads to the nearest 0.1 pound. So, in the most simplistic reasoning, we should be able to have a tolerance of 1 pound. On the other hand, there is the phrase "amount of change from a reference value that an instrument can detect AND FAITHFULLY indicate).

When calibrated by an accredited lab (it passed), the greatest verification error was 0.69% - the verfication reading for a 50 pound machine reading was 50.1, at 495.8 pounds, it was 493.7 pounds, etc. How do I put this information to use in demonstrating that our system can discriminate to the level we need it to? (At low values of strength, our specified tolerance is about 12 pounds (70 to 82 pounds), at higher levels, our tolerance is about 90 pounds (340 to 430 pounds).

Based on the errors reported, above, I'm sure the equipment discriminates, but just what kind of a summary report do I use to show it???
 
A

Al Dyer

10-1 rule of thumb example:

If a gage used to measure the width of an item that is designated:

5.3 inches ----- gage descrimination would need to be .01
5.03 inches ---- gage descrimination would need to be .001
5.003 inches --- gage descrimination would need to be .0001

Documantation of your situation could be placed on some type of linearity report.

Al...
 

Attachments

  • Bias Linearity Worksheet.xls
    27 KB · Views: 1,349
Top Bottom