Help needed in choosing the method of calculating the minimum sample size

Steve Prevette

Deming Disciple
Leader
Super Moderator
For this AQL criteria like 11-17 ppm it can be longer for example 1 month or 1 year. - True, that is the point. Is that an acceptable risk? You have to answer that question. If it is NOT acceptable you need to look at alternatives like I listed, or I am sure there are a bunch more options. Just saying that is life, and you need to go in with your eyes open and either accept the risk, or do what it takes (redesign, better testing, more testing, other mitigations) to get the risk to an acceptable level.
 

scooon

Registered
The main question (at least for a go no-go inspection plan) is how long can you go without determining there is a problem. You can look at the operating curve for the tests in terms of balancing failure to detect and false alarm rates. There was an effort in MIL STD 105 series (which has been superceded by an ISO document) to do that balancing based upon stated AQL and seriousness of the defect being tested for. There are also two-step sampling plans and other nuances available.

You might want to consider automation in the testing routine. The ability to automatically sort fruit is amazing.

One thing that CAN significantly cut down on sample size is to shift from go no-go to an actual measurement. Such as dimension, weight, resistance, whatever. With that you can then do SPC on the data and be able to detect a shift in the data much faster, and before it hits rejection criteria. But the reality of life (and statistics) is if you have a very low failure rate you are testing for, it will take a LOT of samples. If it is continuous sampling, you can stretch out the sampling, but that means a longer time to detect the problem.
I need to detect for example more than 10 ppm in yearly statement. For other defect signification problem not exact, because the limit is much higher.
 

Steve Prevette

Deming Disciple
Leader
Super Moderator
I need to detect for example more than 10 ppm in yearly statement. For other defect signification problem not exact, because the limit is much higher.

So that is a failure rate of 1 in 100,000. That means at the very least you need to go no-go sample at least 100,000 items in a year. For a 90% confidence of that failure rate, you would need to sample 230,000 over the course of the year.

This is the issue of such a very low failure rate, in the ppm range. And points out that unless you have a very cheap way to do the go no-go test, it is very impractical. You are pretty well stuck at 100% sampling (unless you do make millions of these items), and even then, that may have a 10 ppm rate of failure to detect. Unless, again, there is a way to automate the testing.

I do suggest the only practical way to demonstrate that failure rate is to build quality into the entire production process, with monitoring of many potential failure points from raw materials through final assembly. And use measurement analysis (SPC / six sigma) rather than go no-go. This is the issue the military faces - how do you build a weapons system with very low failure rates? Especially if we are talking the construction of one aircraft carrier or maybe a dozen nuclear submarines.
 
Top Bottom