Tim Folkerts
Trusted Information Resource
Antiquated Sampling Standards?
I have been learning a bit about sampling plans, and the more I look, the poorer the statistics involved seem to be. Worse, most people involve with sampling have little idea what plans say about the actual quality associated with different plans. In this day and age of computers on every desktop, why not scrap paper tables like Z1.4 or MIL-STD-105E and simple go to a software-based system where the sampling plan is generated based on needs.
Why not replace the paper standards with an electronic standard - a set of equation implemented in a simple computer program that tells you just what you need, based on values you input (alpha, beta, AQL, RQL, lot size).
Currently you say something like "Normal, Level II, AQL = 1". Then you have to look up in a table the values for lot size & acceptance number, but most people still have little idea how this translates into actual statistics. (And I can tell you there is little consistency in the statistics when you start plugging through the numbers.)
Instead, you could say something like "AQL = 1, RQL = 5 with 5% risk" (i.e. there is no more than a 5% chance of rejecting a lot with 1% defective and no more than a 5% risk of accepting a lot with 5% defective).
For those who like names, you could call alpha = beta = 5% "normal". "Tightened" could be alpha = 20%, beta = 1%; "reduced" could be alpha = 1%, beta = 20%.
For those who like tables, it would be simple to generate a set of "standard" tables.
This still doesn't give the whole OC curve, but it comes a lot closed to letting you know what you sampling plan is really doing for you.
Tim
I have been learning a bit about sampling plans, and the more I look, the poorer the statistics involved seem to be. Worse, most people involve with sampling have little idea what plans say about the actual quality associated with different plans. In this day and age of computers on every desktop, why not scrap paper tables like Z1.4 or MIL-STD-105E and simple go to a software-based system where the sampling plan is generated based on needs.
Why not replace the paper standards with an electronic standard - a set of equation implemented in a simple computer program that tells you just what you need, based on values you input (alpha, beta, AQL, RQL, lot size).
Currently you say something like "Normal, Level II, AQL = 1". Then you have to look up in a table the values for lot size & acceptance number, but most people still have little idea how this translates into actual statistics. (And I can tell you there is little consistency in the statistics when you start plugging through the numbers.)
Instead, you could say something like "AQL = 1, RQL = 5 with 5% risk" (i.e. there is no more than a 5% chance of rejecting a lot with 1% defective and no more than a 5% risk of accepting a lot with 5% defective).
For those who like names, you could call alpha = beta = 5% "normal". "Tightened" could be alpha = 20%, beta = 1%; "reduced" could be alpha = 1%, beta = 20%.
For those who like tables, it would be simple to generate a set of "standard" tables.
This still doesn't give the whole OC curve, but it comes a lot closed to letting you know what you sampling plan is really doing for you.
Tim
