DPMO calculation for Testing stations- ideal way? High volume PCB manufacture

  • Thread starter Thread starter Sushil
  • Start date Start date
S

Sushil

Electronic High volume PCB assy environmemt- I am looking for best methods to transform test data to DPMO (Defects Per Million Opportunities) values. Test stations detects board pass or fail as well as has capability to come out with symptom code for which the board failed for (50 Symptom codes-some boards more). test times vary from 1 hr to 2 hrs. Intention is to drive improvement in assy based on defects detected. There are terminologies like FPY(first pass yield) and there are cases where the board goes for re-tests number of times- provided the rework/debugging does'nt come with circuit improvement.
 
Elsmar Forum Sponsor
Sushil said:
Electronic High volume PCB assy environmemt- I am looking for best methods to transform test data to DPMO (Defects Per Million Opportunities) values. Test stations detects board pass or fail as well as has capability to come out with symptom code for which the board failed for (50 Symptom codes-some boards more). test times vary from 1 hr to 2 hrs. Intention is to drive improvement in assy based on defects detected. There are terminologies like FPY(first pass yield) and there are cases where the board goes for re-tests number of times- provided the rework/debugging does'nt come with circuit improvement.

Maybe I'm reading more into this than necessary but here goes.

I wouldn't necessarily use DPMO for this applicaton, or many applications for that manner. Irregardless of whether there is one defect on the board or three, if the board doesn't pass test it is rejected. So your "First-Pass Yield" would be based on a simple Pass/Fail criteria.

One thing to look at would be to see where you can combine the "Symptom" codes into groups. For example, a component can be shorted or it can be open, but in both circumstances it is still faulty. A solder joint can be cold, or maybe even missing, hence a group dubbed "Soldering". You might end up with groups like this:

Faulty Component - Open, Shorted, Wrong Value
Soldering - Cold, Missing, Insufficent Coverage
Board Defect - Cracked, Broken
Etc.

A problem with having 50 different categories is that results can be so spread out that "nothing" stands out. Create a Pareto Chart off of the main headings. Once things are categorized into this high level chart, create a secondary Pareto Chart for the categories with highest defect rates using the sub-categories (i.e. Open, Shorted, Wrong Value, etc).

A simple histogram charting the number of defects found on a board and/or the number of items replaced on a board (could show the effectiveness the technician's troubleshooting ability) would be very applicable.

Another item to look at is the accuracy of the symptom codes generated to what is actually done to the board to return it to a condition where it can be used.

It would be great to have some more information or examples to go by.

Wayne
 
Concur with wmarhel, be carefull using DPMO

A couple of past experiences.

I was mentoring a Black Belt on a project where they were improving a process called detail check. There, the CAD system produced a drawing and the "detail check" team looked for errors in the design implementation prior to passing over to manufacturing to wire up the circuitry.

Before I was involved, the improvement team decided that they would obtain a baseline DPMO. So they said each drawing had so many wires on it ( 1 opportunity to either be there or not) , each wire had a start point (opp) and a termination (opp) and a wire coding number (opp) and a color code (opp), so essentially they ended up with each wire on the drawing having 5 or 6 opps for error. OK, that meant each drawing had several hundred opps. They calculated a few hundred DPMO from their sample.

When the team took that result to manufacturing, they almost got thrown out of the building. "No WAY are you that good!” they said. The point is that you must define the opportunities in view of the customer. In this case the wiring guys did not give a tinkers dang if there were 10,000 opportunities for error that were done correctly. ONE error made the drawing not suitable for their use.

Another story on defect counts. I did a project looking at a packaging line. The inspection of the package had a quality plan with, as I remember 77 defects called out. e.g. ("PRINT MISSING CHARATERS", PRINT SMUDGED” and a bunch of others) were all separate defect categories. Please imagine an operator trying to inspect to that list and the daily arguments of the IT IS A DEFECT! , IT IS NOT! variety. All of this is NVA work.

When our team finished we had combined them into only a total of 17 categories e.g. all the Print ones were now "Print Unreadable", with some decent operational definitions around them. Still a lot, but much better.

The bottom line is, if you will not do something with the minute symptom codes, look for combining to simplify and get more resolution in your first order Paretto. If you are using the lower level data to drive improvements, then I applaud you, but in such cases it has been my experience that it is just recorded and no body ever looks at it again because individually they are not seen to be a problem worthy of effort.

My advice is stick with PASS/FAIL, as previous poster also indicated, to drive your initial improvement efforts.

A legitimate use of DPMO can be seen to compare two products of dissimilar complexity, like a circuit board with 100 components versus one with a 1000. This can be the first cut to say what to work on as a priority.

But to improve this type of process, I'd be very carefull of DPMO where the "O" was anything except "1", that "1" being what the customer wants. Else, you may delude yourself that you are doing better than you actually are.
 
wmarhel said:
Maybe I'm reading more into this than necessary but here goes.

I wouldn't necessarily use DPMO for this applicaton, or many applications for that manner. Irregardless of whether there is one defect on the board or three, if the board doesn't pass test it is rejected. So your "First-Pass Yield" would be based on a simple Pass/Fail criteria.

One thing to look at would be to see where you can combine the "Symptom" codes into groups. For example, a component can be shorted or it can be open, but in both circumstances it is still faulty. A solder joint can be cold, or maybe even missing, hence a group dubbed "Soldering". You might end up with groups like this:

Faulty Component - Open, Shorted, Wrong Value
Soldering - Cold, Missing, Insufficent Coverage
Board Defect - Cracked, Broken
Etc.

A problem with having 50 different categories is that results can be so spread out that "nothing" stands out. Create a Pareto Chart off of the main headings. Once things are categorized into this high level chart, create a secondary Pareto Chart for the categories with highest defect rates using the sub-categories (i.e. Open, Shorted, Wrong Value, etc).

A simple histogram charting the number of defects found on a board and/or the number of items replaced on a board (could show the effectiveness the technician's troubleshooting ability) would be very applicable.

Another item to look at is the accuracy of the symptom codes generated to what is actually done to the board to return it to a condition where it can be used.

It would be great to have some more information or examples to go by.

Wayne

Wayne thanks, as I understand from you the DPMO tracking is not advisable at testing stations. I was recently refering to two IPC standards released ie 7912A and 9216A which has both references to the DPMO calculations in PCB assy environment, here they are categorising defect Opp as termination, component, Assy, Placement,Overall Index opportunities- do you advise tracking at this level for which IPC is also strongly recommending. But for layman terms and in the shop floor, this would'nt be a good indicator for driving improvements especially from test stations. Further this would require Debug data also to be fed in to have a real time DPMO displayed, is this method adoptable in Assy environments ?

As an example , I have a tests named "POST test" which has several defect codes under it POST_110, POST_90, POST_105 test which has it's own reasons to fail within the Circuit. Even though the Diagnostic test routine calls for POST test, the test script writes next level defect code in the log.
ie board Sl# 999 fails for POST_110, board Sl# 1000 could fail for POST_105. The defect pareto charts/Test Engineers in the line also uses the POST_XXX defect codes to get the hitter and identify the process/component related issues. But the issue being you have too many defect codes to be tracked eventhough we restrict them to top 10 from the pareto.

Going by PASS/FAIL category could be too high level. We are in a fix now on which method to adopt and whether the method adopted could find it's suitability in the test environment for the TE's to drive the improvements.
 
Sushil said:
Wayne thanks, as I understand from you the DPMO tracking is not advisable at testing stations. I was recently refering to two IPC standards released ie 7912A and 9216A which has both references to the DPMO calculations in PCB assy environment, here they are categorising defect Opp as termination, component, Assy, Placement,Overall Index opportunities- do you advise tracking at this level for which IPC is also strongly recommending. But for layman terms and in the shop floor, this would'nt be a good indicator for driving improvements especially from test stations. Further this would require Debug data also to be fed in to have a real time DPMO displayed, is this method adoptable in Assy environments ?

As an example , I have a tests named "POST test" which has several defect codes under it POST_110, POST_90, POST_105 test which has it's own reasons to fail within the Circuit. Even though the Diagnostic test routine calls for POST test, the test script writes next level defect code in the log.
ie board Sl# 999 fails for POST_110, board Sl# 1000 could fail for POST_105. The defect pareto charts/Test Engineers in the line also uses the POST_XXX defect codes to get the hitter and identify the process/component related issues. But the issue being you have too many defect codes to be tracked eventhough we restrict them to top 10 from the pareto.

Going by PASS/FAIL category could be too high level. We are in a fix now on which method to adopt and whether the method adopted could find it's suitability in the test environment for the TE's to drive the improvements.


Hi,

I can't speak to the IPC standards as I'm not familiar with them. You are correct though, I would not use DPMO at the testing station, a simple Pass/Fail criteria would suffice here.

I have used real time data reporting on a manufacturing floor for electromechanical components. I had it tied into my production monitors so I knew how many pieces were produced (to takt time) and how many defective units had failed first pass. If the defects reached a certain point, we would print out the detailed defect data and respond accordingly.

If you have a line of code in your test software that writes the specific defect found, would it be possible to insert a few lines to write a "group defect" into the report? Again, the trick is to try and identy the biggest area of opportunity, and not to try to solve all the problems at once which won't get you very far in the long run. Here's a simple way to look at:

At the end of the day you ran 1000 parts, of which 100 failed for various defects. The defect breakdown is as follows: 20 - missing solder joint, 15 - cold solder joint, 10 - excessive solder, 20 - defective component, 20 - damaged board, 10 - missing component, 5 - bad trace

From here we can see that the biggest area of opportunity is with soldering related defects which is responsible for 45% of the defects. If we were to take each of these defect codes individually, the breakdown of the top five would like like this:

Missing Solder Joint - 20%
Defective Component - 20%
Damaged Board - 20%
Cold Solder Joint - 15%
Excessive Solder - 10%
Missing Component - 10%

The PASS/FAIL criteria is there to help you identify whether or not the process yield is acceptable. If not, then you begin the drill down process. Essentially it is asking yourself "Why?" five times but with the use of data. In the above example you'd have three areas tied for the top spot. Which one do you attack? By grouping defects which have some sort of common thread, you may find that the solution for "Cold Solder Joints" also helps to addrerss "Missing Solder Joints" as well.

Long term, DPMO may be appropriate when designing new processes but I don't feel it will help identify and fix problems in the production area as they occur.

I'm sure there will be others out there who have a different take, and that's fine too. At the end of the day we all need to find out what works best for us and the customer. One my biggest peeves is that often there is a tendency to jump into some of the higher level tools because they appear "sexier" than sticking to the basics. But again, use what works and what will give you the best results.

Regards,

Wayne
 
Dave Strouse said:
A couple of past experiences.

When the team took that result to manufacturing, they almost got thrown out of the building. "No WAY are you that good!” they said. The point is that you must define the opportunities in view of the customer. In this case the wiring guys did not give a tinkers dang if there were 10,000 opportunities for error that were done correctly. ONE error made the drawing not suitable for their use.

... But to improve this type of process, I'd be very carefull of DPMO where the "O" was anything except "1", that "1" being what the customer wants. Else, you may delude yourself that you are doing better than you actually are.

My files include some definitions of "opportunities" for DPMO calculations. The article was titled Some Guidelines for Counting Opportunities for Defects. I cannot find the original source but it may be from ASQ's Six Sigma Forum. Here are the definitions and a few other comments from the article.

"According to the ASQ Six Sigma Forum’s own glossary:

Opportunity: Any area of a product, process, or service that must be right to achieve customer satisfaction. Only operations that add value and have a direct connection to customer CTQs contain opportunities.

Opportunity for error or defect: An observable, measurable opening for a defect to occur, either in a final outcome or within processes leading toward the final outcome. Anything undesirable from a customer point-of-view."

The focus on counting the opportunity is whether it is critical to quality (CTQ) for the customer. Overstating the opportunities actually increases the possibility of wasting resources to fix problems that are not important.

The author gave some examples such as counting a supplied part as one opportunity (correct or incorrect), one opportunity for each entry on a form, a single tool making five cuts counts as five.

It sounds like Dave's descriptions were consistent with these definitions.

Bill Pflanz
 
Go for variable data to maximize information instead of DPMO

DPMO is said to be a beginner's choice in six sigma. DPMO calculated from attribute data can vary a lot from day to day and often makes you wonder what is happening.

To understand what is really happening, select a variable metric, confirm satifactory measurement system and focus on control charts to give you more insight of process.
 
Pls clarify

Arvind,
Thanks.

can you pls elaborate on this. E.g in PCBA environment, Solder quality is either good or bad, Component is missing or Damaged etc are some of the defect codes at the process level. So this is attribute type of data. How can I convert this to Variable data type and which alternative do you suggest for instead of DPMO ?.

Sushil
 
thanks for your guy opinions,
now I have a example want to get answer

I have a production manufacture line,
finish 1000 producitons, final inspection find 40 parts have totally 80 defects. There are 3 assmble process for this line, for 1st process has 2 kinds of defect, the 2nd process has 1 kind of defect and the 3rd process has 3 kinds of defect.
so the DPU is 80/1000=0.008
but the DPO is 0.008/6=0.00133 or 0.008/30.00266?

which one is right? thanks

Michael
 
Back
Top Bottom