Sushil said:
Wayne thanks, as I understand from you the DPMO tracking is not advisable at testing stations. I was recently refering to two IPC standards released ie 7912A and 9216A which has both references to the DPMO calculations in PCB assy environment, here they are categorising defect Opp as termination, component, Assy, Placement,Overall Index opportunities- do you advise tracking at this level for which IPC is also strongly recommending. But for layman terms and in the shop floor, this would'nt be a good indicator for driving improvements especially from test stations. Further this would require Debug data also to be fed in to have a real time DPMO displayed, is this method adoptable in Assy environments ?
As an example , I have a tests named "POST test" which has several defect codes under it POST_110, POST_90, POST_105 test which has it's own reasons to fail within the Circuit. Even though the Diagnostic test routine calls for POST test, the test script writes next level defect code in the log.
ie board Sl# 999 fails for POST_110, board Sl# 1000 could fail for POST_105. The defect pareto charts/Test Engineers in the line also uses the POST_XXX defect codes to get the hitter and identify the process/component related issues. But the issue being you have too many defect codes to be tracked eventhough we restrict them to top 10 from the pareto.
Going by PASS/FAIL category could be too high level. We are in a fix now on which method to adopt and whether the method adopted could find it's suitability in the test environment for the TE's to drive the improvements.
Hi,
I can't speak to the IPC standards as I'm not familiar with them. You are correct though, I would not use DPMO at the testing station, a simple Pass/Fail criteria would suffice here.
I have used real time data reporting on a manufacturing floor for electromechanical components. I had it tied into my production monitors so I knew how many pieces were produced (to takt time) and how many defective units had failed first pass. If the defects reached a certain point, we would print out the detailed defect data and respond accordingly.
If you have a line of code in your test software that writes the specific defect found, would it be possible to insert a few lines to write a "group defect" into the report? Again, the trick is to try and identy the biggest area of opportunity, and not to try to solve all the problems at once which won't get you very far in the long run. Here's a simple way to look at:
At the end of the day you ran
1000 parts, of which
100 failed for various defects. The defect breakdown is as follows:
20 - missing solder joint,
15 - cold solder joint,
10 - excessive solder,
20 - defective component,
20 - damaged board,
10 - missing component,
5 - bad trace
From here we can see that the biggest area of opportunity is with soldering related defects which is responsible for
45% of the defects. If we were to take each of these defect codes individually, the breakdown of the top five would like like this:
Missing Solder Joint -
20%
Defective Component -
20%
Damaged Board -
20%
Cold Solder Joint -
15%
Excessive Solder -
10%
Missing Component -
10%
The PASS/FAIL criteria is there to help you identify whether or not the process yield is acceptable. If not, then you begin the drill down process. Essentially it is asking yourself "Why?" five times but with the use of data. In the above example you'd have three areas tied for the top spot. Which one do you attack? By grouping defects which have some sort of common thread, you may find that the solution for "Cold Solder Joints" also helps to addrerss "Missing Solder Joints" as well.
Long term, DPMO may be appropriate when designing new processes but I don't feel it will help identify and fix problems in the production area as they occur.
I'm sure there will be others out there who have a different take, and that's fine too. At the end of the day we all need to find out what works best for us and the customer. One my biggest peeves is that often there is a tendency to jump into some of the higher level tools because they appear "sexier" than sticking to the basics. But again, use what works and what will give you the best results.
Regards,
Wayne