-----Begin Snip-------
Date: Fri, 7 Apr 2000 21:03:35 +1000
From: "John McConnell"
Subject: Simple Stuff - Reduce Variation
This six sigma stuff seems to have got more than a little out of control.
A long time ago, just after the earth's crust had formed, I attended a seminar in Sydney where a bunch of Motorola folks told us about six sigma.
The essence of the approach we saw was the notion that in a complex system even if every step in the process, as well as every component and item of raw material, was performing at three sigma (just meeting specifications), that first pass yield fell to nil after only a few hundred events (an event is a process step or an item of raw material). A simple factorial was used to demonstrate this. Another factorial showed that after 10,000 events a six sigma system (variation halved for every event) produced 99.99% first pass yield.
This model, it was explained, assumed perfect stability for every event. To approximate a world where stuff went wrong occasionally, the factorials were repeated where the average of each event was allowed to drift by plus or minus 1.5 sigma.
The 3 sigma system collapsed very quickly. The 6 sigma system produced 96 or 97% first pass yield after 10,000 events.
That's all it started out as. An approach that demonstrated how effective was an approach that reduced variation. It was simple yet powerful.
At the time, the approach was based on using statistical methods to find and reduce variability, with particular emphasis on zeroing in of those event= s that gave the greatest leverage. It had a strong feel of Taguchi's methods, as well as some very sexy response surfaces.
Some years later I attended another briefing on six sigma. It had become much more complex, more difficult to understand and more expensive to purchase the training. And black belts etc had emerged.
It is not my purpose to can six sigma. However, whilst I applaud any approach that has at its aim the reduction in variability, the later version seemed to miss the notion that reducing variation ought to be the job of everyone. My experience is that where leaders are determined to reduce variation, they will. As Deming was so fond of saying; "When you know why, you will find a way".
Some of you are familiar with LITTLE'S LAW. If you are not, it is recommended for study. In essence, it is:
Throughput volume =3D Work In Progress divided by Cycle Time
Study that carefully for a while. Then add the notion that as variation is reduced throughout the process, Cycle Time reduces. Then you have the option of increasing output, reducing WIP, or a little of both.
This law addresses volume rather than quality. But you don't need to be a black belt to figure out that as variation in quality reduces, so too does variation in volume.
I will close this long winded rambling with a reminder for us all:
Mr. Murray Mansfield of Melbourne has what I believe to be the only completely up to date version of Dr. Deming's famous Obligations for Top Management. After a long discussion with Murray, Dr. Deming agreed that there ought to be a fifteenth point. He took Murray's notes turned to the page containing the fourteen points and at the foot of the page wrote:
15. Have a good time!
John McConnell
-------End Snip-------
Another View
-------Begin Snip-------
Date: Fri, 7 Apr 2000 04:46:31 EDT
To: den.list
Subject: Re: comments on six sigma
My 2 cents on the "Six Sigma debate" going on. Before we slam something or question its motives, maybe we should first try to understand it from the inside, rather than from the outside, throwing stones, as so many did with Dr. Deming . . .Anything can be abused - or used to our advantage. As always, our motivation and values drive the effort and results.
Charles Hannabarger
Process Solutions International
619-443-3165
-------End Snip-------
Regards,
Don