Six Sigma - A debate of the validity of Six Sigma

Marc

Fully vaccinated are you?
Leader
I know... I know... Six sigma is 'over rated'. But then again, maybe not. Here's something to think about from Forbes web site:

------snippo------

For GE’s Jack Welch, cost-cutting isn’t an event—it’s a process.
Revealed at last: the secret of Jack Welch’s success

By Michelle Conlin

THERE ARE CERTAIN management mantras that will forever be associated with General Electric’s Jack Welch: being only number one or number two in your field and preaching the “boundaryless” sharing of ideas, a process that breaks down traditional corporate hierarchies to make sure that information flows up as well as down.

To that add a defect-reduction program called Six Sigma. Don’t let the jargon fool you: This is serious stuff, and Welch has embraced it with his usual zeal. Six Sigma contributes mightily to GE’s earnings growth, which was 13%in 1996 and should reach 14% in 1997.

Think of a sigma as a mark on a bell curve that measures standard deviation. Most companies have between 35,000 and 50,000 defects per million operations, or about 3 sigma. For GE, a defect could be anything from the misbilling of an NBC advertiser to faulty wiring in locomotives. Three years ago GE engineers determined that the company was averaging 35,000 defects per million operations—or about 3.5 sigma. (The higher the sigma, the fewer the errors.) That was a better-than-average showing, but not enough for Welch’s restless mind.

He’s now maniacal about hitting his goal of reducing defects to the point where errors would be almost nonexistent: 3.4 defects per million, or 6 sigma.

“This is not about sloganeering or bureaucracy or filling out forms,” Welch says. “It finally gives us a route to get to the control function, the hardest thing to do in a corporation.”

In implementing Six Sigma, Welch borrowed a page from Motorola—whose engineers first embraced the concept in the early 1990s—and from AlliedSignal, which followed Motorola’s lead.

It took Motorola eight years to get to 6 sigma from about 3 sigma. Welch said he wanted to get there faster and, like Motorola, apply the Six Sigma program to all the company’s businesses. Five years, he said, not eight.

That was in 1995. Already GE has reduced its defect rate to more than 3.5 sigma. Welch has demanded that the defect reduction program apply not just to the goods rolling off all the manufacturing lines but to performance during the product’s lifetime as well.

Customers of GE’s Milwaukee-based medical division were frustrated by the short life span of the tubes in GE’s CAT scanners. The tubes lasted for about 50,000 to 100,000 X rays and took about four hours to replace. Lots of angry patients and lost dollars.

GE assigned a team of Six Sigma “Black Belts”—as those trained to manage the program are called—to the problem. Their job was to measure and analyze each phase of the tube manufacturing process to see what improvements could be made.

Engineers found they could reduce by nine months the time needed to perfect new models of the X-ray tubes. GE is producing tubes that have up to five times the life span of the old tubes. The new tubes provide sharper, more complete pictures, allowing physicians to examine images of the entire brain of a stroke victim, rather than just slices at a time.

The cost savings flow over to other areas: Defect-free tubes coming off the assembly line help free up production capacity. What GE Medical Systems learned in making this product better is now being extended to other kinds of X-ray tubes the company makes.

The Six Sigma program has produced unexpected benefits. The conventional wisdom was that GE Medical needed to use expensive titanium and tungsten to make X-ray tube bolts. The Black Belts came up with proof that cheaper materials like aluminum and steel would do just fine.

Before they get credit for any savings, Welch requires Six Sigma Black Belts to prove that the problems are fixed permanently. Forty percent of each bonus given to all top managers is now tied to Six Sigma goals.

The money is well spent. Savings at the Medical Systems division alone hit $40 million last year. Also in 1997 GE raised its companywide savings estimates for the defect program twice, from between $400 million to $500 million, then to $600 million and $650 million.

By the year 2000, Morgan Stanley Dean Witter analyst Jennifer Pokrzywinski figures gross annual benefit from the program could be as much as $6.6 billion, 5.5% of sales.

Now you have the secret of Jack Welch’s success. Not a series of brilliant insights or bold gambles, but a fanatical attention to detail.
 
D

Don Winton

Marc,

Good post. I would humbly suggest, however, that six-sigma is not 'over-rated.' Rather, I think it is, perhaps, 'over-sold.'

As with other quality management programs, applied correctly, it will work. Applied incorrectly, it is just another collection of paper.

Regards,
Don
 

Marc

Fully vaccinated are you?
Leader
I say 'over rated' because I've been through the arguments before including the 'theoretical' 1.5 sigma shift.
 
D

Don Winton

Marc,

I also believe that six-sigma is significant. The 1.5 sigma shift you mentioned brings up an interesting point. Even though the opportunities for failure values may not change by a significant amount, there is an argument (maybe or maybe not justified) that due to the shift, there is a loss function involved. I remember reading that somewhere, and mathematically, there may be some justification.

Comments by interested parties appreciated.

Batman,

Roger Burns (Harris Corp.) said, “I think if it going to be successful, a lot of it has to do with support from a pretty high level within the organization.” He was referring to another subject, but it pretty much applies to any QMS system implemented.

Regards,
Don
 
D

Don Winton

What follows is a discussion I found regarding the Six Sigma Concept. Whaddya Think?

Regards,
Don

Re: Six Sigma and Deming
----------------------------------------------------------
* Subject: Re: Six Sigma and Deming
* From: "William J. Latzko"
* Date: Mon, 17 Aug 1998 09:47:52 -0400
----------------------------------------------------------

Six Sigma does the right things for the wrong reason. See my paper at http://deming.eng.clemson.edu/pub/den/six_sig.pdf . The theory on which six sigma is based is flawed. However, since it contributes to continual, never ending improvement it has a psychological value. Why not adopt a policy of continual improvement and be done with it.

The drawback to six sigma is that it cost a lot and may mislead the user into a sense of comfort that is not deserved. I am much reminded of the reliance people used to put on MIL-STD-105 when they thought that the AQL was the buyers protection.

Bill Latzko

---------------------------------------------------
* Subject: Motorola +/- 1.5 sigma Wavering/Shifts
* From: "H Södersved"
* Date: Mon, 28 Sep 1998 10:43:17 +0200
---------------------------------------------------

1. According the Dr. DJ Wheeler the 1.5 sigma shifts come "out of the blue", from nowhere. An assumption that builds Special Causes into the process from the start to very high cost levels. Think of the cost of design when an extra unspecified capability of 3 sigma is forced into the product design.

2. I have tried to find an explanatory origin in the Motorola papers on this shift and have found none.

3. An attempt to understand: Electronic systems, that has been my living for 27 years, have very complex interactions between many types of material processes. When product design times have become extremely short (3 to 6 months) and in practice the most competitive strategic tool (Deming's 1st Quality Prong: Innovation), there is simply not enough time to find all special causes in the upstream processes before it is time to launch the next model. This is how I "might" understand the practical way of dealing with this excessive noise in the product/manufacturing stream.

4. At the same time that Motorola developed the Six Sigma Approach, in 1985 I at Ericsson Radio Systems and some other Surface Mount Specialists here in Scandinavia (Tandberg Data etc) started measuring Attribute Data in PPM (parts per million), or DPM (defects per million) as Motorola says. I did not know Deming nor conventional capability analysis, due to the lack of training, education and interest from Technical University and my earlier managers. We were inspired by the ppm-measurement practice in Japanese Industry. In three years we could improve the manufacturing processes from 45000 ppm to 50 ppm. In California some company reproted the same result in one year. We did not know anything of the +/- 1.5 sigma shift at Motorola. This was very essential for the Ericsson success in Mobile Telephone Systems worldwide.

5. When reaching the 50 ppm level we started emphasizing the Six Sigma approach for capability of test varibles of the process, without knowing better. But electronic designers were very restrictive to our efforts, maybe they undestood the analytic crazyness better. Donald Wheeler was the first person who relly gave me strength to look through the fallacy of Six Sigma. The Six Sigma should be called 4.5 Sigma Approach and nothing else. Still the problem is that it is a Specification Method, not a Deming nor SPC method!

Expira AB, Process & Quality Management
Bjornidegrand 3, S-162 46 VALLINGBY


-------------------------------------------------------------
* Subject: RE: More on 6 Sigma
* From: "Murphy, Kevin P (GEAE)"
* Date: Tue, 29 Sep 1998 13:54:51 -0400
-------------------------------------------------------------

Mikel Harry, the President of the Six Sigma Academy, also refers to the following articles to justify the 1.5 sigma shift:

Bender, A. (1975) "Statistical Tolerancing as it Relates to Quality Control and the Designer." Automotive Division Newsletter of ASQC

Evans, David H. (1975) "Statistical Tolerancing: The State of the Art, Part III: Shifts and Drifts." Journal of Quality Technology; 7 (2), pp. 72-76

Gilson, J. (1951) A New Approach to Engineering Tolerances. London, England; Machinery Publishing Company Ltd.

Basically, his idea is this:

From the research mentioned above, he quotes Evans (1975):

"....shifts and drifts in the mean of the distribution of a component occur for a number of reasons...for example, tool wear in one source of a gradual (nonrandom) drift...which can cause (nonrandom) shifts in the distribution. Except in special cases, it is almost impossible to predict quantitatively the changes in the distribution of a component value which will occur, but the knowledge that they will occur enables us to cope with the difficulty. A solution proposed by Bender...allows for (nonrandom) shifts and drifts. Bender suggests that one should use: V = 1.5*SQRT(VAR X) as the standard deviation of the response to relate the component tolerances and the response tolerance."

>From here, Harry suggests that a generalization can be made, namely:

(st)^2 = C(sw)^2 OR C=st/sw

He calls c "the magnitude of inflation imposed on the instantaneous reproducibility.", or "It may be said that c is a compensatory constant used to correct the sustained reproducibility for the effect of nonrandom manufacturing errors which perturbs the process center." He claims that the general range of c proposed by the above three articles is between 1.4 and 1.8.

By "assuming a rational sampling strategy", he shows that:

average quadratic mean deviation = (sw)^2*(c^2 (ng-1)-g(n-1))/ng

where n = subgroup size and g = number of subgroups

Next, he does a couple of algebraic manipulations and then standardizes the equation, yielding

Zshift = SQRT {(c^2(ng-1) - g(n-1))/ng}

He then says that typically in the general range of sampling conventions, n is usually between 4 and 6, and g between 10 and 100. When you use c = 1.8, and you plug in the "typical" values of n = 5 and g=50, voila! Zshift = 1.49 which is just about 1.5, what he calls the standard mean shift correction. (I got 1.55 when I plugged in the numbers)

Basically, since he is trying to formulate a cookbook approach, he wanted a standard value. It would seem that when you decide to change your tooling or how much drift you allow would depend on the loss function for that process, which he does not cover at all, since he promotes a specification-oriented, project-centered, cookbook viewpoint (which does have some advantages, well maybe only two, namely quicker buy-in and easier training since you don't have to think as much).

Obviously, views represented by me (hey, I always called them facts) are not necessarily those of my employer.

> Kevin P. Murphy GEAE Six Sigma Quality (BB)
> General Electric Company
> Aircraft Engines
> 1 Neumann Way M/D J30
> Cincinnati, Ohio 45215
 
C

chen

Hello everyone:
Who would like to tell me the web site
of 6-sigma or any information,
but not the concept only.
I think some practice are best.
thank you.
have a nice day. Mar.22/1999
 
D

Don Winton

-------Snip--------

Subject: Re: Six Sigma On the Rise?
Date: Fri, 16 Apr 1999 12:46:27 EDT
From: GrantBlair
To: den.list

In a message dated 4/15/99 1:40:18 AM Eastern Daylight Time,
latzko writes:

“Six Sigma theory is based on a misunderstanding of Dr. Shewhart's ideas. See my paper on the DEN at (Dead link was: http://deming.eng.clemson.edu/pub/den/six_sig.pdf) for more. Bill Latzko”

Six Sigma is based on an EXTENSION of Dr. Shewhart's ideas developed by statisticans at Motorola and is a sound application both theoretically and practically. If you're a theoretical type: (by this, I mean you went to Bill's reference and understood what he was talking about };-).

1. Bill states in the paper that Six Sigma theory requires assumptions of a normal distribution.

This is incorrect. You can demonstrate that Six Sigma applies to non-normal distrubutions using a modification of Chebycheff's inequality.

2. Bill also sates in the paper that the 1.5 mean shift assumes an unstable process and "unstable distributions are unpredictable" This is also incorrect. ALL PROCESSES are inherently unstable.

This is a Law of Physics known as Entropy. However, this doesn't mean that a process producing toothpicks will one day randomly produce a telephone pole. (If you believe this, then I'll be glad to sell you an Encyclopedia typed by a room full of monkeys). This is another reason Six Sigma theory works.

Now, if you're a practical type, here are some more arguments for Six Sigma:

1. Parts per thousand defect levels were great in Shewhart and Deming's time, but won't work in today's world---- parts per million levels are required. (A good example is airplanes.....would you fly an airline which accepted a defect level of 3 parts per thousand and ran a thousand flights per month???? Think about it.)

2. Jack Welch is not an advocate of Six Sigma because it's threoretically correct...he endorses it because it is providing DEMONSTRATED SAVINGS to GE. One requirement for becoming a 6 Sigma black belt includes completing an project demonstrating $100k in cost benefits. There are a LOT of black belts at GE.....

Ninety Six SC

-------End Snip-------

And Another

-------Snip--------

Subject: Re: Jim McKinley's Post "Six Sigma on the rise"
Date: Thu, 15 Apr 1999 16:54:35 -0500
From: Eugene Taurman

Six sigma as started by Motorola as a vision statement was a great idea. Six sigma as taught by the Motorola University is pretty good approach to change.

As preached by Welch I have no idea what he is saying. Motorola's vision was concise and to the point. Every process will have a 100% safety factor between what we think we need or the customer needs and our capability.

Motorola tried to be clever and worked on a two humped distribution idea to allow for the fact that short term capability studies never included all the variable that impact out put. Often people focus on this but it has little to do with the original mission. The original mission make all processes capable had a very positive positive impact on people, attitude and priorities as they tried to bring processes into control and make them capable.

As I see what Welch and others are doing is making it into a program trying to copy Motorola activities.

Good luck this is just another in the series of people trying to make Deming dies something they can sell.

et

-------End Snip-------

Regards,
Don
 
Last edited by a moderator:
D

Don Winton

-------Begin Snip-------

From: "Bill Scherkenbach"
Date: Mon, 19 Apr 1999 13:19:45 -0400
Subject: Fw: Six Sigma On the Rise?

When the statistical community was up in arms over Taguchi, Dr. Deming was not as perturbed as most everyone else was. I asked why. He smiled and recalled the battles of Pearson and Fisher becoming public and confusing unknowing but all powerful management. Best to have the battles but keep them private.

I have seen the comings of other "pretenders" "hacks" etc. The Deming community (not Deming) has ridiculed most everyone else claiming to offer a method for improvement. Crosby, Juran, Taguchi, Shainin, Six Sigma, Reengineering, et al. I certainly did my share.

The arguments were almost exclusively in the Logical world, though. I think our assumption (wrong) was that if we could convince the others of the Logical pitfalls/deficiencies/error-of-their-thinking that they would change. It works for some, but not everyone.

We have got to respect the perspectives of the Physical, and Emotional world views. Logical fallacies have been identified by logical people who don't want logic contaminated by Physical or Emotional tricks. Get a life. Physical, Logical, and Emotional is us.

Yes, Six Sigma is logically flawed. It could be done better by anyone of us. But arguing on that level won't stop it. You have to join a different world to make an impact here. Start receiving and broadcasting on all three frequencies.

Bill

From: "Novick, David T"
Date: Wed, 21 Apr 1999 09:24:35 -0700
Subject: FW: Jim McKinley's Post "Six Sigma on the rise"

The following item was included in the den-list digest of April 21. While I have reduced my activity to one of a "reader," this is a topic I cannot dismiss commenting upon.

> ----------
>From: Jerry Mairani
>Sent: Saturday, April 17, 1999 8:09 AM
>Subject: RE: Jim McKinley's Post "Six Sigma on the rise"

>I see that six sigma has once again made its way into the discussion. If I
>recall we had quite a discuss about a year ago. I quite agree with Jim
>McKinley's statement. "As I see what Welch and others are doing is makingit
>into a program trying to copy Motorola activities." As we know from WED,
>copying just doesn't work!!!

>Taking the time to gain an understanding of the principles of six sigma (let
>me say I am no expert but have some knowledge) reveals it to be consistent
>with WED teaching. Let me share the Aerojet six sigma objectives here in
>Sacramento.

>1. Identify and Remove Defects in our processes
>2. Increase capacity through better yields
>3. Identify and Remove variation in our processes
>4. Reduce complexity in our processes

In my humble opinion, this is consistent with the teachings of Deming. It is nothing more nor anything less than an excellent outline of a continuous process improvement and variability reduction program.

>At its heart is a simple mathematical statement: Six Sigma Capabilities
>will yield 3.4 defects in every one million opportunities to make a mistake.
>It is an approach to pinpointing and reducing process variation and other
>sources of defects (quality loss function).

And this is where, in my opinion, our thinking begins to go astray. Motorola 6-sigma has taken an excellent concept and reduced it with a simple but flawed statistical analysis. First, it assumes that the collected data is normally distributed, independent, randomly collected and rational. I will also suggest the assumption is that the data is taken from a process that is statistically in control, i.e., is predictable. Second, it assumes that process means can drift in time by +/- 1.5-sigma. At the very best, again in my opinion, both assumptions are weak and highly questionable. Let me explain why.

Consider the first set of assumptions.

Normality: When we collect data, usually in the form of SPC charts, we know from the initial work of Shewhart that the requirement of normality is not necessary. While a desired ideal condition, the process is established on a rational economic basis that does not require normally distributed data. In fact, I often find the data in question does not
appear to be normally distributed.

For the uninitiated, I suggest any of Wheeler's books on SPC as an excellent resource for the argument that normality is not needed, even though statisticians argue that in the case of x-bar/r charts, the averaging of the individual sample means forces the data into a normal distribution (Central Limit Theorem). Wheeler clearly demonstrates that non-normal distributions fall reasonably well within the limits obtained using Shewhart's approach.

Independent, randomly collected and rational: For most of the applications we work with, I find that process sampling is not random nor are the measurements within the sampling always independent. Samples are most often taken from the same portion of the "test bed" and frequently without considering their relationship to each other. Once again, if one considers that SPC looks at variation caused by common cause sources and tries to identify, for removal, special cause sources, it is a process which allows for these "mistakes."

Process is predictable: Nothing I have read in Motorola 6-sigma documents specifically mentions "in control" processes. However, the tone of the commentary does indicate they may have had this issue in consideration.

But, I say this with tongue in cheek because of the issue I have to take with the second big assumption.

Drift of the process mean: The several documents I have from Motorola do not treat this issue in detail. At best, the comment is that there is data to support the assumption. I have never been able to find anyone at Motorola who could provide me with this evidence. And, the few references provided in at least two documents are obscure. At least to me, they have been unavailable on interlibrary loan as they date back to the 1940s and 1950s. Furthermore, they are related to tolerancing studies, not process studies. Without these references, one cannot judge how well the data supports the contention, if they are in fact valid or simple observations/assumptions used to drive the tolerancing study. Comments I have received indicate the latter is probably close to the truth.

Now, if one looks at what SPC says, a stable, in control process should have a steady process mean and stable process limits. Within those limits, common cause sources of variation will produce samples that indicate the process average varies +/- 3-sigma. If this is what Motorola's experts were describing as process drift then immediately one can see there is a disagreement.

Others argue, no, that is short term drift and what Motorola is proposing is long term drift. Well, I contend, if a process drifts from its mean, long term, then there must be an internal trend that will be picked up as a signal indicating something unusual has happened. There is a process change due to a special source cause of variation and it has to be identified and removed. In such case, if the process is producing a drift of the magnitude +/- 1.5-sigma, then the process is not in control.

Now, I suppose, you the readers are asking, what does all this mean. To me it means the underlying basis from which the 3.4 ppm "goal" has been calculated, is a weak set of assumptions. Thus the goal itself is meaningless. To pursue this further, the entire concept behind the calculation of this "fraction nonconforming" is exactly the same as that driving the calculation of capability indices. These require the same assumptions of normally distributed, independent, random, rational data taken from processes that are demonstrated to be in control. In fact, if the conditions hold true, one can impute "fraction nonconforming" values directly from the indices.

In my opinion, the Motorola approach is similar to using a Cpk index for the calculation. Thus one can pursue the argument in the same manner used for discussing problems with that capability index. To calculate the index, one needs to estimate the process average and standard deviation. In my understanding, this puts a bias on the index and, it places even more stringent need for the underlying data to be normally distributed as the Central Limit Theorem does not hold. Since the indices are variables, and not constants, a single value has no meaning without confidence limits. Unless the underlying data is precisely normal, etc. there is no way to determine these confidence limits. Thus, any "fraction nonconforming" value determined is meaningless.

Because the Motorola calculations are based on the same input, unless the underlying data can be shown to be compliant with the assumptions, calculation of a "fraction nonconforming" value has no meaning. If one now adds the question of reliability in the allowance for "drift" of the process mean, the entire exercise becomes open to challenge.

As an example, if one assumes the underlying data is not normally distributed but can be described by the closely related Burr distribution (both are bell shaped curves but the Burr distribution has a slight skewness to the right at the top of the distribution) and both are perfectly centered, then the "fraction nonconforming" for a normal distribution is 2 parts per billion. The equivalent value calculated for the Burr distribution is 0.31 parts per million. The difference is slightly more than two orders of magnitude.

Most 6-sigma programs focus on the 3.4 ppm goal, measuring yields looking to reduce defects down to that target level. When the measure does not meet expectations and the process looks only at the ppm defects, the output being measured, the program fails. If we would stick with the four considerations Aerojet focuses on,

> 1. Identify and Remove Defects in our processes
> 2. Increase capacity through better yields
> 3. Identify and Remove variation in our processes
> 4. Reduce complexity in our processes

as outlined by Mariani, identify the key processes producing defects and control the key process parameter for each of those processes and forget the 3.4 ppm metric, we would all be better off -- and probably defect rates would be considerably less than that figure of merit.

David T. Novick

From: "Murphy, Kevin P (GEAE)"
Date: Wed, 21 Apr 1999 10:44:41 -0400
Subject: RE: Jim McKinley's Post "Six Sigma on the rise"

"I believe until dissuaded the use of WED with six sigma can be a powerful tool."

What is gained by adding Six Sigma to the work of WED?

In my opinion, statistically nothing. The mathematical approach to calculating "sigma level" is somewhat controversial and based on specification limits, an older view of quality. Maybe a "six sigma" defect level is right for your process, maybe not. Why not more than six sigma? If a loss function indicates, why not less? That is if you can properly define a "sigma level" for a process. I don't think too many statisticians would say that Six Sigma has added anything significant to their toolbox.

The success of Six Sigma, in my opinion, is the allocation of resources and support that is given to it, which is really more like a change in the organizational structure, so we have companies that actually pay a lot of attention to quality. Maybe it has added something to the management toolbox. And that is a part of what we are after, isn't it?

Those interested in more should refer to the DEN archives, as a lengthy discussion on Six Sigma was held just a few months ago.

-------End Snip-------

Regards,
Don
 
Top Bottom