Theoretical Analysis of Different Data Sets - Quality Concerns at Toyota

C

Charmed

Dear All:

A recent Wall Street Journal article (August 4, 2004) discusses some of the quality concerns that have surfaced at Toyota, even as it has doubled its revenues in the past decade and is now closing in on GM to become the world's largest automaker. This intrigued me and I compiled the following data from the annual J. D. Power quality surveys. There are two types of surveys, the Initial Quality Study (IQS, 90 days of ownership) and the Vehicle Dependability Study (VDS, three years of ownership).

Further details about how the studies are conducted (sort of like polls) may be found at http:/www.jdpower.com/ . There are two parts to the website. The IQS and VDS data are found in the Corporate Site. To obtain prior data, click on Press Releases for each year. The noteworthy quality data are as follows. For each year, the number of problems per 100 vehicles (PP100) are as follows. This was obtained from the VDS, since it looks at customer satisfaction after three years of ownership. Going across the row, the numbers are year, problems per 100 vehicles for Lexus, same for Toyota, and the overall number for Toyota Motor Company (available only for 2003 and 2004).

Year Nameplate Toyota Motor Company
Lexus Toyota

2004 162 216 207 (Rank 1)
2003 163 201 196 (Rank 2)
2002 159 276
2001 173 278
2000 216 299

The overall corporate rank was not reported for earlier years. Although Lexus has consistently ranked as the top rated vehicle (rank 1 in 2004 and in earlier years) notice the erratic variation in the problems per 100 vehicles. It was as high as 216 per 100 vehicles in 2000 dropped to 159 per 100 vehicles in 2002 and then went up again. Also, Toyota Motor Company was able to capture the top rank 1 in 2004, although the number of problems per 100 vehicles was significantly higher (notice even a difference of 1 is detectable in these survey, such as 162 versus 163 for Lexus in two consecutive years).

The 2004 VDS is based on responses from more than 48,000 original owners of 2001 model years vehicles. However, we do not know the exact number of vehicles tested. For example, many Lexus owners must have participated in the survey to arrive at the 162 problems per 100 vehicles, or an average of 1.62 problems per vehicle. This means some vehicles had only 1 problem, some had 2 problems, some may have had 3 problems, and it is also conceivable that some vehicles had more than 3 problems while some had 0 problems. The survey thus gives us an "average" and even this average has been varying erratically.

Let me simply add that I have performed computer "simulations" that show how "problems" or "defects" might increase as we add more and more vehicles to the survey. The distribution of "problems" or "defects" can vary and is actually similar to the problem of determining the "average" energy or the "average" entropy of N particles that Max Planck describes in his December 1900 paper that led to the birth of modern quantum physics. The N particles are located within a heated body, an electrically heated solid block of tungsten, with a small hole drilled in it, was used in these studies. This then led Einstein to his idea of a work function W, when he extended Planck's ideas to light (or electromagnetic radiation, now particles, or photons, are located outside the body which produces light or radiation.

As noted in earlier post (please see response to J. Oliphant's questions on work function), Planck showed that the temperature T = E/S where E is energy and S is entropy. It appears to me that we can generalize and extend these ideas, beyond physics, to discuss quality problems. The mathematical formula for Planck's radiation curve, which was simplified by Einstein, can be written as follows.

y = A (x^n) exp (- ax)

Here A, n, and a are three constants that uniquely specify the relation between x and y. In the Planck and Einstein problem, the constant a = hx/kT where h and k are universal constants known as the Planck and the Boltzmann constant, respectively and T is the temperature (of the heated body, or a temperature that Einstein associates with light, or radiation). More generally, this equation can be used to describe the relation between the control variable x and the number of defects or problems y.

What is interesting is that If a > 0, however small, there is a maximum point on the graph of y = f(x). Such a maximum point can be shown to exist, for example, when we consider the historical data on traffic-related fatalities for the U. S. going back to 1913. Such data is available and I have analyzed it and find that the constant "a" has varied between very narrow limits, with A and n being held constant. The temperature T of the "system" described by the (x, y) data can be related to the critical value of x that leads to the maximum point. Using elementary calculus, it is readily shown that at the maximum point, when x = x*, the value is given by.

x = x* = n/a

If we have good "observations" on the system, n and a can be deduced and the maximum point predicted in advance. Alternatively, such as analysis can be used to determine where a "complex" system is operating, either to the left or the right of the maximum point.

Will we find a similar maximum point if we study the evolution of defects in a manufacturing process (as more and more parts or vehicles are produced and tested) or in a service industry (as more transactions are made, as more customers pass through a drive in, as at McDonalds or Wendy's)?

I don't know. The data seems to be sparse or unavailable at this time. A careful analysis of the data for the Healthcare industry (see thread initiated by Wes) is also extremely important within this context. Mathematically speaking, there is no difference between a "defect" produced (wrong hamburger, or incorrect order filling) when customers drive through at McDonald's or patients pass through a clinic, or hospital. The customer or patient enters, certain "processes" are carried out, the customer exits the system and the outcome is either a happy or an unhappy one.

In most cases, however, the data follows the simple linear law y = hx + c = hx - W, which is a generalization of what Einstein deduced from Planck's law. For n = 1 and a = 0, y = Ax but Einstein said we must allow for a nonzero c or the work function W. Also, the constant A, which applies when n differs from n = 1 is not the same as the constant h, which applies for the special case of n = 1. I look forward to your comments. With my warmest regards.

Charmed :) :thanx:
 
Last edited by a moderator:
Top Bottom