DPPM? Scrap? Both? How to report Returns

Rambo

Starting to get Involved
My colleagues,

I am a manager at a manufacturer. We make and ship several hundred part numbers, daily. We are having discussions about reporting returns, and I would like to find some objective input.

Suppose 100 units are returned for defects. We sort through the 100, and find that actually only 25 are defective. Our options for reporting:

1) Scrap the 25 and report it as the day we performed the sort. Even though it doesn't reflect the manufacturing of that day, the reporting on the original date of manufacture was "understated", because defects were shipped. Anyway, the other 75 will be shipped and sold that day too, so "credit" would also be given.

2) Forget about scrap reporting, just count the 25 defects in our DPPM reporting.

3) Do #1 above, but also count the 25 into the DPPM metric.

How is this handled in your facility?
 

normzone

Trusted Information Resource
Re: DPPM? Scrap? Both? How to report returns

Given your situation, by my standards your outfit is both high mix and high volume. Your outlook may differ. It would be interesting to know the nature of your product line. Is the example purely theoretical or is it possible to produce and ship 75% conforming product?

Does the organization do anything constructive with the DPPM data, or is it simply dashboarding?

1) is unrealistic - does the process output conformance vary greatly by day? I'd be upset if my number for the day got tweaked by the timing of a return.

2) makes a little more sense, but if you're shipping that much nonconforming product then the process is out of control.

3) makes even less sense to me.

Perhaps I'm not the guy to be responding to you - I work in a high mix/low volume environment. I'd be more concerned about getting at defect causes in a given part number group, rather than looking at a broad PPM picture.
 
Re: DPPM? Scrap? Both? How to report returns

I track total returns (good/bad mixed) but in my quarterly report I back out product that was actually good, or customer damaged. We have several customers who will quickly just reject an entire lot if any defect at all is discovered, in those cases we sort product (as you do) but report as I indicated. The good product is reshipped with any make-up product for the delivery.
 

normzone

Trusted Information Resource
In my environment I track component failures within one year (dashboarding for the curious) and also do corrective action for any RMA that the organization is at fault for (real root cause analysis and appropriate response).

Most else is off my radar - routine customer service traffic. Yeah, [hogheavenfarm], if the customer was just being a &!<% or tinkered with it and broke it that data gets trimmed before reporting.
 

Rambo

Starting to get Involved
Thanks for the responses. Yes, we are high mix, high volume environment. It's actually more of a bulk material we produce, so indeed, it is likely that the customer would find instances of a defect, but the balance would be salvageable.

The daily volume does vary every day, but not in enormous swings. Normzone, you hit the nail on the head-- the Operations people do not like getting the hit (an average, 1% scrap day becomes an eyebrow-raising 3% scrap day, because a returned product was processed). Of course, had sorting been perfect on the day it was originally made/packaged, that would have been a 3% scrap day, so in my mind, it was just avoided one day, and addressed later. I don't want to drive the wrong behavior of shipping questionable product- if it makes it out the door, it will be someone else's issue (DPPM, not scrap).

Definitely, the DPPM data is scrutinized.

My manager's concept is that if we do both- #3 on my option list-- that we're actually "double-counting" the event.

I think not: yes it should have been counted as scrap, which it should be now that it's been returned-- but our failure to prevent/detect should be scored also, not either/or scrap and DPPM. Again, just curious as to how others see it.
 

Al2010x

Registered
I track number of units returned against number of units shipped in any given month for product you are tracking. This is then compared against first pass yield using random sampling method. The returns pareto is created to idnetify the cause of return such as transit damage, wrong product, etc.

DPMO or DPPM may be good measure and you may be able prepare weekly or monthly report. At the end of the day, your goal is to make data driven decisions, or action plan that is result oriented and customer focused. How you slice and dice the data depends on what you want to achieve.
 

Bev D

Heretical Statistician
Leader
Super Moderator
It's not how you score that matters; it's what you need to do to improve.

which report will initiate improvement action? How will it facilitate improvement?

I have several groups that scrutinize their field data forever. I actually think they look at each pixel in the chart. they constantly 'redo the math to make sure the report is correct; they debate definitions; they question the statistical control limits; they ask for 'more data'; they ask for different types of displays of the data; then another month rolls around and they start the whole process over again. they never improve anything.

I have other groups who glance a the trends and the paretos. They initiate problem solving on emerging problems and ensure that the teams working on the top 3 items on the pareto are actively making progress. They are continually improving the quality of their products.

I cannot help the first group as they don't want it. I focus on the second group. They haven't changed the reports in the last 12 years...
 
Top Bottom