SBS - The best value in QMS software

Received a minor for not having good measureables/goals. Need help with KPIs.

RoxaneB

Super Moderator
Super Moderator
#31
Corrective & Preventative action KPIs: Time to closure (Ideally < 14 days), Number of (trending down), Recurrence (%)
Internal Auditing: Percentage completion, Frequency, Time to closure (days) for mitigations
Management review:
Frequency, Objectives (i.e. what is discussed in management review? Answer: All these KPIs)
Maintenance KPIs:
Uptime (% or hours), Unscheduled maintenance (hours), Percentage completion of scheduled maintenance, Critical parts inventory
Calibration KPIs: Some sort of calibration tracking method i.e. spreadsheet or a software. Evidence of MSA during PPAP and periodically.
How do these particular indicators add value to the QMS, Patrick?

Time to close Corrective/Preventive actions at less than 14 days is, in my experience, naive. A large, systemic issue may require significantly more time, money, people than can be allocated in 14 days (all while the organization continues to operate and meet customer requirements). In fact, my current organization, it can take up to 6 months to close these items, including verification period.

I also disagree with the "number of". It's a double-edged sword, this metric. Sure, no one LIKES to be on the receiving end of a Corrective/Preventive action, but its also important to identify and address them. Some companies put "near misses = 0" as a safety metric, but this ends up resulting in people being afraid of reporting them. The same mentality can occur with a "number of" indicator in quality.

Recurrence isn't too bad, however, you need to be very clear about what constitutes as recurrence. is it the same clause? Same root cause? Same description? Same process? What if Process A has a document control finding and fixes it...and then a few months later Process B has a document control finding? Is this a recurrence?

Your list of internal audit KPIs are more like check boxes - Did we do the audit? Did we do the right number of audits? Did we close all audit issues? How do these speak to the overal health and value of the internal audit system? You could consider "ontime completion of audits" if your set on having a completion metric. You could also have auditee experience results (similar to satisfaction). Rather than have a metric about resolving issues, find a way to capture if results were disputed (i.e., did auditees agree to and understand the value of the finding(s)?)

Management review - Does it really need a metric? In my experience, the majority of your metrics should be addressed and resolved and corrected (where needed) by process owners PRIOR to management review. I'm not saying that management review is a pointless exercise - it can, in fact, add a lot to an organization if done properly and in such a way that it presents a picture of the organization instead of flashing a bunch of charts and graphs and tables that no one quite understands.

Maintenance - These are more operational than traditional quality, but they do at least speak to what's going on down on the floor. I would offer that from a quality perspective, as an example, uptime (% or hours) does not drill down deep enough. it may be good for Maintenance but that's not the story we're interested in. While I applaud the positive slant to the indicator, I'd suggest we flip it over to downtime. The line could go down 30 times in a month for 2 minutes each and then 1 time for 60 minutes. They both equal 1 hour, giving us a total downtime of 2 hours. But so what? Is 2 hours good or bad? Which is more of a concern - the 30 hiccups of 2 minutes or the 1 full-hour session? In a previous life, we acknowledged that downtime was downtime, but we looked at our data and said that we'd focus on getting the larger downtime sessions under control (e.g., any unscheduled downtime > 1.5 hours required a corrective action). After a year of this, we re-analyzed our data and saw that our larger downtime sessions had decreased, so we decreased our trigger for a corrective action (e.g., any unscheduled downtime > 1 hour). We repeated this year over year, tightening our processes and gaining better control.

Calibation - A tracking sheet does not equate to a KPI. What would be more interesting would be the number of times a quality defect was identified and attributed to a piece of equipment being out of calibration.
 
Elsmar Forum Sponsor

qualprod

Trusted Information Resource
#32
How do these particular indicators add value to the QMS, Patrick?

Time to close Corrective/Preventive actions at less than 14 days is, in my experience, naive. A large, systemic issue may require significantly more time, money, people than can be allocated in 14 days (all while the organization continues to operate and meet customer requirements). In fact, my current organization, it can take up to 6 months to close these items, including verification period.

I also disagree with the "number of". It's a double-edged sword, this metric. Sure, no one LIKES to be on the receiving end of a Corrective/Preventive action, but its also important to identify and address them. Some companies put "near misses = 0" as a safety metric, but this ends up resulting in people being afraid of reporting them. The same mentality can occur with a "number of" indicator in quality.

Recurrence isn't too bad, however, you need to be very clear about what constitutes as recurrence. is it the same clause? Same root cause? Same description? Same process? What if Process A has a document control finding and fixes it...and then a few months later Process B has a document control finding? Is this a recurrence?

Your list of internal audit KPIs are more like check boxes - Did we do the audit? Did we do the right number of audits? Did we close all audit issues? How do these speak to the overal health and value of the internal audit system? You could consider "ontime completion of audits" if your set on having a completion metric. You could also have auditee experience results (similar to satisfaction). Rather than have a metric about resolving issues, find a way to capture if results were disputed (i.e., did auditees agree to and understand the value of the finding(s)?)

Management review - Does it really need a metric? In my experience, the majority of your metrics should be addressed and resolved and corrected (where needed) by process owners PRIOR to management review. I'm not saying that management review is a pointless exercise - it can, in fact, add a lot to an organization if done properly and in such a way that it presents a picture of the organization instead of flashing a bunch of charts and graphs and tables that no one quite understands.

Maintenance - These are more operational than traditional quality, but they do at least speak to what's going on down on the floor. I would offer that from a quality perspective, as an example, uptime (% or hours) does not drill down deep enough. it may be good for Maintenance but that's not the story we're interested in. While I applaud the positive slant to the indicator, I'd suggest we flip it over to downtime. The line could go down 30 times in a month for 2 minutes each and then 1 time for 60 minutes. They both equal 1 hour, giving us a total downtime of 2 hours. But so what? Is 2 hours good or bad? Which is more of a concern - the 30 hiccups of 2 minutes or the 1 full-hour session? In a previous life, we acknowledged that downtime was downtime, but we looked at our data and said that we'd focus on getting the larger downtime sessions under control (e.g., any unscheduled downtime > 1.5 hours required a corrective action). After a year of this, we re-analyzed our data and saw that our larger downtime sessions had decreased, so we decreased our trigger for a corrective action (e.g., any unscheduled downtime > 1 hour). We repeated this year over year, tightening our processes and gaining better control.

Calibation - A tracking sheet does not equate to a KPI. What would be more interesting would be the number of times a quality defect was identified and attributed to a piece of equipment being out of calibration.
Excelent advice, Roxanne, thanks
 

TrivialSublime

Involved In Discussions
#34
^^^ Why track any metric that won't teach you anything? If it's tracked just to have an easy metric to meet, it's just wasted money and doesn't follow the intent of the standard.
Bingo, in the old days that would be called the $ 64,000 question.

The standard uses words like " relevant, should, etc " for a reason. It is up to an organization to decide ( track) what they need to do their business as long as they can argue that its relevant to them and meets the requirement.

This is where an auditee has the right to challenge an auditor who will at times make statements like " I think this should be " or something similar and we all know that happens.
 
P

patricks

#36
How do these particular indicators add value to the QMS, Patrick?

Time to close Corrective/Preventive actions at less than 14 days is, in my experience, naive. A large, systemic issue may require significantly more time, money, people than can be allocated in 14 days (all while the organization continues to operate and meet customer requirements). In fact, my current organization, it can take up to 6 months to close these items, including verification period.

I also disagree with the "number of". It's a double-edged sword, this metric. Sure, no one LIKES to be on the receiving end of a Corrective/Preventive action, but its also important to identify and address them. Some companies put "near misses = 0" as a safety metric, but this ends up resulting in people being afraid of reporting them. The same mentality can occur with a "number of" indicator in quality.

Recurrence isn't too bad, however, you need to be very clear about what constitutes as recurrence. is it the same clause? Same root cause? Same description? Same process? What if Process A has a document control finding and fixes it...and then a few months later Process B has a document control finding? Is this a recurrence?

Your list of internal audit KPIs are more like check boxes - Did we do the audit? Did we do the right number of audits? Did we close all audit issues? How do these speak to the overal health and value of the internal audit system? You could consider "ontime completion of audits" if your set on having a completion metric. You could also have auditee experience results (similar to satisfaction). Rather than have a metric about resolving issues, find a way to capture if results were disputed (i.e., did auditees agree to and understand the value of the finding(s)?)

Management review - Does it really need a metric? In my experience, the majority of your metrics should be addressed and resolved and corrected (where needed) by process owners PRIOR to management review. I'm not saying that management review is a pointless exercise - it can, in fact, add a lot to an organization if done properly and in such a way that it presents a picture of the organization instead of flashing a bunch of charts and graphs and tables that no one quite understands.

Maintenance - These are more operational than traditional quality, but they do at least speak to what's going on down on the floor. I would offer that from a quality perspective, as an example, uptime (% or hours) does not drill down deep enough. it may be good for Maintenance but that's not the story we're interested in. While I applaud the positive slant to the indicator, I'd suggest we flip it over to downtime. The line could go down 30 times in a month for 2 minutes each and then 1 time for 60 minutes. They both equal 1 hour, giving us a total downtime of 2 hours. But so what? Is 2 hours good or bad? Which is more of a concern - the 30 hiccups of 2 minutes or the 1 full-hour session? In a previous life, we acknowledged that downtime was downtime, but we looked at our data and said that we'd focus on getting the larger downtime sessions under control (e.g., any unscheduled downtime > 1.5 hours required a corrective action). After a year of this, we re-analyzed our data and saw that our larger downtime sessions had decreased, so we decreased our trigger for a corrective action (e.g., any unscheduled downtime > 1 hour). We repeated this year over year, tightening our processes and gaining better control.

Calibation - A tracking sheet does not equate to a KPI. What would be more interesting would be the number of times a quality defect was identified and attributed to a piece of equipment being out of calibration.

Roxanne,

Thanks for the discussion. Lets note that the purpose of my post was to offer guidelines to the OP on KPIs for departments/processes. The purpose of the post was not to create an exhaustive set of metrics (cant be done without being at OPs company for a few days).

  • Time to close Corrective/Preventive actions: Assuming you are in the automotive business, this is not trivial. Most OEM's give you a day to implement containment and submit a full 8D in 15 days. You can take more time to perfect your root cause investigation and/or find an alternate CAPA. To your point, what if a company needs to engage in substantive R&D that takes half a year or more? The answer is go ahead and be sure to share it with the customer in your Quality Improvement Plan (QIP) - they will be thrilled that you are taking their concerns seriously. Does that mean we can dispense with a KPI for Time to closure of CAPA? No. My guess is most automotive (or any other regulated industry) folks would disagree with not using this KPI altogether.
    • FYI, when I was an SQE, one of my Tier 1 supplier had a long supply chain and it took about 20 days of transportation just for the defective part to reach the supplier's plant for analysis. How would you then justify the OEM's requirements to submit an 8D (with CAPA) within 15 days?

  • "Number of": If we are talking about "number of" in terms of customer concerns, you can slice and dice it in a millions. You can do an absolute number of (customer concerns OR defective parts OR defects found at the customers plant vs. found at the field). You can further classify the defects as high, medium or low risk. You can classify them as warranty or non-warranty. Instead of absolute numbers you could do PPM or %ages. We can go on and on here on The Cove. What makes sense for the OP's company is something only the OP can determine. In terms of "human mentality" playing a role (lets say in reporting safety incidents) I couldn't agree more with you. Identifying KPIs and communicating those KPIs to your team are both important.

  • Recurrence: At our place, we would determine recurrence in terms of the process failure mode (Linked to an individual PFMEA item). That's not an easy thing to do (especially if your PFMEAs need substantial work). For companies just starting with the approach of assigning a root-cause bucket to their customer concerns I would suggest starting with a broader list of categories rather then diving into the deep end of the pool and mapping customer defects to a PFMEA item.

  • Internal Auditing: Percentage completion, Frequency, Time to closure (days) for mitigations: Are we saying here that these KPIs shouldn't be used? If its yes, then I respectfully don't agree. These are great indicators of the overall health of the internal audit system (Internal audits here meant to include internal IATF focused audits + manufacturing audits + Layered Process Audits). Are we saying we could add some more KPIs to the mix to improve effectiveness of internal audits? By all means.

  • Management review: Frequency, Objectives (i.e. what is discussed in management review? Answer: All these KPIs) : If management doesn't do management reviews (or does one just before their external audit) then everything else in this section is pretty much a moot point, isn't it?

  • Maintenance KPIs: Uptime (% or hours), Unscheduled maintenance (hours), Percentage completion of scheduled maintenance, Critical parts inventory: Quoting Roxanne
    "These are more operational than traditional quality "
    . Yes, yes and yes again - These are operational! And how is that bad? I think we are being a little trivial if we are discussing if "uptime" is better than "downtime", so I am just going to pass.
For people interested in looking at this holistically: The parent metric here is OEE (Overall Equipment Effectiveness). Uptime is one of the three components of OEE (the other two being Performance and Quality). The calculation of OEE is not a subjective matter...OEE is defined by a mathematical formula. Lets say the OEE metric throws a red flag, then you drill down to see if Performance, Uptime or Quality is the culprit (there may be more than one). Lets say the the culprit is uptime, then you drill down to understand what within "uptime" is the culprit (sounds like 5-Whys).​
There could be many layers within uptime. As some of the metrics I listed indicate, it could be that scheduled maintenance was not done, or that the machines is down more for unscheduled maintenance, or that the machine is starved of raw material, or that the associate is out smoking and the machine alarm is ON. In a real manufacturing plant there is a no shortage of "stories". That's why we have KPIs. Because numbers don't lie, people do.​
Also FYI, if there is a "QMS" metric that doesn't actually serve "Operations" then we have taken things too far and have been a bit overzealous by inventing unnecessary metrics. By "operations", I don't narrowly mean the guys who get their paycheck from the "production" or "operations" department in a company. Operations means factory operations - it encompasses quality, safety, production, engineering, R&D and accounting.​

  • Calibration KPIs: Some sort of calibration tracking method i.e. spreadsheet or a software. Evidence of MSA during PPAP and periodically: No a calibration sheet does not equate to a KPI....most people would get that. A calibration procedure would definitely help you with achieving your calibration KPIs though....i thought that was self-evident so didn't elaborate earlier. I do like your suggested KPI of "the number of times a quality defect was identified and attributed to a piece of equipment being out of calibration." That could also be done if your are doing the recurrence thing mentioned above where one of your broad root cause categories is "Calibration".
I appreciate the discussion...I think I got a chance to add more clarity.
 

Jim Wynne

Staff member
Admin
#37
This is a subject that's dazzlingly attractive to those among us who overcomplicate things for a living. It's really pretty simple:
  • We have all identified the processes of the QMS.
  • By definition, a "process" has three components that should be carefully defined: Input, Processing, and Output.
  • In determining process effectiveness, the output is compared to the (carefully defined) requirements.
  • The mostly useless concept of "KPI" may be safely done away with if we know how the process is supposed to perform.
  • The goals/objectives of the process should be periodically reviewed and corrections should be made when necessary.
  • The output requirements are the "key indicators."
 

RoxaneB

Super Moderator
Super Moderator
#39
^^^ Why track any metric that won't teach you anything? If it's tracked just to have an easy metric to meet, it's just wasted money and doesn't follow the intent of the standard.
Exactly. When I look at a proposed KPI, I ask "What if it goes red?" or "What if it doesn't hit target?". If my response is "So what?" or "Who cares?", then this is obviously not the right KPI for the process. If a result for a KPI does not meet target and there is an apathetic response to such an outcome, I'd really be questioning the value and relevancy of that KPI.

patricks said:
I appreciate the discussion...I think I got a chance to add more clarity.
I - and hopefully other folks reading your additional post - appreciate the clarity. Perhaps it was just the initial wording that left me questioning the value of your suggested metrics - with no context or examples, it can be difficult to see a different perspective. I stand by my response that some of them do offer questionable value to a QMS, but I will also be the first to say that if it works for your organization, then that's all that really matters.
 

Golfman25

Trusted Information Resource
#40
Exactly. When I look at a proposed KPI, I ask "What if it goes red?" or "What if it doesn't hit target?". If my response is "So what?" or "Who cares?", then this is obviously not the right KPI for the process. If a result for a KPI does not meet target and there is an apathetic response to such an outcome, I'd really be questioning the value and relevancy of that KPI.



I - and hopefully other folks reading your additional post - appreciate the clarity. Perhaps it was just the initial wording that left me questioning the value of your suggested metrics - with no context or examples, it can be difficult to see a different perspective. I stand by my response that some of them do offer questionable value to a QMS, but I will also be the first to say that if it works for your organization, then that's all that really matters.
That's just it. How many metrics have a reaction point? If it's just measurement for measurement sake, then there isn't much value. We measure shipment performance. What's the difference between 97 and 98%? Not much. If it drops to 85%, I'll guarantee I'll know about it before the data is collected -- customer lines will shut down and they know where I live. :)
 
Thread starter Similar threads Forum Replies Date
E Supplier PPAP - Section 7.3.6.3. - Our company received a minor - Help IATF 16949 - Automotive Quality Systems Standard 7
B How many nonconformances have you received on average during IATF audits? General Auditing Discussions 24
V Quality review Meeting with Customer for complaints we received Customer Complaints 6
Ed Panek Corona Virus impact on Supplier Audits and Received Parts ISO 13485:2016 - Medical Device Quality Management Systems 4
E Received a Major finding during IATF Surveillance audit for loss of BIQS Level 3 (more than 6 SPPS in 6 months)...how should we address SYSTEMIC CA? IATF 16949 - Automotive Quality Systems Standard 11
M Informational TGA – Submissions received: Regulation of software, including Software as a Medical Device (SaMD) Medical Device and FDA Regulations and Standards News 1
CPhelan Nonconformance opened as incorrect expiration date placed on received product. Escalate to CAPA? Nonconformance and Corrective Action 4
M We still have not received our certificate due to a 'backlog' with our auditing body Registrars and Notified Bodies 25
K Internal Audit CAR - How Long for Responses to be Received? Internal Auditing 8
W AS9100 (8.5.1 j) - Missing Parts when received back from External Provider AS9100, IAQG, NADCAP and Aerospace related Standards and Requirements 4
L Supplier Revision Confirmation of Received Engineering Documents ISO 9000, ISO 9001, and ISO 9004 Quality Management Systems Standards 2
M Help with SCAR - Not all sub-assemblies had properly received a First Article Nonconformance and Corrective Action 2
Y Part 21-G organization - Compliance documents of received product Inspection, Prints (Drawings), Testing, Sampling and Related Topics 1
V Pre-dispensed API material received in predetermined quantities & given to production Pharmaceuticals (21 CFR Part 210, 21 CFR Part 211 and related Regulations) 1
K APQP requirement when a new print or spec is received from a customer - Existing Part APQP and PPAP 3
Q Recommendation for a suitable PA on a complaint received Preventive Action and Continuous Improvement 3
marmotte 510(k) "received date" discrepancy Other US Medical Device Regulations 1
R Is IMDS required for Shipping cap received from our suppliers RoHS, REACH, ELV, IMDS and Restricted Substances 7
M Should we analyse risk at the quotation phase or after the order is received? AS9100, IAQG, NADCAP and Aerospace related Standards and Requirements 7
Q Receiving Inspection - Quantity Received different from Pack List & C of C AS9100, IAQG, NADCAP and Aerospace related Standards and Requirements 8
M Corrective Action - Customer Received Mislabeled Parts Nonconformance and Corrective Action 6
D We received our AS9100 Rev C. certificate! Thank you! AS9100, IAQG, NADCAP and Aerospace related Standards and Requirements 4
Hershal This morning I received notification that I have my third copyright Coffee Break and Water Cooler Discussions 11
J Internal Audit Reports - Is it acceptable if received after due date? ISO 9000, ISO 9001, and ISO 9004 Quality Management Systems Standards 9
C Received 510K Market Approval - Use in Advertising US Food and Drug Administration (FDA) 5
M "As Found" (aka "As Received") Calibration Data - ISO 13485 Requirements ISO 13485:2016 - Medical Device Quality Management Systems 8
C Contaminated Sharp Medical Devices received by mail - Biohazard? Other Medical Device and Orthopedic Related Topics 2
K Ts-16949 certification received! IATF 16949 - Automotive Quality Systems Standard 8
Ajit Basrur Laboratory Monthly Report template needed (samples received, samples tested, etc.) General Measurement Device and Calibration Topics 2
R On/OFF switch received from supplier - No CE marking but with Certificate CE Marking (Conformité Européene) / CB Scheme 1
Stijloor Consultants, have you received calls from Registrars lately? Registrars and Notified Bodies 9
J Received TS 16949 Certification Today! IATF 16949 - Automotive Quality Systems Standard 19
C Notice Received from Customer Regarding PPAP's & FAIR's APQP and PPAP 8
C (DDR) Delivery Discrepancy Report received from customer for late delivery Nonconformance and Corrective Action 3
R UL Requirements - Received a variation notice - Safety caution text Various Other Specifications, Standards, and related Requirements 3
P I Received an AQL questionnaire from a customer - Comments? AQL - Acceptable Quality Level 3
L ISO 9001-2000 Registered - We received our ISO 9001:2000 certificate yesterday ISO 9000, ISO 9001, and ISO 9004 Quality Management Systems Standards 30
S Purchasing Procedure - Purchasing process is not complete until product is received? IATF 16949 - Automotive Quality Systems Standard 11
L Registrar Quotes - Quality of Responses Received - Discussion Registrars and Notified Bodies 29
R Has anyone received the ASQ publication of the DIS for ISO9001:2000? ISO 9000, ISO 9001, and ISO 9004 Quality Management Systems Standards 3
D Is it possible to control Customer Drawings (Prints) by a "Received" Date Stamp? QS-9000 - American Automotive Manufacturers Standard 16
M IEC 62304 Software changes - Minor labeling changes on the GUI IEC 62304 - Medical Device Software Life Cycle Processes 3
M Major vs. Minor for Internal Audits? Internal Auditing 10
C Fact or fiction - Repeat minor becomes a major IATF non-conformance IATF 16949 - Automotive Quality Systems Standard 7
Q ISO 14001:2015 accredited but import most product - CB Raised minor NC on Life Cycle Perspective ISO 14001:2015 Specific Discussions 2
G Timing allowed by IATF to close a Remote Site Minor Nonconformance IATF 16949 - Automotive Quality Systems Standard 2
J EU ISO 13485:2016 Recertification Audit - Effect of 10 Minor Nonconformances EU Medical Device Regulations 2
Y Informational Change control process - Major vs Minor change - Active class III medical devices ISO 13485:2016 - Medical Device Quality Management Systems 12
S Minor diameter of threaded ring gage General Measurement Device and Calibration Topics 1
D Minor non-conformance for not receiving a CofC from a heat treater Manufacturing and Related Processes 10

Similar threads

Top Bottom