How do these particular indicators add value to the QMS, Patrick?
Time to close Corrective/Preventive actions at less than 14 days is, in my experience, naive. A large, systemic issue may require significantly more time, money, people than can be allocated in 14 days (all while the organization continues to operate and meet customer requirements). In fact, my current organization, it can take up to 6 months to close these items, including verification period.
I also disagree with the "number of". It's a double-edged sword, this metric. Sure, no one LIKES to be on the receiving end of a Corrective/Preventive action, but its also important to identify and address them. Some companies put "near misses = 0" as a safety metric, but this ends up resulting in people being afraid of reporting them. The same mentality can occur with a "number of" indicator in quality.
Recurrence isn't too bad, however, you need to be very clear about what constitutes as recurrence. is it the same clause? Same root cause? Same description? Same process? What if Process A has a document control finding and fixes it...and then a few months later Process B has a document control finding? Is this a recurrence?
Your list of internal audit KPIs are more like check boxes - Did we do the audit? Did we do the right number of audits? Did we close all audit issues? How do these speak to the overal health and value of the internal audit system? You could consider "ontime completion of audits" if your set on having a completion metric. You could also have auditee experience results (similar to satisfaction). Rather than have a metric about resolving issues, find a way to capture if results were disputed (i.e., did auditees agree to and understand the value of the finding(s)?)
Management review - Does it really need a metric? In my experience, the majority of your metrics should be addressed and resolved and corrected (where needed) by process owners PRIOR to management review. I'm not saying that management review is a pointless exercise - it can, in fact, add a lot to an organization if done properly and in such a way that it presents a picture of the organization instead of flashing a bunch of charts and graphs and tables that no one quite understands.
Maintenance - These are more operational than traditional quality, but they do at least speak to what's going on down on the floor. I would offer that from a quality perspective, as an example, uptime (% or hours) does not drill down deep enough. it may be good for Maintenance but that's not the story we're interested in. While I applaud the positive slant to the indicator, I'd suggest we flip it over to downtime. The line could go down 30 times in a month for 2 minutes each and then 1 time for 60 minutes. They both equal 1 hour, giving us a total downtime of 2 hours. But so what? Is 2 hours good or bad? Which is more of a concern - the 30 hiccups of 2 minutes or the 1 full-hour session? In a previous life, we acknowledged that downtime was downtime, but we looked at our data and said that we'd focus on getting the larger downtime sessions under control (e.g., any unscheduled downtime > 1.5 hours required a corrective action). After a year of this, we re-analyzed our data and saw that our larger downtime sessions had decreased, so we decreased our trigger for a corrective action (e.g., any unscheduled downtime > 1 hour). We repeated this year over year, tightening our processes and gaining better control.
Calibation - A tracking sheet does not equate to a KPI. What would be more interesting would be the number of times a quality defect was identified and attributed to a piece of equipment being out of calibration.
Hi Roxane, I'm relatively new to Quality. Please can you advise wherein the QMS the Quality Objectives should be located?