MSA for Employee Skill (Training) example wanted

Jen Kirley

Quality and Auditing Expert
Leader
Admin
It sounds interesting but it seems to me the thing that's measured is the competency of assessors, not the people being assessed. Did I understand that correctly?

And it's absolutely worthwhile to use evaluations of trainers/assessors to improve a train-the-trainer program. I think this is too often overlooked and the result of this variation can be damaging to readiness and morale. There are books out there on the subject, such as Aligning Training for Results: A Process and Tools That Link Training to Business.

When you say CMMI Level 4 are you referring to SEI CMMI Maturity Levels? The statistical models in MSA don't apply well to this exersize though. I agree that a side-by-side comparison of evaluations can highlight variation. But I'm not sure this is the way to go either.

Training groups such as Academy for Professional Excellence apply Kirkpatrick's 4-Level development model for their training program effectiveness. This Academy has put out a (broken link removed) with a lot of tools that exhibit the four levels - this group has added two more for thier own use - see the links.

We in industry don't - or shouldn't - rely on managers to evaluate employee skills, especially when the employee is more skilled in a particular type of work or task than his or her manager. Competency is evaluated and measured in various ways, and has been discussed at length in the Cove.

But if it's evaluation of skills in managers you are interested in, I wholeheartedly agree with the principle but I don't have my reference materials in organizational development with me right now.
 

Jim Wynne

Leader
Admin
It sounds interesting but it seems to me the thing that's measured is the competency of assessors, not the people being assessed. Did I understand that correctly?

And it's absolutely worthwhile to use evaluations of trainers/assessors to improve a train-the-trainer program. I think this is too often overlooked and the result of this variation can be damaging to readiness and morale. There are books out there on the subject, such as Aligning Training for Results: A Process and Tools That Link Training to Business.

When you say CMMI Level 4 are you referring to SEI CMMI Maturity Levels? The statistical models in MSA don't apply well to this exersize though. I agree that a side-by-side comparison of evaluations can highlight variation. But I'm not sure this is the way to go either.

Training groups such as Academy for Professional Excellence apply Kirkpatrick's 4-Level development model for their training program effectiveness. This Academy has put out a (broken link removed) with a lot of tools that exhibit the four levels - this group has added two more for thier own use - see the links.

We in industry don't - or shouldn't - rely on managers to evaluate employee skills, especially when the employee is more skilled in a particular type of work or task than his or her manager. Competency is evaluated and measured in various ways, and has been discussed at length in the Cove.

But if it's evaluation of skills in managers you are interested in, I wholeheartedly agree with the principle but I don't have my reference materials in organizational development with me right now.

I think there's some confusion over what the OP is looking for, perhaps caused by the thread title. He wants to have managers do some form of assessment of employees, and then compare the results for consistency. It's not about training, per se, but rather the efficacy of the assessment process.
 
A

Atul Khandekar

Are you looking for Attribute inter-rater agreement? If so, look up/search for Cohen's Kappa or Kendall's coeficient of concordance.
 

Jen Kirley

Quality and Auditing Expert
Leader
Admin
I think there's some confusion over what the OP is looking for, perhaps caused by the thread title. He wants to have managers do some form of assessment of employees, and then compare the results for consistency. It's not about training, per se, but rather the efficacy of the assessment process.
I think I understand, but it's reinventing training evaluation at the least, and scratches the surface of a very big subject called Knowledge Management.

If we want to establish some concistency in how people are assessed I suggest the effort start with the process of assessment and setting out assessment expectations. We haven't talked about that much yet, but I can vouch that it's an area of critical need. A short paper called Assessing Employee Evaluation Report Writing shows some examples of evaluation skills. Like you said, unless we set out the objective criteria the exersize is of little use.
 
C

class89

Thanks Jennifer, Jim, Atul and others. As Jim says, we want to have managers do an assessment of employees on areas like 'Reqt. engg. skill, Design engg. skill, Code review skill' etc., each area ona scale of 1 to 5 with objective criteria of what is 1 and what is 5. Based on the rating given by 2 managers, compare the results for consistency. It's not about training, per se, but rather the efficacy of the evaluation process.

Atul: yes. We will be using Attribute inter-rater agreement. Kappa or Kendall's coeficient of concordance.
 

ddddavidddd

Registered
Hi,

This sounds amazing as I found it
very much useful and informative to be honest. Also, I have gone through this
post which definitely helped me out a lot as a new member I am looking forward
for more such discussions.

Thanks
 
Top Bottom