MSA is a fairly complex subject, and reports could take many forms. Audits of them should be done only by those who have a firm understanding of the requirements and technical details. If you could be more specific about what you're looking for, someone might be able to help.
I agree with Jim. I can point out a few of the more important items, but this is not all inclusive.
Is the gage calibrated?
Is the gage used over a wide range of measurements? If so, was a linearity study performed?
Is the gage sensitive to ambient conditions or time in use? If so, was a stability study performed?
How were the sample parts selected? If the study is for a gage used for SPC, the parts must represent actual product variation, not selected from the extremes of the tolerance.
How will the gage be used? For inspection, for SPC, both? Use the R&R metric appropriate to the gage use:
Inspection only, Use % Tolerance.
SPC only, use % Study Variation, or preferably % Process Variation. ndc may be used as well, but is redundant with %SV.
Inspection and SPC, use % Tolerance and % Study Variation (same comments as above)
Do the operators used for the R&R study actually use the gages in the use environment?
Is the study performed blind? That is Operators do not have the ability to see/remember prior measurements taken by themselves or other operators.
Does the gage have sufficient resolution? Poor resolution will give invalid results. Assess resolution from R&R Range graph. Must be able to resolve >= 5 levels within the UCL-R and < 25% of ranges = 0.
assuming you have acces to the above, urge you refer to pages 37-39, this provides a general framework to be used whilst constructing a measurment development checklist...up front...many of these topics can be used later in the audit process as well
So, here's my thing. I don't believe that anyone should audit something that they do not have deep understanding of. This is especially important when auditing technical things such as gauge R&R (or special processes or corrective actions or risk assessment...). If you don't truly and deeply understand a topic how can the audit be anything more than a rote comparison to a standard? Where does knowledge, reason and logic enter the picture? Miner's list of excellent questions require deep understanding to answer. Isn't the rote approach the reason for so many of the incorrect NCs that we discuss - and rail against - so frequently here at the cove?
So, here's my thing. I don't believe that anyone should audit something that they do not have deep understanding of. This is especially important when auditing technical things such as gauge R&R (or special processes or corrective actions or risk assessment...)
Amen, and amen. The automotive industry has created an ugly monster in its technical requirements, issued without regard for whether suppliers have the necessary technical acumen to fulfill them. Add to this the fact that they require use of voodoo math, in which hardly anyone is an expert, and you have a formula for vast amounts of wasted time and money.