Recording Monitoring and Measuring Readings - Record inspection gages we use?

G

Gary Foreman

Hi there, 1st time posting
We are a fabrication and machining job shop. Small lot size, with high mix. We will soon be in the transition phase to 2000. The problem that I have is getting an accurate definition of 7.6 " in addition, the organization shall assess and record the validity of the previous measuring results when the (measuring) equipment is found not to confirm to requirements. Does this mean that we have to record what inspection gages we use for ever inspection when the operation is complete. For example, at the press brake operation we may use 12" calipers, a protractor, 24" calipers, and a 6" square. Once this operation is complete, do we record each of these inspection gages on a router or database. An operator may run 6 or more jobs a day. If so, this will be time cunsuming and costly. I would appreciate any help. Thank You in advance.

Gary:confused:
 
D

db

Calibration

First of all Gary, welcome to the Cove! :bigwave:

What is being required is that if a measuring/monitoring device is found to be out of calibration, you will determine if measurements made by that device are accurate. Let’s say one of your mics are found to be off by .003. You will need to determine if product accepted by the mic, before it was discovered to be off, was in fact acceptable. Likewise product that was rejected by the mic would have to be re-checked to see if it was in fact nonconforming.

Now, this is an oversimplification, but I think it gets the point across.
 

CarolX

Trusted Information Resource
another way to do it

Hi Gary and Welcome,

I, too, work for a small fabricating shop and we have been certified to ISO900:1994 (haven't done the transition thing yet). I don't know enough how your system works, but let me tell you what we do here, and it may work for you.

We run small lots with a large mix of product. Our "long runs" never last more than a few hours. Sounds similar to what you are doing. We perform a "First Piece" and a "Last Piece" inspection, along with in-process inspections. If the calipers used by the operator are out of adjustment, AND defective parts formed, this will be detected at the "Last Piece" inspection. No need to record any info on the router as to what gage was used.

Hope this helps a bit!
CarolX
 
G

gburns

The requirement is to record the results of the assessment. In other words you have to determine if the inaccuracy discovered during the calibration would likely have had a significant effect upon ANY of the product verifications performed since it was last known to be accurate (i.e. the last calibration date).

If there IS as likelyhood of significant effect, you need to let all the customers know whose product was or may have been verified using that device. If you don't record what parts a specific device was used on, then for all practical purposes you have to tell every customer who's parts were running in your shop over the period of time the device may have been inaccurate.

If there is NOT a likelyhood of any significant effect, you don't need to tell anyone.

The key to using this rationale is the acceptability of using the words "likely" and "significant". They aren't specifically mentioned or used in the standard, but their use seems reasonable. For example, do you think there's any harm done to your products if a micrometer, that may normally be required to be accurate within .001" is found to be inaccurate a total of .002"? I doubt it. However, if it's found to be out .005" there may be more concern. The required "appropriate action" on the "product affected" may simply be to have Engineering/Quality management sign-off on a document that says the amount of inaccuracy is negligible and stop right there. Whatever you do, you'd need to document the criteria used to determine when an inaccuracy is insignificant in terms of product recall or customer advisement.

Or maybe not.
 
G

Graeme

Re: Control of monitoring and measuring devices

Gary, welcome to the Cove!

First, I come from the calibration world, where recording the tools (standards) you use has been common practice for decades. So what seems like a heavy task to one is a routine cost of doing business to another. Here are a couple of additional solutions to what the others have said. These are things that I have seen while benchmarking or auditing. They may or may not work for you, depending on how your company is set up.

One way I have seen would work if your people use a toolroom to get tools for each job. If this is the case, I have seen a couple of places that have a tool steel artifact made so that it has a number of different inside and outside dimensions and angles, and a drawing of the known dimensions. (What they are depends on what you do.) The last one I saw was about the size of a coffee mug and kept on the toolroom counter. It was measured weekly by the company cal lab. When a person gets a tool he/she makes the appropriate measurements on the "thing" and logs the result. The measurements are repeated when the tool is turned in. If the before and after measurements are within limits then the tool was probably OK during the time it was used. The log also records who used what tools when, and you can link that somehow to who was working on what job. Note that in this case the condition of the tool is probably known while the items that were worked on are still in the building.

Another way may work if your operation is highly computerized. Barcode each tool -- on some systems you can add a barcode to the calibration label. Have a handheld scanner at or near the workstations. At the start of a job, the operator would just scan the barcode on each tool, the job number and his/her identifier. It all zips into the computer and the magic of software does all the drudgery for you! Here, the main risk is if a tool is found to be out of tolerance some time after the job is complete.

Both of the above are ways of recording which toools are used for each job. The other key thing is the significance of the out of tolerance event. How does the magnitude of the OOT value compare with the uncertainty of the measurement system the tool is used in, and with the tolerance of the parts being measured? You may need gage R&R studies for the first, and your drawings will give you the other. There will sometimes be cases where a tool was "out of tolerance" with respect to its own specifications, but was used on jobs where the allowable measurement tolerance was 50 or 100 or more times greater. A case like that is clearly not significant. Each case has to be checked, though. Sorry, but I don't know a way around that. (I just finished evaluating an OOT from one of my multifunction electronic calibrators. 5 minutes to get the list of tools touched by the calibrator over the past year, and then the whole rest of the day to check the tolerances on everything. The computer did the first part, but I had to do the rest myself ... and in the end nothing was "significant".)
 
D

dsudduth

Can anyone weigh in on whether the barcode reader/scanner should be under the calibration program? It is used to verify inner and outer labels, and against inventory.:thanx:
 
A

andygr

Last post first
If a device is used to establish compliance to requirements then it must be calibrated. If the bar code being read is the identification required by design then you are verifing compliance with the reader and it should be calibrated. For the bar code reader it is realy nothing more than a functional verification that it can truely read the bars( or patterns).
As with all calibration the frequancy is a risk based analysis based on the stability of the equipment in the enviroment it is being used.
Beyond complaince to design it would be good shop practice to verify function on some basis. How many mis reads can you tollerate and still stay in busness?

As far as recording tool numbers against production inspection opperations this once again becomes a cost risk analysis. I would always record tool numbers for FAI opperations- no exceptions.
For production oppperations I would simplfy it by assigning various tools to a bench or opperation. I would not log them individualy as each inspection occures but be aware that any of them assigned to that area could be used based on opperator preferance. If durring calibration a specific tool is found to be out of tollerance than all the parts inspected in that bench since the last calibration would have to be suspect and reviewed. Judge how likley this is and determine the potential cost. Balance this cost against the cost of logging each tool against each inspection and you will have your answer.
 

Jim Wynne

Leader
Admin
Last post first
If a device is used to establish compliance to requirements then it must be calibrated. If the bar code being read is the identification required by design then you are verifing compliance with the reader and it should be calibrated. For the bar code reader it is realy nothing more than a functional verification that it can truely read the bars( or patterns).
As with all calibration the frequancy is a risk based analysis based on the stability of the equipment in the enviroment it is being used.
Beyond complaince to design it would be good shop practice to verify function on some basis. How many mis reads can you tollerate and still stay in busness?

The question is, it seems to me, is whether or not mis-reads can be ascribed to the scanner itself. I don't think so. The thing will either work or it won't, and verifying that it works doesn't mean that it has to be entered into the calibration system. The lights in the plant must be turned on in order for work to proceed, but the lights don't get "calibrated." They either work or they don't.
 
Top Bottom