Hi all,
To perform the temperature calibration in a furnace, we always take out the sensors and put them in a liquid bath
and calibrate the whole loop as one instrument.
This equals downtime for the furnace
For the reduction of downtime we are requested if we can perform a partial calibration on parts of the entire loop.
As we will be adding extra uncertainty to the calibratin (by the use of a multifuntion dmm) it will only be applied to furnaces with low criticality.
So we evaluate to be using a second set of temperature sensors and 4-20 milliamps transmitters
Have those calibrated in our liquid bath while a multifunction dmm reads out the milliamps.
and afterwards use the multifunction dmm to simulate the milliamps to the dcs card and check the temperature on the controlpanel.
Afterwards replace the first set by the second set and then start to calibrate the first set in the liquid bath, again readout by the dmm.
But: how to set tolerances?
Do we split the tolerances equal for both the sensor&transmittor and the dcs?
Or do we take the actual milliamps and only check the readout on the controlpanel is in specification
More detailed:
The furnace is a type that cycles thru heating and active cooling
Calibration of the entire loop is performed in the range of 0 - 100 °C with 6 data points (0 ; 20 ; 40 ; 60 ; 80 ; 100)
Specification for the entire loop, seen as one instrument, is plus minus 1 degree celcius
0 - 100 °C equals 4 - 20 mA, so the spec in milliamps can be said to be plus minus 0,16 mA
Option 1:
Measure sensor in liquid bath with dmm: use a tolerance of 0,08 milliamps compared to theoretical milliamps
Simulate theoretical milliamps with dmm to plc card: use a tolerance on the control read out of 0,5 °C
Option 2:
Measure sensor in liquid bath with dmm and note down the milliamps
Simulated the real values recorded with the dmm in to the plc card
Tolerance on the read out: 1°C
Any thoughts on both options?
I prefer option 2 as it does not stipulate what individual part of the loop is allowed to have drift
and best case drift on one of the parts are cancelled out by another part of the loop
But it means that after we have calibrated the first set (when it was replace with the second set) we will have to do one more simulation of those recorded values to the plc card.
And what about if we need to adjust settings to the plc card during the change from sensor set one to sensor set two?
Perform an extra theoretical milliamps simulation to be able to investigate the impact when we have the results of sensor set one?
To perform the temperature calibration in a furnace, we always take out the sensors and put them in a liquid bath
and calibrate the whole loop as one instrument.
This equals downtime for the furnace
For the reduction of downtime we are requested if we can perform a partial calibration on parts of the entire loop.
As we will be adding extra uncertainty to the calibratin (by the use of a multifuntion dmm) it will only be applied to furnaces with low criticality.
So we evaluate to be using a second set of temperature sensors and 4-20 milliamps transmitters
Have those calibrated in our liquid bath while a multifunction dmm reads out the milliamps.
and afterwards use the multifunction dmm to simulate the milliamps to the dcs card and check the temperature on the controlpanel.
Afterwards replace the first set by the second set and then start to calibrate the first set in the liquid bath, again readout by the dmm.
But: how to set tolerances?
Do we split the tolerances equal for both the sensor&transmittor and the dcs?
Or do we take the actual milliamps and only check the readout on the controlpanel is in specification
More detailed:
The furnace is a type that cycles thru heating and active cooling
Calibration of the entire loop is performed in the range of 0 - 100 °C with 6 data points (0 ; 20 ; 40 ; 60 ; 80 ; 100)
Specification for the entire loop, seen as one instrument, is plus minus 1 degree celcius
0 - 100 °C equals 4 - 20 mA, so the spec in milliamps can be said to be plus minus 0,16 mA
Option 1:
Measure sensor in liquid bath with dmm: use a tolerance of 0,08 milliamps compared to theoretical milliamps
Simulate theoretical milliamps with dmm to plc card: use a tolerance on the control read out of 0,5 °C
Option 2:
Measure sensor in liquid bath with dmm and note down the milliamps
Simulated the real values recorded with the dmm in to the plc card
Tolerance on the read out: 1°C
Any thoughts on both options?
I prefer option 2 as it does not stipulate what individual part of the loop is allowed to have drift
and best case drift on one of the parts are cancelled out by another part of the loop
But it means that after we have calibrated the first set (when it was replace with the second set) we will have to do one more simulation of those recorded values to the plc card.
And what about if we need to adjust settings to the plc card during the change from sensor set one to sensor set two?
Perform an extra theoretical milliamps simulation to be able to investigate the impact when we have the results of sensor set one?