S
I have a question on the allowed correction factors for the AMS 2750E heat treat furnace System Accuracy Tests. The pertinent section, 3.4.5.3, is still confusing to me with how to handle all the corrections.
There are 5 examples of how to handle offsets and corrections, but I don't feel any of them adequately address what exactly gets subtracted/added when filling out the SAT report.
According to Figure 6, you are supposed to subtract any:
- manual correction factor from calibration of the instrument (Binst)
- manual thermocouple calibration correction factor (Btc)
- correction factor from TUS offset of control or recording instrument (Btus)
- Test thermocouple correction factor (E)
- Test instrument correction factor (F)
We already actively subtract the test instrument and resident SAT thermocouple correction factors from the test instrument reading, but I have questions about the controller calibration correction factor and the control thermocouple calibration correction factors. We are using SSI 9205 and old Dual Pro controllers on our furnaces, and I don't believe these are capable of programming these 2 factor into the instruments. And I don't believe trying to get the heat treaters to add in a manual adjustment to the setpoint (which would be different for every setpoint) is feasible. So I want to put forth another example SAT and hope for an answer:
example: The furnace has an offset of -5 (meaning a correction factor of +5, and we use the same offset for both TUS and SAT). If an SAT is run at 500 DegF (displayed on the controller), where The K control T/C calibration correction factor is +3.3 DegF, the SSI 9205 controller calibration correction factor was 1 DegF, the type N Resident SAT test T/C was -2.5DegF, the Fluke multimeter used to perform the test had an error of 0.2DegF, and this test instrument gave a reading of 505 DegF, should the math be:
Instrument Reading (A): 500 DegF
Furnace Offset (Btus): +5 DegF
Corrected Control Instrument Reading (C): 505 DegF
Indicated Test Instrument Reading (D): 505 DegF
Test T/C correction Factor (E): -2.5 DegF
Test Instrument correction factor (F): 0.2 DegF
True Test Temperature (G=D+E+F): 502.7 DegF
SAT Difference (C-G): 2.3 DegF = PASS
Or should the math be:
Instrument Reading (A): 500 DegF
Furnace Offset (Btus): +5 DegF
Correction Factor from instrument cal (Binst): 1 DegF
Correction Factor from control K T/C (Btc): 3.3 DegF
Corrected Control Instrument Reading (C): 509.3 DegF
Indicated Test Instrument Reading (D): 505 DegF
Test T/C correction Factor (E): -2.5 DegF
Test Instrument correction factor (F): 0.2 DegF
True Test Temperature (G=D+E+F): 502.7 DegF
SAT Difference (C-G): 6.6 DegF = FAIL
As you can see, this is the difference between passing and failing the TUS on a class 5 furnace. So are you supposed to include Binst and Btc in the calculations if they aren't programmed into the controller or you didn't manually adjust the setpoint to run at 504.3 DegF?
There are 5 examples of how to handle offsets and corrections, but I don't feel any of them adequately address what exactly gets subtracted/added when filling out the SAT report.
According to Figure 6, you are supposed to subtract any:
- manual correction factor from calibration of the instrument (Binst)
- manual thermocouple calibration correction factor (Btc)
- correction factor from TUS offset of control or recording instrument (Btus)
- Test thermocouple correction factor (E)
- Test instrument correction factor (F)
We already actively subtract the test instrument and resident SAT thermocouple correction factors from the test instrument reading, but I have questions about the controller calibration correction factor and the control thermocouple calibration correction factors. We are using SSI 9205 and old Dual Pro controllers on our furnaces, and I don't believe these are capable of programming these 2 factor into the instruments. And I don't believe trying to get the heat treaters to add in a manual adjustment to the setpoint (which would be different for every setpoint) is feasible. So I want to put forth another example SAT and hope for an answer:
example: The furnace has an offset of -5 (meaning a correction factor of +5, and we use the same offset for both TUS and SAT). If an SAT is run at 500 DegF (displayed on the controller), where The K control T/C calibration correction factor is +3.3 DegF, the SSI 9205 controller calibration correction factor was 1 DegF, the type N Resident SAT test T/C was -2.5DegF, the Fluke multimeter used to perform the test had an error of 0.2DegF, and this test instrument gave a reading of 505 DegF, should the math be:
Instrument Reading (A): 500 DegF
Furnace Offset (Btus): +5 DegF
Corrected Control Instrument Reading (C): 505 DegF
Indicated Test Instrument Reading (D): 505 DegF
Test T/C correction Factor (E): -2.5 DegF
Test Instrument correction factor (F): 0.2 DegF
True Test Temperature (G=D+E+F): 502.7 DegF
SAT Difference (C-G): 2.3 DegF = PASS
Or should the math be:
Instrument Reading (A): 500 DegF
Furnace Offset (Btus): +5 DegF
Correction Factor from instrument cal (Binst): 1 DegF
Correction Factor from control K T/C (Btc): 3.3 DegF
Corrected Control Instrument Reading (C): 509.3 DegF
Indicated Test Instrument Reading (D): 505 DegF
Test T/C correction Factor (E): -2.5 DegF
Test Instrument correction factor (F): 0.2 DegF
True Test Temperature (G=D+E+F): 502.7 DegF
SAT Difference (C-G): 6.6 DegF = FAIL
As you can see, this is the difference between passing and failing the TUS on a class 5 furnace. So are you supposed to include Binst and Btc in the calculations if they aren't programmed into the controller or you didn't manually adjust the setpoint to run at 504.3 DegF?