S
Hello!
I'm trying to calibrate a thermometer (fluke 52 II) with the simulated output of the Fluke 5520A.
The problem is that there is about 2-2.5°C of error!
I set 0°, it starts measuring for example -1°C and after a couple of minute it goes to -2.5°C or something so (the problem is that the compensation of the 5520 change output millivoltage).
It does the same error with my fluke 741b..
If i put a thermocouple in a thermalbath, it measure perfectly.
Also, where is my fault? I do an error, but i don't understand what kind of..
I don't want to measure it with the thermocouple, because I need to calibrate the instrument not the thermocouple..
thank's to anyone who can help me
I'm trying to calibrate a thermometer (fluke 52 II) with the simulated output of the Fluke 5520A.
The problem is that there is about 2-2.5°C of error!
I set 0°, it starts measuring for example -1°C and after a couple of minute it goes to -2.5°C or something so (the problem is that the compensation of the 5520 change output millivoltage).
It does the same error with my fluke 741b..
If i put a thermocouple in a thermalbath, it measure perfectly.
Also, where is my fault? I do an error, but i don't understand what kind of..
I don't want to measure it with the thermocouple, because I need to calibrate the instrument not the thermocouple..
thank's to anyone who can help me
