DaveC said:
I know this subject of "cheap" and "calibrate" used in the same sentence may
well be anathema to some of you but I need to verify that either my IR temp
gun is accurate or my DMM/thermocouple is, or neither. Accuracy to 2 or 3
degrees F is fine.
I'm looking for suggestions for a simple way to provide some kind of common
temperature "standard" (I use the term loosely, here) I can compare these
against.
Thanks,
Having suggested in an earlier post that
water and ice might not be "friendly"
sources in IR pyrometer calibration, it
is worth reading the following
thermometer calibration guide:
http://www.oznet.ksu.edu/library/fntr2/mf2440.pdf
While the guide provides detailed
methods for using water in the
calibration of bi-metal, thermocouple,
thermistor, and other thermometers, it
says the following about IR thermometer
calibration:
IR thermometers are calibrated using a
“Blackbody,” which emits a given amount
of energy at a given temperature. A
blackbody calibration instrument
is expensive. However most manufacturers
of NIST IR thermometers provide a
calibration service for a nominal fee
for yearly calibration and certification.
On the other hand, AEMC (an instrument
manufacturer) offers the following
water/ice calibration technique:
http://www.aemc.com/techinfo/appnotes/EnvironmentalTesters/CA870_872_876_CalProcedure.pdf
Please note AEMC's concept of acceptable
errors using these standards! That
should explain why serious calibration
requires a cavity-type blackbody source.
One can usually calibrate an instrument
using a variety of standards, but it is
prudent to understand the errors each
introduces.
It's yet another instance of the "good,
cheap or fast - choose any two" constraint.
Chuck