Jump to content

Measuring temps with IR cameras is flawed!

I thought this post might be worth making because I've seen a lot of hardware reviewers using thermal (i.e. IR) cameras to check external temperatures (most recently in the "Backplates cool your videocard" LTT video, but I've seen a lot of other websites do it as well). I want to make it clear that I'm not writing this post to attack LTT or anybody else, but instead I'm doing it to offer a bit of knowledge to the handful of people out there who might actually be interested. So rather than doing real work, I'm going to give a mini IR camera physics lesson on an internet forum instead.

 

IR cameras don't have a way of directly measuring the temperature of objects in the way that a thermometer does. Instead, they measure the amount of thermal radiation coming from a surface (IR wavelengths in this case), and then use that intensity to calculate a temperature using what's known as the Stefan-Boltzmann Law. The equation looks like this:

 

T = (I/(e*A*s))1/4

 

where T is the temperature, I is the intensity of the IR radiation entering the camera, e is the emissivity of the surface you're pointing the camera at, A is the area of that surface, and s is just a constant which we can ignore here. The important thing to note here is that there are other variables which go into this calculation, namely e and A.

 

Now, the most important one, and the one I'm going to focus on is the emissivity e. Emissivity is a property which varies from one material to another, and is essentially a measure of how well a given material behaves like an ideal black body (i.e., how good the material is at radiating IR). It can range from 0 to 1, where 1 is the equivalent of an ideal black body, and 0 means that the object doesn't radiate any IR radiation at all. Most thermal cameras (including the FLIR ones that I think most tech reviewers use) assume that the emissivity is 0.9 or so. This means that if the object you're pointing the camera at actually has an emissivity of 0.9, then the temperature that the camera shows on-screen will be accurate. However, there are plenty of materials which do NOT have an emissivity of around 0.9. Metals would be the most relevant example.

 

Metals tend to have very low emissivities (often less than 0.1). So what happens if we try to measure the temperature of, say, a copper surface with our camera? The emissivity of copper is around 0.05 (varies a bit depending on smoothness of the surface), but our camera is assuming that the emissivity is 0.9, which is WAY too high. This means that the temperature that the camera calculates will be considerably lower than the true temperature (see the equation above). 

 

Metals have another problem, and that is that they are also good at reflecting IR wavelengths. This means that when you point your camera at that piece of copper, some of the IR radiation that it's measuring is actually coming from somewhere else in the room and simply reflecting off of the copper surface and into the camera. 

 

These two different phenomena combine to make it look like the piece of copper is colder than its surroundings, when in reality everything is the same temperature. This is exactly what happened in the IR images shown in LTT’s “Backplates cool your videocard” video. In the images it looks like the copper region is much colder than the surrounding backplate, when in reality it was probably as hot or hotter.

 

So there are a few conclusions that we can make here. First, one should take a grain of salt if exact temperatures are measured using a thermal camera. If the emissivity that the camera assumes is different than the emissivity of the object you’re measuring, then the temperature given by the camera will be incorrect.

 

Second, we CAN compare relative temperatures between different regions, if we do it carefully. This can be done by putting a piece of non-shiny tape on each of the surfaces that you’re trying to compare. The tape will equalize to the same temperature of whatever it’s attached to, and because you have the same tape on both surfaces (e.g., a piece of copper and a piece of plastic), the emissivities will also be the same (because now it's the tape radiating in both cases) and the temperatures of the two objects will be directly comparable. You can even take this a step further if you want to get accurate temperatures. If your camera allows you to manually set the emissivity of what you’re looking at (many FLIR cameras do), you can determine the emissivity of the tape you’re using by putting some tape on a surface, measuring its temperature with a normal thermometer, and then adjusting the emissivity in the camera’s settings until the onscreen temperature matches the true temperature. Once you know the emissivity of your tape, you can then use that emissivity in your camera’s settings to accurately measure the temperature of anything that you put the tape on in the future.

Link to comment
Share on other sites

Link to post
Share on other sites

You learn something new everyday. Thanks Sebastian-Very informative

Edited by wkdpaul
Don't quote long text
Link to comment
Share on other sites

Link to post
Share on other sites

Some youtubers are well aware of this and use IR imaging just to get an idea of the temperature zones, not to take actual quantitative measurements.

Quote or tag if you want me to answer! PM me if you are in a real hurry!

Why do Java developers wear glasses? Because they can't C#!

 

My Machines:

The Gaming Rig:

Spoiler

-Processor: i5 6600k @4.6GHz

-Graphics: GTX1060 6GB G1 Gaming

-RAM: 2x8GB HyperX DDR4 2133MHz

-Motherboard: Asus Z170-A

-Cooler: Corsair H100i

-PSU: EVGA 650W 80+bronze

-AOC 1080p ultrawide

My good old laptop:

Spoiler

Lenovo T430

-Processor: i7 3520M

-4GB DDR3 1600MHz

-Graphics: intel iGPU :(

-Not even 1080p

 

Link to comment
Share on other sites

Link to post
Share on other sites

This was a good read, and after watching the video again, it's cool to know that we can't be sure that that copper backplate was actually a "cold spot" - it may have just had a low emissivity. In fact, the copper could have even been hotter than everything else yet appear colder in this manner - I love these counter intuitive physical phenomena.

 

That said, I'm pretty sure that when they got their temp readings they used built in sensors on the card and software to allow them to read the temperature, which doesn't rely on the thermal camera, with the the cameras just being there for a visual (yet maybe misleading) representation. The final result of the video is still valid.

._.

AMD A8-5500•••ASUS GTX 660•••MSI MS-7778 Mobo•••2x4GB DDR3 RAM•••CoolerMaster v550 Semi-Modular PSU•••1.5TB HDD•256GB Samsung 850 Evo SSD•••Optical Drive•••Phanteks Enthoo Pro

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Peepnbrick said:

I'm pretty sure that when they got their temp readings they used built in sensors on the card

I'm sure you're right, and I know LTT also uses similar sensors to measure case temps (e.g., in the Workshop episode where Luke tried filling the computer case with random junk to see how it would affect temps). I just mentioned it here because I've seen other tech sites showing IR photos of hardware where they've labeled specific temperatures at specific points and then said something like, "the IR camera says it's only 60 C on the backside of the PCB." That may be what the camera says, but that doesn't necessarily mean it's true. #alternativefacts

 

Anyway, glad you enjoyed the post! I wasn't sure if ANYBODY would care haha.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×