Jump to content

Sebastian

Member
  • Posts

    38
  • Joined

  • Last visited

Reputation Activity

  1. Informative
    Sebastian got a reaction from Rambo in Measuring temps with IR cameras is flawed!   
    I thought this post might be worth making because I've seen a lot of hardware reviewers using thermal (i.e. IR) cameras to check external temperatures (most recently in the "Backplates cool your videocard" LTT video, but I've seen a lot of other websites do it as well). I want to make it clear that I'm not writing this post to attack LTT or anybody else, but instead I'm doing it to offer a bit of knowledge to the handful of people out there who might actually be interested. So rather than doing real work, I'm going to give a mini IR camera physics lesson on an internet forum instead.
     
    IR cameras don't have a way of directly measuring the temperature of objects in the way that a thermometer does. Instead, they measure the amount of thermal radiation coming from a surface (IR wavelengths in this case), and then use that intensity to calculate a temperature using what's known as the Stefan-Boltzmann Law. The equation looks like this:
     
    T = (I/(e*A*s))1/4
     
    where T is the temperature, I is the intensity of the IR radiation entering the camera, e is the emissivity of the surface you're pointing the camera at, A is the area of that surface, and s is just a constant which we can ignore here. The important thing to note here is that there are other variables which go into this calculation, namely e and A.
     
    Now, the most important one, and the one I'm going to focus on is the emissivity e. Emissivity is a property which varies from one material to another, and is essentially a measure of how well a given material behaves like an ideal black body (i.e., how good the material is at radiating IR). It can range from 0 to 1, where 1 is the equivalent of an ideal black body, and 0 means that the object doesn't radiate any IR radiation at all. Most thermal cameras (including the FLIR ones that I think most tech reviewers use) assume that the emissivity is 0.9 or so. This means that if the object you're pointing the camera at actually has an emissivity of 0.9, then the temperature that the camera shows on-screen will be accurate. However, there are plenty of materials which do NOT have an emissivity of around 0.9. Metals would be the most relevant example.
     
    Metals tend to have very low emissivities (often less than 0.1). So what happens if we try to measure the temperature of, say, a copper surface with our camera? The emissivity of copper is around 0.05 (varies a bit depending on smoothness of the surface), but our camera is assuming that the emissivity is 0.9, which is WAY too high. This means that the temperature that the camera calculates will be considerably lower than the true temperature (see the equation above). 
     
    Metals have another problem, and that is that they are also good at reflecting IR wavelengths. This means that when you point your camera at that piece of copper, some of the IR radiation that it's measuring is actually coming from somewhere else in the room and simply reflecting off of the copper surface and into the camera. 
     
    These two different phenomena combine to make it look like the piece of copper is colder than its surroundings, when in reality everything is the same temperature. This is exactly what happened in the IR images shown in LTT’s “Backplates cool your videocard” video. In the images it looks like the copper region is much colder than the surrounding backplate, when in reality it was probably as hot or hotter.


     
    So there are a few conclusions that we can make here. First, one should take a grain of salt if exact temperatures are measured using a thermal camera. If the emissivity that the camera assumes is different than the emissivity of the object you’re measuring, then the temperature given by the camera will be incorrect.


     
    Second, we CAN compare relative temperatures between different regions, if we do it carefully. This can be done by putting a piece of non-shiny tape on each of the surfaces that you’re trying to compare. The tape will equalize to the same temperature of whatever it’s attached to, and because you have the same tape on both surfaces (e.g., a piece of copper and a piece of plastic), the emissivities will also be the same (because now it's the tape radiating in both cases) and the temperatures of the two objects will be directly comparable. You can even take this a step further if you want to get accurate temperatures. If your camera allows you to manually set the emissivity of what you’re looking at (many FLIR cameras do), you can determine the emissivity of the tape you’re using by putting some tape on a surface, measuring its temperature with a normal thermometer, and then adjusting the emissivity in the camera’s settings until the onscreen temperature matches the true temperature. Once you know the emissivity of your tape, you can then use that emissivity in your camera’s settings to accurately measure the temperature of anything that you put the tape on in the future.

  2. Informative
    Sebastian got a reaction from Peepnbrick in Measuring temps with IR cameras is flawed!   
    I thought this post might be worth making because I've seen a lot of hardware reviewers using thermal (i.e. IR) cameras to check external temperatures (most recently in the "Backplates cool your videocard" LTT video, but I've seen a lot of other websites do it as well). I want to make it clear that I'm not writing this post to attack LTT or anybody else, but instead I'm doing it to offer a bit of knowledge to the handful of people out there who might actually be interested. So rather than doing real work, I'm going to give a mini IR camera physics lesson on an internet forum instead.
     
    IR cameras don't have a way of directly measuring the temperature of objects in the way that a thermometer does. Instead, they measure the amount of thermal radiation coming from a surface (IR wavelengths in this case), and then use that intensity to calculate a temperature using what's known as the Stefan-Boltzmann Law. The equation looks like this:
     
    T = (I/(e*A*s))1/4
     
    where T is the temperature, I is the intensity of the IR radiation entering the camera, e is the emissivity of the surface you're pointing the camera at, A is the area of that surface, and s is just a constant which we can ignore here. The important thing to note here is that there are other variables which go into this calculation, namely e and A.
     
    Now, the most important one, and the one I'm going to focus on is the emissivity e. Emissivity is a property which varies from one material to another, and is essentially a measure of how well a given material behaves like an ideal black body (i.e., how good the material is at radiating IR). It can range from 0 to 1, where 1 is the equivalent of an ideal black body, and 0 means that the object doesn't radiate any IR radiation at all. Most thermal cameras (including the FLIR ones that I think most tech reviewers use) assume that the emissivity is 0.9 or so. This means that if the object you're pointing the camera at actually has an emissivity of 0.9, then the temperature that the camera shows on-screen will be accurate. However, there are plenty of materials which do NOT have an emissivity of around 0.9. Metals would be the most relevant example.
     
    Metals tend to have very low emissivities (often less than 0.1). So what happens if we try to measure the temperature of, say, a copper surface with our camera? The emissivity of copper is around 0.05 (varies a bit depending on smoothness of the surface), but our camera is assuming that the emissivity is 0.9, which is WAY too high. This means that the temperature that the camera calculates will be considerably lower than the true temperature (see the equation above). 
     
    Metals have another problem, and that is that they are also good at reflecting IR wavelengths. This means that when you point your camera at that piece of copper, some of the IR radiation that it's measuring is actually coming from somewhere else in the room and simply reflecting off of the copper surface and into the camera. 
     
    These two different phenomena combine to make it look like the piece of copper is colder than its surroundings, when in reality everything is the same temperature. This is exactly what happened in the IR images shown in LTT’s “Backplates cool your videocard” video. In the images it looks like the copper region is much colder than the surrounding backplate, when in reality it was probably as hot or hotter.


     
    So there are a few conclusions that we can make here. First, one should take a grain of salt if exact temperatures are measured using a thermal camera. If the emissivity that the camera assumes is different than the emissivity of the object you’re measuring, then the temperature given by the camera will be incorrect.


     
    Second, we CAN compare relative temperatures between different regions, if we do it carefully. This can be done by putting a piece of non-shiny tape on each of the surfaces that you’re trying to compare. The tape will equalize to the same temperature of whatever it’s attached to, and because you have the same tape on both surfaces (e.g., a piece of copper and a piece of plastic), the emissivities will also be the same (because now it's the tape radiating in both cases) and the temperatures of the two objects will be directly comparable. You can even take this a step further if you want to get accurate temperatures. If your camera allows you to manually set the emissivity of what you’re looking at (many FLIR cameras do), you can determine the emissivity of the tape you’re using by putting some tape on a surface, measuring its temperature with a normal thermometer, and then adjusting the emissivity in the camera’s settings until the onscreen temperature matches the true temperature. Once you know the emissivity of your tape, you can then use that emissivity in your camera’s settings to accurately measure the temperature of anything that you put the tape on in the future.

  3. Informative
    Sebastian got a reaction from ElZamo92 in Measuring temps with IR cameras is flawed!   
    I thought this post might be worth making because I've seen a lot of hardware reviewers using thermal (i.e. IR) cameras to check external temperatures (most recently in the "Backplates cool your videocard" LTT video, but I've seen a lot of other websites do it as well). I want to make it clear that I'm not writing this post to attack LTT or anybody else, but instead I'm doing it to offer a bit of knowledge to the handful of people out there who might actually be interested. So rather than doing real work, I'm going to give a mini IR camera physics lesson on an internet forum instead.
     
    IR cameras don't have a way of directly measuring the temperature of objects in the way that a thermometer does. Instead, they measure the amount of thermal radiation coming from a surface (IR wavelengths in this case), and then use that intensity to calculate a temperature using what's known as the Stefan-Boltzmann Law. The equation looks like this:
     
    T = (I/(e*A*s))1/4
     
    where T is the temperature, I is the intensity of the IR radiation entering the camera, e is the emissivity of the surface you're pointing the camera at, A is the area of that surface, and s is just a constant which we can ignore here. The important thing to note here is that there are other variables which go into this calculation, namely e and A.
     
    Now, the most important one, and the one I'm going to focus on is the emissivity e. Emissivity is a property which varies from one material to another, and is essentially a measure of how well a given material behaves like an ideal black body (i.e., how good the material is at radiating IR). It can range from 0 to 1, where 1 is the equivalent of an ideal black body, and 0 means that the object doesn't radiate any IR radiation at all. Most thermal cameras (including the FLIR ones that I think most tech reviewers use) assume that the emissivity is 0.9 or so. This means that if the object you're pointing the camera at actually has an emissivity of 0.9, then the temperature that the camera shows on-screen will be accurate. However, there are plenty of materials which do NOT have an emissivity of around 0.9. Metals would be the most relevant example.
     
    Metals tend to have very low emissivities (often less than 0.1). So what happens if we try to measure the temperature of, say, a copper surface with our camera? The emissivity of copper is around 0.05 (varies a bit depending on smoothness of the surface), but our camera is assuming that the emissivity is 0.9, which is WAY too high. This means that the temperature that the camera calculates will be considerably lower than the true temperature (see the equation above). 
     
    Metals have another problem, and that is that they are also good at reflecting IR wavelengths. This means that when you point your camera at that piece of copper, some of the IR radiation that it's measuring is actually coming from somewhere else in the room and simply reflecting off of the copper surface and into the camera. 
     
    These two different phenomena combine to make it look like the piece of copper is colder than its surroundings, when in reality everything is the same temperature. This is exactly what happened in the IR images shown in LTT’s “Backplates cool your videocard” video. In the images it looks like the copper region is much colder than the surrounding backplate, when in reality it was probably as hot or hotter.


     
    So there are a few conclusions that we can make here. First, one should take a grain of salt if exact temperatures are measured using a thermal camera. If the emissivity that the camera assumes is different than the emissivity of the object you’re measuring, then the temperature given by the camera will be incorrect.


     
    Second, we CAN compare relative temperatures between different regions, if we do it carefully. This can be done by putting a piece of non-shiny tape on each of the surfaces that you’re trying to compare. The tape will equalize to the same temperature of whatever it’s attached to, and because you have the same tape on both surfaces (e.g., a piece of copper and a piece of plastic), the emissivities will also be the same (because now it's the tape radiating in both cases) and the temperatures of the two objects will be directly comparable. You can even take this a step further if you want to get accurate temperatures. If your camera allows you to manually set the emissivity of what you’re looking at (many FLIR cameras do), you can determine the emissivity of the tape you’re using by putting some tape on a surface, measuring its temperature with a normal thermometer, and then adjusting the emissivity in the camera’s settings until the onscreen temperature matches the true temperature. Once you know the emissivity of your tape, you can then use that emissivity in your camera’s settings to accurately measure the temperature of anything that you put the tape on in the future.

  4. Informative
    Sebastian got a reaction from WkdPaul in Measuring temps with IR cameras is flawed!   
    I thought this post might be worth making because I've seen a lot of hardware reviewers using thermal (i.e. IR) cameras to check external temperatures (most recently in the "Backplates cool your videocard" LTT video, but I've seen a lot of other websites do it as well). I want to make it clear that I'm not writing this post to attack LTT or anybody else, but instead I'm doing it to offer a bit of knowledge to the handful of people out there who might actually be interested. So rather than doing real work, I'm going to give a mini IR camera physics lesson on an internet forum instead.
     
    IR cameras don't have a way of directly measuring the temperature of objects in the way that a thermometer does. Instead, they measure the amount of thermal radiation coming from a surface (IR wavelengths in this case), and then use that intensity to calculate a temperature using what's known as the Stefan-Boltzmann Law. The equation looks like this:
     
    T = (I/(e*A*s))1/4
     
    where T is the temperature, I is the intensity of the IR radiation entering the camera, e is the emissivity of the surface you're pointing the camera at, A is the area of that surface, and s is just a constant which we can ignore here. The important thing to note here is that there are other variables which go into this calculation, namely e and A.
     
    Now, the most important one, and the one I'm going to focus on is the emissivity e. Emissivity is a property which varies from one material to another, and is essentially a measure of how well a given material behaves like an ideal black body (i.e., how good the material is at radiating IR). It can range from 0 to 1, where 1 is the equivalent of an ideal black body, and 0 means that the object doesn't radiate any IR radiation at all. Most thermal cameras (including the FLIR ones that I think most tech reviewers use) assume that the emissivity is 0.9 or so. This means that if the object you're pointing the camera at actually has an emissivity of 0.9, then the temperature that the camera shows on-screen will be accurate. However, there are plenty of materials which do NOT have an emissivity of around 0.9. Metals would be the most relevant example.
     
    Metals tend to have very low emissivities (often less than 0.1). So what happens if we try to measure the temperature of, say, a copper surface with our camera? The emissivity of copper is around 0.05 (varies a bit depending on smoothness of the surface), but our camera is assuming that the emissivity is 0.9, which is WAY too high. This means that the temperature that the camera calculates will be considerably lower than the true temperature (see the equation above). 
     
    Metals have another problem, and that is that they are also good at reflecting IR wavelengths. This means that when you point your camera at that piece of copper, some of the IR radiation that it's measuring is actually coming from somewhere else in the room and simply reflecting off of the copper surface and into the camera. 
     
    These two different phenomena combine to make it look like the piece of copper is colder than its surroundings, when in reality everything is the same temperature. This is exactly what happened in the IR images shown in LTT’s “Backplates cool your videocard” video. In the images it looks like the copper region is much colder than the surrounding backplate, when in reality it was probably as hot or hotter.


     
    So there are a few conclusions that we can make here. First, one should take a grain of salt if exact temperatures are measured using a thermal camera. If the emissivity that the camera assumes is different than the emissivity of the object you’re measuring, then the temperature given by the camera will be incorrect.


     
    Second, we CAN compare relative temperatures between different regions, if we do it carefully. This can be done by putting a piece of non-shiny tape on each of the surfaces that you’re trying to compare. The tape will equalize to the same temperature of whatever it’s attached to, and because you have the same tape on both surfaces (e.g., a piece of copper and a piece of plastic), the emissivities will also be the same (because now it's the tape radiating in both cases) and the temperatures of the two objects will be directly comparable. You can even take this a step further if you want to get accurate temperatures. If your camera allows you to manually set the emissivity of what you’re looking at (many FLIR cameras do), you can determine the emissivity of the tape you’re using by putting some tape on a surface, measuring its temperature with a normal thermometer, and then adjusting the emissivity in the camera’s settings until the onscreen temperature matches the true temperature. Once you know the emissivity of your tape, you can then use that emissivity in your camera’s settings to accurately measure the temperature of anything that you put the tape on in the future.

  5. Informative
    Sebastian got a reaction from Jetfighter808 in Measuring temps with IR cameras is flawed!   
    I thought this post might be worth making because I've seen a lot of hardware reviewers using thermal (i.e. IR) cameras to check external temperatures (most recently in the "Backplates cool your videocard" LTT video, but I've seen a lot of other websites do it as well). I want to make it clear that I'm not writing this post to attack LTT or anybody else, but instead I'm doing it to offer a bit of knowledge to the handful of people out there who might actually be interested. So rather than doing real work, I'm going to give a mini IR camera physics lesson on an internet forum instead.
     
    IR cameras don't have a way of directly measuring the temperature of objects in the way that a thermometer does. Instead, they measure the amount of thermal radiation coming from a surface (IR wavelengths in this case), and then use that intensity to calculate a temperature using what's known as the Stefan-Boltzmann Law. The equation looks like this:
     
    T = (I/(e*A*s))1/4
     
    where T is the temperature, I is the intensity of the IR radiation entering the camera, e is the emissivity of the surface you're pointing the camera at, A is the area of that surface, and s is just a constant which we can ignore here. The important thing to note here is that there are other variables which go into this calculation, namely e and A.
     
    Now, the most important one, and the one I'm going to focus on is the emissivity e. Emissivity is a property which varies from one material to another, and is essentially a measure of how well a given material behaves like an ideal black body (i.e., how good the material is at radiating IR). It can range from 0 to 1, where 1 is the equivalent of an ideal black body, and 0 means that the object doesn't radiate any IR radiation at all. Most thermal cameras (including the FLIR ones that I think most tech reviewers use) assume that the emissivity is 0.9 or so. This means that if the object you're pointing the camera at actually has an emissivity of 0.9, then the temperature that the camera shows on-screen will be accurate. However, there are plenty of materials which do NOT have an emissivity of around 0.9. Metals would be the most relevant example.
     
    Metals tend to have very low emissivities (often less than 0.1). So what happens if we try to measure the temperature of, say, a copper surface with our camera? The emissivity of copper is around 0.05 (varies a bit depending on smoothness of the surface), but our camera is assuming that the emissivity is 0.9, which is WAY too high. This means that the temperature that the camera calculates will be considerably lower than the true temperature (see the equation above). 
     
    Metals have another problem, and that is that they are also good at reflecting IR wavelengths. This means that when you point your camera at that piece of copper, some of the IR radiation that it's measuring is actually coming from somewhere else in the room and simply reflecting off of the copper surface and into the camera. 
     
    These two different phenomena combine to make it look like the piece of copper is colder than its surroundings, when in reality everything is the same temperature. This is exactly what happened in the IR images shown in LTT’s “Backplates cool your videocard” video. In the images it looks like the copper region is much colder than the surrounding backplate, when in reality it was probably as hot or hotter.


     
    So there are a few conclusions that we can make here. First, one should take a grain of salt if exact temperatures are measured using a thermal camera. If the emissivity that the camera assumes is different than the emissivity of the object you’re measuring, then the temperature given by the camera will be incorrect.


     
    Second, we CAN compare relative temperatures between different regions, if we do it carefully. This can be done by putting a piece of non-shiny tape on each of the surfaces that you’re trying to compare. The tape will equalize to the same temperature of whatever it’s attached to, and because you have the same tape on both surfaces (e.g., a piece of copper and a piece of plastic), the emissivities will also be the same (because now it's the tape radiating in both cases) and the temperatures of the two objects will be directly comparable. You can even take this a step further if you want to get accurate temperatures. If your camera allows you to manually set the emissivity of what you’re looking at (many FLIR cameras do), you can determine the emissivity of the tape you’re using by putting some tape on a surface, measuring its temperature with a normal thermometer, and then adjusting the emissivity in the camera’s settings until the onscreen temperature matches the true temperature. Once you know the emissivity of your tape, you can then use that emissivity in your camera’s settings to accurately measure the temperature of anything that you put the tape on in the future.

  6. Informative
    Sebastian got a reaction from Organized in Measuring temps with IR cameras is flawed!   
    I thought this post might be worth making because I've seen a lot of hardware reviewers using thermal (i.e. IR) cameras to check external temperatures (most recently in the "Backplates cool your videocard" LTT video, but I've seen a lot of other websites do it as well). I want to make it clear that I'm not writing this post to attack LTT or anybody else, but instead I'm doing it to offer a bit of knowledge to the handful of people out there who might actually be interested. So rather than doing real work, I'm going to give a mini IR camera physics lesson on an internet forum instead.
     
    IR cameras don't have a way of directly measuring the temperature of objects in the way that a thermometer does. Instead, they measure the amount of thermal radiation coming from a surface (IR wavelengths in this case), and then use that intensity to calculate a temperature using what's known as the Stefan-Boltzmann Law. The equation looks like this:
     
    T = (I/(e*A*s))1/4
     
    where T is the temperature, I is the intensity of the IR radiation entering the camera, e is the emissivity of the surface you're pointing the camera at, A is the area of that surface, and s is just a constant which we can ignore here. The important thing to note here is that there are other variables which go into this calculation, namely e and A.
     
    Now, the most important one, and the one I'm going to focus on is the emissivity e. Emissivity is a property which varies from one material to another, and is essentially a measure of how well a given material behaves like an ideal black body (i.e., how good the material is at radiating IR). It can range from 0 to 1, where 1 is the equivalent of an ideal black body, and 0 means that the object doesn't radiate any IR radiation at all. Most thermal cameras (including the FLIR ones that I think most tech reviewers use) assume that the emissivity is 0.9 or so. This means that if the object you're pointing the camera at actually has an emissivity of 0.9, then the temperature that the camera shows on-screen will be accurate. However, there are plenty of materials which do NOT have an emissivity of around 0.9. Metals would be the most relevant example.
     
    Metals tend to have very low emissivities (often less than 0.1). So what happens if we try to measure the temperature of, say, a copper surface with our camera? The emissivity of copper is around 0.05 (varies a bit depending on smoothness of the surface), but our camera is assuming that the emissivity is 0.9, which is WAY too high. This means that the temperature that the camera calculates will be considerably lower than the true temperature (see the equation above). 
     
    Metals have another problem, and that is that they are also good at reflecting IR wavelengths. This means that when you point your camera at that piece of copper, some of the IR radiation that it's measuring is actually coming from somewhere else in the room and simply reflecting off of the copper surface and into the camera. 
     
    These two different phenomena combine to make it look like the piece of copper is colder than its surroundings, when in reality everything is the same temperature. This is exactly what happened in the IR images shown in LTT’s “Backplates cool your videocard” video. In the images it looks like the copper region is much colder than the surrounding backplate, when in reality it was probably as hot or hotter.


     
    So there are a few conclusions that we can make here. First, one should take a grain of salt if exact temperatures are measured using a thermal camera. If the emissivity that the camera assumes is different than the emissivity of the object you’re measuring, then the temperature given by the camera will be incorrect.


     
    Second, we CAN compare relative temperatures between different regions, if we do it carefully. This can be done by putting a piece of non-shiny tape on each of the surfaces that you’re trying to compare. The tape will equalize to the same temperature of whatever it’s attached to, and because you have the same tape on both surfaces (e.g., a piece of copper and a piece of plastic), the emissivities will also be the same (because now it's the tape radiating in both cases) and the temperatures of the two objects will be directly comparable. You can even take this a step further if you want to get accurate temperatures. If your camera allows you to manually set the emissivity of what you’re looking at (many FLIR cameras do), you can determine the emissivity of the tape you’re using by putting some tape on a surface, measuring its temperature with a normal thermometer, and then adjusting the emissivity in the camera’s settings until the onscreen temperature matches the true temperature. Once you know the emissivity of your tape, you can then use that emissivity in your camera’s settings to accurately measure the temperature of anything that you put the tape on in the future.

  7. Like
    Sebastian got a reaction from Fattedd in 33% more FPS in BF1 on SSD vs HDD!?   
    Well now I've finally had the time to do a bit of testing, and I can say that I saw a negligible difference in framerates for both Overwatch and Battlefield 1. I'm using a core i5 4670K @ 4.2 GHz (sad, I know haha), and a GTX 1080 Founder's Edition, with 8 GB of RAM. I tested both games at max settings at 1440p.
     
    Luckily, however, there WERE two good things to come out of this. First of all, since I tested the games on HDD first, and SSD second, both games are now gonna stay on the SSD, since I DID obviously notice a big improvement in loading times. Second, during my testing I noticed that ticking the "epic" preset in Overwatch caused the game to automatically set my resolution scale to 122%, which corresponds to around 42% more pixels. When I manually changed this to 100%, my average fps went from the 90-100 fps range up to the 135-145 fps range. Hooray!
     
    I recommend for all you Overwatch players out there that you take a quick look in your graphical settings to see if the game automatically applied a >100% resolution scale.
  8. Agree
    Sebastian got a reaction from Snadzies in The Division - Ubisoft's Fastest-Selling Game   
    I don't think that people who are interested in this game are expecting it to be some sort of simulator-type game. It's an online RPG game, so of course you have bullet sponge-y enemies. Do people shy away from WoW because a guy surviving five fireballs to the face is unrealistic and "kills the immersion"? No, of course not! The thing that makes these games enticing (for some people, not everybody) is the progression. Getting loot, making yourself more powerful, fortifying your base, etc. If you want immersion go play a military sim.
  9. Agree
    Sebastian got a reaction from Luraguse in The Division - Ubisoft's Fastest-Selling Game   
    I don't really understand why so many people get their panties in a bunch over this stuff. I don't care if you don't like a particular game, it's not going to affect my enjoyment in the slightest. So many people seem to believe that a game's enjoyability is an objective value, when it is (by definition) subjective. You don't like Destiny or The Division or any other game? Fine, don't play them. But don't be an ass hole and go out telling other people that they can't enjoy something that you don't, especially when the player base statistics are staring you in the face, telling you that there are, in fact, a huge number of people who do have fun playing these games.
     
    And the whole point of video games is to have fun, right? Or did I miss something?
  10. Funny
    Sebastian got a reaction from suxen in Stream PS4 games to your PC or Mac with next system update   
    Obviously it's a clipped version of the present progressive tense of the verb "to elf." I think he missed an apostrophe though, so it should really be " elfin' ".
  11. Like
    Sebastian got a reaction from Arty in deceptive reviews and shoddy advertisement practices being cracked down on in youtube re Machinima/xbox 1   
    Honestly I don't really see why people would have a problem with LMG over this stuff. Whether you trust Linus or not, there IS still useful information to be gained from the reviews. Even if he were secretly being paid to give a positive review on a product, it wouldn't change the numbers in the benchmarks. If you're looking at reviews and are seriously looking into buying a GPU, CPU, etc., you should:
     
    (a) Look at the benchmark data that's being presented. This is where the product in question as well as competing products are being most objectively compared.
     
    (b ) Look through the rest of the review article/video for other objective information (e.g., GPU X has a certain additional feature, but GPU Y doesn't).
     
    (c ) Look at multiple reviews (both professional and consumer reviews), in order to check that the OBJECTIVE information (i.e., the stuff from (a) and (b ) above) roughly matches up from different sources.
     
    (d) Take any subjective information (i.e., conclusions and opinions of the reviewer) with a grain of salt. The reviewer is just a normal person like you or me, and they're entitled to their opinion. Just because it's Linus or Jay or someone at Anandtech or Tom's Hardware doesn't mean that they should be put on any sort of pedestal. You should never trust somebody's opinion because they are in a position of power or because they seem​ really cool​​. You should take the objective information and form YOUR OWN opinion, regardless of whether you agree with the conclusions of any particular reviewer.
     
     
    I think that I've emphasized that the important thing in the reviews (especially from a potential buyer's perspective) is the objective information that is presented. And that's the real value of the LMG review videos: they are able to provide extensive and objective apples-to-apples comparisons of a lot of products. Only big reviewers have the resources to be able to compare thousands of dollars' worth of hardware in a series of benchmarks. And part of the reason they have those resources is because companies are willing to send their products to the reviewers.
     
    So to conclude, I honestly wouldn't really care if Linus or anyone else WERE being paid to give glowing reviews of products, because the numbers in the benchmarks don't lie. And anyway, I don't think anything like that is going on, because as Linus has already pointed out in this thread, behaving that way would be an incredibly stupid business decision.
  12. Like
    Sebastian got a reaction from iRiley in PCIe 3.0 Lanes with 2xGPU + Intel 750 PCIe Z97 vs X99   
    I think you're getting hung up on the 4x vs 8x. To summarize what other people in this thread have already said:
     
    For a single card, performance will be identical whether it's running at PCIe 3.0 x16, x8, or x4. So don't worry about that. For SLI to be enabled, each card needs to run at either x8 or x16. If one or more cards is at x4, you won't be able to turn on SLI. Z97 has 16 total lanes, which means that if you want to run dual cards in SLI, the only option is for them to both run at x8. This means that if you run SLI on Z97, you will not be able to put in anything else in a PCIe 3.0 slot, Period. The only possible way to use two or more cards in SLI AND a PCIe 3.0 SSD is to use X99.
  13. Like
    Sebastian got a reaction from Dobbsjr in CPU performance doesn't seem to follow Moore's Law... why not?   
    I agree with you, that's why I said "Moore's Law is the observation that...". I also agree that we'll get to a point where currently-used technology won't be able to offer any more improvements, because we will have reached a physical limit. We're getting fairly close to the point where individual (traditional) transistors can't be shrunk any further. That being said, we're still not at that point. Intel themselves have stated they they expect Moore's Law to continue until at least 2018 with the 7 nm process:
     
    http://www.pcworld.com/article/2887275/intel-moores-law-will-continue-through-7nm-chips.html
     
    Plus, the question I posed was about the past, where Moore's prediction HAS pretty much held strong. The fact is that transistor counts HAVE been roughly doubling ever two years, but CPU performance growth has been far from that. 
     
     
    This seems like a more reasonable explanation to me. Especially since it also fits in with the performance trends we see in the GPU market. And if THAT'S the true explanation, then it's even more frustrating than the idea that on board graphics are stealing away our sweet sweet CPU core power...
  14. Like
    Sebastian got a reaction from ThE_MarD in Laptop tips for a non-gamer girlfriend?   
    At the moment, she has an HP 630: http://www.cnet.com/products/hp-630-15-6-p-p6200-windows-7-home-premium-64-bit-4-gb-ram-320-gb-hdd-series/specs/
     
    It would probably help considerably to do a fresh install of windows or putting in an SSD, but to be honest, I'd rather just replace it altogether. Oh, and thanks for the tip to follow my own topic .
     
     
    I hadn't looked into their flex series yet, but that looks potentially quite promising. She seemed to light up a bit when I mentioned that they had touch screens, so that one's definitely going on the list of possibilities, given the other specs. I'd still prefer that it comes with a hybrid drive or SSD, since buying an SSD separately will just boost the cost on its own. Still, if that turns out to be cheaper than buying one of the more expensive Flex models, then I will (and just throw the extra 128 GB SSD in my desktop for extra game storage ). Either way, I'll probably just keep an eye out for Flex models when local sellers are having sales.
     
     
    I've been looking at a lot at the Aspire series as well actually. I just wish there were an option to drop the 840M and put in a faster storage alternative, since as you say, it doesn't really serve its intended purpose very well anyway. At any rate, this is also on my list, and I might just consider buying an SSD separately and throwing that in. 
     
    My strategy for buying generally involves picking a few models that I like, and then waiting until I see one of them on a decent sale. If I'm lucky, I can get a hold of an $800 Flex or Aspire model for under my $700 limit.
     
    Is anyone else aware of any other models that are similar in price and specs to the aforementioned Flex and Aspire models?
×