Jump to content

Sebastian

Member
  • Posts

    38
  • Joined

  • Last visited

Awards

This user doesn't have any awards

Profile Information

  • Gender
    Male
  • Location
    Sweden
  • Interests
    Computers and such, of course!
  • Occupation
    Materials Physicist

System

  • CPU
    Core i5 4670k
  • Motherboard
    Gigabyte GA-Z87-HD3
  • RAM
    Gskill Ripjaws 2x4GB 1600 MHz
  • GPU
    EVGA GTX 1080 FE
  • Case
    Antec VSK4000E
  • Storage
    Samsung 840 Evo 120 GB, Seagate Barracuda 1 TB
  • PSU
    Corsair CX 600M
  • Display(s)
    BenQ GL2450
  • Cooling
    Cooler Master Hyper 212 Evo
  • Keyboard
    Microsoft Sidewinder X4
  • Mouse
    Perixx MX-3000B
  • Sound
    Sennheiser HD595
  • Operating System
    Windows 8.1

Recent Profile Visitors

785 profile views
  1. Regarding the video "I have some things to say - Core i9 & X299": I was wondering about all this after seeing the Kaby Lake X specs, and the video did a great job of filling in the details that I hadn't yet bothered to research. I agree with Linus here that the X299 launch (plus LGA2066 CPUs) appears pretty rushed and haphazardly thrown together, driven by AMD's Threadripper announcements. That being said, I think there are a couple of silver linings here. First, for the extreme enthusiasts, I think this launch will be like any other. You'll still be able to buy a badass 18C/36T CPU and an equally badass mobo with all the possible bells and whistles. It's only the lower tierCPUs (particularly Kaby Lake X) that start to become questionable. The second possible good news is that Intel may already be in the process of taking Linus' free business advice. If we're lucky, then this particular launch will be a bit of a mess, but Coffee Lake will end up being a big step up. They've already hastily announced that it will give a 30% performance boost, though we'll have to wait and see what metric they're actually referring to. Intel may have been caught off guard this time around, but now that they've realized that AMD is finally putting out some strong competition, they may start pushing themselves harder. Ultimately, that's what everybody's been wanting for years anyway. We've had great improvements in GPU performance for years because there's been real competition in that market. Hopefully we'll finally see the same in the CPU market in the years to come (once we get past X299).
  2. I'm sure you're right, and I know LTT also uses similar sensors to measure case temps (e.g., in the Workshop episode where Luke tried filling the computer case with random junk to see how it would affect temps). I just mentioned it here because I've seen other tech sites showing IR photos of hardware where they've labeled specific temperatures at specific points and then said something like, "the IR camera says it's only 60 C on the backside of the PCB." That may be what the camera says, but that doesn't necessarily mean it's true. #alternativefacts Anyway, glad you enjoyed the post! I wasn't sure if ANYBODY would care haha.
  3. I thought this post might be worth making because I've seen a lot of hardware reviewers using thermal (i.e. IR) cameras to check external temperatures (most recently in the "Backplates cool your videocard" LTT video, but I've seen a lot of other websites do it as well). I want to make it clear that I'm not writing this post to attack LTT or anybody else, but instead I'm doing it to offer a bit of knowledge to the handful of people out there who might actually be interested. So rather than doing real work, I'm going to give a mini IR camera physics lesson on an internet forum instead. IR cameras don't have a way of directly measuring the temperature of objects in the way that a thermometer does. Instead, they measure the amount of thermal radiation coming from a surface (IR wavelengths in this case), and then use that intensity to calculate a temperature using what's known as the Stefan-Boltzmann Law. The equation looks like this: T = (I/(e*A*s))1/4 where T is the temperature, I is the intensity of the IR radiation entering the camera, e is the emissivity of the surface you're pointing the camera at, A is the area of that surface, and s is just a constant which we can ignore here. The important thing to note here is that there are other variables which go into this calculation, namely e and A. Now, the most important one, and the one I'm going to focus on is the emissivity e. Emissivity is a property which varies from one material to another, and is essentially a measure of how well a given material behaves like an ideal black body (i.e., how good the material is at radiating IR). It can range from 0 to 1, where 1 is the equivalent of an ideal black body, and 0 means that the object doesn't radiate any IR radiation at all. Most thermal cameras (including the FLIR ones that I think most tech reviewers use) assume that the emissivity is 0.9 or so. This means that if the object you're pointing the camera at actually has an emissivity of 0.9, then the temperature that the camera shows on-screen will be accurate. However, there are plenty of materials which do NOT have an emissivity of around 0.9. Metals would be the most relevant example. Metals tend to have very low emissivities (often less than 0.1). So what happens if we try to measure the temperature of, say, a copper surface with our camera? The emissivity of copper is around 0.05 (varies a bit depending on smoothness of the surface), but our camera is assuming that the emissivity is 0.9, which is WAY too high. This means that the temperature that the camera calculates will be considerably lower than the true temperature (see the equation above). Metals have another problem, and that is that they are also good at reflecting IR wavelengths. This means that when you point your camera at that piece of copper, some of the IR radiation that it's measuring is actually coming from somewhere else in the room and simply reflecting off of the copper surface and into the camera. These two different phenomena combine to make it look like the piece of copper is colder than its surroundings, when in reality everything is the same temperature. This is exactly what happened in the IR images shown in LTT’s “Backplates cool your videocard” video. In the images it looks like the copper region is much colder than the surrounding backplate, when in reality it was probably as hot or hotter. So there are a few conclusions that we can make here. First, one should take a grain of salt if exact temperatures are measured using a thermal camera. If the emissivity that the camera assumes is different than the emissivity of the object you’re measuring, then the temperature given by the camera will be incorrect. Second, we CAN compare relative temperatures between different regions, if we do it carefully. This can be done by putting a piece of non-shiny tape on each of the surfaces that you’re trying to compare. The tape will equalize to the same temperature of whatever it’s attached to, and because you have the same tape on both surfaces (e.g., a piece of copper and a piece of plastic), the emissivities will also be the same (because now it's the tape radiating in both cases) and the temperatures of the two objects will be directly comparable. You can even take this a step further if you want to get accurate temperatures. If your camera allows you to manually set the emissivity of what you’re looking at (many FLIR cameras do), you can determine the emissivity of the tape you’re using by putting some tape on a surface, measuring its temperature with a normal thermometer, and then adjusting the emissivity in the camera’s settings until the onscreen temperature matches the true temperature. Once you know the emissivity of your tape, you can then use that emissivity in your camera’s settings to accurately measure the temperature of anything that you put the tape on in the future.
  4. Well now I've finally had the time to do a bit of testing, and I can say that I saw a negligible difference in framerates for both Overwatch and Battlefield 1. I'm using a core i5 4670K @ 4.2 GHz (sad, I know haha), and a GTX 1080 Founder's Edition, with 8 GB of RAM. I tested both games at max settings at 1440p. Luckily, however, there WERE two good things to come out of this. First of all, since I tested the games on HDD first, and SSD second, both games are now gonna stay on the SSD, since I DID obviously notice a big improvement in loading times. Second, during my testing I noticed that ticking the "epic" preset in Overwatch caused the game to automatically set my resolution scale to 122%, which corresponds to around 42% more pixels. When I manually changed this to 100%, my average fps went from the 90-100 fps range up to the 135-145 fps range. Hooray! I recommend for all you Overwatch players out there that you take a quick look in your graphical settings to see if the game automatically applied a >100% resolution scale.
  5. If the graphics card doesn't have enough VRAM, wouldn't the system memory come next (before system storage)? This is why it seems so bizarre to me that storage should have any impact on FPS whatsoever. At any rate, I currently have BF1 on an HDD, so if nothing else comes out of this testing, at least I'll definitely have faster loading times!
  6. That's my initial reaction as well. Which is why I want to test it for myself, just in case I can get a bunch of free FPS :P.
  7. Hey everyone, This article recently popped up on my facebook feed: http://www.tweaktown.com/articles/7911/upgrade-test-gtx-760-vs-1060-ssd-hdd-system/index.html In it, the author benchmarks a few games using either a GTX 760 or 1060 combines with either an HDD or an SSD (i.e., 4 combinations in total). According to his results at 1080p, 3 out of 4 games (Desus Ex: Mankind Divided, Overwatch, and BF1) all had 20-30% framerate increases ONLY by switching from an HDD to an SSD. I have to say that this was pretty unexpected for me, since I assumed that, at least for online shooters like Overwatch and BF1, everything would be loaded into system memory and VRAM at the start of the match, thus making your type of storage irrelevant (except when it comes to loading times). I also feel the need to point out that they claim to be using an i5 760 (LGA 1156) on an H97 (LGA 1150) motherboard, and it's also not the first time I've seen some unusual things on Tweaktown. That being said, if moving some games from an HDD to an SSD can really improve performance so dramatically, I figured people here might want to know about it. I guess the only way to know for sure would be to test it myself, so that's probably what I'll do. Out of the four games tested in the article, I only have Overwatch and BF1, so I suppose I'll go test those out, then come back and post here again later, assuming anyone is interested.
  8. Well my original goal was maximizing performance per dollar, but I double checked prices and found an MSI GTX 980 Ti gaming for less than double the price of the cheapest GTX 970 I'd found previously. The extra 2 GB of memory will definitely be useful for us as well. Perk of having special purchasing arrangements between the university and retailers I guess. Time to go convince my boss that four 980 Tis are worth it! The software is called Mumax3. It's designed for simulating magnetism in micro- and nano-structured materials. And we've already been using it for a while. We've got a a computer with a 780 and a 970, plus another computer with two Titan Blacks, but with the number of people using the software, we still have people pretty much constantly waiting in line. So basically this build is more because we know it works well and want to add more GPUs to the cluster to take some load off of the others that we already have.
  9. I building a computer for work to run simulations on, and since the software we use is designed to use CUDA, we want to cram as many GPUs in as possible. As the resident Computer nerd, I volunteered to design and build it. Since we're more interested in bang-for-buck, I'm opting for four GTX 970s rather than the highest end cards (or Teslas, for that matter). When it comes to choosing the case, however, I really want to be sure that it will actually have space for four GPUs. I previously built a 3-GPU system for the same purpose, and discovered too late that it had one expansion slot too few, forcing me to do some creative modding. I'd rather not repeat that. I'm looking at Corsair's 750D Airflow Edition. Corsair doesn't advertise it as being capable of 4 GPUs, but it DOES have 9 expansion slots, which should be enough as far as I can tell. I haven't managed to find any blogs or forums online where somebody has used this specific case with 4 GPUs, but I found one where they used the Graphite 760T, which appears to be very similar. What do you guys think? Also, what are your opinions on reference 970s vs something like an MSI Gaming or Gigabyte G1 Gaming?
  10. Alright, well that seems like a more reasonable criticism then. I'm hoping that I disagree when I get around to playing it, though I haven't gotten a chance yet. I dunno, I haven't gotten into any RPGs in a while, so I'm hoping that I'll like this one.
  11. I don't think that people who are interested in this game are expecting it to be some sort of simulator-type game. It's an online RPG game, so of course you have bullet sponge-y enemies. Do people shy away from WoW because a guy surviving five fireballs to the face is unrealistic and "kills the immersion"? No, of course not! The thing that makes these games enticing (for some people, not everybody) is the progression. Getting loot, making yourself more powerful, fortifying your base, etc. If you want immersion go play a military sim.
  12. I don't really understand why so many people get their panties in a bunch over this stuff. I don't care if you don't like a particular game, it's not going to affect my enjoyment in the slightest. So many people seem to believe that a game's enjoyability is an objective value, when it is (by definition) subjective. You don't like Destiny or The Division or any other game? Fine, don't play them. But don't be an ass hole and go out telling other people that they can't enjoy something that you don't, especially when the player base statistics are staring you in the face, telling you that there are, in fact, a huge number of people who do have fun playing these games. And the whole point of video games is to have fun, right? Or did I miss something?
  13. Obviously it's a clipped version of the present progressive tense of the verb "to elf." I think he missed an apostrophe though, so it should really be " elfin' ".
  14. Does this mean we'll be able to play PS4 games with a mouse and keyboard? Seems like it could really screw up the balance in first person shooters...
×