Jump to content

infamous_last

Member
  • Posts

    375
  • Joined

  • Last visited

Reputation Activity

  1. Like
    infamous_last got a reaction from Forge in 980ti compatibility with my motherboard   
    980ti (SLI no less) works fine on my much older x58 board, Z77 should be a cake walk!
  2. Like
    infamous_last reacted to Brodholm in WING X99 | A CNC-milled Scratch Build! (Benchmarks, temps and wallpapers posted!)   
    Hello again! I just did some benchmarks, temperature, fan speeds and power usage tests. I also selected a more reasonable number of pictures to fix up for wallpapers and re-did some of the edit on them to fix/adjust the color balance/hue/contrast and exposure! And I also removed the big water mark on each picture since it was a bit disturbing. The "album" should now have much more of a flow to it and have a red thread, so you can easily navigate through the pictures. That was some of the critique and I can agree on that and have made adjustments for it! 

    Some people have made request for me to make some wallpaper versions so here is a Google drive 4K PNG Wallpaper Link. I also included some updates to the "Battlestation" and some updates on some small things like how the fan filter looks after 6 months




     



     


     
     
     
     
     
     



     
     
     
     
     
     
     
     
     
    Computer performance and temps after 5 months of constant daily use (+6h each day). All in all, the system runs perfectly with a low noise level with the fans doing around 7-800 RPM when in idle and with it ramping up to about 850 RPM when doing gaming, rendering and other heavy tasks. While I could run the system on higher fan speeds to gain a few degrees it seems hardly worth it. But It is good to know that I could If I wanted to. The Corsair Commander have been quite good and really easy to use and setup, both with hardware and software. Being able to set your own curves for the fans is a really nice feature that I value a lot. It is a nice unit all in all the only thing I wish it had was on board memory so in case the computer hangs or the software crashes it would still be able to control the fans. But that is about the only bad thing I can say about. And the Link software has come a long way for those who haven't used it for a while.

    3Dmax Timespy with Gsync Off and got the following score. CPU @ 4.0 GHz and GPUs at stock settings.
     
    Time to do some benchmarking now (about 5 months later). Everything have been running smooth and I have been using the computer about 8 hours a day or so. This is the result in Cinebench Running CPU test with CPU @ 4.0 GHz @ 1.20 v. Fans at Adaptive speed (goes to ~850 RPM)
     
    Temps during the 3Dmax test and power consumption. We can see that with 2xD5 Pumps, 7x140mm ML Fans, Strix 1080 in SLI strix, 2x800gb PCIe SSDs from Intel and a 10-core 6950X uses about 571 Watts from the wall. Notice that the fan on the Corsair AX1200i is still at 0 RPM. Pushing the air through the radiator to the PSU is enough cooling for it to always be passive. I have never seen the fan spin on the PSU even under these types of loads for longer periods. Quite impressive I must say!
     
    This is the computer at IDLE while having the fans (4xCorsair ML140 push and 3x in pull) all on a dynamic curve that I have set up. With the fans going from 500 to 1800 RPM. We can see core temps around 35-39 C with the water @ 34 C and Air in @ 27 C. Drawing from the wall socket about 240 Watts.
     
    This is the computer at IDLE while having the fans (4xCorsair ML140 push and 3x in pull) all on maximum RPM. We can see core temps around 32-34 C with the water @ 32 C and Air in @ 27 C. Drawing from the wall socket about 250 Watts.
     
    Doing Ray Tracing in Autodesk Inventor using 100% of the CPU. Ran for 873 seconds with the fans at dynamic speed that settled around 850 RPM. With the CPU temps being around 58C and a power consumption about 360 Watts at the wall.
     
    Doing Ray Tracing in Autodesk Inventor using 100% of the CPU. Ran for 757 seconds with the fans at max speed (1800 RPM). With the CPU temps being around 53C and a power consumption about 360 Watts at the wall.
     
    Running Playerunkown's Battleground @ 120-140 FPS (2K res, Ultra on AA, Textures and View Distance, rest on low or very low). You can see here how much PUBG utilizes SLI... One card is basically not used at all.
     
    I upgraded my setup with some more stuff. I won a Corsair T1 RACE with WING X99. And after having used it for about 3 months I thought I would give my thoughts on it since there is not that much info out there I feel. All in all I am very pleased with it. And one of the best things with it is how nice the "roller blade" type wheels are. It is like gliding on a cloud compared to hard plastic wheels and much softer on your floor and it does not get stuck on cables and stuff. The material feels soft and has a very high quality. Also, the armrests were surprisingly comfortable. It has this "carbon" texture to it that is tightly wrapped or molded over a springy foam that feels really nice. The ability to change the width, height, "out and inwards" and back and forth position of the armrests is something I did not think I would do much, but I have ended up doing a l lot. The only thing I wished was that it was just a tiny bit wider over the back of the chair. For me it is just right but could have been a bit bigger. If I were to compare it to my previous chair (DXracer formula) this is much nicer in quality over all. The T1 RACER has a material that feels nicer, it has softer padding in the base. It is basically a higher quality chair over all. It is also quite a bit easier to assemble alone compared to the formula chair. The biggest thing for me was that the formula chair was WAY too narrow in the base so my thighs hurt while sitting in it for longer than 30 minutes. I actually had to swap back to my IKEA chair. But as always when it comes to chairs it comes down to personal preference. I would suggest trying out chairs before buying. But I can highly recommend the T1 RACE chair if you are looking for a high-quality chair. For reference I am 182 cm tall and weigh about 80-85kg and that is about 6 feet and 185 lbs.
     
    This is the texture I was talking about earlier. It looks really hard, but it is actually really nice! And it does not collect dust/skin particles. I was a bit worried about that the first time I felt it. But this is how it looks after almost 3 months of usage!
     
    I also swapped out my Corsair LUX K70 with MX browns to a Corsair K90 RGB Platinum with Speed switches that has a lower actuation point. Compared to the K70 if you don't compare switches the K90 has nicer media switches and a really nice wrist rest that you can swap for a rough and a smooth surface. All in all, a really nice upgrade and if you like macro keys it has 6 extra to the left that can be quite useful. My thoughts on the speed switches is that you sacrifice a bit of typing comfort compared to my overall favorite brown switches. But I ended up using the speed anyway, it still has a nice feel when typing but it feels much more fast and rapid when gaming. I went back to try brown after a few days of using the speed, but it felt sluggish in games compared to the speed ones. I definitely think it is worth it if you play a lot of games, especially FPS. On the subject of RGB I really like the ability to have specially assigned colors each key and that you can have layout profiles tied to different applications. I am not really a RGB person per say but the biggest thing is that you have options. In desktop I still use a solid red color and when I play games I have specially assigned key colors in my most played games. I would say that is basically always worth to get RGB versions compared to solid color ones.
     
    I also swapped to the Corsair Harpoon mouse since the K65 mouse did not really fit my grip. It’s a really cheap and simple mouse that does its job well!
     
    This is how the dust filter looked after 6 months of use! Should probably clean that more often

    Also the GPU bubble is gone! But as I did some refilling and cleaning out some dust on the cars this bubble came out from one of the blocks and got stuck there... But It will be gone in a few days or so, but still annoying when you are taking a picture That is all I have to update on the WING X99 Project. So far everything has been working perfectly and I am happy with how it looks and how it performs! I am sorry that I did not get this update up earlier and I have been promising people performance benchmarks and temps for a long time! Sorry about the delay!

    Cheers
  3. Like
    infamous_last reacted to ionbasa in Kodi is getting a UWP version and is coming to Xbox One soon   
    Yes, the UWP app has been out for a while and I have it running on PC, but I don't think it was available on Xbox. (Someone correct me if I'm wrong). From my understanding, it'll soon be available on the Xbox since most of the bugs have been squashed out by now.
  4. Like
    infamous_last reacted to AlTech in Kodi is getting a UWP version and is coming to Xbox One soon   
    The UWP version is a converted app using Project Centennial.
     
    It is not a native UWP app. It is being ported to be a native UWP app and thus will be getting Xbox Support in the process.
     
    That has been removed to my knowledge. It's just Cortana on the Xbox One now.
  5. Agree
    infamous_last reacted to Omon_Ra in Least annoying FREE AntiVirus ?   
    Microsoft Security Essentials. Lightweight, doesn't annoy you with popups, scheduled scans. Works well enough for me.
     
    https://www.microsoft.com/en-us/download/details.aspx?id=5201
  6. Like
    infamous_last reacted to jasonwj322a in 2017 Dell XPS 15 9560 Details   
    https://youtu.be/5IGNIYbrm-g - Detail look at website.
    http://www.dell.com/en-us/shop/productdetails/xps-15-9560-laptop - Actual website (Features section is now removed) 
    https://archive.fo/GF2eN#selection-7573.1-7582.0 - Archive of the website.
     
    The long awaited details of the refresh of the Dell XPS 15 are finally here. I myself had been waiting for this refresh and am really happy with the new hardware. I know many people had been waiting for this too. With the release of the late 2016 Razer Blade, even more people are eager to know what will we be getting on the highly anticipated Dell XPS 15. For me I would be picking the Dell XPS 15, sure I am sacrificing the graphic card performance but I feel like I am getting more, like having better screen. Fingerprint scanner is a really nice edition, I'll definitely have that. Only thing I am worried about is the price. The GTX 1050 is "optional". For how much? 100-300? 
    What do you guys think? I am just hoping I will be able to pick up the core i7, 16GB RAM, 512GB SSD, GTX 1050, 4K screen, and fingerprint scanner for about $2000.
     
  7. Informative
    infamous_last reacted to creatip123 in The Impedance Ballad (image heavy)   
    There have been lots of people with the mindset of 'the higher the impedance of a headphone, the harder it is to drive them, or the harder it is to push it louder'. Consequently, this leads to the mindset of 'the higher the impedance of a headphone, the more it needs an amp' 
     
    It's been said a lot of times in a lot of topics here. The regulars here always go against this mindset, but of course with words alone, it's kinda like a 3rd grader's argument. Something like, 'afraid not....' 'afraid so....'
     
    Well, I thought of a simple and rough experiment, for this matter. For the base theory, let's just go with that mindset for a while, 
     
     
    That's the base theory, so let's see if it applies to the reality.
     
    For this experiment, I'd need some sort of loudness measuring device. This is what I'm using:
     

     
    It's just a regular sound level meter. It will 'listen' to ambient sound, and calculate the average (A weighted) loudness.
     
    These are the specs:
     
     
    It got 'MAX' function, which is it will hold a max value of measurement.
     

     
    It will 'listen' through this microphone. 
     
    Now this is an ambient sound level, right? That means, it's quite sensitive, and it will pick up lots of ambient noises. For this, I need a way to give it some kind of sound insulation. 
     

     
    This is the insulation. It's just a plain handycraft foam sheets, 12x12cm, 3 layers stacked. Cut a hole in the middle for the microphone to sit in.
     

     
    This is how it looks.
     

     
    This is how it's implemented. So I covered the pads of the headphones for testing with the foam insulation, and take the measurements. The foam minimizes sounds leaking in and out of the headphone. 
     
     
    Yeah, so I got the measurement device ready. Next I'd need the test subjects, right?
     

     
    These are all the headphones I got with me, sorted from the smallest impedance on the left, to highest impedance on the right.
     
    1. ATH-AD700: 32 ohms (http://eu.audio-technica.com/en/products/product.asp?catID=5&subID=37&prodID=156)
    2. Hifiman HE400: 35 ohms (http://www.head-direct.com/Products/?act=detail&id=115)
    3. AKG Q701: 62 ohms (http://eu.akg.com/akg-product-detail_eu/q-701.html => click the 'specsheet' link)
    4. Krezt DJ-9200 (local brand, China generic): 64 ohms (http://krezt.co.id/v3/?product=krezt-dj-9200)
     
    So....it's already sorted out from smallest (32 ohms) to biggest/highest (64 ohms) impedance. According to the base theory, with the same source device, and same volume setting, it will sound the loudest on the 32 ohms one, and least loud on the 64 ohms, right? Well, let's see....
     
    Measurement device ready, test subjects ready, next I'd need a source device. I'm not gonna use my PC, ipad, or some random DAP or amps. Instead, I'm gonna use this:
     

     
    Good ol' Nokia 2730 classic. It's discontinued. It's just a plain basic phone. No touch screens, no iOS or android, hell it doesn't even have wi-fi capability, and can't play any videos. I bought it for $35 or so. The latest price I found on google is $50 or something. At least it got an MP3 player function, and a 3.5mm audio jack. I'd be plugging the test headphones directly to this phone, set the volume to max, with no amp in-between. 
     
    This is the test track I'm using:
     

     
    Obviously, the MP3 version, not the video, because of the phone's limitation. 
     
    Why do I use a trance song? Why not a sine tone, something like 1kHz sine tone or something? Well, it's because the headphones got different frequency responses, so I thought it'd be better to get an average reading of various frequencies, rather than 1 particular tone. 
     
    So it's all set, played the song for 30 seconds from the beginning. With the sound meter's 'MAX' function, it will lock and display the maximum loudness it measured.
     
     
    Moment of truth, the results, sorted from number 1 to 4. Remember, same source, same volume, same test track, same sound meter, same method of testing:
     

     
    AD700: 91.8 dB
     

     
    HE400: 83.1 dB. So far so good, it went according to the base theory. Because HE400 is 35 ohms, so it's less loud than the 32 ohms AD700, right?
     

     
    Q701: 83.2 dB. Hmmm that's strange. 35 ohms to 62 ohms impedance is quite a leap. So the Q701 should be considerably less loud than the HE400, right? Then why the hell did it measure basically the same as HE400 (let's waive the 0.1dB, chalk it to difference in frequency response)??
     

     
    Krezt: 103.4 dB. Hot damn, I've broken a law of physics here. The highest impedance of all 4 test subjects is actually the loudest!! Yay, the 64 ohms headphone is 11dB louder than the headphone with half its impedance, 32 ohms.....
     
    What gives, dude?
     
    Seems like the base theory of 'the higher the impedance of a headphone, the harder it is to drive it louder, the more it needs an amp' is on the brink of being debunked. Oh man, the world's coming to an end....
     
    Sooooo....if impedance is not the determining factor, then there have to be another determining factor, right? After all, the universe is bound to orderly laws of physics. 
     
    So let's look at one other rating of the headphones, which is the sensitivity/efficiency. Something that looks like 'X dB', or 'Y dB/mW', or 'Z dB/V'.
     
    1. AD700: 98 dB/mW
    2. HE400: 93.5 dB/mW
    3. Q701: 105 dB/V => this is not the same as dB/mW. To be able to measure it in the same context, this have to be converted to dB/mW, which results in ~93 dB/mW
    4. Krezt DJ-9200: 107 dB/mW
     
    Oh wow, now it all make sense!!. Using the order of the sensitivity ratings, the order of the least loud to loudest is: 2 or 3 (about the same ratings and measurements) - 1 - 4. This corresponds with the test results in the pictures. 
     
    So what is this efficiency/sensitivity? In a nutshell:
     
     
    Just by the definition of the efficiency/sensitivity, it's already quite obvious that when we're talking about loudness, or how loud it will go, efficiency/sensitivity is the more important, and most determining factor. 
     
    So at this point, I think we could all agree now that 'impedance rating' is NOT the determining factor of how easy or hard it is to drive a headphone to the desired loudness. Here's a fun trivia fact: the hardest to drive headphone in the market, the Hifiman HE-6, is only 50 ohms. Even less impedance than the Krezt DJ-9200. 
     
    So the next time you see someone posting something like, 'My headphone is 250 ohms, so it DEFINITELY needs an amp. If only my headphone is 32 ohms, I won't need an amp at all', hope you'd remember this topic and this simple experiment...
  8. Informative
    infamous_last reacted to ApolloX75 in System won't boot after GPU Waterblock install   
    Have you tried swapping that 780 into the system to test? And when you tried the cards on their own they did or didn't work at all? That wasn't very clear in your original post, just want to clarify it. 
     
    Seems odd that the simple addition of water blocks would have this effect on a healthy system. What else (if anything) did you change?
  9. Informative
    infamous_last reacted to Dietrichw in 2016 MacBook Pro drops digital audio out on 3.5mm jack   
    I forgot that Apple even included a 3.5mm SPDIF combo. I remember looking it up once because I was confused why I could see a red glow from the 3.5mm jack of a mac mini.
     

     
  10. Like
    infamous_last reacted to limetechjon in 2 Gamers, 1 CPU - Virtualized Gaming Build Log   
    Sorry, must have missed that part.  That said, it's a pretty big miss considering NVIDIA has the majority share of the gaming GPU market.  
     
     
     
    I was simply calling out a design difference in how you proposed your system as opposed to how unRAID works natively.  I agree, you could have choose differently, but you didn't, so I simply highlighted that.  In addition, a hardware-based RAID like this is far less portable.  With unRAID, you could literally buy all new hardware except your storage devices, move them over, start it up, and everything picks up right where it left off.  Your hardware-based RAID solution doesn't offer that kind of portability.
     
    Also, the reason we call it a cache pool is because the primary function is to cache write operations from the parity protected array, then move them over to the array at a later time so as to improve write performance.  That is it's primary function and why we refer to it as such.  The ability to force data to live on the cache pool is just a feature, but I agree, IT terminology can be confusing in general, so I don't hold you at fault for this misunderstanding.
     
     
     
    Have you tried unRAID?  It's not the same setup at all.  You plug a USB stick into a system, copy the files to it (or you can buy one pre-configured from us), and then you boot it up.  There is no "install" to another device at that point.  The flash device is already the installed version of unRAID.
     
    As far as needing a separate device from the hypervisor, you can buy the USB pre-configured flash from us to avoid even needing an x86 computer at all for configuration (just do it over a tablet/smartphone using a browser).  You also could do the entire thing from a Mac OS X device which you can't do with VMWare.  Even VMWare fusion doesn't have the same management options as the Windows-based client.
     
    As far as someone wanting to do this already having a separate Windows-based system, our aim is to remove the need for that second system altogether.  One master system with proper resource partitioning.
     
     
     
    Well, I'm comparing to someone who is not familiar with setting up the things you've done here.  I don't think many would know how to use all the VMware tools nor how to setup WHS2011.  It also required the extra software you mentioned.  Just seems like lots of layered technology whereas with unRAID, we deliver all that functionality out of the core OS itself.
     
     
     
    The main reason is to call out the extra stuff unRAID can do over just ESXi by itself.  And while you may not have Apple in your computing family, there are many that do and would love to utilize their NAS for Time Machine functionality.  And while yes, you could install another NAS OS as a VM, again, that's another product / solution you have to install and master.  There are lots of layers of technology involved from multiple vendors.  If you want something that acts as a NAS out of the box and offers a simple way to pool/manage storage, but also want to consolidate your desktop PC into it as well, ESXi is a much more complicated animal given that you have to master it and another system for storage management.  It's not as simple as "just install a NAS OS and presto, I have SMB shares."
     
     
     
    It will be significantly different.  First, ESXi requires that you have an emulated graphics device in addition to the passed through device, which makes the emulated graphics the primary graphics.  It also means performance overhead for ESXi and 3D graphics compared to native pass through with QEMU/KVM that we're doing on unRAID.  With unRAID, we can specify an option to the hypervisor on unRAID to not create any emulated graphics adapter, and then let the GPU naturally take that over itself.  This means you install your Windows VM itself on the monitor, not through a remote VNC session.
     
    Also, unRAID isn't doing much in the background and you can completely isolate CPU cores for NAS services from VMs, which allows for things like what Linus did to be possible (eliminating context-switching).
     
    There are many who talk about poor performance with PCI pass through for gaming on VMware and other solutions.  The methods built into QEMU/KVM are definitely cutting edge, support a wider array of GPUs, offer near bare-metal performance, and with unRAID, take much less time to configure.
     
     
     
    I do know, but you're missing the point.  Your system is doing all those things just the same, so isn't it still a jack of all trades?  Your argument is that the layer that controls creating VMs should be isolated from the layer to manage storage.  My argument is that it's not necessary and virtualizing just for the sake of virtualizing doesn't make sense to me.  Look up the trends in IT convergence and you'll see even the big boys are moving towards a model of converging hardware appliances into less and less physical equipment.  We are simply offering that same capability at a consumer/prosumer scale.
  11. Agree
    infamous_last reacted to GreezyJeezy in Hyper Evo 212 Broke Computer?   
    is it plugged into the cpu fan header? 
  12. Agree
    infamous_last reacted to SonoDanshi in Project: Node way will it fit! - A hardline watercooled Node202   
    Final update time!
     
    So carrying on from the last update, as I had everything connected I figured it was time to leak test. Time for lots of paper towels and a light color tint to the water to highlight any problems.
     
    As the res was so small and the connection to the pump is right near the top, I had to add a temporary tube to the fill port that would give me enough liquid to cycle the pump on and off without it running dry for too long.  The tube also helped with the bleeding as it allowed air top bubble up and escape wit the water level higher than the res.

     
    To my complete and utter surprise absolutely nothing leaked at all and after adding a bit more dye to the loop it looked like this:

     

     

     

     
    Here's a better look at the res when fully topped up.

     

     
    And finally... here she is with everything in place and ready to roll.
     

     
    I haven't done a great deal of benching and haven't done any overclocking yet, but the CPU idles at 24c and peaked out at 67c on the highest core when benchmarking. I definitely plan to push things beyond stock but with such small thermal capacity I imagine things will spike very quickly beyond a certain point. I also have plans to cut the top of the Node202 out and replace it with some beveled tempered glass so I can actually see all the hard work that has gone into it.
  13. Funny
    infamous_last reacted to Notional in Doom (4?) performance on Radeon and GeForce   
    Not a huge surprise. OpenGL is one obsolete redundant clusterfrack of an API.
     
    The results are also all over the place. Vulkan is the only thing useful outside of DirectX. And since IDS hates DX, because they like to cater to the 5 people gaming on Apple, and the 50 people typing into the Linux console to make a driver work to start the intro screen on a game before their computer catches on fire; they better include the Vulkan version soon. What a joke. A AAA title using OpenGL.
     
  14. Agree
    infamous_last reacted to shadowbyte in Goodbye 980Ti   
    well time to buy a 980ti for cheap
  15. Agree
    infamous_last reacted to Project37 in Is there room in the industry for non-degreed, self-taught people?   
    If you want to get into programming, you could build your own apps for Android or ios. There are plenty of free tutorials out there to get you started. This will get your feet wet in code, and maybe make some money if an app becomes popular. 
     
    Coding is not the only IT Avenue. There is also the infrastructure side,setting up networks and servers. Solving business problems with technology. You can get started in this fairly easy by getting your comptia a+ certification and find a user support job. Like the geek squad. better yet, in house support for a company or a call center. Then you will start gaining experience. 
     
    There is plenty of room for people with certifications. Not in programming as much. But with certs and start getting successful projects under your belt, you can go a long way. Interviewers want to hear what the problem was, what you did to resolve it and what the outcome was. Quantifying your improvements into something that is a positive outcome is huge. 
     
    The business world is all about boasting your accomplishments in a professional manner in my opinion. 
  16. Agree
    infamous_last got a reaction from don_svetlio in 980 or a fury x?   
    Prices are in AUD, OP is in Australia.
    Quoted RRP is USD
     
    Still more expensive than it should be (IMO), but your not taking into account exchange rates.
     
     
     
  17. Agree
    infamous_last reacted to HKZeroFive in 980 or a fury x?   
    Why not just go for a cheaper GTX 980Ti? That variant is ASUS' most premium offering. Wouldn't hurt to go for a STRIX.
     
    Is it possible to buy from a different store rather than Umart?
  18. Like
    infamous_last reacted to Jidonsu in The Water Cooling Gallery   
    Built my first liquid cooled pc this week. Pretty happy with how it turned out.
     
     






























































  19. Agree
    infamous_last reacted to GoodBytes in Microsoft's Conspiracy within windows 10   
    No. The only thing that Microsoft does, is that moved older OSs updated to older servers, so downloading updates are a bit slower.
    This is absolutely nothing new, and have been doing so since Windows 95.
     
    As for the Check for Update that takes ages, see @Hawx post.
     
    Microsoft is NOT slowing down older Windows. If anything, you see the new and improved Windows as being faster and more responsive, and you think your current Windows was purposefully slowed down, that isn't the case. It must also be noted that Windows 8 has improved Windows Update system, and Windows10 has a much improved Windows Update system which check updates WAY faster than previous version of Windows. Similarly how Windows 7 is faster than Windows XP.
     
    Windows Update pretty much all share a similar problem... the more updates they are, the slower it is at checking for updates. It is something that Microsoft has been improving over he years with different version of Windows.
  20. Agree
    infamous_last got a reaction from paddy-stone in What is the best VPN provider?   
    Been using PIA for years and have been very happy.
    Had no issues with the desktop client or using OpenVPN.
  21. Agree
    infamous_last reacted to scottyseng in Help with a 3D (CAD) Rendering/Occasional Gaming Powerhouse   
    Yeah, I would also recommend going to X99. The 5820K (Do note it has fewer PCIe lanes) or 5930K are really good CPUs. If you're working on Revit, you're definitely going to want a faster CPU. Revit uses the CPU for pretty much everything...All geometry is rendered by the CPU. I use a FirePro v7900 and the graphics aren't my bottleneck, it's my 2500K (OC'd to 4.6GHz). I have a 14 core Xeon (E5-2965v3) )in my NAS with the same FirePro v7900, and the speed difference is quite drastic. However, I just got done building and overclocking a 5820K based computer for my cousin, and I got it clocked to 4.5Ghz and it's not too far behind my 14 core Xeon there (Which is stuck at 2.2GHz). The 5820K pulls more power though (1.21vcore vs .089 on the Xeon).
     
    I think you should be fine with the Nvidia consumer GPUs unless you're dealing with massive models (Though with Revit it matters less since it's more CPU biased). I personally bought a FirePro because I was getting weird graphical glitches with my Radeon GPU (HD 4850) in Revit (like walls that didn't exist, artifacts, roofs that didn't exist where they were supposed to). However, that was back in Revit 2008...These days Revit is pretty good. If you have Nvidia GPUs, you can utilize the Nvidia Raytracer renderer that they added in Revit 2016 (I didn't notice until I did a practice render a few weeks ago). The AMD FirePros have a better performance per dollar compared to the Quadros (Though Quadros have better drivers), but neither can touch consumer GPUs (I think you should only get workstation GPUs if you're using software that can use it correctly (3DS Max, AutoCAD, etc)).
     
    Also, I see you have four WD Blue Drives there. Are you planning on having a RAID array? If so, you should change to using WD Red drives.
  22. Informative
    infamous_last reacted to nicklmg in Dell XPS 15 9550 Review   
    Amazon: http://geni.us/yJw
    NCIX: http://bit.ly/1WST0jS
     
    dbrand: https://dbrand.com/xpslinus
     
    Does the Dell XPS 15 deliver a solid all-around experience for a fair price?
     
     
     
  23. Like
    infamous_last reacted to LinusTech in Dell XPS 15 9550 Review   
     
    I don't have an external GPU box yet.. So stay tuned.
  24. Agree
    infamous_last reacted to ImBleu in Running 4 70inch TVs   
    Any modern GPU capable of 4 outputs can run 4x 1440p TVs, but for gaming you'll need quite a lot of horses, think SLI 980Ti / xf R9 390x.
     
    The physical size makes no difference to the GPU, whether your 1440p screen is 1inch or 1000inches, it's still gotta send signals to the same amount of pixels.
  25. Agree
    infamous_last reacted to Arty in Baking Graphic card is not a FIX !! please stop making videos abut it.   
    if your GPU is already dead, it can't get worse lmao
×