Jump to content

ThE_MarD

Member
  • Posts

    964
  • Joined

Awards

This user doesn't have any awards

3 Followers

About ThE_MarD

  • Birthday Jun 06, 1985

Profile Information

  • Gender
    Male
  • Location
    Alberta, Canada eh?
  • Occupation
    Wellhead Isolation (Oilfield)

System

  • CPU
    Intel i7-3770k @ 4.2GHz
  • Motherboard
    Gigiabyte Z68-UD7-B3 with bios F11a
  • RAM
    16.0GB (4x4GB) Kingston Dual-Channel DDR3 @ 2133MHz (11-12-11-30 @ 1.65V)
  • GPU
    EVGA GeForce GTX 680 2GB Two-Way SLI
  • Case
    NZXT Phantom 530 (Red) ATX Full Tower
  • Storage
    Kingston SSDNow V+200 120GB 2.5" SSD (Windows) // Seagate 4TB 3.5" 5900RPM Hybrid (Data) // Crucial V4 64GB 2.5" SSD (World of Tanks hehe) // Seagate Barracuda 500GB 3.5" 7200RPM (Linux)
  • PSU
    Corsair 850W ATX12V / EPS12V
  • Display(s)
    Asus VE278Q 27.0" 60Hz via DisplayPort (Using RGB Full)
  • Cooling
    Corsair H80 92.0 CFM Liquid
  • Keyboard
    Logitech G15
  • Mouse
    Logitech G400s Wired Optical
  • Sound
    Creative Labs Soundblaster Recon3D PCI-E
  • Operating System
    Windows 10 Technical Preview
  • PCPartPicker URL

Recent Profile Visitors

1,774 profile views
  1. Heyyo, Yeah your best bet might be to just use Virtual Machine for running Mac OSX within Linux. As for running Windows programs in Linux? You have the choice of WINE (PlayOnLinux if you want an easier way to do it with automatic scripts for certain games/apps) or VM for that too. Of course, with VM or dynamic recompilers like WINE? It won't run as fast as natively in its OS of choice.
  2. Heyyo, Yeah sadly it was dead... but hey, I'm sure it'll cool just as well as any zip-tied CPU cooler on a GPU. Yeah no doubt. I'd be a monsterous little unit. Maybe it'll get the R9 Nano treatment too. Indeed. The hardest part would probably be to setup the watercooling loop. The first batch of OCZ SSD's back when it was $200 for a 60GB SSD? Yeah they weren't so great and suffered stability issues when used in RAID... but my bud got his RMA'ed for a newer revision that worked fine and sold me one a few years ago. It's still going and working great. OCZ does have a good warranty setup on their SSD's at least.
  3. Heyyo, For me? Warcraft 2 had so so many damn good songs! You could even pop the CD into a CD player and listen to the whole soundtrack which was awesome... BUT, I think my favorite video game song is still Command & Conquer Red Alert's Hell March. That would be my RTS pick... for FPS? Probably Unreal Tournament '99 GOTY Soundtrack - Organic... that or Go Down... tough choice!
  4. Heyyo, Yep. This proves that the AMD drivers indeed have a lot more overhead than NVIDIA in DirectX 11. An AMD R9 280 is a faster GPU than the GTX 750 Ti... unless the CPU is a bottleneck. In that case? The more efficient NVIDIA drivers and Maxwell is just as good for a lot lower cost... well, at least in GTA V. As for DirectX 12 Ashes of Singularity? Meh, this time it looks like NVIDIA put out some poop drivers instead of what we've seen instead of AMD. If you go back into the past with that Anandtech article with Star Swarm DirectX 12? NVIDIA scaled properly in that and you could see the AMD driver bottleneck in DirectX 11... where for some reason even a GTX 750 Ti got a higher framerate. It could also be a serious AMD DirectX 11 driver bug in Star Swarm. http://www.anandtech.com/show/8962/the-directx-12-performance-preview-amd-nvidia-star-swarm So I dunno, I say let's wait to see what NVIDIA does working with Oxide Games to fix their driver's performance issues with Ashes of Singularity. This kind of reminds me of ID Software's ID Tech 5 Engine game... RAGE. ATi put out the wrong driver and everyone with an ATi GPU was raging (pun not intended... but left in heh) at ID Software until John Carmack said "they put out the wrong driver." Afterwards? Proper drivers released and all was well again. History repeats itself meow with Ashes of Singularity, this time NVIDIA's fault. Oxide Games also said it nicely here: http://www.oxidegames.com/2015/08/16/the-birth-of-a-new-api/ Also... dammit! I knew that DirectX 12 is only single-GPU right meow... couldn't get my SLI setup to work with Unreal Engine 4.9 leaked DirectX 12 Elemental Demo... I didn't bother spending the $50 founders pack for Ashes of Singularity and I'm glad I didn't meow. I'm not that big into RTS games anymore anyways... plus, I can't even test SLI in it? Meh. I'll just wait for future benchmarks and reviews once it goes into Beta I guess.
  5. Heyyo, http://www.pcper.com/reviews/Graphics-Cards/DX12-GPU-and-CPU-Performance-Tested-Ashes-Singularity-Benchmark So... This benchmark has a bum run for NVIDIA drivers right meow. I guess for a change NVIDIA put out some bad drivers instead of AMD. Lol. Dat search function though! It's hard to use.
  6. Heyyo, Such is the way with computers. Here's a handy tool that I saw on an old LTT or NCIX video called Ninite where it'll auto-install some of the basic programs for a fresh OS install. I like it and I use it. https://ninite.com/
  7. Heyyo, You can't in that sense. The only way you could attempt something like that is to first backup your data to another HDD and delete extra data to match the size of the SSD, then clone your HDD to the SSD and then put your data back on your HDD... Ultimately? Your best bet is to just reinstall your OS from scratch on the SSD. It'll ensure too that you get proper performance. It's the safest and cleanest bet.
  8. Heyyo, Memory usage is up to the game engine. Unless DirectX 12 has a more optimized texture compression which so far not a single mention of that? It will probably have the same memory usage as now. I wouldn't worry too much about 2GB on a GTX 960 though, you'll still run out of GPU power before you hit a VRAM limitation. If you ran SLI on GTX 960 2GB? There is a chance you'd run into a VRAM wall... but of course that depends on if the game engine will support VRAM doubling or not. From everything I've gathered so far? VRAM doubling on Multi-GPU setups will vary depending on game engine support as it is part of the multi adapter setup and NOT natively through DirectX. It does make sense though as I have not heard about AMD Mantle doing VRAM sharing between GPUs on Battlefield 4, it only happens on Civ V's new addon... that probably doesn't even touch 4GB since it's an isometric strategy game and not GTA V lol. I bet GTA V would benefit greatly from AMD Mantle / DirectX 12 / Vulkan... since even the boost of draw calls per second on a detailed open world game would make the game run better... especially the multi adapter VRAM sharing in my case of my 2GB GTX 680 SLI.
  9. Heyyo, SSD's only affect load times, not performance. The only time it affects performance vs an HDD? Stuff like photo and video editing and using a scratch disk. With that said? Loading times are shortened A LOT if you get a proper SSD like the Samsung 850 Evo series.
  10. Heyyo, Hmm, apparently the target audience is people from Dubai that are addicted to sex... good to know lol.
  11. Heyyo, Tbh I've always found that adaptive-vsync doesn't work for screen-tearing, it only tends to fix stuttering when the framerate drops hard is all I've noticed with it. I always got screen tearing with that. Definitely try turning off adaptive vsync and just use vsync and cap the framerate at either 28fps or 30fps.
  12. Heyyo, good answer amigo. Oops to the second part! I didn't notice the dsfix is in use... hmm... Maybe use NVIDIA Inspector? Check that the profile is running on the right exe (launch the game, alt-tab and use task manager, right click dark souls and select "show process" and it'll give you a .exe, right-click that .exe and "open file location") Next? Try forcing VSYNC on in there and then set the framerate target to 28fps. See if that helps. If that doesn't work? I can always try downloading the game and tinkering with it too.
  13. Heyyo, @Majestic Use DSfix and try tinkering with the settings please: http://steamcommunity.com/sharedfiles/filedetails/?id=357056859 Maybe if you're pushing your graphics so hard that your PC can't maintain 60fps sure... I've been playing with Vsync on for the longest time and I've never had massive stuttering issues.
  14. Heyyo, I mainly wanna discuss what @Senzelian wrote as I do like his answers but wanted to say a few things on them. It should be noted with the GTX 980 there is a driver bug due to MSAA. Of course, NVIDIA blamed the game devs and the game devs say it's NVIDIA's fault. Lots of that going around, blaming others these days... AMD blames people, NVIDIA blames people... blame blame blame... it's getting fucking lame lol. You can Google it and see for yourself. So for DirectX 12 benchmarks? I'd still hold off... Ashes of Singularity is a pre-beta benchmark... this was just the first true taste as the leaked/hacked Unreal Engine 4.9 DirectX 12 Elemental Demo isn't optimized properly... My GTX 680 SLI benchmarks in DirectX 11 were fine even with single-GPU mode... but DirectX 12 is single-GPU only and was scoring about 15% slower than DirectX 11. Too true. It sucks that some games tend to prefer one GPU make over the other, but luckily for 2015? That trend is only in a handful of games where-as most others scale very nicely. Exactly. I wouldn't worry about costs differences to run unless it was a dedicated server running 24/7 and planned on running it for a decade. JayzTwoCents on YouTube even did a thing about it for AMD Vs Intel CPUs... four hours a day, every day for a year worked out to about $9 but that $9 fluctuates depending on your provider's cost of kW/h... so at most if it was double his cost? $18 a year? The R9 390 is a very decently overclocked card on a refreshed Hawaii GPU. The cooling provided on the MSI DirectCU II is really damn good so I wouldn't worry so much about thermals as long as your case has great ventilation and airflow. I used to own two stock-cooler EVGA GTX 480's in SLI.... THAT... was some serious heat hahaha... so I wouldn't even worry about it on the AMD R9 390 unless you wanted to do Crossfire. Even then? A little thermal throttling is all that might happen if it's a crazy-hot summer day. With all that in mind? I do agree for non-reference coolers? The AMD R9 390 is a better purchase, especially if you plan on doing a Multi-GPU upgrade in the future. 4GB of VRAM is a very healthy place to be on 2560x1440 and all that extra VRAM on the AMD R9 390 will help it out. From what I've seen and read so far for DirectX 12? That whole VRAM stacking will depend on developers implementing it, same with Split-Frame Rendering... so... that could suck if you buy a game and find out it only does Alternate-Frame Rendering with no stacking VRAM. An SLI GTX 970 setup could feel the pain of that 3.5GB threshold before a potential performance drop. I say that's not worth the risk since both cards perform about the same and cost about the same.
  15. HEyyo, Yeah, AMD really do need to step up their mobile GPU game. I remember back in the day? the AMD Mobility Radeon 5650 was awesome in my Gateway NV5905H notebook for $700 CAD could play Mass Effect 2 very nicely... but ever since then? I haven't seen a lot of good performers unless you go for an APU... but nowadays for an $800 notebook? APUs get outclassed by the GTX 950M. I bought my mom a Lenovo notebook with an A6 APU for around $500 CAD and it's badass though... but that's the thing, for $500. Spend a couple more hundred bucks like the OP wants to? Discrete GPU is the way to go. If you don't need that 1TB HDD? It'll game better for the i7 version but not by a crapton though. One thing I dislike about how manufacturer's do model numbers on notebooks? They're nowhere near the same product as PC. The i5-5500U here on Intel's spec website: http://ark.intel.com/products/85212/Intel-Core-i5-5200U-Processor-3M-Cache-up-to-2_70-GHz The i7-5500U here on Intel's spec website: http://ark.intel.com/products/85214/Intel-Core-i7-5500U-Processor-4M-Cache-up-to-3_00-GHz As you can see, both are dual cores with hyper threading. The main differences is the L2 cache and clock speeds. 300MHz will help a bit sure, but that bigger SSD might benefit you more with a more responsive notebook. It'll also have better power savings than an HDD and less heat due to no moving parts. Another prime example of misleading models between notebooks and desktops? AMD's R9 290... the AMD R9 M290X... it's slower than a desktop's AMD R9 270X. http://hothardware.com/reviews/alienware-17-amds-r9-m290x-goes-mobile?page=6 So yeah, it isn't a lot more expensive for the higher-end Asus notebook for a bigger SSD and slightly faster CPU... so if you got the spare cash? Heck yeah I say go for it.
×