Jump to content

ThE_MarD

Member
  • Posts

    964
  • Joined

Everything posted by ThE_MarD

  1. Heyyo, Yeah your best bet might be to just use Virtual Machine for running Mac OSX within Linux. As for running Windows programs in Linux? You have the choice of WINE (PlayOnLinux if you want an easier way to do it with automatic scripts for certain games/apps) or VM for that too. Of course, with VM or dynamic recompilers like WINE? It won't run as fast as natively in its OS of choice.
  2. Heyyo, Yeah sadly it was dead... but hey, I'm sure it'll cool just as well as any zip-tied CPU cooler on a GPU. Yeah no doubt. I'd be a monsterous little unit. Maybe it'll get the R9 Nano treatment too. Indeed. The hardest part would probably be to setup the watercooling loop. The first batch of OCZ SSD's back when it was $200 for a 60GB SSD? Yeah they weren't so great and suffered stability issues when used in RAID... but my bud got his RMA'ed for a newer revision that worked fine and sold me one a few years ago. It's still going and working great. OCZ does have a good warranty setup on their SSD's at least.
  3. Heyyo, For me? Warcraft 2 had so so many damn good songs! You could even pop the CD into a CD player and listen to the whole soundtrack which was awesome... BUT, I think my favorite video game song is still Command & Conquer Red Alert's Hell March. That would be my RTS pick... for FPS? Probably Unreal Tournament '99 GOTY Soundtrack - Organic... that or Go Down... tough choice!
  4. Heyyo, Yep. This proves that the AMD drivers indeed have a lot more overhead than NVIDIA in DirectX 11. An AMD R9 280 is a faster GPU than the GTX 750 Ti... unless the CPU is a bottleneck. In that case? The more efficient NVIDIA drivers and Maxwell is just as good for a lot lower cost... well, at least in GTA V. As for DirectX 12 Ashes of Singularity? Meh, this time it looks like NVIDIA put out some poop drivers instead of what we've seen instead of AMD. If you go back into the past with that Anandtech article with Star Swarm DirectX 12? NVIDIA scaled properly in that and you could see the AMD driver bottleneck in DirectX 11... where for some reason even a GTX 750 Ti got a higher framerate. It could also be a serious AMD DirectX 11 driver bug in Star Swarm. http://www.anandtech.com/show/8962/the-directx-12-performance-preview-amd-nvidia-star-swarm So I dunno, I say let's wait to see what NVIDIA does working with Oxide Games to fix their driver's performance issues with Ashes of Singularity. This kind of reminds me of ID Software's ID Tech 5 Engine game... RAGE. ATi put out the wrong driver and everyone with an ATi GPU was raging (pun not intended... but left in heh) at ID Software until John Carmack said "they put out the wrong driver." Afterwards? Proper drivers released and all was well again. History repeats itself meow with Ashes of Singularity, this time NVIDIA's fault. Oxide Games also said it nicely here: http://www.oxidegames.com/2015/08/16/the-birth-of-a-new-api/ Also... dammit! I knew that DirectX 12 is only single-GPU right meow... couldn't get my SLI setup to work with Unreal Engine 4.9 leaked DirectX 12 Elemental Demo... I didn't bother spending the $50 founders pack for Ashes of Singularity and I'm glad I didn't meow. I'm not that big into RTS games anymore anyways... plus, I can't even test SLI in it? Meh. I'll just wait for future benchmarks and reviews once it goes into Beta I guess.
  5. Heyyo, http://www.pcper.com/reviews/Graphics-Cards/DX12-GPU-and-CPU-Performance-Tested-Ashes-Singularity-Benchmark So... This benchmark has a bum run for NVIDIA drivers right meow. I guess for a change NVIDIA put out some bad drivers instead of AMD. Lol. Dat search function though! It's hard to use.
  6. Heyyo, Such is the way with computers. Here's a handy tool that I saw on an old LTT or NCIX video called Ninite where it'll auto-install some of the basic programs for a fresh OS install. I like it and I use it. https://ninite.com/
  7. Heyyo, You can't in that sense. The only way you could attempt something like that is to first backup your data to another HDD and delete extra data to match the size of the SSD, then clone your HDD to the SSD and then put your data back on your HDD... Ultimately? Your best bet is to just reinstall your OS from scratch on the SSD. It'll ensure too that you get proper performance. It's the safest and cleanest bet.
  8. Heyyo, Memory usage is up to the game engine. Unless DirectX 12 has a more optimized texture compression which so far not a single mention of that? It will probably have the same memory usage as now. I wouldn't worry too much about 2GB on a GTX 960 though, you'll still run out of GPU power before you hit a VRAM limitation. If you ran SLI on GTX 960 2GB? There is a chance you'd run into a VRAM wall... but of course that depends on if the game engine will support VRAM doubling or not. From everything I've gathered so far? VRAM doubling on Multi-GPU setups will vary depending on game engine support as it is part of the multi adapter setup and NOT natively through DirectX. It does make sense though as I have not heard about AMD Mantle doing VRAM sharing between GPUs on Battlefield 4, it only happens on Civ V's new addon... that probably doesn't even touch 4GB since it's an isometric strategy game and not GTA V lol. I bet GTA V would benefit greatly from AMD Mantle / DirectX 12 / Vulkan... since even the boost of draw calls per second on a detailed open world game would make the game run better... especially the multi adapter VRAM sharing in my case of my 2GB GTX 680 SLI.
  9. Heyyo, SSD's only affect load times, not performance. The only time it affects performance vs an HDD? Stuff like photo and video editing and using a scratch disk. With that said? Loading times are shortened A LOT if you get a proper SSD like the Samsung 850 Evo series.
  10. Heyyo, Hmm, apparently the target audience is people from Dubai that are addicted to sex... good to know lol.
  11. Heyyo, Tbh I've always found that adaptive-vsync doesn't work for screen-tearing, it only tends to fix stuttering when the framerate drops hard is all I've noticed with it. I always got screen tearing with that. Definitely try turning off adaptive vsync and just use vsync and cap the framerate at either 28fps or 30fps.
  12. Heyyo, good answer amigo. Oops to the second part! I didn't notice the dsfix is in use... hmm... Maybe use NVIDIA Inspector? Check that the profile is running on the right exe (launch the game, alt-tab and use task manager, right click dark souls and select "show process" and it'll give you a .exe, right-click that .exe and "open file location") Next? Try forcing VSYNC on in there and then set the framerate target to 28fps. See if that helps. If that doesn't work? I can always try downloading the game and tinkering with it too.
  13. Heyyo, @Majestic Use DSfix and try tinkering with the settings please: http://steamcommunity.com/sharedfiles/filedetails/?id=357056859 Maybe if you're pushing your graphics so hard that your PC can't maintain 60fps sure... I've been playing with Vsync on for the longest time and I've never had massive stuttering issues.
  14. Heyyo, I mainly wanna discuss what @Senzelian wrote as I do like his answers but wanted to say a few things on them. It should be noted with the GTX 980 there is a driver bug due to MSAA. Of course, NVIDIA blamed the game devs and the game devs say it's NVIDIA's fault. Lots of that going around, blaming others these days... AMD blames people, NVIDIA blames people... blame blame blame... it's getting fucking lame lol. You can Google it and see for yourself. So for DirectX 12 benchmarks? I'd still hold off... Ashes of Singularity is a pre-beta benchmark... this was just the first true taste as the leaked/hacked Unreal Engine 4.9 DirectX 12 Elemental Demo isn't optimized properly... My GTX 680 SLI benchmarks in DirectX 11 were fine even with single-GPU mode... but DirectX 12 is single-GPU only and was scoring about 15% slower than DirectX 11. Too true. It sucks that some games tend to prefer one GPU make over the other, but luckily for 2015? That trend is only in a handful of games where-as most others scale very nicely. Exactly. I wouldn't worry about costs differences to run unless it was a dedicated server running 24/7 and planned on running it for a decade. JayzTwoCents on YouTube even did a thing about it for AMD Vs Intel CPUs... four hours a day, every day for a year worked out to about $9 but that $9 fluctuates depending on your provider's cost of kW/h... so at most if it was double his cost? $18 a year? The R9 390 is a very decently overclocked card on a refreshed Hawaii GPU. The cooling provided on the MSI DirectCU II is really damn good so I wouldn't worry so much about thermals as long as your case has great ventilation and airflow. I used to own two stock-cooler EVGA GTX 480's in SLI.... THAT... was some serious heat hahaha... so I wouldn't even worry about it on the AMD R9 390 unless you wanted to do Crossfire. Even then? A little thermal throttling is all that might happen if it's a crazy-hot summer day. With all that in mind? I do agree for non-reference coolers? The AMD R9 390 is a better purchase, especially if you plan on doing a Multi-GPU upgrade in the future. 4GB of VRAM is a very healthy place to be on 2560x1440 and all that extra VRAM on the AMD R9 390 will help it out. From what I've seen and read so far for DirectX 12? That whole VRAM stacking will depend on developers implementing it, same with Split-Frame Rendering... so... that could suck if you buy a game and find out it only does Alternate-Frame Rendering with no stacking VRAM. An SLI GTX 970 setup could feel the pain of that 3.5GB threshold before a potential performance drop. I say that's not worth the risk since both cards perform about the same and cost about the same.
  15. HEyyo, Yeah, AMD really do need to step up their mobile GPU game. I remember back in the day? the AMD Mobility Radeon 5650 was awesome in my Gateway NV5905H notebook for $700 CAD could play Mass Effect 2 very nicely... but ever since then? I haven't seen a lot of good performers unless you go for an APU... but nowadays for an $800 notebook? APUs get outclassed by the GTX 950M. I bought my mom a Lenovo notebook with an A6 APU for around $500 CAD and it's badass though... but that's the thing, for $500. Spend a couple more hundred bucks like the OP wants to? Discrete GPU is the way to go. If you don't need that 1TB HDD? It'll game better for the i7 version but not by a crapton though. One thing I dislike about how manufacturer's do model numbers on notebooks? They're nowhere near the same product as PC. The i5-5500U here on Intel's spec website: http://ark.intel.com/products/85212/Intel-Core-i5-5200U-Processor-3M-Cache-up-to-2_70-GHz The i7-5500U here on Intel's spec website: http://ark.intel.com/products/85214/Intel-Core-i7-5500U-Processor-4M-Cache-up-to-3_00-GHz As you can see, both are dual cores with hyper threading. The main differences is the L2 cache and clock speeds. 300MHz will help a bit sure, but that bigger SSD might benefit you more with a more responsive notebook. It'll also have better power savings than an HDD and less heat due to no moving parts. Another prime example of misleading models between notebooks and desktops? AMD's R9 290... the AMD R9 M290X... it's slower than a desktop's AMD R9 270X. http://hothardware.com/reviews/alienware-17-amds-r9-m290x-goes-mobile?page=6 So yeah, it isn't a lot more expensive for the higher-end Asus notebook for a bigger SSD and slightly faster CPU... so if you got the spare cash? Heck yeah I say go for it.
  16. Heyyo, NVIDIA Optimus. What it does? It uses the Intel Intergrated graphics for power savings and then when it detects a program that is setup with an NVIDIA Control Panel profile to use the NVIDIA GPU? It switches to the NVIDIA GPU. Battery savings on the go and gaming grunt when ya need it! http://www.nvidia.ca/object/optimus_technology.html My ol' Alienware M14X used the Intel HD 4000 series GMA when I was browsing the net and stuff and would use my GTX 650M when gaming. Pretty darn neat. Here's a write-up that shows how to even set it up on programs that don't have an NVIDIA Control Panel profile to use the NVIDIA GPU when it's running: http://www.studio1productions.com/Articles/Nvidia.htm Before you maybe ask? YES! NVIDIA Optimus did in fact have a setup like that on certain motherboards and chipsets long ago for desktop PCs... but apparently there was some issues on there that oddly weren't an issue on notebooks so hardware vendors stopped working together.... I forget what it was called on desktop PCs... just tried Googling and I think they just called it "hybird graphics" I think Lucid VirtuMVP does it as well with the Intel iGPU and the dGPU on desktops... but I dunno, I've never tried it... I run an SLI setup and Lucid VirtuMVP is not compatible with SLI or AMD's Crossfire.
  17. Heyyo, On Newegg I see an Asus notebook with an i5 and a GTX 950M for under $800 USD. http://www.newegg.com/Product/Product.aspx?Item=N82E16834232576
  18. Heyyo, Q6600 came out in I think 2006. It's more than enough CPU for both games and it'll work fine with that 7850. 8GB of RAM is plenty too. NES emulation? You essentially need a PC from the year 1998 to do that... so you'll have absolutely no problem emulating older consoles. N64, Dreamcast and PSX might struggle a little depending on the title.
  19. Heyyo, Eh, I wouldn't use Linux Mint with Cinnamon on that... that Cinnamon Desktop Environment is pretty heavy. I'd use something based on LXDE or XFCE. Lubuntu or Xubuntu. They barely use any resources and will do everything that @trainergames asked for. http://xubuntu.org/ or http://lubuntu.net/ Ultimately? My favorite version based on Ubuntu is Ubuntu Mate. It does use more resources than XFCE or LXDE though. https://ubuntu-mate.org/ Linux Mint also has a Mate version here: http://www.linuxmint.com/download.php But yeah, I definitely recommend sticking to Desktop Environments with as little resource use as possible. I wouldn't bother with Windows on your PC unless you have an old version of XP kicking around... but then again, no security updates so it could be vulnerable to certain exploits... Windows 7 and up will probably hog so many recourses on that unit and make it quite aggravating with constant delays or slow loading due to being a bit of a resource hog for your older unit.
  20. Heyyo, Right meow? My favorite ROM is Exodus Android. It's a tweaked spin-off of CyanogenMod with improvements to the kernel, similar to franco kernel. It has really good battery life and has stuff like built-in ad blocker SYSTEM WIDE which is epic since Chrome for Android does not have extensions plus built-in support for SuperSU. I'd recommend the builds for 5.1.1 since it runs so SO much better than 5.0.2. Of course, 5.1.1 builds are still nightly since it's based on cm12.1 nightly builds. http://forum.xda-developers.com/lg-g3/orig-development/rom-team-exodus-t3133568 or if you're on verizon: http://forum.xda-developers.com/verizon-lg-g3/orig-development/rom-team-exodus-t3133089 You could ultimately go for the more stable 5.0.2 builds... but like I said, since 5.0.2 in general has a bunch of memory leaks? You won't get nearly as good of battery life. Yeah you can use your stock Android LG UX if you want. Not that I know of unless it's included in an updated firmware... sorry mang. You can via your recovery. You can do the adb sideload or copy it to an easy to find folder like your downloads. If you switch from one ROM to another? You'll need to do a system wipe, otherwise it's called "dirty flashing" which may lead to stability issues (no need for data wipe though). I recommend using the TWRP (Team Win Recovery Project) custom recovery. https://twrp.me/Devices/ For other custom ROMS for the LG G3? You should look at the XDA Developers forum section here: http://forum.xda-developers.com/lg-g3/orig-development There's quite a few to check out. I've also heard that BlissPop is great too but tbh I haven't tried it yet.
  21. Heyyo, Indeed this. At most? It will ask you to redo your activation and if it fails? Just do the telephone activation and explain you changed the motherboard because the old one died. Tell them it's only activated on one PC and they'll set up your new PC to have the genuine activation with some numbers to type into the activation window.
  22. Heyyo, Eh, I'd take the Sony Xperia Z4 (Z3+ outside of Japan) out of that list and replace it with either the Z3 or wait for the Z5 and see how it turns out... http://www.androidauthority.com/sony-xperia-z3-plus-review-625850/ I'm unsure if AA had a test unit or an actual commercial release unit... but I dunno, it needs software fixes... even if it's thermal throttling like HTC did with their One M9. I think one of the issues with the Z4 overheating is they have their SnapDragon clocked at 2.0GHz where it has been known to get really hot to the touch, were-as OnePlus clocked theirs in the OnePlus 2 at 1.8GHz so you still get good heterogeneous multi-processing battery saving with performance that's still better than a SnapDragon 801. That, and I have no idea what kind of passive cooling the Xperia Z4/Z3+ offers, but it doesn't seem quite up to snuff. Overall? I still think the Sony Xperia Z3 is your best choice. Sure, it's not a 2015 flagship, but it's an extremely solid device. The only true benefit the Z3+/Z4 has over it is the battery life.
  23. Heyyo, https://www.saygus.com/v2-2/ Check it out amigo. It comes close... Headphone port is on the bottom, battery is 100 mAh smaller than what you wanted, screen is only 1080P but it's still a very high 445 PPI... But ,you know that MicroSD slot? Well... Dual MicroSDXC Slots* (400GB expandable storage using SanDisk’s 200GB memory cards) Oh if you're cool with ditching Qi charging? Sony Xperia Z3 would do the rest. Battery is 100 mAh smaller than what you wanted and it is a FHD screen. http://www.sonymobile.com/gb/products/phones/xperia-z3/
  24. Heyyo, @Aefiel You can use furmark as a GPU thermal stress test... but tbh? It's safer to just leave 3DMark on a loop or Unigine Valley. 3DMark pushes your pc beyond what any game would so it'll definitely get your PC hot beyond what any game could. Just go into the custom tab and check the loop box, set it to run in windowed mode and you're set. http://www.futuremark.com/support/3dmark
  25. Heyyo, It's just thermal throttling. Same goes for anything GPU-related. Linus did a video about thermal throttling on his HTPC before... lemme find it... That's what all GPUs do. They don't wanna melt.
×