Jump to content

Locke290

Member
  • Posts

    77
  • Joined

  • Last visited

Everything posted by Locke290

  1. No, it isn't. A 970 is even faster then the original Titan.
  2. The 970 is 15% faster then a 780 at stock.
  3. Go research the memory being used on the 390X, its by Hynix model number H5GC4H24AJR-2TC. Rated for 5000Mhz effective. Same rating as the 290X. The 390X has more voltage going to it and that same memory is overclocked to go to 6000Mhz. The only difference between the 290X and 390X is the firmware, drivers, and BIOS. Stop with the "AMD says". Go read a review by any reviewer worthwhile you they will tell you this is an overclocked 290X with double the video memory. An overclocked 290X with added voltage will perform the same.Here at hardocp they did a clock per clock comparison. 290X and 390X perform identical. http://m.hardocp.com/article/2015/06/18/msi_r9_390x_gaming_8g_video_card_review/9#.VZCjd-3NiBY
  4. You're the only person I have seen saying this. It's impossible, its a 290X core with more voltage going to it.
  5. Then learn to read the source material you decide to quote. Straight from the Digital Foundry 390X review it states its a reference GTX 980 and they are using the MSI 390X. They also show 390X uses 77watts more than a 290X, and uses almost double the power as the 980.Also on TechPowerUp the memory they are using is actually specced at 5000Mhz and AMD added voltage to the card go allow for the 6000Mhz. Hence the increased power consumption, this is literally a 290X with more voltage, overclocked, more heat etc., The case against is equally clear: " there's little overclocking headroom left, and the 390X's power consumption is quite staggering." From Digital Foundry.
  6. Yah, so go back and read the analysis then. 390X uses 77 watts more than a 290X and 980 still outperforms it. They are also using a non-reference 390X vs reference cards.
  7. Yeah an overclocked 290X (ie 390X) uses less power then a stock 290X... Makes sense.Any review I have seen it uses on average 30-50watts more than a 290X. It's using more voltage to allow higher clocks. Yah it will beat a GTX 970 at 1440p/4k. At 1080p they are basically even but the 970 does it at 100$ less and half the power usage. I'd wager on an overclocked 970 and an overclocked 390X being pretty well even at 1440p and 4k with 970 pulling ahead at 1080p. I'd say a 390X would be 5% faster at 1440p+ and a 970 being 10% faster at 1080p. (With overclocks). At stock they are even at 1080p and 390X pulls ahead 10-15% at 1440p+. Not worth the extra $100 and power consumption.
  8. No it doesn't, it uses more power than the 290X in any review I seen. Performance is on par with an overclocked 290X as well.The 390X is $100+ more expensive than the 970 as well.
  9. Its around 980 performance what you smoking? :L Check out any TechPowerUp review of a 970 and you'll see the 970 performing on par with the 780 Ti. Even on the new Fury X review they have them performing on par..At 1080p both the 970 and 780 Ti perform 85% of the Fury X.
  10. At 1080p the GTX 970 beats the R9 290, 290X and 390. The 390X performs on par at 1080p. The 390X is an overclocked 290X with 8Gb GDDR5. It was more power and runs hotter than a 290X, and won't overclock well as it is already overclocked basically. I've seen ~5% performance increases from overclocks. With most GTX 970 overclocks you're looking at around 15% more performance. The 390X is also $100+ more expensive. At 1080p I'd recommend the 970. 1440p or higher AMD usually takes the lead.
  11. GTX 970 and 780 Ti perform basically the same. I'd get the 970 imo.
  12. I've been using 347.88 and also been having 0 issues with any of the new releases.
  13. Hey guys, been awhile since I overclocks very far. What is everyone using to test stability now? I been using a FFXIV Heavensward benchmark and Witcher 3. I figured my OC was stable, but when FFXIV Heavensward launched I had my system completely freeze and needed to hard reboot (played Witcher 3 60+ hours with the same overclock and FFXIV uses like 60% GPU). Would that be a sign of an unstable overclock or just a bug with the expansion?
  14. The GTX 970 and R9 390X perform similarly at 1080p from the TPU Review (390X is a masssssssive power hog from what they reviewed, way above 290X). At higher resolutions the 390X basically matches the GTX 980. It also overclocks like crap, was running at 83 degrees with the MSI cooler which is one of the best. It uses double the power as the 970/980 on average. So at 1080p GTX 970 performs the same and cost less while using half the power (lets be honest, the majority of people buying at the 300-400 price range are gaming at 1080p). Don't see how it is the death of the 970.
  15. I have the 970 Strix. Runs cool and silent. Overclocks to about 1500Mhz on the core and 2Ghz on memory clocks (8Ghz?)
  16. Wouldn't see why you would be able to get a refund this far in. Seems like you just want the newer graphics card..
  17. I'm at 1490 MHz and 8Ghz on the memory with a Asus Strix. Could possibly go further but this was literally the first setting I went to and been rock solid so far. Might dive deeper and see what else I can get out of it.
  18. Temps won't be a worry with this card with normal bios.
  19. For 1080p definetly the 970.
  20. I have the Asus Strix GTX 970. I use MSI Afterburner for monitoring and overclocking. I haven't test for my max overclock, but I have 1490 on the coreclock and 8Ghz on the memory (mainly been playing The Witcher 3 with). Card runs cool, about 67-68 degrees 100% load and stock fan settings. Very quiet as well besides the minor coil whine I unfortuantly do have (not very audible with headset off). I'm on my phone so I'd have to wait until I'm home to actually tell you the +core etc., settings.
  21. You do understand most console games run sub-720p resolution. With PCs most people are pushing 2-4x the pixel amount at much higher graphic fidelity. You could still play PC games at console resolution and equivalent graphical settings on a 8800GT and most likely still run it better then the consoles. The Xbox 360 launch vs PCs at that release date was an anomaly, it was very advanced for its time. This next gen of consoles not so much however. You're most likely safe with 2GB VRAM and be comparable to the consoles for the generation. Battlefield 4 developers said the console graphics for the game are comparable to medium on PC. The recommended specs list a GTX 660. Minimum was a 8800GT I believe. I'd go as far as saying console optimization is a myth, the difference is in the quality of the port.
  22. They will still be the same video cards. I wasn't referring to future cards, but current cards with that amount of VRAM. If X card with Y VRAM can't fully utilize that VRAM now because it is not powerful enough, that will not change in the future. Hence, what I stated holds true. Which is why I stated ''single GPU right now". I am pretty certain I didn't try to predict then future. You've just compared a video card to a hard drive, seriously? Also, a spell check wouldn't hurt. I am assuming a lack of reading comprehension
  23. Too bad almost every review puts the stock clock 760 ahead of a 7950 Boost let alone the non-boost. TPU puts a 7950 Boost around 90% performance of stock GTX 760.
  24. No, it won't and same prices for the crappy brands 7950 as top brands GTX 760 in Canada.GTX 760 = 7950. They basically perform on par.
  25. Look below the card in the chart, there is a stock 760. The chart I linked is all stock clocks. If you look up further you'll see the 760 OC vs 7950 OC. By hardocp. Obviously some cards will OC better then others so results vary. Personally I run 1280/7400. I can run my memory clock close to 7800 but don't as there's no need. I hit about 47 FPS on Unigine Valley Extreme HD Preset, I've seen some GTX 760s go past 50 FPS overclocked. All in all, you see the same performance and will get no real in game difference between the cards.
×