Jump to content

AMD R9 390X Coming With Cooler Master Liquid Cooler + Estimated Performance

GPUXPert

Calm down guys ...

 

No need to get the thread locked.

 

Right, because that's the offensive part of this thread...

Link to comment
Share on other sites

Link to post
Share on other sites

Oh lord we still have clowns arguing over wattage, when most of us on this forum overclock our CPU's anyways. My 4770k uses like 75-80 more watts on it's overclock than stock and I am not even pushing high voltage. Guess what? I don't care. My R9 290 is also overclocked to past 290x speeds. They both operate on a 600 watt Corsair budget line PSU I picked up for 50 bucks....and people want to act as if I need Nikola Tesla's Niagara Falls power plant to run my PC. 

 

BTW Faa what the hell happened on Evolve? Oh yeah the 290x beat the 980 at 1080p/1440p and demolished it at 4k. The GTX 970? A piece of junk in that game. Sweclockers. Guess that is what happens when you can't hide behind a closed game library for poorly optimized games that run like garbage on everyone's GPU (including Nvidia). Geeze I wonder why we don't see benchmarks of this game from the big media outlets like Tom's Hardware (owned by a media group, and did not run a story on the GTX 970 memory issue for a LONG time). Funny thing? It is a Nvidia title like Shadows of Mordor was. Guess the devs didn't want the game to run like crap for everyone. Bummer Nvidia!

 

http://www.sweclockers.com/artikel/20031-snabbtest-grafikprestanda-i-evolve/3

 

So you can keep that 3.5 gig card that saves you a quarter a month on power and you can keep that 980 that was a few hundred more than my Tri-x. Game Works games? No problem. I just turn tessellation down, crank up the downsample and get a better picture and bypass the Game Works BS.

 

If you want to argue Nvidia is a better choice for Mini ITX builds? Go right ahead. HDMI cables exist though and so do Wireless Xbox controller dongles.

 

This new AMD card is like 1/3rd faster on TFLOPS than my R9 290, with bandwidth that annihilates my R9 290 which beats a GTX 980 outside Game Works games at 1440p/4k. Of course the thing is going to use a good amount of wattage.  

 

What do you expect? Magic? You should all stop running SLI setups and OC'ing CPU's. More "efficient" that way. Should also lower the bandwidth on the GPU's and not allow all the ram to be used well for games. Efficiency is all that matters for PC Gaming enthusiasts right? 

CPU:24/7-4770k @ 4.5ghz/4.0 cache @ 1.22V override, 1.776 VCCIN. MB: Z87-G41 PC Mate. Cooling: Hyper 212 evo push/pull. Ram: Gskill Ares 1600 CL9 @ 2133 1.56v 10-12-10-31-T1 150 TRFC. Case: HAF 912 stock fans (no LED crap). HD: Seagate Barracuda 1 TB. Display: Dell S2340M IPS. GPU: Sapphire Tri-x R9 290. PSU:CX600M OS: Win 7 64 bit/Mac OS X Mavericks, dual boot Hackintosh.

Link to comment
Share on other sites

Link to post
Share on other sites

This line of cards could put AMD in the lead I think

Sally: l Intel i5 4690k | | Corsair H55 | | Zalman z12 plus | | Hitachi 2TB HDD | | Samsung 840 Pro 128 GB | | ASRock Fatal1ty z97 Killer | | Corsair Vengeance 16 GB | | EVGA SuperNova 750 watt | | MSI 390 | | Windows 10| 

Link to comment
Share on other sites

Link to post
Share on other sites

I rate this thread at 3 cuils. 

The Internet is the first thing that humanity has built that humanity doesn't understand, the largest experiment in anarchy that we have ever had.

Link to comment
Share on other sites

Link to post
Share on other sites

So, since the forum doesn't have post by thread search, I gotta ask (sorry if it had been before)

 

Should I wait for AMD's "miracle" (or re-rebranded?) 380 series or just go with a GTX 970? I won't be playing 1440p or 4k anytime soon but I'd rather future-proof my build now than dealing with selling/buying components again because I was too rushed before.

 

Also, I've been googling around and rumors of a march/early april release show up a lot, that's why I'm asking whether it's better to wait or not.

Link to comment
Share on other sites

Link to post
Share on other sites

This line of cards could put AMD in the lead I think

 

It certainly should, but the real question is what happens when The Empire Nvidia strikes back with GM200. Competition!

Link to comment
Share on other sites

Link to post
Share on other sites

Oh lord we still have clowns arguing over wattage, when most of us on this forum overclock our CPU's anyways. My 4770k uses like 75-80 more watts on it's overclock than stock and I am not even pushing high voltage. Guess what? I don't care. My R9 290 is also overclocked to past 290x speeds. They both operate on a 600 watt Corsair budget line PSU I picked up for 50 bucks....and people want to act as if I need Nikola Tesla's Niagara Falls power plant to run my PC. 

 

BTW Faa what the hell happened on Evolve? Oh yeah the 290x beat the 980 at 1080p/1440p and demolished it at 4k. The GTX 970? A piece of junk in that game. Sweclockers. Guess that is what happens when you can't hide behind a closed game library for poorly optimized games that run like garbage on everyone's GPU (including Nvidia). Geeze I wonder why we don't see benchmarks of this game from the big media outlets like Tom's Hardware (owned by a media group, and did not run a story on the GTX 970 memory issue for a LONG time). Funny thing? It is a Nvidia title like Shadows of Mordor was. Guess the devs didn't want the game to run like crap for everyone. Bummer Nvidia!

 

http://www.sweclockers.com/artikel/20031-snabbtest-grafikprestanda-i-evolve/3

 

So you can keep that 3.5 gig card that saves you a quarter a month on power and you can keep that 980 that was a few hundred more than my Tri-x. Game Works games? No problem. I just turn tessellation down, crank up the downsample and get a better picture and bypass the Game Works BS.

 

If you want to argue Nvidia is a better choice for Mini ITX builds? Go right ahead. HDMI cables exist though and so do Wireless Xbox controller dongles.

 

This new AMD card is like 1/3rd faster on TFLOPS than my R9 290, with bandwidth that annihilates my R9 290 which beats a GTX 980 outside Game Works games at 1440p/4k. Of course the thing is going to use a good amount of wattage.  

 

What do you expect? Magic? You should all stop running SLI setups and OC'ing CPU's. More "efficient" that way. Should also lower the bandwidth on the GPU's and not allow all the ram to be used well for games. Efficiency is all that matters for PC Gaming enthusiasts right? 

I'm not going to argue about which cards are faster, only 12 years old kids would argue about cards that perform 5-10% differently, feel free to be a part of such a group. Ok I'll make you happy, the 290x is ALOTTTTTTTTTTTTTT faster than the 980 and the money you spend on the extra power is 5$ every 30 years. There you go. Any other peptalk you would like? And you still are rumouring Gameworks crippling AMD's performance when you got shitload of evidence shoved under your ass where we see the same degradation in their own Mantle titles. Rumouring with no sensical arguments is pretty much trying to be a genious in a mental hospital. Now admit it has nothing to do with Gameworks to prove you're not astroturfing for AMD.

If you're coming up with benchmarks that show a 280x doing better than a 780 just to support your argument a 970 is a shit card then you're silly.

evolve-bench-1.jpg

Before you start trolling about the 970's 3.5GB VRAM, well here you go;

GPU was at its full potential as the GPU usage was pretty much at 99% all the time pointing out that the bandwidth of the last 0.5GB wasnt limiting anything. I needed 2 or 4x msaa orsomething to get to 4GB. Anything else you want to add other than hearsaying?

For some people power consumption means more than just your bills, it's an indication for many other factors such as; cooling/noise/heating your room up/PSU fan kicking in if your psu got a noisy fan/if SLI&CF would be acceptable etc. Have fun going with going 4x 290x's for a 4K setup, you kinda really need this, when you can do it under air at half the power draw with a cheap 650W PSU for a little bit less performance.

Let's have a look at how quiet the quietest 290x is aka the Sapphire 290x vaporx; http://nl.hardware.info/productinfo/224307/sapphire-radeon-r9-290x-vapor-x-8gb#tab:testresultaten

57 dBa from 4" away measured with a meter that costs 6000 eur. Here's a 970 Strix sitting at 38 dBa; http://nl.hardware.info/productinfo/253092/asus-geforce-gtx-970-strix-4gb#tab:testresultaten

20 dBa difference meaning the Sapphire is just 4 times louder. Here you got my reasoning why I don't buy AMD cards anymore and why I bought AMD back in the days Nvidia cards were consuming a shitload more. 

Link to comment
Share on other sites

Link to post
Share on other sites

Let's have a look at how quiet the quietest 290x is aka the Sapphire 290x vaporx; http://nl.hardware.info/productinfo/224307/sapphire-radeon-r9-290x-vapor-x-8gb#tab:testresultaten

57 dBa from 4" away measured with a meter that costs 6000 eur. Here's a 970 Strix sitting at 38 dBa; http://nl.hardware.info/productinfo/253092/asus-geforce-gtx-970-strix-4gb#tab:testresultaten

20 dBa difference meaning the Sapphire is just 4 times louder. Here you got my reasoning why I don't buy AMD cards anymore and why I bought AMD back in the days Nvidia cards were consuming a shitload more. 

 

Shitty review site getting their test wrong. Here's a proper review, where the Sapphire R9 290X Vapor-X is much quieter despite being tested from a shorter distance. And it beats the GTX 970.

 

Also, a 20 dB(A) difference is 100 times as loud, not 4 times as loud. It's a base-10 logarithm, not a base-2 logarithm. 20 dB(A) = 2 B(A) = 10^2 = 100

Link to comment
Share on other sites

Link to post
Share on other sites

Shitty review site getting their test wrong. Here's a proper review, where the Sapphire R9 290X Vapor-X is much quieter despite being tested from a shorter distance. And it beats the GTX 970.

 

Also, a 20 dB(A) difference is 100 times as loud, not 4 times as loud. It's a base-10 logarithm, not a base-2 logarithm. 20 dB(A) = 2 B(A) = 10^2 = 100

The EVGA 970 isnt a quiet card though........why didnt they use a quiet 970 for the review?

You can't be serious.  Hyperthreading is a market joke?

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Shitty review site getting their test wrong. Here's a proper review, where the Sapphire R9 290X Vapor-X is much quieter despite being tested from a shorter distance. And it beats the GTX 970.

Haha, a 290x only consuming 30W more than a 970? My 970 only consumes like 120W;

Pretty sure my 780 was consuming nearly twice as much and a 290x is definitely going to consume more than a 780. Hardware.info uses a meter worth of 6K eur, not a lot of people are taking Tomshardware serious these days.

6789_45_asus_geforce_gtx_970_4gb_strix_o

53dBa and the 290x vaporx

6562_50_sapphire_radeon_r9_290x_4gb_vapo

20 dBa difference as well, so Hardware.info was right

 

Also, a 20 dB(A) difference is 100 times as loud, not 4 times as loud. It's a base-10 logarithm, not a base-2 logarithm. 20 dB(A) = 2 B(A) = 10^2 = 100

 

It's 4 times; http://martinsliquidlab.org/2013/03/12/swiftech-h220-vs-corsair-h100i-noise-testing/

Link to comment
Share on other sites

Link to post
Share on other sites

Oh lord we still have clowns arguing over wattage, when most of us on this forum overclock our CPU's anyways. My 4770k uses like 75-80 more watts on it's overclock than stock and I am not even pushing high voltage. Guess what? I don't care. My R9 290 is also overclocked to past 290x speeds. They both operate on a 600 watt Corsair budget line PSU I picked up for 50 bucks....and people want to act as if I need Nikola Tesla's Niagara Falls power plant to run my PC. 

 

BTW Faa what the hell happened on Evolve? Oh yeah the 290x beat the 980 at 1080p/1440p and demolished it at 4k. The GTX 970? A piece of junk in that game. Sweclockers. Guess that is what happens when you can't hide behind a closed game library for poorly optimized games that run like garbage on everyone's GPU (including Nvidia). Geeze I wonder why we don't see benchmarks of this game from the big media outlets like Tom's Hardware (owned by a media group, and did not run a story on the GTX 970 memory issue for a LONG time). Funny thing? It is a Nvidia title like Shadows of Mordor was. Guess the devs didn't want the game to run like crap for everyone. Bummer Nvidia!

 

http://www.sweclockers.com/artikel/20031-snabbtest-grafikprestanda-i-evolve/3

 

So you can keep that 3.5 gig card that saves you a quarter a month on power and you can keep that 980 that was a few hundred more than my Tri-x. Game Works games? No problem. I just turn tessellation down, crank up the downsample and get a better picture and bypass the Game Works BS.

 

If you want to argue Nvidia is a better choice for Mini ITX builds? Go right ahead. HDMI cables exist though and so do Wireless Xbox controller dongles.

 

This new AMD card is like 1/3rd faster on TFLOPS than my R9 290, with bandwidth that annihilates my R9 290 which beats a GTX 980 outside Game Works games at 1440p/4k. Of course the thing is going to use a good amount of wattage.  

 

What do you expect? Magic? You should all stop running SLI setups and OC'ing CPU's. More "efficient" that way. Should also lower the bandwidth on the GPU's and not allow all the ram to be used well for games. Efficiency is all that matters for PC Gaming enthusiasts right? 

I did not know that background on Tom's Hardware.  Is that true? 

 

 

Throughout this thread, I have noticed not a single reference to drivers.  I do not know if the things people say on numerous sites are biased or not, but there is a fair amount of sludge thrown at AMD and their 3rd party manufacturers about the drivers for those cards.  I would hate that to be true.  Perhaps someone here can say with certainty that by this point in time, AMD software has no problems (anymore)?

Link to comment
Share on other sites

Link to post
Share on other sites

I did not know that background on Tom's Hardware. Is that true?

Throughout this thread, I have noticed not a single reference to drivers. I do not know if the things people say on numerous sites are biased or not, but there is a fair amount of sludge thrown at AMD and their 3rd party manufacturers about the drivers for those cards. I would hate that to be true. Perhaps someone here can say with certainty that by this point in time, AMD software has no problems (anymore)?

No software is perfect, both AMD and nVidia drivers have their problems.

However, from my experience, AMD drivers are nowhere near as bad as people make them out to be. I used nVidia cards from 2007-2012, and in the past 3 years I haven't noticed any more problems than when I was worth nVidia.

Apart from that one driver way back in the day that made me think my 9800gtx+ was dead (I think it was the same driver that killed some nVidia cards) I've only had a few small issues from both vendors that had easy workarounds.

However, my sample size of AMD cards is lower than that of nVidia cards, so I can't just outright say "AMD drivers are better"

I think that the drivers are close enough that it shouldn't be a factor in deciding which gpu manufacture you choose.

Intel i7 6700k @ 4.8ghz, NZXT Kraken X61, ASUS Z170 Maximus VIII Hero, (2x8GB) Kingston DDR4 2400, 2x Sapphire Nitro Fury OC+, Thermaltake 1050W

All in a Semi Truck!:

http://linustechtips.com/main/topic/519811-semi-truck-gaming-pc/#entry6905347

Link to comment
Share on other sites

Link to post
Share on other sites

That photo is not reall. i can see marks of photoshop. its a 295x2

thats real i watched a video from either jayztwocentz or theteksyndicate.

Link to comment
Share on other sites

Link to post
Share on other sites

 

Haha, a 290x only consuming 30W more than a 970? My 970 only consumes like 120W;

 

120W? Get real.

Link to comment
Share on other sites

Link to post
Share on other sites

120W? Get real.

 

My whole system draws about 500W while benchmarking. That's two GPUs and an overclocked Sandy Bridge-E CPU so that number per GPU wouldn't surprise me tbh.

Link to comment
Share on other sites

Link to post
Share on other sites

Reviews show the GTX 970 drawing around 170-175W when gaming. Here's one example. That's only the graphics card, the total system power consumption is going to be higher - in the 300W range in this review. Note how the GTX 970 draws significantly more power than the GTX 670 and 760. That's even though Nvidia claims a significantly lower TDP on the GTX 970. Along with the "specifications controversy", it seems like Nvidia decided to get creative with their marketing of the 900 series. The efficiency gap is real, but not as big as Nvidia would have you believe; and the new GCN version should reduce it.

Link to comment
Share on other sites

Link to post
Share on other sites

Shitty review site getting their test wrong. Here's a proper review, where the Sapphire R9 290X Vapor-X is much quieter despite being tested from a shorter distance. And it beats the GTX 970.

 

Also, a 20 dB(A) difference is 100 times as loud, not 4 times as loud. It's a base-10 logarithm, not a base-2 logarithm. 20 dB(A) = 2 B(A) = 10^2 = 100

That's not how sound measurement works.

 

every 10dB results in a perceived doubling of SPL.  ergo 10dB = 2x as loud. Then another 10dB = 2x again so 20db is a doubling then another doubling or 4x louder. Not 100x.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

That's not how sound measurement works.

 

every 10dB results in a perceived doubling of SPL.  ergo 10dB = 2x as loud. Then another 10dB = 2x again so 20db is a doubling then another doubling or 4x louder. Not 100x.

 

That's not really true, because there is no objective measure of loudness.

Link to comment
Share on other sites

Link to post
Share on other sites

That's not really true, because there is no objective measure of loudness.

That is a 100% true and measured.  Look up SPL and safe listening levels for further explanation. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

That is a 100% true and measured.  Look up SPL and safe listening levels for further explanation. 

 

SPL is what dB is used to measure. But that's where 10 dB means 10 times as loud. You want to convert that to subjective loudness, which there is no objective way to do. Phons and sones are units attempting to quantify perceived loudness, but they are not objective measures.

Link to comment
Share on other sites

Link to post
Share on other sites

I don't like aio's and I CBF to do a custom loop so this is dissapointing, I hope this doesn't become a reg thing for high end GPUs :/

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×