Jump to content

R9 390X Coming in 2015. Featuring 20nm, Liquid Cooling and High Bandwidth Memory "HBM"

Whenever Nvidia's efficiency claims are debated and rightly so. Suddenly everyone becomes and expert in electrical engineering and thermal dynamic definitions.

When Nvidia says "POWER" for the 980 is 165W and that Maxwell has "2x per/watt vs. kepler" And then countless reviews refute that nobody cares really.

Suddenly that 165 of "power" is "TDP" and then suddenly "TDP" isn't power but heat. and then suddenly it's 2x perf/heat not watt. Just hopelessly pathetic.

 

Regardless of exactly how the marketing stacks up, the fact remains that the thermal headroom on the 900 series seems to be excellent and as long as nvidia allows this is directly related to potential overclocks, especially if your not looking to get more exotic with your cooling.

 

That being said if amd wants to go exotic, I would like to see a single slot card with a thin rad, think of the ITX crossfire builds it could enable... I would love a little box crossfire enabled gaming goodness. 

System CPU : Ryzen 9 5950 doing whatever PBO lets it. Motherboard : Asus B550 Wifi II RAM 80GB 3600 CL 18 2x 32GB 2x 8GB GPUs Vega 56 & Tesla M40 Corsair 4000D Storage: many and varied small (512GB-1TB) SSD + 5TB WD Green PSU 1000W EVGA GOLD

 

You can trust me, I'm from the Internet.

 

Link to comment
Share on other sites

Link to post
Share on other sites

 

Were you the guy who won Logans car? 

GTX-980-123-49.jpg

Let me write the numbers down here in case you're too high; 680 pulling 338W - 980 pulling 327W. Thats a 3% difference so thats within margin of error so lets just say they both consume the same amount of power.

GTX-980-123-36.jpg

980: 19.08/330 = 0.5781

680: 9.32/330 = 0.2824

(0.5781-0.2824)/(0.2824) * 100 =  104% difference in terms of performance/wattage. Did nvidia lie? No. Did they guarantee in any way you're always getting 2x performance/wattage? No.

perfwatt_3840.gif

Before you go out and euheuheuheuh it's only 44% more, well no. 980 in this case would be 100% so 1, a 680 at 56% would come down to .56. Do the math, 0.56*2 giving you 1.12 so 12% off their claim.

 

 

Oh. Spoken about lies;

Ya4xBgE.png

 

You, GPUlolpert, bookerthing, emmawhatever, milkysyndicate, katnesswhocan'taim, etc are all grouping up spreading bullshit, working at AMD offices as mistresses getting paid with 8320's, carpooling together in a gangsta ride fiat punto praying when watching Logans fabricated video's, accusing each other of fabricating the IQ results because you all scored the same, claiming the 5th teletubby is called AMD, believing in conspiracy theories where 5 people actually believe in etc etc.

Seriously.. 

 

Alright seriously lets cut the personal attacks out. And that's not just to you, but to everyone. Sick and tired of this "attack people who have opposite opinions" bullshit. Stick to the facts, and your argument has made itself. Start calling them "AMD mistresses" and you've already lost your argument.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

What about the 7990? From what I recall the 690 doesn't come close to it.

 

Are you nuts?! Without a liquid cooler, no flipping way.

7990 and 690 are close to each other in terms of performance with the winner slightly skewing more towards the 7990. This is with both of them running as is. No water cooling on either of them, no excuses. 

 

Sources: 

http://www.techoftomorrow.com/2013/pc/shootout-amd-hd-7990-vs-nvidia-gtx-690-performance-results/

http://www.anandtech.com/bench/product/1184?vs=1074

THE BEAST Motherboard: MSI B350 Tomahawk   CPU: AMD Ryzen 7 1700   GPU: Sapphire R9 290 Tri-X OC  RAM: 16GB G.Skill FlareX DDR4   

 

PSU: Corsair CX650M     Case: Corsair 200R    SSD: Kingston 240GB SSD Plus   HDD: 1TB WD Green Drive and Seagate Barracuda 2TB Media Drive

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

7990 and 690 are close to each other in terms of performance with the winner slightly skewing more towards the 7990. This is with both of them running as is. No water cooling on either of them, no excuses. 

 

Sources: 

http://www.techoftomorrow.com/2013/pc/shootout-amd-hd-7990-vs-nvidia-gtx-690-performance-results/

http://www.anandtech.com/bench/product/1184?vs=1074

ya but when that card was new AMD was in the middle of its frame pacing denial. so yeas you got XXX# of FPS but 2 of them displayed at the same time :/

System CPU : Ryzen 9 5950 doing whatever PBO lets it. Motherboard : Asus B550 Wifi II RAM 80GB 3600 CL 18 2x 32GB 2x 8GB GPUs Vega 56 & Tesla M40 Corsair 4000D Storage: many and varied small (512GB-1TB) SSD + 5TB WD Green PSU 1000W EVGA GOLD

 

You can trust me, I'm from the Internet.

 

Link to comment
Share on other sites

Link to post
Share on other sites

ya but when that card was new AMD was in the middle of its frame pacing denial. so yeas you got XXX# of FPS but 2 of them displayed at the same time :/

That's a very good point you make, haha I was really just trying to deny the eventual argument between the other two. Put out some good ol' benchmarks and a source, hope they accept it and peace it

THE BEAST Motherboard: MSI B350 Tomahawk   CPU: AMD Ryzen 7 1700   GPU: Sapphire R9 290 Tri-X OC  RAM: 16GB G.Skill FlareX DDR4   

 

PSU: Corsair CX650M     Case: Corsair 200R    SSD: Kingston 240GB SSD Plus   HDD: 1TB WD Green Drive and Seagate Barracuda 2TB Media Drive

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I have little faith in a company that needs a water cooler to do the job that the other side can do with air coolers. 

 

How is that acceptable in any way shape or form? 

1.) It is only a rumour

2.) Even if it is true I bet that you can get GPU from ASUS, MSI, Sapphire, Gigabyte, .... that wont have water cooler

Link to comment
Share on other sites

Link to post
Share on other sites

1.) It is only a rumour

2.) Even if it is true I bet that you can get GPU from ASUS, MSI, Sapphire, Gigabyte, .... that wont have water cooler

Gigabyte WF 3 cooler on basically any GPU... sooo good. Love that cooler.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

The naming still doesn't make sense.

How can the 285 be slower than a 280X and how can the the 295X2 be a dual 290X card?

It should be 280<280X<285 and it should be 290X2 and not 295X2.

AMD just really messed the naming scheme maybe they can do better with the 300 series. 

 

 

Gotcha. There is a difference between having a fundamental flaw in the naming system and not executing the naming scheme in a logical way. The 290X2 would have definitely been a better name. The 285 is trickier. 281? 280 2? The new 280? have a letter to designate sequel? 280S? I get the point that people make, but I don't think anything is wrong with the STRUCTURE itself. Just poor execution.

 CPU:  Intel i7-4790K      Cooler:  Noctua NH-D14     GPU: ZOTAC GTX 1070 TI MINI     Motherboard:  ASUS Z97 Gryphon     RAM:  32GB G Skill Trident X     

Storage: 2x 512GB Samsung 850 EVO (RAID 0) / 2TB Seagate Barracuda     PSU: 850W EVGA SuperNova G2     Case: Fractal Design Node 804

Link to comment
Share on other sites

Link to post
Share on other sites

And @Mister Snow

 

I think you'll find What Victorious means is if the product has to go to the next level of cooling in order to maintain the same level of performance (note he said nothing about cost) then there is something inherently inferior to the design.     Like it or not there are very few other conclusions to come to.

Well yeah, I agree that it was a messy solution and I personally do not like it but there was an aftermarket air cooler that did the same job if I'm not mistaken. There is still a problem of power consumption and in that category they really are inferior to Nvidia at the moment but being messy as it is, it still did a better job at gaming than Titan Z so in the end you sacrifice something to gain something else. It's just a question of what do you want more. 

Link to comment
Share on other sites

Link to post
Share on other sites

I hope AMD comes in with a new card and stomps the forums again they are definitely keeping nvidia on there feet, if only they did that to Intel aswell.

Link to comment
Share on other sites

Link to post
Share on other sites

I have little faith in a company that needs a water cooler to do the job that the other side can do with air coolers.

How is that acceptable in any way shape or form?

Well in all honesty with the way technology moves you only have two ways of going with keeping it cooled one is try and trim it so it can take less power but be just as powerful or look into cooling it and see if that can be improved.

I'm not defending/bashing nvidia or AMD im just saying in genral is all.

Link to comment
Share on other sites

Link to post
Share on other sites

When he said Acceptable I assumed he meant from a consumer stand point, and a consumer will be looking at cost. Also until we actually see the new cards it is a fool's errand to attempt to draw conclusions at all.

 

I will wait for him to clarify his poisition, but it sounded like he was talking about the trend of AMD's cards and the performance of their design. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

I will wait for him to clarify his poisition, but it sounded like he was talking about the trend of AMD's cards and the performance of their design.

I meant exactly what you had stated earlier.

If it takes a water cooler from AMD to be able to provide sufficient cooling for their chip yet it takes only an air cooler for Nvidia to cool theirs, AMD is lacking on an efficient architecture.

And you can't argue that away no matter how hard you try. As of today, AMD doesn't know what efficiency is.

Link to comment
Share on other sites

Link to post
Share on other sites

I'll bet the power requirement will be at least a 750w psu.

Add the exaggeration that card manufacturers add so they don't get sued..... and you get 1200W.

LTT's unofficial Windows activation expert.
 

Link to comment
Share on other sites

Link to post
Share on other sites

Why the hell does this have a 980 ti tag on it.

Someone told Luke and Linus at CES 2017 to "Unban the legend known as Jerakl" and that's about all I've got going for me. (It didn't work)

 

Link to comment
Share on other sites

Link to post
Share on other sites

When he said Acceptable I assumed he meant from a consumer stand point, and a consumer will be looking at cost. Also until we actually see the new cards it is a fool's errand to attempt to draw conclusions at all.

Well, that's not entirely true because if it was based entirely on price how is nvidia still around. Personally I would pay extra (up to a point) for the more efficient and more elegant solution, the more efficient solution being that with doesn't chuck out so much heat your forced to throw a waterblock on it, and the more elegant solution being air.

 

 

And surely something isn't adding up if they're going to 20nm and still needing water? WTF are they doing with that GPU if it's still throwing out that much heat?! I find it more believable that it's 28nm given we are fairly sure about the water cooling bit, either that or the energy saving element of the shrink is smaller than I think.

Link to comment
Share on other sites

Link to post
Share on other sites

Alright seriously lets cut the personal attacks out. And that's not just to you, but to everyone. Sick and tired of this "attack people who have opposite opinions" bullshit. Stick to the facts, and your argument has made itself. Start calling them "AMD mistresses" and you've already lost your argument.

Just ignore him, it's just @Faa, from my experience with him personal attacks and unwavering convictions to the point of bigotry are sort of his thing. I don't like him but If someone needs help with an intel or Nvidia products he can be constructive but he's a volatile and degenerative part of AMD discussions and cannot be compromised with.

Why do you always die right after I fix you?

 

Link to comment
Share on other sites

Link to post
Share on other sites

The R9 390X (Fiji) will be the first graphics card in the world to feature TSMC's next generation 20nm manufacturing process.

 

It's very hard not to be very sceptical about this. Same rumour-mongers claimed that Nvidia Maxwell were supposed to be 20 nm. That should probably be enough for most people to discount this. Besides, making big powerful GPUs on TSMC 20 nm planar is very close to breaking the laws of physics. It would be amazing if it was true.

 

I recommend the easy read by Josh Walrath, written last year:

http://www.pcper.com/reviews/Editorial/Next-Gen-Graphics-and-Process-Migration-20-nm-and-Beyond

 

Relevant quote:

 

22/20 nm processes can pack the transistors in.  Such a process utilizing planar transistors will have some issues right off the bat.  This is very general, but essentially the power curve increases very dramatically with clockspeed.  For example, if we were to compare transistor performance from 28 nm HKMG to a 20 nm HKMG product, the 20 nm might in fact be less power efficient per clock per transistor.  So while the designer can certainly pack more transistors into the same area, there could be some very negative effects from implementing that into a design.  For example, if a designer wants to create a chip with the same functionality as the old, but increase the number of die per wafer, then they can do that with the smaller process.  This may not be performance optimized though.  If the designer then specifies that the chips have to run as fast as the older, larger versions, then they run a pretty hefty risk of the chip pulling just as much power (if not more) and producing more heat per mm squared than the previous model.

 

Intel got around this particular issue by utilizing Tri-Gates.  This technology allowed the scaling of performance and power that we are accustomed to with process shrinks.  This technology has worked out very well for Intel, but it is not perfect.  As we have seen with Ivy Bridge and Haswell, these products do not scale in speed as well as the older, larger 32 nm Sandy Bridge processors.  Both of the 22 nm architectures start pulling in more power than the previous generation when clockspeeds go past 4.0 GHz.  Having said that, the Intel 22 nm Tri-Gate process is exceptionally power efficient at lower clockspeeds.  The slower the transistors switch, the more efficient they are.  These characteristics are very favorable to Intel when approaching the mobile sector.  This is certainly an area that Intel hopes to clean up in.  This is the area that is finally scaring all the other 3rd party SOC designers (Qualcomm, Samsung, NVIDIA, etc.) and potentially putting more pressure on the pure-play foundries to get it together.

 

There have been nothing really that even hints at that the problems with high performance parts on TSMC 20 nm planar are even close to being solved. AMD was pretty clear last year when they released Hawaii on 28 nm, 20 nm planar does not work for high performance. If anyone has any solid info on this I'm all ears. Otherwise it seems pretty clear that both AMD and Nvidia are looking at ~16 nm FinFET process tech that TSMC, Samsung et al are working on.

Neither well, nor normal.

Link to comment
Share on other sites

Link to post
Share on other sites

what i find most interesting about the rumor is AMD using the High Bandwidth Memory. If AMD actually does release it early 2015 with the HBM that would be a huge advantage over NVIDIA. To my knowledge NVIDIA isnt going to be using an HBM like interface til Pascal and that wont be out til most likely 2016. Im hoping AMD does release this card with all that is speculated because that will force NVIDIA to respond (maybe upping the time line of pascal). No matter if youre an AMD or NVIDIA fan boy everybody wins with competition

Link to comment
Share on other sites

Link to post
Share on other sites

And this definitively proves that you talk out of your ass instead of actually reading reviews.

I have read the reviews. Without a liquid cooler, no way.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

What about the 7990? From what I recall the 690 doesn't come close to it.

Looking at a few reviews, the 7990 is only situationally better - usually at higher resolutions (1440p+), and only by a small margin (results are closer for 1080p). Nvidia gets better minimum framerates, but lower averages. Nvidia also seems to have better frame time variance (frame times are more consistent, which means frame rates are also consistent). 

 

And that's only considering performance, but I like to consider power usage, noise, and temperature as well, and Nvidia is the clear winner for both. 

Interested in Linux, SteamOS and Open-source applications? Go here

Gaming Rig - CPU: i5 3570k @ Stock | GPU: EVGA Geforce 560Ti 448 Core Classified Ultra | RAM: Mushkin Enhanced Blackline 8GB DDR3 1600 | SSD: Crucial M4 128GB | HDD: 3TB Seagate Barracuda, 1TB WD Caviar Black, 1TB Seagate Barracuda | Case: Antec Lanboy Air | KB: Corsair Vengeance K70 Cherry MX Blue | Mouse: Corsair Vengeance M95 | Headset: Steelseries Siberia V2

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I have read the reviews. Without a liquid cooler, no way.

It's not far ahead like claimed, but they're pretty much equal. The way you said was as if the 690 destroys it.

Link to comment
Share on other sites

Link to post
Share on other sites

Just ignore him, it's just @Faa, from my experience with him personal attacks and unwavering convictions to the point of bigotry are sort of his thing. I don't like him but If someone needs help with an intel or Nvidia products he can be constructive but he's a volatile and degenerative part of AMD discussions and cannot be compromised with.

Stop lying about me, you have no ideas how bad advice they have been giving to people with many of them that came back and asked for an upgrade to Intel. Recommending 8320/8350's over 4670k for single threaded games, giving you a crap ton of futureproof nonsense, that's what they do. It never bothered them what performs better, any benchmark except from Teksyndicate is bullshit, all they're interested in is supporting AMD. It didn't matter a damn what they were buying, as long as AMD can sell a CPU that's all they care about. Lying to people who seek advice and being denial is the worst thing you can do. When I joined this forum in every 8350 vs i5 thread literally everyone recommended the 8350 over the i5 when they both costed exactly the same, now I can't even remember an OP going with their advice anymore. After you show them how terrible the FX cpu's can perform, they're quoting you with childish nonsense insulting your intelligence etc and you expect me to respect them? In their world AMD is always always the best, it can't be any different.

This isn't the first time we heard them moaning about nvidia's claim about 2x perf/watt & TDP;

1st one: http://linustechtips.com/main/topic/217732-nvidia-lying-about-maxwell-gtx-980-specs-to-press/?view=findpost&p=2985898

2nd one: http://linustechtips.com/main/topic/218016-nvidias-attempt-to-manipulate-the-gtx-980-figures-to-deceive-the-press-worked/?view=findpost&p=2990029

After their first thread got locked because of all the BS, the 2nd thread was just the same exact thing, and while being proven wrong in both threads already he starts talking about it again in this thread that had nothing to do with nvidia at all. Give him a week or two, he'll post it again. Also I'm actually never active in AMD vs Nvidia threads, what am I supposed to do there when there's hardly a performance difference between them? 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×