Jump to content

Regarding the RX 480 AOTS benchmark at Computex

Fulgrim
8 minutes ago, GlassBomb said:

Part of me is asking why they didn't try to use a 1070 and tell everyone "our $400 solution is better than their $400 solution".

 

But I guess it sounds better when you compare it to the 1080.

 

Good question, but do two 4GB RX 480's beat a 1070? Otherwise AMDs $460 solution beats nvidias $400 solution. 

- ASUS X99 Deluxe - i7 5820k - Nvidia GTX 1080ti SLi - 4x4GB EVGA SSC 2800mhz DDR4 - Samsung SM951 500 - 2x Samsung 850 EVO 512 -

- EK Supremacy EVO CPU Block - EK FC 1080 GPU Blocks - EK XRES 100 DDC - EK Coolstream XE 360 - EK Coolstream XE 240 -

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, GlassBomb said:

Part of me is asking why they didn't try to use a 1070 and tell everyone "our $400 solution is better than their $400 solution".

 

But I guess it sounds better when you compare it to the 1080.

It's a bit of a tougher sell against the 1070:

 

(2x) 480 - 60 FPS - Extreme 1440p

1070 - 60 FPS - Extreme 1440p

 

The 1070 has slightly better framerates in medium and heavy batches, and has lower power consumption, none of the problems of Xfire, and is a single card solution. Keep in mind this is in Ashes of the Singularity, a game that's heavily AMD biased, so in (2x) 480 vs 1070, (2x) 480 will lose in most other titles.

I am conducting some polls regarding your opinion of large technology companies. I would appreciate your response. 

Microsoft Apple Valve Google Facebook Oculus HTC AMD Intel Nvidia

I'm using this data to judge this site's biases so people can post in a more objective way.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, CommandMan7 said:

It's a bit of a tougher sell against the 1070:

 

(2x) 480 - 60 FPS - Extreme 1440p

1070 - 60 FPS - Extreme 1440p

 

The 1070 has slightly better framerates in medium and heavy batches, and has lower power consumption, none of the problems of Xfire, and is a single card solution. Keep in mind this is in Ashes of the Singularity, a game that's heavily AMD biased, so in (2x) 480 vs 1070, (2x) 480 will lose in most other titles.

 

Do we know if that's the 8GB version of the RX 480?  It pretty much has to be, right?

- ASUS X99 Deluxe - i7 5820k - Nvidia GTX 1080ti SLi - 4x4GB EVGA SSC 2800mhz DDR4 - Samsung SM951 500 - 2x Samsung 850 EVO 512 -

- EK Supremacy EVO CPU Block - EK FC 1080 GPU Blocks - EK XRES 100 DDC - EK Coolstream XE 360 - EK Coolstream XE 240 -

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, TidaLWaveZ said:

 

Do we know if that's the 8GB version of the RX 480?

No, but from my personal experience in AotS, it's relatively low on VRAM usage so it shouldn't matter anyways.

I am conducting some polls regarding your opinion of large technology companies. I would appreciate your response. 

Microsoft Apple Valve Google Facebook Oculus HTC AMD Intel Nvidia

I'm using this data to judge this site's biases so people can post in a more objective way.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, TidaLWaveZ said:

Good question, but do two 4GB RX 480's beat a 1070? Otherwise AMDs $460 solution beats a $400 solution. 

Only time will tell. If you're looking at minimum specs required for the Vive, which would be a GTX 970/R9 290. Then I can only assume that a single RX 480 would perform at least similar to that. And they did say that they were pushing the card as a 'VR ready-card", if that says anything.

 

But at this point games that utilize Dx12 is still not plentiful. Not to mention that you need Win10 for it, and all of that.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, CommandMan7 said:

No, but from my personal experience in AotS, it's relatively low on VRAM usage so it shouldn't matter anyways.

That code number or whatever is tied to the higher polaris 3Dmark, if that gives us a clue.

- ASUS X99 Deluxe - i7 5820k - Nvidia GTX 1080ti SLi - 4x4GB EVGA SSC 2800mhz DDR4 - Samsung SM951 500 - 2x Samsung 850 EVO 512 -

- EK Supremacy EVO CPU Block - EK FC 1080 GPU Blocks - EK XRES 100 DDC - EK Coolstream XE 360 - EK Coolstream XE 240 -

Link to comment
Share on other sites

Link to post
Share on other sites

40 minutes ago, Daiyus said:

OK, so they aren't the EXACT ones from the show, but they're both at Crazy 1440p settings in a standardised benchmark showing the RX 480 CF matching up to the GTX 1080. It's all there in black and white.

 

Now that's a single game that we know AMD generally performs well in compared to nVidia, but the fact is these RX 480's are no slouch at their price point. I'm waiting for more benchmarks in different games, but even so I doubt I'll be disappointed. These cards weren't ever meant to take on the GTX 1080, they're entry level cards for crying out loud, but at least the scalability is there to seriously push some pixels if you want it after you've put together your budget build. What I find amazing is that for ~£200 I can pick up a card that will destroy any game at 1080p Ultra and run VR. That'll suit me until I can afford to go 4K with a flagship card and a new screen.

It takes 2 480s to get somewhat playable framerates so I wouldn't call that destroy. 

Link to comment
Share on other sites

Link to post
Share on other sites

27 minutes ago, zMeul said:

watch your language buddy

Oh, you special little snowflake. ( ͡° ͜ʖ ͡°)

Shot through the heart and you're to blame, 30fps and i'll pirate your game - Bon Jovi

Take me down to the console city where the games are blurry and the frames are thirty - Guns N' Roses

Arguing with religious people is like explaining to your mother that online games can't be paused...

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, zMeul said:

the compression applies to the video - the left side is compressed the same amount as the right side

blaming the compression for the differences in the same video is utter nonsense 

Well, the way compression works it is actually legitimate to say that the compression could affect it.

Given the differences in the rendering on the two cards, the 1080 stream ended up with more high frequency (high detail) components... depending upon the encoder, it is possible that more bits in the bitrate "budget" were used up in this general area of the frame, because of the extra high frequency components, therefore leaving the AMD side comparatively starved of bits to encode its own region of the frame... Better encoders do do this (admittedly not usually real-time encoders) thereby leaving the AMD part looking slightly worse.

This is all ignoring the fact that the two gameplay footages were captured and therefore encoded separately to begin with.

Not saying that's what happened or that it'd always make a noticable impact... just saying it's not necessarily nonsense. :) 

Link to comment
Share on other sites

Link to post
Share on other sites

46 minutes ago, kurahk7 said:

It takes 2 480s to get somewhat playable framerates so I wouldn't call that destroy. 

What do you classify as "playable framerates"? Everything we've seen so far puts the RX 480 in the R9 390x to Air Fury range. Last I checked both those cards would play any game at 1080p60 Ultra. 60FPS isn't playable?

Link to comment
Share on other sites

Link to post
Share on other sites

Yeah... Well if the shaders working improperly means the game looks better and is less taxing on a GPU, then please sign me up for the broken shading...

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, CommandMan7 said:

It's a bit of a tougher sell against the 1070:

 

(2x) 480 - 60 FPS - Extreme 1440p

1070 - 60 FPS - Extreme 1440p

 

The 1070 has slightly better framerates in medium and heavy batches, and has lower power consumption, none of the problems of Xfire, and is a single card solution. Keep in mind this is in Ashes of the Singularity, a game that's heavily AMD biased, so in (2x) 480 vs 1070, (2x) 480 will lose in most other titles.

The CPUs in this benchmark are different.

1070 - i7 6700k 4.0GHz

2x 480 - i7 5930k 3.5GHz

Link to comment
Share on other sites

Link to post
Share on other sites

43 minutes ago, Daiyus said:

What do you classify as "playable framerates"? Everything we've seen so far puts the RX 480 in the R9 390x to Air Fury range. Last I checked both those cards would play any game at 1080p60 Ultra. 60FPS isn't playable?

Well, AMD reports 62.5 fps average at 1080 maxed settings. I consider 60fps minimum as playable with anything above that as better. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, CommandMan7 said:

It's a bit of a tougher sell against the 1070:

 

(2x) 480 - 60 FPS - Extreme 1440p

1070 - 60 FPS - Extreme 1440p

 

The 1070 has slightly better framerates in medium and heavy batches, and has lower power consumption, none of the problems of Xfire, and is a single card solution. Keep in mind this is in Ashes of the Singularity, a game that's heavily AMD biased, so in (2x) 480 vs 1070, (2x) 480 will lose in most other titles.

I don't really agree with saying that it's AMD biased. It's just Nvidia's inferior implementation of Asynchronous Compute which is lacking.

Shot through the heart and you're to blame, 30fps and i'll pirate your game - Bon Jovi

Take me down to the console city where the games are blurry and the frames are thirty - Guns N' Roses

Arguing with religious people is like explaining to your mother that online games can't be paused...

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, CommandMan7 said:

It's a bit of a tougher sell against the 1070:

 

(2x) 480 - 60 FPS - Extreme 1440p

1070 - 60 FPS - Extreme 1440p

 

The 1070 has slightly better framerates in medium and heavy batches, and has lower power consumption, none of the problems of Xfire, and is a single card solution. Keep in mind this is in Ashes of the Singularity, a game that's heavily AMD biased, so in (2x) 480 vs 1070, (2x) 480 will lose in most other titles.

 

On top of this, isn't Ashes one of the games in somewhat of a minority where it can make a ton of use out of Async Compute? Other big strategy/simulation games will make use of it too, but ultimately this is one of the "best case" scenarios in performance? For the hottest games (shooters, mobas, action, RPG), does it matter so much?

ExMachina (2016-Present) i7-6700k/GTX970/32GB RAM/250GB SSD

Picard II (2015-Present) Surface Pro 4 i5-6300U/8GB RAM/256GB SSD

LlamaBox (2014-Present) i7-4790k/GTX 980Ti/16GB RAM/500GB SSD/Asus ROG Swift

Kronos (2009-2014) i7-920/GTX680/12GB RAM/120GB SSD

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, CommandMan7 said:

It's a bit of a tougher sell against the 1070:

 

(2x) 480 - 60 FPS - Extreme 1440p

1070 - 60 FPS - Extreme 1440p

 

The 1070 has slightly better framerates in medium and heavy batches, and has lower power consumption, none of the problems of Xfire, and is a single card solution. Keep in mind this is in Ashes of the Singularity, a game that's heavily AMD biased, so in (2x) 480 vs 1070, (2x) 480 will lose in most other titles.

its nowhere near as "AMD" biased as it is the fault of Maxwell and Pascal (modded Maxwell) not properly supporting a official feature of DX12 purely because it would cost Nvidia time and money redesigning the front end of their architecture in order to make it properly optimized for these workloads.

 

If Nvidia had bothered to leave their hardware schedulers in, like they were in Fermi, and not cut them out in order to save power, then they would have beaten the shit out of AMD. But by taking their hardware schedulers out and relying on much slower context switching and preemption, they shot themselves in the " performance foot " in order to save power.

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, ANewFace said:

The real question is when are they coming out with the M versions for the people who can't spend $3000+ on a gaming laptop. I'm holding out on the Razer Blade till it gets upgraded with the 1070M

So you can have a mobile oven in your backpack? Razers idea of cooling is no better than Apple's idea of cooling.

3 hours ago, ivan134 said:

Async drivers for Paxwell coming soon TM? 

lol, "300 SERIES IS A REBRAND!!!!!!! RIOT!!!!!!!!!!!!!!!!!!!!!!!!!!!!" and Nvidia "REVOLUTIONARY PRODUCTS, OMG CIRCLEJERK FAP FAP FAP"

3 hours ago, Trixanity said:

As for picking AotS and utilizing 2 x 480: it's quite simple. AotS is usually a favorable outcome for AMD cards, hence the results look good. Picking 2 x 480 means that it beats the 1080, so the results look good. Picking a single card only would mean it lost and then people would say "it's slower" "but it's $500 less" "yes but it's slower" "it's much cheaper" "Fast. Me like fast. Fast is gooder" etc.

Point being: it's bad to show the competitor performing better in your marketing even though it makes perfect sense and even if your product is overall superior (generally speaking).

 

So here AMD wants to showcase that they can beat a 1080 with their new product by putting two of them together for an overall still much cheaper solution that probably clocks in at or below $500 for 2 x 480 8 GB with aftermarket coolers. All this just to avoid detractors saying AMD isn't beating Nvidia's latest and greatest. Obviously all marketing smoke and mirrors but necessary smoke and mirrors it is.

 

2 hours ago, CommandMan7 said:

It's a bit of a tougher sell against the 1070:

 

(2x) 480 - 60 FPS - Extreme 1440p

1070 - 60 FPS - Extreme 1440p

 

The 1070 has slightly better framerates in medium and heavy batches, and has lower power consumption, none of the problems of Xfire, and is a single card solution. Keep in mind this is in Ashes of the Singularity, a game that's heavily AMD biased, so in (2x) 480 vs 1070, (2x) 480 will lose in most other titles.

AotS Is not Biased on AMD. I think another forumer mentioned other games that are coming out that are DX12 supported along with ( The post wasn't quite clear ) ASync. It is not AMD Biased, it is simply a game that utilizes part of AMDs GPU that nVidia fails to implement correctly in their lineup, That is nVidia's fault, Not

AMDs. Instead they want to slap some duct tape on the "Paxwel" ( Thanks ivan ) cards via drivers and hope it gives a performance boost.

 

 

i7-6700k  Cooling: Deepcool Captain 240EX White GPU: GTX 1080Ti EVGA FTW3 Mobo: AsRock Z170 Extreme4 Case: Phanteks P400s TG Special Black/White PSU: EVGA 850w GQ Ram: 64GB (3200Mhz 16x4 Corsair Vengeance RGB) Storage 1x 1TB Seagate Barracuda 240GBSandisk SSDPlus, 480GB OCZ Trion 150, 1TB Crucial NVMe
(Rest of Specs on Profile)

Link to comment
Share on other sites

Link to post
Share on other sites

At the end of the day, no one is going to be playing AOTS, and as is the case in the past, buy the card that performs best in the games that you will be playing.

 

Edit: Oh and don't buy two of the same cards cause you're an idiot if you do. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, WereCat said:

The CPUs in this benchmark are different.

1070 - i7 6700k 4.0GHz

2x 480 - i7 5930k 3.5GHz

Yeah, but in both benchmarks the CPU framerate far exceeds the actual framerate, so it would only affect performance to the slightest, less than a single FPS i'm guessing.

36 minutes ago, Fulgrim said:

I don't really agree with saying that it's AMD biased. It's just Nvidia's inferior implementation of Asynchronous Compute which is lacking.

 

14 minutes ago, Commander Llama said:

 

On top of this, isn't Ashes one of the games in somewhat of a minority where it can make a ton of use out of Async Compute? Other big strategy/simulation games will make use of it too, but ultimately this is one of the "best case" scenarios in performance? For the hottest games (shooters, mobas, action, RPG), does it matter so much?

 

8 minutes ago, Prysin said:

its nowhere near as "AMD" biased as it is the fault of Maxwell and Pascal (modded Maxwell) not properly supporting a official feature of DX12 purely because it would cost Nvidia time and money redesigning the front end of their architecture in order to make it properly optimized for these workloads.

 

If Nvidia had bothered to leave their hardware schedulers in, like they were in Fermi, and not cut them out in order to save power, then they would have beaten the shit out of AMD. But by taking their hardware schedulers out and relying on much slower context switching and preemption, they shot themselves in the " performance foot " in order to save power.

 

4 minutes ago, DarkBlade2117 said:

AotS Is not Biased on AMD. I think another forumer mentioned other games that are coming out that are DX12 supported along with ( The post wasn't quite clear ) ASync. It is not AMD Biased, it is simply a game that utilizes part of AMDs GPU that nVidia fails to implement correctly in their lineup, That is nVidia's fault, Not

AMDs. Instead they want to slap some duct tape on the "Paxwel" ( Thanks ivan ) cards via drivers and hope it gives a performance boost.

 

Yes, yes and yes. It's not AMD biased, Nvidia is just stupid for their lack of foresight with Async. My bad on the exact phrasing.

 

And @Commander Llama, yes, AotS is probably the most Async friendly scenario possible. 

I am conducting some polls regarding your opinion of large technology companies. I would appreciate your response. 

Microsoft Apple Valve Google Facebook Oculus HTC AMD Intel Nvidia

I'm using this data to judge this site's biases so people can post in a more objective way.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, CommandMan7 said:

Yeah, but in both benchmarks the CPU framerate far exceeds the actual framerate, so it would only affect performance to the slightest, less than a single FPS i'm guessing.

 

 

 

 

Yes, yes and yes. It's not AMD biased, Nvidia is just stupid for their lack of foresight with Async. My bad on the exact phrasing.

 

And @Commander Llama, yes, AotS is probably the most Async friendly scenario possible. 

AMD "pushed" for Async compute features back in 2011-2012. Their roadmap for this feature has been blatantly obvious to Nvidia.

Mantle (Low level API)
Consoles (Low Level API with Async Compute usage in games here and there ever since the PS4 and XBONE launched)

Graphics Core Next architecture is heavily built around it.

 

Nvidia has KNOWN very well that AMD would push for Async. and back with Fermi, Nvidia had hardware schedulers. However starting with Kepler, 2012-2013, they removed them to reduce power. Because hardware schedulers are more powerhungry then software drivers. With maxwell, they made the architecture even more reliant on drivers, and with Pascal, they made the majority of their new features ENTIRELY BASED ON DRIVERS and as for the hardware, they mostly just cleared up bottlenecks in the Maxwell architecture to speed it up, then shrunk it.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, TidaLWaveZ said:

 

Good question, but do two 4GB RX 480's beat a 1070? Otherwise AMDs $460 solution beats nvidias $400 solution. 

Except Nvidia's solution is a $450 solution not $400

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, RagnarokDel said:

Except Nvidia's solution is a $450 solution not $400

Actually...

 

The reference card is $450

The base price is $380

 

So I'll edit my original comment to make it $380 instead of $400 if that's more accurate for you.

- ASUS X99 Deluxe - i7 5820k - Nvidia GTX 1080ti SLi - 4x4GB EVGA SSC 2800mhz DDR4 - Samsung SM951 500 - 2x Samsung 850 EVO 512 -

- EK Supremacy EVO CPU Block - EK FC 1080 GPU Blocks - EK XRES 100 DDC - EK Coolstream XE 360 - EK Coolstream XE 240 -

Link to comment
Share on other sites

Link to post
Share on other sites

AMD are putting a lot of emphasis on their making VR affordable for the masses. Are VR titles optimized for multi GPU? 

 

 

Ryzen 7 2700x | MSI B450 Tomahawk | GTX 780 Windforce | 16GB 3200
Dell 3007WFP | 2xDell 2001FP | Logitech G710 | Logitech G710 | Team Wolf Void Ray | Strafe RGB MX Silent
iPhone 8 Plus ZTE Axon 7 | iPad Air 2 | Nvidia Shield Tablet 32gig LTE | Lenovo W700DS

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, dtaflorida said:

AMD are putting a lot of emphasis on their making VR affordable for the masses. Are VR titles optimized for multi GPU? 

I believe that AMD said before the VR works excellently with with MGPU, being able to assign an eye to each GPU. So I would imagine so.

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, shdowhunt60 said:

I believe that AMD said before the VR works excellently with with MGPU, being able to assign an eye to each GPU. So I would imagine so.

Googling around doesn't make me confident in that. The most recent links seem to be from steam community about a week ago complaining about crossfire/sli.

Time to wait for Vega... Better safe than sorry, been burned by that before. 

 

 

Ryzen 7 2700x | MSI B450 Tomahawk | GTX 780 Windforce | 16GB 3200
Dell 3007WFP | 2xDell 2001FP | Logitech G710 | Logitech G710 | Team Wolf Void Ray | Strafe RGB MX Silent
iPhone 8 Plus ZTE Axon 7 | iPad Air 2 | Nvidia Shield Tablet 32gig LTE | Lenovo W700DS

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×