Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
Fulgrim

Regarding the RX 480 AOTS benchmark at Computex

Recommended Posts

3 hours ago, Trixanity said:

I suppose but the point is: the cards do more work. The benchmark is CPU-bottlenecked with the "normal" batches. An extension of my comment regarding how the CPU is doing more work to feed the two cards over a single card.

Normal batches mean less draw calls mean less work for the CPU to feed the GPUs so i don't understand why you think Normal batches become "CPU" bottleneck instead of GPU bottleneck.

 

 

Anyway this thread go derailed pretty fast since it's clearly the 1080 is at fault here. xD


| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to post
Share on other sites

please everyone just stop and wait for the cards, nvidia has lied a ton about the gtx 1080 running cool, being faster than 2x 980 etc (maybe in cherry picked game or something) so lets wait for AMD cards, i personally think 2x 480's are not faster than 1x 1080 in every single game with 50% headroom left to spare that would make 1x 480 equal to 1x 1080 which makes no sense, the AOS game is an AMD partenership title i do believe its also cherry picked but somewhat accurate lets just wait to see the reality

Link to post
Share on other sites
22 minutes ago, deviant88 said:

please everyone just stop and wait for the cards, nvidia has lied a ton about the gtx 1080 running cool, being faster than 2x 980 etc (maybe in cherry picked game or something) so lets wait for AMD cards, i personally think 2x 480's are not faster than 1x 1080 in every single game with 50% headroom left to spare that would make 1x 480 equal to 1x 1080 which makes no sense, the AOS game is an AMD partenership title i do believe its also cherry picked but somewhat accurate lets just wait to see the reality

it is only 50% headroom in scenes with extreme amounts of CPU workloads.

Link to post
Share on other sites
33 minutes ago, xAcid9 said:

Normal batches mean less draw calls mean less work for the CPU to feed the GPUs so i don't understand why you think Normal batches become "CPU" bottleneck instead of GPU bottleneck.

 

 

Anyway this thread go derailed pretty fast since it's clearly the 1080 is at fault here. xD

Source: AMD. They say the normal batch is CPU-bound. I suppose it's because the GPU utilization is at 51% hence the GPU isn't doing much work. An increase in draw calls usually means lessening of CPU-bound scenarios. So that would indicate that an increase in draw calls reduces the CPU bottleneck. I guess this also depends on what one perceives as a bottleneck. Is the GPU being choked by a lack of draw calls and does that mean the GPU is the bottleneck? Or is it the CPU which is not providing the performance (or draw calls) necessary to keep the GPU occupied? I would personally consider the CPU a bottleneck in this scenario. Or perhaps I'm wrong and the GPU is unable to distribute the work properly to achieve maximum utilization when it doesn't receive enough draw calls. If that is so, perhaps a driver update can fix that.

 

In either case: you'll have to take this one up with AMD. It's their statement.

Link to post
Share on other sites
47 minutes ago, Trixanity said:

In either case: you'll have to take this one up with AMD. It's their statement.

Even if that doesn't really make sense?  xD

 

I read @Thracks post in Reddit, yeah less GPU utilization mean it could be bottleneck by the CPU but why the CPU got bottleneck in less demanding scenario in the first place?


| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to post
Share on other sites
32 minutes ago, xAcid9 said:

Even if that doesn't really make sense?  xD

 

I read @Thracks post in Reddit, yeah less GPU utilization mean it could be bottleneck by the CPU but why the CPU got bottleneck in less demanding scenario in the first place?

driver overhead

Link to post
Share on other sites
15 minutes ago, xAcid9 said:

Even of that doesn't really make sense?  xD

 

I read @Thracks post in Reddit, yeah less GPU utilization mean it could be bottleneck by the CPU but why the CPU got bottleneck in less demanding scenario in the first place?

I'll assume the experts know what they're saying although blind trust is never a good thing :)

 

Well, hard to say. All I can do is speculate since I'm not exactly a software engineer or anything of the sort. It is a conundrum. So I'll try to come up with something: 

 

The game is still doing lots of CPU tasks (physics, AI, etc) despite the graphical fidelity going down. So the GPU is doing less despite the CPU still being able to feed the GPU adequately because the GPU tasks are smaller. However, even if we assume that the CPU tasks are still reduced (as in normal batch being less intensive on both CPU and GPU versus heavy batch), the CPU is still constantly busy, so we're in a scenario where increased CPU power (eg overclocking, more cores) could increase the FPS to reduce the bottleneck (potentially). Well, depends on whether we consider that a bottleneck. Let's just say the performance/FPS in this scenario is limited by how fast the CPU does things, not the GPU because the GPU is doing its job without breaking a sweat and increasing GPU power would not increase the FPS because the system is not lacking GPU power, but CPU power in order to improve performance.

 

Since AotS is an RTS: many units, complex AI, dynamic weather (don't know if AotS has these things but I'm guessing it's probable) etc. is very demanding even with for example less units on screen (or off screen) in a less intensive benchmark would still be CPU limited with modern CPUs. However with modern APIs and modern programming techniques the bottleneck might be lessened or removed by proper CPU scaling (clock speed and cores).

 

It could be that or just driver overhead as suggested :)

 

Hopefully my post is somewhat coherent and not entirely wrong.

Link to post
Share on other sites

I'm still gonna put a 480 in my system, It's cheap, and good from what I've seen. 


CPU: Intel Core i5 6600k - RAM: 16GB DDR4 - Case: Phanteks Enthoo Luxe-  PSU: RM750 Modular Cooler: Corsair H55 - Storage: 128gb SSD + 1tb HDD -  OS: Windows 7 - GFX Card: ASUS R7 360

 

Link to post
Share on other sites
On 6/2/2016 at 9:58 AM, AmbarChakrabarti said:

 

$400 GPU setup beats a $600 GPU! Holy Balls!

And you only have to worry about crossfire issues. I went for two 970s instead of a 980. Sli never again. One 1080 is still going to be objectively better than two of any cheaper card in sli for that reason alone. 

 

It's impressive that they're selling it at such a low price,  but one card for me. 

 

 

Disclaimer: I am aware of the difference between sli and crossfire. Both have undeniably annoying issues in many titles. 

Link to post
Share on other sites
On 02/06/2016 at 11:56 PM, Fulgrim said:

GTX 1080 is incorrectly executing the terrain shaders

I liked the incorrectly executed terrain shaders tho :/

The image on 480s looked kinda bland


hello!

is it me you're looking for?

ᴾC SᴾeCS ᴰoWᴺ ᴮEᴸoW

Spoiler

Desktop: X99-PC

CPU: i7 5820k

Mobo: X99 Deluxe

Cooler: Dark Rock Pro 3

RAM: 32GB DDR4
GPU: GTX 1080

Storage: 1TB 850 Evo, 1TB HDD, bunch of external hard drives
PSU: EVGA G2 750w

Peripherals: Logitech G502, Ducky One 711

Audio: Xonar U7, O2 amplifier (RIP), HD6XX

Monitors: 4k 24" Dell monitor, 1080p 24" Asus monitor

 

Laptop:

-Overkill Dell XPS

Fully maxed out early 2017 Dell XPS 15, GTX 1050 4GB, 7700HQ, 1TB nvme SSD, 32GB RAM, 4k display. 97Whr battery :x 
Dell was having a $600 off sale for the fully specced out model, so I decided to get it :P

 

-Crapbook

Fully specced out early 2013 Macbook "pro" with gt 650m and constant 105c temperature on the CPU (GPU is 80-90C) when doing anything intensive...

A 2013 laptop with a regular sized battery still has better battery life than a 2017 laptop with a massive battery! I think this is a testament to apple's ability at making laptops, or maybe how little CPU technology has improved even 4+ years later (at least, until the recent introduction of 15W 4 core CPUs). Anyway, I'm never going to get a 35W CPU laptop again unless battery technology becomes ~5x better than as it is in 2018.

Apple knows how to make proper consumer-grade laptops (they don't know how to make pro laptops though). I guess this mostly software power efficiency related, but getting a mac makes perfect sense if you want a portable/powerful laptop that can do anything you want it to with great battery life.

 

 

Link to post
Share on other sites
On 3-6-2016 at 11:00 PM, Entropy said:

And you only have to worry about crossfire issues. I went for two 970s instead of a 980. Sli never again. One 1080 is still going to be objectively better than two of any cheaper card in sli for that reason alone. 

 

It's impressive that they're selling it at such a low price,  but one card for me. 

 

 

Disclaimer: I am aware of the difference between sli and crossfire. Both have undeniably annoying issues in many titles. 

 

Maybe to you but i never had experienced isseus with SLI or Crossfire on. and i have a lot of games


EOC folding stats - Folding stats - My web folding page stats

 

Summer Glau: Quote's The future is worth fighting for. Serenity

 

My linux setup: CPU: I7 2600K @4.5Ghz, MM: Corsair 16GB vengeance @1600Mhz, GPU: 2 Way Radeon his iceq x2 7970, MB: Asus sabertooth Z77, PSU: Corsair 750 plus Gold modular

 

My gaming setup: CPU: I7 3770K @4.7Ghz, MM: Corsair 32GB vengeance @1600Mhz, GPU: 2 Way Gigabyte RX580 8GB, MB: Asus sabertooth Z77, PSU: Corsair 860i Platinum modular

Link to post
Share on other sites
On 6/2/2016 at 10:11 AM, No Nrg said:

I still think it's a bit ridiculous that people continue to use this game no one even plays as a benchmark in performance and call it definitive proof of one being better than the other.

 

Show me benchmarks from a whole list of games with one beating the other consistently across a majority of the list and maybe I won't call bullshit. ;)

Well AOTS is the only ground up Dx12 title to date which gives it street cred. What other Dx12 title can pair a Nvidia and Amd gpu and actually benchmark it.

Link to post
Share on other sites
On 6/3/2016 at 2:29 PM, TidaLWaveZ said:

 

I'm sure it is.  I'm not out to bash the RX 480, I don't see any thing wrong with it.  I honestly prefer a single card after a few lackluster experiences with SLi/Crossfire.  For my situation I think the extra $140 for the GTX 1080 is justifiable, and I also honestly think that the two RX 480 solution will not end up being comparable to the 1080(though I could be wrong).

Yes, I'm sorry I didn't make it clear that I meant that the 480 is a sweetspot single GPU solution. if it performs around the same as a 980. I didn't mention or mean to mention CF. I meant to just include that a single 480 might be good enough for a lot of people and that people wouldn't need to even consider buying anything more expensive unless they planned on a 4k build, unless superbly more demanding games came out which I doubt since my 980 is still more than capable to max every single game I play at 1080p at ~ 60+ fps. I even DSR to 4k on a lot of my games just for fun. (the difference is minimal at 1080 since MSAA etc are adequate.) 

 

So I'm super excited for how amazing the performance might be for the $200 - $300 range and would expect a lot of people that were even planning on buying a high end solution for 1080p to fall back to this option if it delivers on it's promised performance.


Spoiler

CPU: R5 1600 @ 4.2 GHz; GPU: Asus STRIX & Gigabyte g1 GTX 1070 SLI; RAM: 16 GB Corsair vengeance 3200 MHz ; Mobo: Asrock Taichi x470; SSD: 512 gb Samsung 950 Pro Storage: 5x Seagate 2TB drives; 1x 2TB WD PurplePSU: 700 Watt Huntkey; Peripherals: Acer S277HK 4K Monitor; Logitech G502 gaming mouse; Corsair K95 Mechanical keyboard; 5.1 Logitech x530 sound system

 01000010 01101001 01101110 01100001 01110010 01111001 00100000 01100100 01101111 01100101 01110011 01101110 00100111 01110100 00100000 01101101 01100001 01101011 01100101 00100000 01111001 01101111 01110101 00100000 01110000 01110010 01101111 00101110

 

 

 

Link to post
Share on other sites

Officially confirmed that NVidia fracked up their shitty drivers yet again:

 

 


Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to post
Share on other sites
1 hour ago, Notional said:

Officially confirmed that NVidia fracked up their shitty drivers yet again:

 

 

This seems to be a recurring theme this year: nVidia rushing out drivers of dubious quality.

Link to post
Share on other sites
3 minutes ago, shdowhunt60 said:

This seems to be a recurring theme this year: nVidia rushing out drivers of dubious quality.

And people blaming AMD for the result xD


Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to post
Share on other sites
21 minutes ago, Notional said:

And people blaming AMD for the result xD

Just LTT drama things 


One day I will be able to play Monster Hunter Frontier in French/Italian/English on my PC, it's just a matter of time... 4 5 6 7 8 9 years later: It's finally coming!!!

Phones: iPhone 4S/SE | LG V10 | Lumia 920

Laptops: Macbook Pro 15" (mid-2012) | Compaq Presario V6000

Link to post
Share on other sites
2 minutes ago, suicidalfranco said:

Just LTT drama things 

Sadly it was all over on many prominent tech forums and reddit; so many people claiming AMD were "obviously" running at lower graphical settings and lying.


5820K 4.0GHz | NH D15S | 32 GB RAM | GTX 580 | ASUS PG348Q+MG278Q

 

Link to post
Share on other sites
2 minutes ago, Valentyn said:

Sadly it was all over on many prominent tech forums and reddit; so many people claiming AMD were "obviously" running at lower graphical settings and lying.

LTT is the only forum i visit, don't even know what's reddit ¯\_(ツ)_/¯


One day I will be able to play Monster Hunter Frontier in French/Italian/English on my PC, it's just a matter of time... 4 5 6 7 8 9 years later: It's finally coming!!!

Phones: iPhone 4S/SE | LG V10 | Lumia 920

Laptops: Macbook Pro 15" (mid-2012) | Compaq Presario V6000

Link to post
Share on other sites
Posted · Original PosterOP
13 minutes ago, suicidalfranco said:

LTT is the only forum i visit, don't even know what's reddit ¯\_(ツ)_/¯

Pure cancer, please avoid it.


Shot through the heart and you're to blame, 30fps and i'll pirate your game - Bon Jovi

Take me down to the console city where the games are blurry and the frames are thirty - Guns N' Roses

Arguing with religious people is like explaining to your mother that online games can't be paused...

Link to post
Share on other sites
1 minute ago, Fulgrim said:

Pure cancer, please avoid it.

Some would say the same about LTT. It all depends on who you are asking.

Link to post
Share on other sites

@Fulgrim Mind updating your post with the twitter update?

 

3 hours ago, Notional said:

Officially confirmed that NVidia fracked up their shitty drivers yet again:

 

 

 


Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to post
Share on other sites
Posted · Original PosterOP
2 minutes ago, Notional said:

@Fulgrim Mind updating your post with the twitter update?

 

 

will do fam


Shot through the heart and you're to blame, 30fps and i'll pirate your game - Bon Jovi

Take me down to the console city where the games are blurry and the frames are thirty - Guns N' Roses

Arguing with religious people is like explaining to your mother that online games can't be paused...

Link to post
Share on other sites
1 hour ago, shdowhunt60 said:

This seems to be a recurring theme this year: nVidia rushing out drivers of dubious quality.

what are you talking about?

 

AMD used pre-release drivers instead of the publicly available 368.25 WHQL, since May 26th

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×