Jump to content

GTX 1060 vs RX 480: The Verdict

PCGuy_5960
1 hour ago, Zangashtu said:

Does Chill actually make a difference to anybodys games?

From what I can see I don't see any performance/temp difference, or is it one of them "Only works on some supported games"?  

That's right and you need to enable it first.

 

Radeon%20Software%20Crimson%20ReLive%20%

 

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

That'll be why then I only play one or two of them :P
Thanks!

Intel i5 6600k @ 4.5GHz | NZXT Kraken X52 | Asus Maximus VIII Hero | Asus ROG Strix GTX 1070 | 16GB DDR4 Corsair Dominator Platinum @ 3200MHz | CoolerMaster MasterCase 5

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Zangashtu said:

Does Chill actually make a difference to anybodys games?

From what I can see I don't see any performance/temp difference, or is it one of them "Only works on some supported games"?  

AFAIK there's a growing list of "supported" titles, if the game isn't on the list there is no guarantee it will work or do anything at all, but it might.

 

15 hours ago, App4that said:

480 can crossfire, that is why it wins. No need to cherry pick APIs. /topic

IMO 480 wins cause of those reasons:

- Better at 1440p

- FreeSync (the monitors are much cheaper and a person spending around 220$ on a GPU such as the 480/1060 usually can't afford a G-Sync monitor, FreeSync is another story though)

- It's cheaper:

krynYYc.png

OhENkRx.png

- Crossfire, obviously

- The 4GB version doesn't have a cut-down chip, although it does have a bit slower memory

- The performance keeps getting better with each driver update, AFAIK it should be equal on average to the 1060 at 1080p across many AAA titles atm and better at any higher resolution

 

14 hours ago, App4that said:

Dat stuttering, and being limited by 4g of VRAM which is catching up with Furys. 

I'm not sure what stuttering, my Fury isn't CPU limited at all considering my current CPU but I haven't experienced any stuttering since November when I switched to a new rig.

 

Also, from the testing I did at 1920x1200, not many games exceed 4GB of VRAM and those that do exceed that threshold don't show any noticeable performance drops and other issues related to VRAM limitations, I even pushed ROTTR textures from High (just under 4GB usage) to Ultra (just under 6GB) and in the benchmark I lost like 1-2FPS. Then I played the game for 6 hours (I already finished it twice so didn't play it through again entirely) but there was no stuttering, FPS drops etc and the VRAM usage oscillated around 5-6GB +- 0,5GB, when I had my 4GB 290X and set the textures to Ultra, FPS drop was noticeably higher and the game did stutter. Badly. I don't think it was because of my CPU as I had an i7-3770 at the time and it AFAIK wasn't limiting the 290X.

 

I just had a direct comparison of 4GB of GDDR5 (although at high memory bus of the 290X) and 4GB of HBM, the Fury is obviously a faster chip but I'm not comparing the numbers directly, but the performance loss when exceeding the 4GB of VRAM threshold. Testing in Shadow of Mordor with the high-res texture pack yields similar results.

 

I'm yet to test more titles but the initial results seem that if the game doesn't allocate more than 6GB of VRAM for itself, it shouldn't be an issue on the Fury-series cards.

 

The only game I've seen the VRAM become an issue with the Fury is the latest Resident Evil with the "Shadow Cache" setting enabled, apparently it boosts performance (especially for AMD cards), it can even max out the VRAM of the 1060 and the 980Ti with that setting. However benchmarks of the same game with that setting disabled for the cards that don't have more than 4GB look like this:

KhYZDqC.png

Looks like the Fury doesn't need that setting to perform well in RE7 :) (Why I went for the Fury over the 480 and 1060, got a faster card for the same price in Poland)

 

Also, a bonus which might interest you and @Prysin, my FireStrike run with the Fury, look at the GPU score which matches a ref. 980Ti and a 1070 FE:

http://www.3dmark.com/3dm/17281882?

CPU: AMD Ryzen 7 5800X3D GPU: AMD Radeon RX 6900 XT 16GB GDDR6 Motherboard: MSI PRESTIGE X570 CREATION
AIO: Corsair H150i Pro RAM: Corsair Dominator Platinum RGB 32GB 3600MHz DDR4 Case: Lian Li PC-O11 Dynamic PSU: Corsair RM850x White

Link to comment
Share on other sites

Link to post
Share on other sites

45 minutes ago, Morgan MLGman said:

AFAIK there's a growing list of "supported" titles, if the game isn't on the list there is no guarantee it will work or do anything at all, but it might.

 

IMO 480 wins cause of those reasons:

- Better at 1440p

- FreeSync (the monitors are much cheaper and a person spending around 220$ on a GPU such as the 480/1060 usually can't afford a G-Sync monitor, FreeSync is another story though)

- It's cheaper:

krynYYc.png

OhENkRx.png

- Crossfire, obviously

- The 4GB version doesn't have a cut-down chip, although it does have a bit slower memory

- The performance keeps getting better with each driver update, AFAIK it should be equal on average to the 1060 at 1080p across many AAA titles atm and better at any higher resolution

 

I'm not sure what stuttering, my Fury isn't CPU limited at all considering my current CPU but I haven't experienced any stuttering since November when I switched to a new rig.

 

Also, from the testing I did at 1920x1200, not many games exceed 4GB of VRAM and those that do exceed that threshold don't show any noticeable performance drops and other issues related to VRAM limitations, I even pushed ROTTR textures from High (just under 4GB usage) to Ultra (just under 6GB) and in the benchmark I lost like 1-2FPS. Then I played the game for 6 hours (I already finished it twice so didn't play it through again entirely) but there was no stuttering, FPS drops etc and the VRAM usage oscillated around 5-6GB +- 0,5GB, when I had my 4GB 290X and set the textures to Ultra, FPS drop was noticeably higher and the game did stutter. Badly. I don't think it was because of my CPU as I had an i7-3770 at the time and it AFAIK wasn't limiting the 290X.

 

I just had a direct comparison of 4GB of GDDR5 (although at high memory bus of the 290X) and 4GB of HBM, the Fury is obviously a faster chip but I'm not comparing the numbers directly, but the performance loss when exceeding the 4GB of VRAM threshold. Testing in Shadow of Mordor with the high-res texture pack yields similar results.

 

I'm yet to test more titles but the initial results seem that if the game doesn't allocate more than 6GB of VRAM for itself, it shouldn't be an issue on the Fury-series cards.

 

The only game I've seen the VRAM become an issue with the Fury is the latest Resident Evil with the "Shadow Cache" setting enabled, apparently it boosts performance (especially for AMD cards), it can even max out the VRAM of the 1060 and the 980Ti with that setting. However benchmarks of the same game with that setting disabled for the cards that don't have more than 4GB look like this:

KhYZDqC.png

Looks like the Fury doesn't need that setting to perform well in RE7 :) (Why I went for the Fury over the 480 and 1060, got a faster card for the same price in Poland)

 

Also, a bonus which might interest you and @Prysin, my FireStrike run with the Fury, look at the GPU score which matches a ref. 980Ti and a 1070 FE:

http://www.3dmark.com/3dm/17281882?

It's not hard to use more than 4GB vram even in 1080p these days. 

 

Most games, Shadow of mordor included, uses 2k textures. Not 4k. Tex pack isn't that demanding. Games these days rarely if ever use more than 2k textures. Even in 4k resolution, they basically supersample 2k textures. This is why many games look annoyingly sharp in 4k. Because it is. 

 

Please refrain from talking about freesync again. The monitors you are showing in your screenshot is so appalling that they should simply never been made in the first place. 

 

Fury "floats" on it's wide bus.  The moment you're past that "life buoy"  all fiji GPUs crap themselves. 

 

Fiji is bad for crossfire, has more stutter than Hawaii. Polaris is way better than Hawaii in CF due to hardware being tweaked for better frame timing. 

 

Weak ass firestrike score. Any respectable 980Ti will hit 21-23k

Link to comment
Share on other sites

Link to post
Share on other sites

@Morgan MLGman

 

The Fury isn't a bad card, it's just limited. You know benchmarks with only fps don't mean shit to me. What's lost by removing settings, what are the frame times, the frame pacing, any stuttering?

 

If someone has a Fury cool, but advising one is risky because of the memory limitation.

If anyone asks you never saw me.

Link to comment
Share on other sites

Link to post
Share on other sites

41 minutes ago, App4that said:

@Morgan MLGman

 

The Fury isn't a bad card, it's just limited. You know benchmarks with only fps don't mean shit to me. What's lost by removing settings, what are the frame times, the frame pacing, any stuttering?

 

If someone has a Fury cool, but advising one is risky because of the memory limitation.

due to the memory compression and massive bandwidth on the bus, it acts like 5GB, but that is already at the limit for 1080p ULTRA....

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Prysin said:

due to the memory compression and massive bandwidth on the bus, it acts like 5GB, but that is already at the limit for 1080p ULTRA....

some games don't look at speed, they cut features based off capacity.

If anyone asks you never saw me.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Morgan MLGman said:

KhYZDqC.png

Looks like the Fury doesn't need that setting to perform well in RE7 :) (Why I went for the Fury over the 480 and 1060, got a faster card for the same price in Poland)
 

 

Just for the record though ^^ that graphs is utter bullshit...i played through the entire RE7 game (AMAZING GAME btw i highly recommend it...) on maximum quality settings and NOT ONCE did the FPS counter dipped bellow 90FPS...it was mostly around 115-130FPS throughout...some dips in the 95-100FPS range when fighting a crazy boss or some shit like that...74FPS for a 980ti...ain't happening...even at reference clock no overclocking the average will still be upward of 90FPS i'm sure unless you bench with a crap CPU or something...just saying...my machine run this game at the ''TITAN XP'' level of this graph...all day long :) bullshit!

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, i_build_nanosuits said:

Just for the record though ^^ that graphs is utter bullshit...i played through the entire RE7 game (AMAZING GAME btw i highly recommend it...) on maximum quality settings and NOT ONCE did the FPS counter dipped bellow 90FPS...it was mostly around 115-130FPS throughout...some dips in the 95-100FPS range when fighting a crazy boss or some shit like that...74FPS for a 980ti...ain't happening...even at reference clock no overclocking the average will still be upward of 90FPS i'm sure unless you bench with a crap CPU or something...just saying...my machine run this game at the ''TITAN XP'' level of this graph...all day long :) bullshit!

It's because they nerf the 980ti to 1000MHz, why I hate benchmarks like this one. Biased af

If anyone asks you never saw me.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Prysin said:

1. It's not hard to use more than 4GB vram even in 1080p these days. 

 

2. Most games, Shadow of mordor included, uses 2k textures. Not 4k. Tex pack isn't that demanding. Games these days rarely if ever use more than 2k textures. Even in 4k resolution, they basically supersample 2k textures. This is why many games look annoyingly sharp in 4k. Because it is. 

 

3. Please refrain from talking about freesync again. The monitors you are showing in your screenshot is so appalling that they should simply never been made in the first place. 

 

4. Fury "floats" on it's wide bus.  The moment you're past that "life buoy"  all fiji GPUs crap themselves. 

 

5. Fiji is bad for crossfire, has more stutter than Hawaii. Polaris is way better than Hawaii in CF due to hardware being tweaked for better frame timing. 

 

6. Weak ass firestrike score. Any respectable 980Ti will hit 21-23k

1. Huh, there aren't too many games that do that though so unless using custom texture packs it's not that common to exceed 4GB at 1080p imho

 

2. Noone said the texture pack to Shadow of Mordor is demanding, but it does eat up the VRAM.

 

3. Not sure what screenshots you're talking about but the pcpartpicker screenshots were done on my Latitude E6420 ATG at work with like 1366 x 768 resolution, and why would I refrain from talking about FreeSync? I think I missed your point over here.

 

4. I wasn't talking about going with two Fiji GPUs, it's quite obvious that going crossfire with 4GB of memory, no matter how fast, will hit a limit with GPUs as beefy as Fiji.

 

5. Not sure if it has more stutter than Hawaii, and I think I should be the one judging that since my previous card a few months ago was an R9 290X.

 

6. Weak ass firestrike score?

Of course an overclocked 980Ti will shoot way ahead of 17300 GPU score, this is why I specifically added "ref. 980Ti and 1070 FE".

Show me a GTX 1060 which even comes close. At the time I bought the Fury it was cheaper than a 6GB 1060 and around the 8GB 480 price (reference).

 

Here's a run of a GTX 1060 at a high overclock of 2139 core and 2177 memory:

http://www.3dmark.com/fs/9563110

14 171 GPU score.

RX 480 @1400MHz core 2250MHz memory:

http://www.3dmark.com/3dm/13746696

14 775 GPU score.

 

See the point? Now judging from this review: http://www.legitreviews.com/nvidia-geforce-gtx-1070-founders-edition-video-card-review_181760

 

A stock 1070 FE does 17 375 GPU score, my Fury does around 17 300. I'm not saying it's a faster card cause we all know it's not, but at the right price it's an insane deal if it's even coming close. I've seen those go for as low as 230$ after a MIR.

gtx1070-3dmark-firestrike.jpg

 

Because of price drops on the Fury videos like this started to appear some time ago:

1 hour ago, App4that said:

@Morgan MLGman

 

The Fury isn't a bad card, it's just limited. You know benchmarks with only fps don't mean shit to me. What's lost by removing settings, what are the frame times, the frame pacing, any stuttering?

 

If someone has a Fury cool, but advising one is risky because of the memory limitation.

I'm wondering if I can do such measurings myself, would FCAT be what I need for that? Frametimes shouldn't be an issue with my current CPU though, a 6700K @4,7GHz is almost as good as it gets for gaming...

 

30 minutes ago, i_build_nanosuits said:

Just for the record though ^^ that graphs is utter bullshit...i played through the entire RE7 game (AMAZING GAME btw i highly recommend it...) on maximum quality settings and NOT ONCE did the FPS counter dipped bellow 90FPS...it was mostly around 115-130FPS throughout...some dips in the 95-100FPS range when fighting a crazy boss or some shit like that...74FPS for a 980ti...ain't happening...even at reference clock no overclocking the average will still be upward of 90FPS i'm sure unless you bench with a crap CPU or something...just saying...my machine run this game at the ''TITAN XP'' level of this graph...all day long :) bullshit!

Yeah, I know, those FPS numbers as a "raw" amount are almost always inaccurate, I'm looking at the relativity between one card and the other, rather than at raw FPS numbers in most graphs.

Though they should be accurate as Guru3D re-did those benchmarks 3 times cause of the Shadow Cache issue, cause initially some cards performed way below expectations :P

CPU: AMD Ryzen 7 5800X3D GPU: AMD Radeon RX 6900 XT 16GB GDDR6 Motherboard: MSI PRESTIGE X570 CREATION
AIO: Corsair H150i Pro RAM: Corsair Dominator Platinum RGB 32GB 3600MHz DDR4 Case: Lian Li PC-O11 Dynamic PSU: Corsair RM850x White

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, Zangashtu said:

Does Chill actually make a difference to anybodys games?

From what I can see I don't see any performance/temp difference, or is it one of them "Only works on some supported games"?  

 

7 hours ago, xAcid9 said:

That's right and you need to enable it first.

 

Radeon%20Software%20Crimson%20ReLive%20%

 

You can make it work with ANY game.

http://www.endoflinemagazine.com/2017/01/enable-amds-radeon-chill-feature-for.html

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, Morgan MLGman said:

Yeah, I know, those FPS numbers as a "raw" amount are almost always inaccurate, I'm looking at the relativity between one card and the other, rather than at raw FPS numbers in most graphs.

Though they should be accurate as Guru3D re-did those benchmarks 3 times cause of the Shadow Cache issue, cause initially some cards performed way below expectations :P

i used the shadow cache ON...all max settings except no motion blur....average FPS i would say around 110-115FPS.

They patched the game and nvidia released a driver for it though...so maybe that's why i don't know, but it does run stupid good...mostly use all my vram though at 1440p. no stutter...buttery smooth.

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

21 hours ago, App4that said:

Any time you see "X% faster in X API", biased.

 

Game to game, and for just about every case you can run both in the API that gives them the strongest performance and they're equal. 

 

480 can crossfire, that is why it wins. No need to cherry pick APIs. /topic

This is why I didn't say that :D

CPU: Intel Core i7-5820K | Motherboard: AsRock X99 Extreme4 | Graphics Card: Gigabyte GTX 1080 G1 Gaming | RAM: 16GB G.Skill Ripjaws4 2133MHz | Storage: 1 x Samsung 860 EVO 1TB | 1 x WD Green 2TB | 1 x WD Blue 500GB | PSU: Corsair RM750x | Case: Phanteks Enthoo Pro (White) | Cooling: Arctic Freezer i32

 

Mice: Logitech G Pro X Superlight (main), Logitech G Pro Wireless, Razer Viper Ultimate, Zowie S1 Divina Blue, Zowie FK1-B Divina Blue, Logitech G Pro (3366 sensor), Glorious Model O, Razer Viper Mini, Logitech G305, Logitech G502, Logitech G402

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Morgan MLGman said:

Better at 1440p

Depending on which games you play....

CPU: Intel Core i7-5820K | Motherboard: AsRock X99 Extreme4 | Graphics Card: Gigabyte GTX 1080 G1 Gaming | RAM: 16GB G.Skill Ripjaws4 2133MHz | Storage: 1 x Samsung 860 EVO 1TB | 1 x WD Green 2TB | 1 x WD Blue 500GB | PSU: Corsair RM750x | Case: Phanteks Enthoo Pro (White) | Cooling: Arctic Freezer i32

 

Mice: Logitech G Pro X Superlight (main), Logitech G Pro Wireless, Razer Viper Ultimate, Zowie S1 Divina Blue, Zowie FK1-B Divina Blue, Logitech G Pro (3366 sensor), Glorious Model O, Razer Viper Mini, Logitech G305, Logitech G502, Logitech G402

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Prysin said:

@Morgan MLGman

FYI, my 295x2 hits just north of 25k...

http://www.3dmark.com/fs/7714404

Well, it is a dual-290X card after all... An awesome one too, wanted one so badly but they were extremely hard to even find here cause the prices are ridiculous compared to the US, over 1400$ was not what I'd like to pay for it :P

 

Though my 290X scored under 14000 GPU score at 1150/1450 so the scaling in FireStrike is near-perfect

CPU: AMD Ryzen 7 5800X3D GPU: AMD Radeon RX 6900 XT 16GB GDDR6 Motherboard: MSI PRESTIGE X570 CREATION
AIO: Corsair H150i Pro RAM: Corsair Dominator Platinum RGB 32GB 3600MHz DDR4 Case: Lian Li PC-O11 Dynamic PSU: Corsair RM850x White

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Morgan MLGman said:

Well, it is a dual-290X card after all... An awesome one too, wanted one so badly but they were extremely hard to even find here cause the prices are ridiculous compared to the US, over 1400$ was not what I'd like to pay for it :P

 

Though my 290X scored under 14000 GPU score at 1150/1450 so the scaling in FireStrike is near-perfect

lol, i paid 550$ for my 295x2

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Morgan MLGman said:

Show me a GTX 1060 which even comes close.

Just to piss you off:

http://www.3dmark.com/fs/10282767

CPU: Intel Core i7-5820K | Motherboard: AsRock X99 Extreme4 | Graphics Card: Gigabyte GTX 1080 G1 Gaming | RAM: 16GB G.Skill Ripjaws4 2133MHz | Storage: 1 x Samsung 860 EVO 1TB | 1 x WD Green 2TB | 1 x WD Blue 500GB | PSU: Corsair RM750x | Case: Phanteks Enthoo Pro (White) | Cooling: Arctic Freezer i32

 

Mice: Logitech G Pro X Superlight (main), Logitech G Pro Wireless, Razer Viper Ultimate, Zowie S1 Divina Blue, Zowie FK1-B Divina Blue, Logitech G Pro (3366 sensor), Glorious Model O, Razer Viper Mini, Logitech G305, Logitech G502, Logitech G402

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Prysin said:

lol, i paid 550$ for my 295x2

what a bust!

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

  • 4 weeks later...
On 11/02/2017 at 5:00 PM, PCGuy_5960 said:

I decided to make this thread to clear up some misconceptions regarding the RX 480 vs GTX 1060 debate.

GTX 1060 3GB vs RX 480 4GB

Get the RX 480 4GB it is faster and cheaper. Consider the 1060 ONLY for 2015 and older titles.

GTX 1060 6GB vs RX 480 8GB

Reasons to consider the 1060:

  • G-Sync (Freesync is cheaper though)
  • Nvidia FastSync
  • Nvidia SMP
  • Nvidia Ansel
  • ShadowPlay
  • Around 10% faster in 2015 and older titles
  • Slightly lower power consumption
  • Nvidia DSR
  • Virtual Super Resolution

Reasons to consider the RX 480:

  • AMD Freesync (Especially if you are thinking of purchasing a new monitor)
  • Crossfire (Not recommended IMO, but it will allow you to install a second GPU and get GTX 1080 like performance)
  • 2GBs more vRAM
  • Around 10% faster in DX12/Vulkan titles
  • AMD ReLive (Shadowplay is a bit easier to use though)
  • More "futureproof"
  • Radeon Chill
  • AMD Zerocore (For Crossfire only)
  • AMD Eyefinity

I would like to point out that in 2016 games, the RX 480 and the GTX 1060 perform similarly (within 5fps of each other)

Conlusion

The RX 480 4GB is better than the 1060 3GB in most games, it is also cheaper, so between those 2, get the RX 480 4GB.

The RX 480 8GB and the GTX 1060 6GB are really close in terms of performance. The 480 pulls ahead in Vulkan and DX12 titles, while the GTX 1060 pulls ahead in DX11 games.

In 2016 games, they perform identically.

TL;DR

RX 480 8GB>GTX 1060 6GB, in DX12/Vulkan titles

RX 480 8GB=GTX 1060 6GB, in 2016 titles

RX 480 8GB<GTX 1060 6GB, in 2015 and older titles (DX11)

 

RX 480 4GB>GTX 1060 3GB, in most games

(Sources:https://youtu.be/CiYQqNiqQKU,https://youtu.be/gEw3CaNSbUo

 

Edit 1:

Pricing:https://linustechtips.com/main/topic/736513-rx-480-vs-gtx-1060-price-comparison/

 

Edit 2:

Added AMD Zerocore and Eyefinity. Thanks to @Prysin for pointing that out.

 

"

  • Crossfire (Not recommended IMO, but it will allow you to install a second GPU and get GTX 1080 like performance)

"

1080 like performance for £300? Am i misunderstandung this?

Link to comment
Share on other sites

Link to post
Share on other sites

Lol, I had to register JUST to say this - 2 cards is not a good idea! This is from experience of trying to get budget performance, it was even put together by experts (I am 35, and have always built my own PC since early teens, so... whatever), and have since have friends who have dual card setups (even passed on my old card to a friend recently).

 

Sure, you look at a performance chart, and you see extra frames but the reality is micro stuttering, heat (WHY THE F**** DO THE CARDS HAVE TO BE SO CLOSE), and normally if you are getting two cards for value, you have low vram!

 

Sure sure, you get some extra frames, but the devil is literally having a party thanks to the extra heat and punishing you with micro-stuttering and an increase to your power usage.

 

The smartest tactic I have ever applied to video cards (or hardware in general), is upgrade when you want/need for a specific game (or other purpose), not just because you can - and if you can (And spend a lot of time with something), spoil yourself, bcs... YOLO, and if you really have to, you can normally find customers for good products if you have to sell it on (so invest in good stuff that people want).

 

</tangent>

 

That aside, nice write up, and way of dealing with a moody bunch of forum dwellers! Thanks!

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia released new drivers with better DX12 support. So GTX 1060 is now little closer to RX 480 in DX12 performance but I'm not sure if there's improvements in other DX12 games. At least BF1 DX12 is still a mess.

 

ZbvhEzwzWnyFDrShFT4JCo-650-80.jpg

 

I tried Rise of the Tomb Raider with DX12 enabled and I got around +15-20% improvement. 

Intel Core i5-13600KF @ 5.4/4.3Ghz (P/E) / Noctua NH-D15 chromax.black / ASRock Z790 PG Lightning / 32GB Kingston Fury Beast DDR5 @ 6400Mhz CL32 / PowerColor Radeon RX 6950 XT Red Devil 16GB @ 2700Mhz / Samsung 980 Pro 1TB / WD Blue SN570 2TB / Corsair RM1000x / Lian Li Lancool 216 / Lenovo G27Q-20 (1440p & 165Hz IPS)

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Lare111 said:

So GTX 1060 is now little closer to RX 480 in DX12

It is actually better than the 480.. The 480 is was only 5% faster :P

CPU: Intel Core i7-5820K | Motherboard: AsRock X99 Extreme4 | Graphics Card: Gigabyte GTX 1080 G1 Gaming | RAM: 16GB G.Skill Ripjaws4 2133MHz | Storage: 1 x Samsung 860 EVO 1TB | 1 x WD Green 2TB | 1 x WD Blue 500GB | PSU: Corsair RM750x | Case: Phanteks Enthoo Pro (White) | Cooling: Arctic Freezer i32

 

Mice: Logitech G Pro X Superlight (main), Logitech G Pro Wireless, Razer Viper Ultimate, Zowie S1 Divina Blue, Zowie FK1-B Divina Blue, Logitech G Pro (3366 sensor), Glorious Model O, Razer Viper Mini, Logitech G305, Logitech G502, Logitech G402

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×