Jump to content

After 5 years, I've finally upgraded my graphics card. I decided to run some benchmarks to see how the two cards compare.

 

The games I picked were ones that I either knew would be demanding, or ones that I actually intend to play. Because these results were for my own reference, I actually kept the undervolt/OC on my RTX 2060 Super intact - I want to know how much of an improvement I'm actually getting.

 

With that said, I thought some others might be interested in these numbers, since this is probably one of the only head-to-head comparisons between these two cards available at the moment. Just note that, between the factory OC and my own tweaks, the numbers for the 2060 Super are inflated by 5-10% compared to reference.

 

I also tried out an overclock on the RX 9060 XT. The numbers for that are +10% power limit, +185MHz frequency offset, and 2900MHz for the memory. That overclock was stable for these tests, but actually crashed when I was getting the results together working in the desktop, so I ended up having to pull it back a bit, but I didn't want to re-run everything. In the end, the OC isn't amazing. It probably isn't worth the potential for instability. I'll try for an undervolt in the coming weeks - although even that is dubious, as the card runs very cool.

 

The specific models are the Gigabyte RTX 2060 Super Windforce OC 8GB and Sapphire Pulse RX 9060 XT 16GB. The CPU in the system is a Ryzen 9 5900X with a Curve Optimizer per Core undervolt. The memory is 48GB across 4 sticks overclocked to DDR4-3333 with primary timings 16-18-18-38. The 2060S was limited to PCIe Gen 3 and no rebar - this is a 20 series limitation. The 9060 XT was running in PCIe Gen 4 mode with SAM enabled.

 

BenchmarkResults.png.3e183ce7b8743f65e7999530309b117e.png

 

The weakest results here are in Abzu. This could be a CPU or RAM limitation - the GPU utilization was only in the 80-90% range during the test. It could also be that this is a weak game for RDNA 4 - or perhaps a strong one for Turing. It is also the only game without RT, which could imply that RDNA 4 has much stronger relative RT performance compared to shader performance vs. Turning. Abzu benchmarking is not common, so I wasn't able to find any others who benchmarked the game with the 9060 XT. If you're a random person on the Internet 10 years from now looking for this data - you're welcome.

 

The best results are in Metro Exodus and The Riftbreaker, both of which had RT enabled. The Riftbreaker is unsurprising in some ways, as it's an AMD sponsored game with limited RT options, but Metro Exodus was a showcase game for the 20 series, so I was surprised to see the 9060XT completely decimate the 2060S here.

 

On the whole, I'm honestly not thrilled with these results. I was hoping for better. But some of this is almost certainly due to my CPU being a limiting factor. Metro Exodus showed utilization below 100% the entire time - sometimes as low as 80% in the early part of the cutscene I didn't benchmark when the card was pushing over 150fps.  

 

Benchmark details for those interested:

Spoiler

The numbers here are a 3 run average.

 

The benchmarks for all of these was taken with Afterburner/RTSS - I did not use the numbers from the built-in benchmark for The Riftbreaker.

 

The Abzu benchmark was swimming across the room where the ocras spawn past the pillar of coral in the center to the reefs on the far side.

 

The Control benchmark was in Central Executive - I fast traveled there, started recording, then turned around and went up the stairs, around the upper level, and looped around, returning to the control point.

 

The HL2 RTX benchmark was in Nova Prospekt - I just loaded the level, went around the corner, through the fence, through the fog, and stopped at the debris, then turned around and went back to the starting point.

 

The Metro Exodus benchmark was the opening of the level Moscow, including some of the opening cutscene, but not all of it. I started recording when Artyom wipes his mask, then I played through the beginning of the level, turning on the flashlight when prompted, and burning the cobwebs when prompted. I stopped at the second set of cobwebs in the train car.

 

The Riftbreaker benchmark was simply the GPU benchmark.

 

The main potential outlier here was a bad run for the RTX 2060 Super in The Riftbreaker, where the 1% low dipped all the way to 1.6fps as the final result. This was likely due to the card running out of VRAM - it happened on the third and final run. The RX 9060 XT was reporting nearly 13GB of VRAM allocated in its runs.

 

I also attached the spreadsheet itself - the raw numbers are held inside of =AVERAGE() functions.

 

BenchmarkResultsSansCrysis.ods

Link to comment
https://linustechtips.com/topic/1614956-rtx-2060-super-vs-rx-9060-xt-benchmarks/
Share on other sites

Link to post
Share on other sites

God what a disappointing result imo. 
3 generations to only go up 50%. 

For future spreadsheets, I do recommend using a geometric mean for the averages rather then an arithmetic mean. Your 1% low average is being pulled up really high with riftbreaker, with a geometric mean, the 1% low average is 40.6%

Link to post
Share on other sites

8 minutes ago, Tetras said:

TPU's GPU database has the card as 55% faster, so coming out at 46% sounds about right. The improvement in 60 series cards has been pretty weak overall since the 2060 launched, despite in theory being a lot newer.

Considering that their numbers are going to be with reference clocks, that makes sense. The OC on my 2060 Super was decent - I regularly got about 8% better in benchmarks vs. the factory OC it came with, which was already a few percent over stock. If we take away 8% from the 2060 Super results, it works out to a 59.3% win for the 9060 XT - even closer to TPU's figures.

13 minutes ago, starsmine said:

God what a disappointing result imo. 
3 generations to only go up 50%. 

When you take inflation into account, the RTX 2060 Super would cost $500 in today's money. So the 9060 XT comes in a fair bit cheaper - it's sort of a tier below product in that respect. If you look historically, that actually makes sense - the GTX 1050 Ti was only 43% faster than the GTX 660 according to TPU numbers. So this arguably gives more weight to the theory that Nvidia and AMD are actually selling tier below cards.

13 minutes ago, starsmine said:

For future spreadsheets, I do recommend using a geometric mean for the averages rather then an arithmetic mean. Your 1% low average is being pulled up really high with riftbreaker, with a geometric mean, the 1% low average is 40.6%

That's fair. I know that reviewers typically use a geomean for that reason. While I feel it was a valid run, the outlier data in The Riftbreaker is factoring in too heavily.

Link to post
Share on other sites

54 minutes ago, starsmine said:

3 generations to only go up 50%

It depends on the game. Factor in the DLSS, and it's a fair budget update. And how is a graph more representative of actual in-game performance gains than numbers? 
Untitled.thumb.png.3c011304f620275331965366208bd789.png

 

Also, for the past two generation xx60 series cards were dogshit. The situation is a bit brighter when you step up the tier.
Untitled2.thumb.png.549e86c2f23278aaa6fd40530c2a1949.png

"The GB8/12 Liberation Front"

There is approximately a 99% chance I edited my post

Refresh before you reply

 

Link to post
Share on other sites

8 minutes ago, Timme said:

It depends on the game. Factor in the DLSS, and it's a fair budget update. And how is a graph more representative of actual in-game performance gains than numbers? 
Untitled.thumb.png.3c011304f620275331965366208bd789.png

 

Also, for the past two generation xx60 series cards were dogshit. The situation is a bit brighter when you step up the tier.
Untitled2.thumb.png.549e86c2f23278aaa6fd40530c2a1949.png

DLSS should be factored OUT, not in.  I dont know what you are talking about with a graph.

Also what kind of cards are you using here?

Also what kind of website are you using here that uses percentages so wrong. -119% makes zero sense. that is negative frames per second.
Stepping up a tier doesn't make sense as a concept either, or rather making a statement like that doesn't make sense in terms of engineering.

I also want to point out alot of the numbers on those chart are very wrong. 

Link to post
Share on other sites

Just now, starsmine said:

DLSS should be factored OUT, not in.

Ermm, what? It's a free performance with minimal fidelity loss. This is a godsend for low-tier cards.
The 5060 Ti is the equivalent of the 9060 XT, no? 
And for the upper tier, I used upper-tier cards from the same generations. Not choosing AMD is a personal commitment, but the point stays the same. 50% is the very minimum of performance gains when 1080p, the intended use case, is considered. It is far from terrible. And where does this idea of exponential performance gains come from? From the times when the dies shrank by 10+ nanometers with each generation?

"The GB8/12 Liberation Front"

There is approximately a 99% chance I edited my post

Refresh before you reply

 

Link to post
Share on other sites

14 hours ago, Timme said:

Ermm, what? It's a free performance with minimal fidelity loss. This is a godsend for low-tier cards.

DLSS doesn't really change anything with this comparison, though, since the 2060 Super supports DLSS including the most recent DLSS transformer model. So it's a wash vs. the 9060 XT's FSR4.

I'm having more fun than you 😠

Link to post
Share on other sites

16 hours ago, YoungBlade said:

After 5 years, I've finally upgraded my graphics card. I decided to run some benchmarks to see how the two cards compare.

 

The games I picked were ones that I either knew would be demanding, or ones that I actually intend to play. Because these results were for my own reference, I actually kept the undervolt/OC on my RTX 2060 Super intact - I want to know how much of an improvement I'm actually getting.

 

With that said, I thought some others might be interested in these numbers, since this is probably one of the only head-to-head comparisons between these two cards available at the moment. Just note that, between the factory OC and my own tweaks, the numbers for the 2060 Super are inflated by 5-10% compared to reference.

 

I also tried out an overclock on the RX 9060 XT. The numbers for that are +10% power limit, +185MHz frequency offset, and 2900MHz for the memory. That overclock was stable for these tests, but actually crashed when I was getting the results together working in the desktop, so I ended up having to pull it back a bit, but I didn't want to re-run everything. In the end, the OC isn't amazing. It probably isn't worth the potential for instability. I'll try for an undervolt in the coming weeks - although even that is dubious, as the card runs very cool.

 

The specific models are the Gigabyte RTX 2060 Super Windforce OC 8GB and Sapphire Pulse RX 9060 XT 16GB. The CPU in the system is a Ryzen 9 5900X with a Curve Optimizer per Core undervolt. The memory is 48GB across 4 sticks overclocked to DDR4-3333 with primary timings 16-18-18-38. The 2060S was limited to PCIe Gen 3 and no rebar - this is a 20 series limitation. The 9060 XT was running in PCIe Gen 4 mode with SAM enabled.

 

BenchmarkResults.png.3e183ce7b8743f65e7999530309b117e.png

 

The weakest results here are in Abzu. This could be a CPU or RAM limitation - the GPU utilization was only in the 80-90% range during the test. It could also be that this is a weak game for RDNA 4 - or perhaps a strong one for Turing. It is also the only game without RT, which could imply that RDNA 4 has much stronger relative RT performance compared to shader performance vs. Turning. Abzu benchmarking is not common, so I wasn't able to find any others who benchmarked the game with the 9060 XT. If you're a random person on the Internet 10 years from now looking for this data - you're welcome.

 

The best results are in Metro Exodus and The Riftbreaker, both of which had RT enabled. The Riftbreaker is unsurprising in some ways, as it's an AMD sponsored game with limited RT options, but Metro Exodus was a showcase game for the 20 series, so I was surprised to see the 9060XT completely decimate the 2060S here.

 

On the whole, I'm honestly not thrilled with these results. I was hoping for better. But some of this is almost certainly due to my CPU being a limiting factor. Metro Exodus showed utilization below 100% the entire time - sometimes as low as 80% in the early part of the cutscene I didn't benchmark when the card was pushing over 150fps.  

 

Benchmark details for those interested:

  Reveal hidden contents

The numbers here are a 3 run average.

 

The benchmarks for all of these was taken with Afterburner/RTSS - I did not use the numbers from the built-in benchmark for The Riftbreaker.

 

The Abzu benchmark was swimming across the room where the ocras spawn past the pillar of coral in the center to the reefs on the far side.

 

The Control benchmark was in Central Executive - I fast traveled there, started recording, then turned around and went up the stairs, around the upper level, and looped around, returning to the control point.

 

The HL2 RTX benchmark was in Nova Prospekt - I just loaded the level, went around the corner, through the fence, through the fog, and stopped at the debris, then turned around and went back to the starting point.

 

The Metro Exodus benchmark was the opening of the level Moscow, including some of the opening cutscene, but not all of it. I started recording when Artyom wipes his mask, then I played through the beginning of the level, turning on the flashlight when prompted, and burning the cobwebs when prompted. I stopped at the second set of cobwebs in the train car.

 

The Riftbreaker benchmark was simply the GPU benchmark.

 

The main potential outlier here was a bad run for the RTX 2060 Super in The Riftbreaker, where the 1% low dipped all the way to 1.6fps as the final result. This was likely due to the card running out of VRAM - it happened on the third and final run. The RX 9060 XT was reporting nearly 13GB of VRAM allocated in its runs.

 

I also attached the spreadsheet itself - the raw numbers are held inside of =AVERAGE() functions.

 

BenchmarkResultsSansCrysis.ods 29.73 kB · 0 downloads

Quite a bit faster! I'm surprised an overclock didn't do more though, and I'm also surprised the 9060XT didn't win by more in the Abzu game.

AMD Ryzen™ 5 5600g w/ Radeon Graphics | 16GB DDR4-3200 RAM | 256GB NVME SSD + 2TB HDD | Amazon Basics 2.0 Speakers

                                                                                            I'M JUST A REAL-LIFE TOM SAWYER

Link to post
Share on other sites

17 minutes ago, KidKid said:

Quite a bit faster! I'm surprised an overclock didn't do more though, and I'm also surprised the 9060XT didn't win by more in the Abzu game.

Abzu was likely being held back by the rest of the system. I saw GPU utilization below 90% there, which usually indicates a CPU or RAM bottleneck.

 

I picked that game because I like to return to it and use the Meditate feature to appreciate the atmosphere and art style. It is likely not a good title for GPU benchmarks.

 

If the Abzu results are removed, the average win percent increases to 52.79% - much more in line with what would be expected from such testing.

 

If Abzu is excluded while using a geomean instead, this goes to 50.41% - a big increase from the 41.54% given with that result still included for the geomean.

Link to post
Share on other sites

2 hours ago, Ha-Satan said:

DLSS doesn't really change anything with this comparison, though, since the 2060 Super supports DLSS including the most recent DLSS transformer model. So it's a wash vs. the 9060 XT's FSR4.

2060 "can" do the transformer model. Not that you should use the transformer model. 
It doesnt have enough tensor cores for it to be viable. You would get better performance running at native
But yes, both GPUs in his comparison can run DLSS 3 and 4

Link to post
Share on other sites

3 minutes ago, starsmine said:

2060 "can" do the transformer model. Not that you should use the transformer model. 
It doesnt have enough tensor cores for it to be viable. You would get better performance running at native

Eh? I was not aware transformer model had more performance overhead than cnn.

I'm having more fun than you 😠

Link to post
Share on other sites

6 minutes ago, starsmine said:

2060 "can" do the transformer model. Not that you should use the transformer model. 
It doesnt have enough tensor cores for it to be viable. You would get better performance running at native
But yes, both GPUs in his comparison can run DLSS 3 and 4

Maybe the regular 2060 has problems, but I was able to use the Transformer model with my 2060 Super by swapping out the preset in the Nvidia App, and it worked just fine. It ran slower than the old model, but it still improved performance compared to Native while looking great - and I could run it at lower resolutions, giving better results.

 

As for being able to run DLSS... No, the RX 9060 XT cannot run DLSS.

2 minutes ago, Ha-Satan said:

Eh? I was not aware transformer model had more performance overhead than cnn.

It does - and the overhead is typically larger for 20 series cards - but it still works and, IMO, is usually still worth using, because you can run a game at DLSS Balanced and get similar results to DLSS Quality using the CNN model, which means you still get a slight net performance gain.

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×