Jump to content

AMD RX 6800XT AIB synthetic benchmarks: Pure rasterized faster than RTX 3080, RT on par with RTX 2080 Ti

FaxedForward

3DMark, Firestrike, Time Spy, and Port Royal benchmarks are now appearing for the AMD Radeon RX 6800XT Navi 21 AIB models. The 6800XT is the only AIB model at this time, the higher end 6900 or XTX (final name unknown) that is exclusive to AMD has not yet been thoroughly benchmarked.

 

Quotes

Quote

Igor’sLAB

Igor Wallosseek shared the data from two different sources (percentage comparison and score comparison). He claims that the data was generated from benchmarks run by board partners offering both AMD and NVIDIA graphics cards (so probably ASUS, MSI or Gigabyte).

Igor Wallossek:

If you compare the percentage differences, the picture in Ultra-HD is very different: While the Firestrike Extreme benchmark is supposed to be over 18 percent ahead for the Radeon RX 6800XT, the Timespy Extreme is only a good 3 percent. If one then adds the raytracing implementation of the Port Royal, the GeForce RTX 3080 FE is then proudly 22 percent ahead of the new Radeon. Here, the WQHD resolution of the default settings has been manually set to Ultra-HD, probably for the sake of uniformity. The older GeForce RTX 2080 Ti doesn’t even do too badly in the RT benchmark, but has to leave springs in the other benchmarks.

Radeon-RX-6800XT-3DMark-IgorsLAB-2.pngRadeon-RX-6800XT-3DMark-IgorsLAB-1.png

Wccftech

It appears that Wccftech received very similar results to Igor, also divided into two groups: percent different and score difference:

AMD-Radeon-RX-6800-XT-3DMark-Benchmarks-

CapFrameX:

The developer of the benchmarking utility posted initial performance figures for the Big Navi, without confirming which SKU that is. The SKU would allegedly offer 8% better performance than RTX 3080 in Fire Strike Ultra, a DirectX 11 based 4K benchmark.

@KittyYYuko

A person who we still remember from a different nickname (Kitty Corgi) has provided many leaks that have already been confirmed. Yuko provided results in four different benchmarks of the Navi 21 XT sample, which allegedly features 80 Compute Units. However, based on what we know the 80CU version is exclusive to the XTX variant.

Kitty’s results have been put together by @harukaze5719:

AMD-Radeon-RX-6800XT-Scores-1200x675.jpg

 

 

If these numbers are accurate, it seems like AMD will have a truly competitive offering for the first time in many years. The inferior RT performance is not terribly surprising given AMD's new support of the feature, but that is likely not a deal breaker for many PC enthusiasts. Of course, the major question is price. If AMD asks 3080 money for the 6800XT, the reception may not be great. However, if it's $100-200 less, AMD may have a winner on its hands.

 

Sources

https://videocardz.com/newz/amd-radeon-rx-6800xt-alleged-3dmark-scores-hit-the-web

Current build: AMD Ryzen 7 5800X, ASUS PRIME X570-Pro, EVGA RTX 3080 XC3 Ultra, G.Skill 2x16GB 3600C16 DDR4, Samsung 980 Pro 1TB, Sabrent Rocket 1TB, Corsair RM750x, Scythe Mugen 5 Rev. B, Phanteks Enthoo Pro M, LG 27GL83A-B

Link to comment
Share on other sites

Link to post
Share on other sites

If that is true, I believe it doesn't have to be a dollar cheaper, it's 10% more base performance for 20% less raytracing performance, which I'd happily trade. It just sounds too good to be true.

 

Yes I know those benchmarks can be misleading and it can be 1:1 in games and 30% worse in raytracing, then AMD should have a cheaper offering, but I just don't think raytracing is that important yet. DLSS is a bigger thing if people can get persuaded to compare stock AMD vs Nvidia on DLSS.

Link to comment
Share on other sites

Link to post
Share on other sites

Now they just need something like DLSS 2.0, otherwise NVIDIA will again be performing better due to the tensor cores doing their magic. Of couse only in future games with proper DLSS implementation ;) 

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Loote said:

DLSS is a bigger thing if people can get persuaded to compare stock AMD vs Nvidia on DLSS.

I think it's perfectly fair to compare NVIDIA cards with DLSS enabled to AMD cards. It's literally an ingame-option that enables more performance without visual downsides, so everyone running NVIDIA will basically use it whenever possible. AMD needs to offer something similar imo.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Stahlmann said:

I think it's perfectly fair to compare NVIDIA cards with DLSS enabled to AMD cards. It's literally an ingame-option that enables more performance without visual downsides, so everyone running NVIDIA will basically use it whenever possible. AMD needs to offer something similar imo.

I personally disagree. DLSS' biggest limitation is that it's not universally supported and Papa Nvidia has to bless your game with AI training to enable it. It's not like an API where the developers can just choose to incorporate it at will. There are less than 20 games that currently support it and only about 25 more with future support announced.

 

I think it would be fair if it was universally available, but Nvidia gets to cherry pick the titles it supports with AI training. So to compare only DLSS-enabled titles and say the 3080 is the best card because it's faster in ~50 out of thousands of PC titles is kind of silly. Pure raster vs pure raster is an objective standard.

 

This is the same thing we saw with S3 and the MeTaL API 20 years ago, it blew the doors off 3dfx/ATi/Nvidia in the small number of titles where it was supported but was objectively inferior elsewhere. Times have changed in a lot of ways but back then consistent performance with no tricks ultimately won out.

Current build: AMD Ryzen 7 5800X, ASUS PRIME X570-Pro, EVGA RTX 3080 XC3 Ultra, G.Skill 2x16GB 3600C16 DDR4, Samsung 980 Pro 1TB, Sabrent Rocket 1TB, Corsair RM750x, Scythe Mugen 5 Rev. B, Phanteks Enthoo Pro M, LG 27GL83A-B

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Stahlmann said:

I think it's perfectly fair to compare NVIDIA cards with DLSS enabled to AMD cards.

You do, some don't, that's where persuasion comes in.
 

Quote

It's literally an ingame-option that enables more performance without visual downsides, so everyone running NVIDIA will basically use it whenever possible.

That is not true, it is very close, but not exactly the same, how much weight one puts to that difference is entirely up to whoever's considering buying a card. The list of games is very short though and one thing they can do to counter AMD is making that list longer.
I agree that with resolutions like 4k it's much harder to notice DLSS, and that's also where the additional frames are needed, so they're good on that.

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, FaxedForward said:

I personally disagree. DLSS' biggest limitation is that it's not universally supported and Papa Nvidia has to bless your game with AI training to enable it. It's not like an API where the developers can just choose to incorporate it at will. There are less than 20 games that currently support it and only about 25 more with future support announced.

I still think it's fair to include the DLSS enabled performance in games that support it because it's a no-brainer enabled setting. Or what would you do if you had the choice to run a game at 60 or 90 fps with everything else staying the same?

12 minutes ago, FaxedForward said:

I think it would be fair if it was universally available, but Nvidia gets to cherry pick the titles it supports with AI training. So to compare only DLSS-enabled titles and say the 3080 is the best card because it's faster in ~50 out of thousands of PC titles is kind of silly. Pure raster vs pure raster is an objective standard.

I'm talking about game for game benchmarks. For example when i shop a GPU i look at the performance for the games i want to play. Also it doesn't depend on per-game training anymore. It's a universal algorythm now. But there is the point that the devs will have to work with NVIDIA to get it implemented.

11 minutes ago, Loote said:

That is not true, it is very close, but not exactly the same, how much weight one puts to that difference is entirely up to whoever's considering buying a card. The list of games is very short though and one thing they can do to counter AMD is making that list longer.
I agree that with resolutions like 4k it's much harder to notice DLSS, and that's also where the additional frames are needed, so they're good on that.

Hardware Unboxed revisited the games that currently support DLSS. And basically the conclusion was that it looks basically as good as native on the "performance" modes and sometimes even slightly sharper than native on the "quality" modes while still unlocking more performance.

 

Think what you will about DLSS, but it will be a huge argument in future titles imo. Especially AAA titles will likely include DLSS for the most part in future releases. And what end-user really cares if the performance they see is raw core performance or AI upscaling performance if the result is basically just better performance either way?

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

26 minutes ago, Stahlmann said:

I think it's perfectly fair to compare NVIDIA cards with DLSS enabled to AMD cards. It's literally an ingame-option that enables more performance without visual downsides, so everyone running NVIDIA will basically use it whenever possible. AMD needs to offer something similar imo.

No. Until DLSS works in ALL games it's not fair to compare. With per game comparison, sure, but you can't then generally say GeForce is generally faster by X percent based on DLSS scores.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, RejZoR said:

No. Until DLSS works in ALL games it's not fair to compare. With per game comparison, sure, but you can't then generally say GeForce is generally faster by X percent based on DLSS scores.

7 minutes ago, Stahlmann said:

I'm talking about game for game benchmarks. For example when i shop a GPU i look at the performance for the games i want to play.

That's exaclty what i mean. I think it is important for benchmarks to at least include the DLSS enabled numbers additionally to the raw performance moving forward.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

56 minutes ago, Stahlmann said:

Now they just need something like DLSS 2.0, otherwise NVIDIA will again be performing better due to the tensor cores doing their magic. Of couse only in future games with proper DLSS implementation ;) 

also DLSS isnt perfect. when you are standing still sure it looks good but when you start moving theres artifacts. also not everyone has 4k monitors nor is it worth it for most people to get 4k 

https://www.techspot.com/article/1113-4k-monitor-see-difference/

i only like 27 inch monitors because bigger than that and id have to move my head to see the corner of the screen and i wouldnt see a difference between 1440p and 4k on a 27 inch screen

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, spartaman64 said:

also DLSS isnt perfect. when you are standing still sure it looks good but when you start moving theres artifacts. also not everyone has 4k monitors nor is it worth it for most people to get 4k 

Afaik there is no artifacting problem with moving content. Also DLSS 2.0 works with 1440p and 1080p aswell. Only DLSS 1.0 was restricted to 4K.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, Stahlmann said:

That's exaclty what i mean. I think it is important for benchmarks to at least include the DLSS enabled numbers additionally to the raw performance moving forward.

Additionally is the key word and with it I agree wholeheartedly. IMO DLSS is single digit worse visually and def. a thing I'd want to have.

Link to comment
Share on other sites

Link to post
Share on other sites

RTX even on 3080/3090 is too big a hit (usuall) if it were just by itself.  What makes it usable is DLSS. 2.0 made a huge difference but 3.0 if it works even the same is a game changer if your able to enable it in any game supporting TAA.

 

I'm curious though how it will look on AMD at maybe an 80% render scale and RIS enabled and ray tracing...

AMD 7950x / Asus Strix B650E / 64GB @ 6000c30 / 2TB Samsung 980 Pro Heatsink 4.0x4 / 7.68TB Samsung PM9A3 / 3.84TB Samsung PM983 / 44TB Synology 1522+ / MSI Gaming Trio 4090 / EVGA G6 1000w /Thermaltake View71 / LG C1 48in OLED

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to comment
Share on other sites

Link to post
Share on other sites

Its going to be a difficult decision for some.

 

If u want pure rasterization performance, but also intend on going 4k. Then you have to decide whether u expect to be wanting to use DLSS for said 4k gaming, which will outperform even the (expected) superior performance of AMDs rasterization at 4k, but likely wont be used in many games.

 

That said, price also comes into play.

AMD HAS to stay the cheaper option, otherwise they wont win this generation.

They cant expect people who have been running Nvidia cards for 10+ years to just jump ship for a few % performance when they have an unfortunate reputation of poor drivers (be it true or not).

 

The Full fat 80CU Navi 21 XTX known as the 6900XT, has to be in the $700 area, if they try market it at $1000+ , u can forget it. There is no way 8CU's and a tiny clock bump is going to make a $400+ price difference over the 3800XT (72cu) card), a worthwhile choice.

 

The reasonable thing to do is to undercut the RTX 3080 with the 6800XT, providing slightly better or comparable raster performance for lower cost.

Then put up the 6900XT for slightly above the 3080. its not what people will like, having a top end AMD card costing more than the top 'gaming' GPU from Nvidia, but its better than AMD trying to price the 6900XT to undercut the 3090, which it wont beat and most certainly wont be worth the price.

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

I can't run Timespy Extreme but did a very quick test Firestrike Ultra with my 95% power target it did get within 8 points of the 3080FE score above.

AMD 7950x / Asus Strix B650E / 64GB @ 6000c30 / 2TB Samsung 980 Pro Heatsink 4.0x4 / 7.68TB Samsung PM9A3 / 3.84TB Samsung PM983 / 44TB Synology 1522+ / MSI Gaming Trio 4090 / EVGA G6 1000w /Thermaltake View71 / LG C1 48in OLED

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to comment
Share on other sites

Link to post
Share on other sites

I know Igor's Lab is generally pretty accurate leaks, but these are so good that I'm going to make a point to be skeptical about them, and wait for reviews.

We have a NEW and GLORIOUSER-ER-ER PSU Tier List Now. (dammit @LukeSavenije stop coming up with new ones)

You can check out the old one that gave joy to so many across the land here

 

Computer having a hard time powering on? Troubleshoot it with this guide. (Currently looking for suggestions to update it into the context of <current year> and make it its own thread)

Computer Specs:

Spoiler

Mathresolvermajig: Intel Xeon E3 1240 (Sandy Bridge i7 equivalent)

Chillinmachine: Noctua NH-C14S
Framepainting-inator: EVGA GTX 1080 Ti SC2 Hybrid

Attachcorethingy: Gigabyte H61M-S2V-B3

Infoholdstick: Corsair 2x4GB DDR3 1333

Computerarmor: Silverstone RL06 "Lookalike"

Rememberdoogle: 1TB HDD + 120GB TR150 + 240 SSD Plus + 1TB MX500

AdditionalPylons: Phanteks AMP! 550W (based on Seasonic GX-550)

Letterpad: Rosewill Apollo 9100 (Cherry MX Red)

Buttonrodent: Razer Viper Mini + Huion H430P drawing Tablet

Auralnterface: Sennheiser HD 6xx

Liquidrectangles: LG 27UK850-W 4K HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

Ultimately, even if RTX 3000 series are better in every way, their absolute unavailability makes anything AMD releases a better card. Assuming AMD can churn out enough cards to satisfy our demands. Depending on how things unfold at the end of October 2020 with AMD's launch, I might be jumping ship and going back with Radeon. I realized DLSS is not much of use for me with 1080p monitor as it renders games at 500p or something and that allegedly shows lack of quality more than at 4K and rendering at 1080p or whatever internal rendering uses there. Ray tracing is nice, but it makes a hit which is similar on both, give or take some.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm still happy I went with the 3080 FE. I sort of doubt that the 6800 XT will be able to beat the 3080, and I do feel like raytracing is the future! I hope pure rasterized it on par with 3080 tho, that way both Nvidia and AMD continue to push further

 

Asobo may choose to implement DXR in FS2020 once it switches to DX12, and if they implement DLSS it will be the kind of option you have to turn on considering how low frame rates in flight sim are to begin with

Current System: Ryzen 7 3700X, Noctua NH L12 Ghost S1 Edition, 32GB DDR4 @ 3200MHz, MAG B550i Gaming Edge, 1TB WD SN550 NVME, SF750, RTX 3080 Founders Edition, Louqe Ghost S1

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Hymenopus_Coronatus said:

I'm still happy I went with the 3080 FE. I sort of doubt that the 6800 XT will be able to beat the 3080, and I do feel like raytracing is the future!

Yeah I'd honestly also be happy with a 3080 purchase if I had made one. I'm not the target for such a high-end graphics card though. Happy with my 1080 Ti I bought 2 months ago.

We have a NEW and GLORIOUSER-ER-ER PSU Tier List Now. (dammit @LukeSavenije stop coming up with new ones)

You can check out the old one that gave joy to so many across the land here

 

Computer having a hard time powering on? Troubleshoot it with this guide. (Currently looking for suggestions to update it into the context of <current year> and make it its own thread)

Computer Specs:

Spoiler

Mathresolvermajig: Intel Xeon E3 1240 (Sandy Bridge i7 equivalent)

Chillinmachine: Noctua NH-C14S
Framepainting-inator: EVGA GTX 1080 Ti SC2 Hybrid

Attachcorethingy: Gigabyte H61M-S2V-B3

Infoholdstick: Corsair 2x4GB DDR3 1333

Computerarmor: Silverstone RL06 "Lookalike"

Rememberdoogle: 1TB HDD + 120GB TR150 + 240 SSD Plus + 1TB MX500

AdditionalPylons: Phanteks AMP! 550W (based on Seasonic GX-550)

Letterpad: Rosewill Apollo 9100 (Cherry MX Red)

Buttonrodent: Razer Viper Mini + Huion H430P drawing Tablet

Auralnterface: Sennheiser HD 6xx

Liquidrectangles: LG 27UK850-W 4K HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Energycore said:

Yeah I'd honestly also be happy with a 3080 purchase if I had made one

Yeah I mean the difference will be small regardless, and I do trust Nvidia more with drivers at the moment (I'd like to see AMD go through a whole generation with nothing but minor issues that get fixed in weeks instead of months)

 

Overall, seems like a great time to be building PCs!

Current System: Ryzen 7 3700X, Noctua NH L12 Ghost S1 Edition, 32GB DDR4 @ 3200MHz, MAG B550i Gaming Edge, 1TB WD SN550 NVME, SF750, RTX 3080 Founders Edition, Louqe Ghost S1

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Hymenopus_Coronatus said:

I'm still happy I went with the 3080 FE. I sort of doubt that the 6800 XT will be able to beat the 3080, and I do feel like raytracing is the future!

 

Asobo may choose to implement DXR in FS2020 once it switches to DX12, and if they implement DLSS it will be the kind of option you have to turn on considering how low frame rates in flight sim are to begin with

Me too I would have considered AMD if it was out at the same time but extra waiting time especially considering they are not releasing AIB models at the same time.  I got out when I could still get a good price for the 2080 so it ended up costing something like 270 total.  Even with a 2y upgrade cycle that is about $11/mo.

 

I do feel its better next time around to not even look at the numbers until a few months after release.  Constantly searching nowinstock and dropping by MicroCenter was stressful.

AMD 7950x / Asus Strix B650E / 64GB @ 6000c30 / 2TB Samsung 980 Pro Heatsink 4.0x4 / 7.68TB Samsung PM9A3 / 3.84TB Samsung PM983 / 44TB Synology 1522+ / MSI Gaming Trio 4090 / EVGA G6 1000w /Thermaltake View71 / LG C1 48in OLED

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Hymenopus_Coronatus said:

I do trust Nvidia more with drivers at the moment (I'd like to see AMD go through a whole generation with nothing but minor issues that get fixed in weeks instead of months)

Honestly though, driver issues are going to happen with every card at launch (look at the 3080 issues).

 

The only way to avoid them is to buy the card a couple months after launch, though I do agree that those issues better be solved in weeks, not 6 months like -certain- architectures (looking at you Radeon VII)

We have a NEW and GLORIOUSER-ER-ER PSU Tier List Now. (dammit @LukeSavenije stop coming up with new ones)

You can check out the old one that gave joy to so many across the land here

 

Computer having a hard time powering on? Troubleshoot it with this guide. (Currently looking for suggestions to update it into the context of <current year> and make it its own thread)

Computer Specs:

Spoiler

Mathresolvermajig: Intel Xeon E3 1240 (Sandy Bridge i7 equivalent)

Chillinmachine: Noctua NH-C14S
Framepainting-inator: EVGA GTX 1080 Ti SC2 Hybrid

Attachcorethingy: Gigabyte H61M-S2V-B3

Infoholdstick: Corsair 2x4GB DDR3 1333

Computerarmor: Silverstone RL06 "Lookalike"

Rememberdoogle: 1TB HDD + 120GB TR150 + 240 SSD Plus + 1TB MX500

AdditionalPylons: Phanteks AMP! 550W (based on Seasonic GX-550)

Letterpad: Rosewill Apollo 9100 (Cherry MX Red)

Buttonrodent: Razer Viper Mini + Huion H430P drawing Tablet

Auralnterface: Sennheiser HD 6xx

Liquidrectangles: LG 27UK850-W 4K HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Energycore said:

though I do agree that those issues better be solved in weeks, not 6 months like -certain- architectures (looking at you Radeon VII)

Exactly.

 

The 3080 issues were more or less resolved in two weeks or so by a driver patch.

Current System: Ryzen 7 3700X, Noctua NH L12 Ghost S1 Edition, 32GB DDR4 @ 3200MHz, MAG B550i Gaming Edge, 1TB WD SN550 NVME, SF750, RTX 3080 Founders Edition, Louqe Ghost S1

Link to comment
Share on other sites

Link to post
Share on other sites

Also be aware that RX 5000 series had some more issues because RDNA is a dramatic shift away from old architecture. Now that RDNA2 is in its second generation, I feel like it'll be much more mature and AMD team knows better how things work and don't work.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×