Jump to content

RX 6800 Ray Tracing Performance Leaked

Random_Person1234
Go to solution Solved by Random_Person1234,

UPDATE: Frank Azor (Chief Gaming Architect at AMD) promised that more details would be released about Big Navi ray tracing and super resolution before the launch of the cards.

https://videocardz.com/newz/amd-frank-azor-promises-more-details-on-radeon-rx-6000-ray-tracing-and-super-sampling-tech-before-launch

https://twitter.com/AzorFrank/status/1321808969283870721 

 

4 hours ago, Ash_Kechummm said:

RIS is a very "dumb" algorithm (as in, it's just a sharpening filter instead of an AI trained on a supercomputer like Nvidia), but as you said

 

If the end result looks the same to human eyes, the one with the least effort wins, whether dumb or not. The only problem is that since Nvidia's approach involves AI, a lot of people who aren't able to see a difference between relatively modest upscaling (like 1440p to 4k, not like 720p to 4k), DO see a difference due to the placebo effect ("the work done by an AI must be better, right?")

All in all, I like both approaches, with Radeon's approach being miraculously simple compared to Nvidia's OP solution, both being able to achieve modest upscaling with minimal difference at best (in both image quality and performance). 

EDIT: also is it just me or does Wendel look like Gabe Newell?

You're praising "Ai" everything too much. Also CAS (AMD uses CAS, Contrast Adaptive Sharpening) and while it's "dumb", it creates super pleasent sharpening effect without any artefacting. And it does that so well it basically negates FXAA blurring and still leaves sharper image in the end. Been using CAS through ReShade in basically all games and I can't imagine playing another without it.

 

In fact I can't imagine playing game without ReShade anymore. SMAA, FXAA, CAS, Tonemapping/FakeHDR and Ambient Light. Playing Rise of Tomb Raider and switching these off while playing makes the game feel like it lacks effects that should be there.

 

What's funny is if they can do all this "Ai" wizardry, yet they still can't figure out near free supersampling somehow. On 1080p monitor, I don't care about all this crap as it doesn't benefit me all that much. But edge smoothing that costs nothing and is better than SMAA/TAA would be greatly appreciated. Instead, using 4K DSR or 4x SSAA bogs down my GTX 1080Ti to for me unplayable levels. 60fps just doens't work for me and has to be above 100fps...

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/31/2020 at 2:26 PM, BlackManINC said:

I don't understand what makes 4K so appealing to people.

Personally, even though I game at 3440x1440p, I look at the 4k numbers because I have a secondary display as well, which adds up to about the same number of pixels as a single 4k display.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Gimli said:

Personally, even though I game at 3440x1440p, I look at the 4k numbers because I have a secondary display as well, which adds up to about the same number of pixels as a single 4k display.

That doesn't really add up to 4k though since you're not using your secondary monitor for games and such along with your primary, overall you're closer to 1440p than you are to 4k

8086k Winner BABY!!

 

Main rig

CPU: R7 5800x3d (-25 all core CO 102 bclk)

Board: Gigabyte B550 AD UC

Cooler: Corsair H150i AIO

Ram: 32gb HP V10 RGB 3200 C14 (3733 C14) tuned subs

GPU: EVGA XC3 RTX 3080 (+120 core +950 mem 90% PL)

Case: Thermaltake H570 TG Snow Edition

PSU: Fractal ION Plus 760w Platinum  

SSD: 1tb Teamgroup MP34  2tb Mushkin Pilot-E

Monitors: 32" Samsung Odyssey G7 (1440p 240hz), Some FHD Acer 24" VA

 

GFs System

CPU: E5 1660v3 (4.3ghz 1.2v)

Mobo: Gigabyte x99 UD3P

Cooler: Corsair H100i AIO

Ram: 32gb Crucial Ballistix 3600 C16 (3000 C14)

GPU: EVGA RTX 2060 Super 

Case: Phanteks P400A Mesh

PSU: Seasonic Focus Plus Gold 650w

SSD: Kingston NV1 2tb

Monitors: 27" Viotek GFT27DB (1440p 144hz), Some 24" BENQ 1080p IPS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

4K or 1080p with 4x SSAA, basically the same shit. And basically equally unplayable. I can run Rise of Tomb Raider at 4K smoothly on GTX 1080Ti. It's not stuttering or pausing or whatever. But the way everything moves is just so lazy and weird there is no way I can play that way. Probably partially because I'm forced to use stupid V-Sync coz this is the only game that still tears like mad with Fast V-Sync for some dumb reason. For me 60fps is unplayable as I'm used to 144fps... Until 4K becomes a standard like 1080p is these days and can run at stupid framerates, I'm not gonna bother with it.

Link to comment
Share on other sites

Link to post
Share on other sites

37 minutes ago, TheDankKoosh said:

That doesn't really add up to 4k though since you're not using your secondary monitor for games and such along with your primary, overall you're closer to 1440p than you are to 4k

The same graphic card still has to pump out all those pixels. My experience is that my FPS numbers are typically closer to 4k than they are to 1440p.

 

3440x1440p is larger than regular 1440p too. It's 35% more pixels, plus the secondary monitor. 

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, RejZoR said:

Staring at grapes? I don't get it...

 

6 minutes ago, leadeater said:

Neither, 😕 🤷‍♂️

Really? I thought it was famous. It's a depiction of the fox from the fable "The fox and the grapes". Maybe the picture wasn't that obvious. 

 

 

Quote

Driven by hunger, a fox tried to reach some grapes hanging high on the vine but was unable to, although he leaped with all his strength. As he went away, the fox remarked 'Oh, you aren't even ripe yet! I don't need any sour grapes.'

 

Basically, people tend to dismiss and downplay things they themselves do not have access to in order to feel better about themselves. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, RejZoR said:

4K or 1080p with 4x SSAA, basically the same shit. And basically equally unplayable. I can run Rise of Tomb Raider at 4K smoothly on GTX 1080Ti. It's not stuttering or pausing or whatever. But the way everything moves is just so lazy and weird there is no way I can play that way. Probably partially because I'm forced to use stupid V-Sync coz this is the only game that still tears like mad with Fast V-Sync for some dumb reason. For me 60fps is unplayable as I'm used to 144fps... Until 4K becomes a standard like 1080p is these days and can run at stupid framerates, I'm not gonna bother with it.

FPS wise it's the same.  But 1080p with 4x SSAA is always more desirable (or 200% render scale) because the game's HUD doesn't get resized.  Even for games that have HUD scaling, that doesn't mean that the croshair won't be protected from shrinking (when using a 3840x2160 monitor, or DSR).

Link to comment
Share on other sites

Link to post
Share on other sites

49 minutes ago, LAwLz said:

Really? I thought it was famous. It's a depiction of the fox from the fable "The fox and the grapes". Maybe the picture wasn't that obvious.

I got it, it's a proverb in Italy. Literally "when the fox can't reach the grapes, he says it's unripe"

 

(I'm in desperate need of stock lol)

MOTHERBOARD: ASRock H97 Pro4 CPU: Intel Core i5-4460 @3.30 Ghz Intel Xeon E3-1271v3 @4.00 Ghz RAM: 32Gb (4x8Gb) Kingstone HyperX Fury DDR3@1600 Mhz (9-9-9-27)

GPU: MSI 390 8Gb Gaming Edition PSU: XFX TS 650w Bronze Enermax Revolution D.F. 650w 80+ Gold MOUSE: Logitech G502 Proteus Spectrum KEYBOARD: Monokey Standard Suave Blue

STORAGE: SSD Samsung EVO 850 250Gb // HDD WD Green 1Tb // HDD WD Blue 4Tb // HDD WD Blue 160Gb CASE: Fractal Design Define R5 Windowed OS: Windows 11 Pro x64 Bit

MONITORS: Samsung CFG7 C24FG7xFQ @144hz // Samsung SyncMaster TA350 LT23A350 @60hz Samsung Odyssey G7 COOLER: Noctua NH-D15

 

Link to comment
Share on other sites

Link to post
Share on other sites

WCCFtech has a loooooooooooooooooooooong history of bullshit "leaks". Especially that one particular guy that is the author of that article. I wouldnt get my hopes up about those numbers or anything from that site.

mY sYsTeM iS Not pErfoRmInG aS gOOd As I sAW oN yOuTuBe. WhA t IS a GoOd FaN CuRVe??!!? wHat aRe tEh GoOd OvERclok SeTTinGS FoR My CaRd??  HoW CaN I foRcE my GpU to uSe 1o0%? BuT WiLL i HaVE Bo0tllEnEcKs? RyZEN dOeS NoT peRfORm BetTer wItH HiGhER sPEED RaM!!dId i WiN teH SiLiCON LotTerrYyOu ShoUlD dEsHrOuD uR GPUmy SYstEm iS UNDerPerforMiNg iN WarzONEcan mY Pc Run WiNdOwS 11 ?woUld BaKInG MY GRaPHics card fIX it? MultimETeR TeSTiNG!! aMd'S GpU DrIvErS aRe as goOD aS NviDia's YOU SHoUlD oVERCloCk yOUR ramS To 5000C18

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, LAwLz said:

 

Really? I thought it was famous. It's a depiction of the fox from the fable "The fox and the grapes". Maybe the picture wasn't that obvious. 

 

 

 

Basically, people tend to dismiss and downplay things they themselves do not have access to in order to feel better about themselves. 

i have a 3080 and i dont care too much about rtx and wouldnt put too much priority on DLSS until i see more games support it. if lets say 60% of future games do then id take it more seriously but at this point it could just be physx again which at launch nvidia was saying eventually all games will come with physx and now look at how many games use it

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, spartaman64 said:

was saying eventually all games will come with physx and now look at how many games use it

physx-tinct

✨FNIGE✨

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/31/2020 at 1:56 PM, Parideboy said:

And most importantly, why rt numbers weren't show during the presentation.

In any case, we couldn't expect anything better than this.

Likely due to how irrelevant RT is in the grand scheme of things. Until it becomes a more prominent addition and less of a tech demo, I think it's completely fair to say that people shouldn't be buying these cards for that purpose. Though it's nice to see Nvidia no longer has zero competition in that field. :)

S.K.Y.N.E.T. v4.3

AMD Ryzen 7 5800X3D | 64GB DDR4 3200 | 12GB RX 6700XT |   Twin 24" Pixio PX248 Prime 1080p 144Hz Displays | 256GB Sabrent NVMe (OS) | 500GB Samsung 840 Pro #1 | 500GB Samsung 840 Pro #2 | 2TB Samsung 860 Evo1TB Western Digital NVMe | 2TB Sabrent NVMe | Intel Wireless-AC 9260

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, Imglidinhere said:

Likely due to how irrelevant RT is in the grand scheme of things.

Agree, but since they allegedly have the numbers, why not show them?

Or at least acknowledge the fact that the rt performance is within reach of the competition. Just to make a statement, so to speak.

 

The same case could be made for DLSS, but they emphasized how "they're working on something" even tho right now there's nothing concrete.

 

22 minutes ago, Imglidinhere said:

Until it becomes a more prominent addition and less of a tech demo, I think it's completely fair to say that people shouldn't be buying these cards for that purpose. Though it's nice to see Nvidia no longer has zero competition in that field. :)

Can't agree more.

MOTHERBOARD: ASRock H97 Pro4 CPU: Intel Core i5-4460 @3.30 Ghz Intel Xeon E3-1271v3 @4.00 Ghz RAM: 32Gb (4x8Gb) Kingstone HyperX Fury DDR3@1600 Mhz (9-9-9-27)

GPU: MSI 390 8Gb Gaming Edition PSU: XFX TS 650w Bronze Enermax Revolution D.F. 650w 80+ Gold MOUSE: Logitech G502 Proteus Spectrum KEYBOARD: Monokey Standard Suave Blue

STORAGE: SSD Samsung EVO 850 250Gb // HDD WD Green 1Tb // HDD WD Blue 4Tb // HDD WD Blue 160Gb CASE: Fractal Design Define R5 Windowed OS: Windows 11 Pro x64 Bit

MONITORS: Samsung CFG7 C24FG7xFQ @144hz // Samsung SyncMaster TA350 LT23A350 @60hz Samsung Odyssey G7 COOLER: Noctua NH-D15

 

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, Parideboy said:

The same case could be made for DLSS, but they emphasized how "they're working on something" even tho right now there's nothing concrete.

Probably because AMD have had something for a couple years now that works just fine without any overhead, nor has the need for die-space hogging tensor cores and ai training. If my 2080 had anything like radeon image sharpening, I'd play everything in 1440p and upscale to 4K for the added performance. For the RX 6000 series AMD users will use radeon image sharpening to get more performance with raytracing, without the artifacting that exists with DLSS.

 

My biggest gripe with tech forums is that a lot of comments contain misinformed opinions. Even techtubers lack the knowledge you'd think they would have working in this space, which only serves to create an echo chamber of misinformation. I know that my own knowledge base is limited, but the amount of times I've seen people asking "what is AMD's answer is to DLSS" makes me wonder where people get their information. How about Nvidia Reflex? AMD have had their own anti-lag for over a year now, and it works insanely well. DLSS? AMD already have great upscaling that works with every game and requires no training. Fast Sync? ditto. 

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

51 minutes ago, Briggsy said:

Probably because AMD have had something for a couple years now that works just fine without any overhead, nor has the need for die-space hogging tensor cores and ai training. If my 2080 had anything like radeon image sharpening, I'd play everything in 1440p and upscale to 4K for the added performance. For the RX 6000 series AMD users will use radeon image sharpening to get more performance with raytracing, without the artifacting that exists with DLSS.

I know, I have an AMD card, and I've used it. It over sharpens stuff, anything over 50% makes everything look bad.
There are scenarios where this works great and others where it doesn't (foliage for instance).
On games like Far Cry 5 it looks awful for example, while on Battlefield 5 it trades blows with (beating imo) DLSS.

 

DLSS has to be implemented on a game basis (and each game will have a different implementation of it). As a flip side, DLSS can upscale based on what it needs to be upscaled, not touching what is already 'sharp'. That's the AI's job. It's not perfect of course, sometimes some textures are not being sharpened properly, especially when there's a lot of motion.

 

RIS on the other hand works on an API level, works on way more games, even grants better performance compared to DLSS. The only problem is that everything gets over-sharpened, without a 'logic'. This leads to elements in the game look artificially sharpened compared to other textures. Some people don't mind this, some others are bothered by it. If AMD gets this sorted out (and hopefully that's what they're referring to as "in the future") they're golden.

MOTHERBOARD: ASRock H97 Pro4 CPU: Intel Core i5-4460 @3.30 Ghz Intel Xeon E3-1271v3 @4.00 Ghz RAM: 32Gb (4x8Gb) Kingstone HyperX Fury DDR3@1600 Mhz (9-9-9-27)

GPU: MSI 390 8Gb Gaming Edition PSU: XFX TS 650w Bronze Enermax Revolution D.F. 650w 80+ Gold MOUSE: Logitech G502 Proteus Spectrum KEYBOARD: Monokey Standard Suave Blue

STORAGE: SSD Samsung EVO 850 250Gb // HDD WD Green 1Tb // HDD WD Blue 4Tb // HDD WD Blue 160Gb CASE: Fractal Design Define R5 Windowed OS: Windows 11 Pro x64 Bit

MONITORS: Samsung CFG7 C24FG7xFQ @144hz // Samsung SyncMaster TA350 LT23A350 @60hz Samsung Odyssey G7 COOLER: Noctua NH-D15

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×