Jump to content

RX 6800 Ray Tracing Performance Leaked

Random_Person1234
Go to solution Solved by Random_Person1234,

UPDATE: Frank Azor (Chief Gaming Architect at AMD) promised that more details would be released about Big Navi ray tracing and super resolution before the launch of the cards.

https://videocardz.com/newz/amd-frank-azor-promises-more-details-on-radeon-rx-6000-ray-tracing-and-super-sampling-tech-before-launch

https://twitter.com/AzorFrank/status/1321808969283870721 

 

Its great news that AMD have got GPUS that can compete/beat Nvidia

 

I still went for a 3070 FE - but if I couldn't get one I would have waited for the AMD and possible Nvidia Refresh.

 

I think AMD will have similar issues with there GPUs at launch just like Nvidia.  look at the 3300x - I haven't seen it in the while since launch.

 

Overall this is a good thing for "us" consumers as the next couple of years is going to bring some great cards.

 

Oh and I hope AMD drivers are better. (I know not everyone has issues, but two of my friends on AMD 5700, 5700XT have had random issues)

Folding Stats

 

SYSTEM SPEC

AMD Ryzen 5 5600X | Motherboard Asus Strix B550i | RAM 32gb 3200 Crucial Ballistix | GPU Nvidia RTX 3070 Founder Edition | Cooling Barrow CPU/PUMP Block, EKWB Vector GPU Block, Corsair 280mm Radiator | Case NZXT H1 | Storage Sabrent Rocket 2tb, Samsung SM951 1tb

PSU NZXT S650 SFX Gold | Display Acer Predator XB271HU | Keyboard Corsair K70 Lux | Mouse Corsair M65 Pro  

Sound Logitech Z560 THX | Operating System Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

ray tracing that thing i'll never enable because i rather have more fps over shadows. Also why i stick to 1080p gaming, and will be getting a 6800XT if reviews are good. 

beast gpu (hopefully) + still more than good enough res + ray off => silent gaming + high fps count

One day I will be able to play Monster Hunter Frontier in French/Italian/English on my PC, it's just a matter of time... 4 5 6 7 8 9 years later: It's finally coming!!!

Phones: iPhone 4S/SE | LG V10 | Lumia 920 | Samsung S24 Ultra

Laptops: Macbook Pro 15" (mid-2012) | Compaq Presario V6000

Other: Steam Deck

<>EVs are bad, they kill the planet and remove freedoms too some/<>

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, RejZoR said:

That's not necessarily true. Just because a compute unit has a multipurpose role it doesn't mean one or the other function has to suffer just because it needs both. Which to me, RA unit acts as a supplemental thing and not as one or the other. Meaning if game doesn't request ray tracing, the RA units will simply be there and do nothing (unless they can utilize them for screen space reflections and ambient occlusion acceleration too somehow). When there is ray tracing involved, they will compute the necessary stuff on top of whatever rasterization units underneath it have to do. Assuming it's that dedicated and not a feature that has to share compute capabilities with rasterization CU's.

My comment was based on XBOX Series X and AMD Ray Tracing patent slides. I don't know if it's different compare to the RX 6000 series but most probably not. 

These Ray Accelerator is inside RNDA2 texture mapping unit(TMU) so it shared the same data path/bandwidth with texture ops. Peak performance is heavily depend on available bandwidth, if there is not enough bandwidth for it to reach peak performance because it got choke by texture operations then you'll get lower/inconsistent RT perf. 

 

I guess that's why Infinity Cache took a large part of RDNA2 die, to prevent/alleviate this bandwidth issue. 

At least that's what i understood looking/reading from these slides. 

Spoiler

image.png.bd61e7611fd393aa51e62a49b8e8e5a4.png

Spoiler

image.png.66dd8f9e23cb761778bb20a158fa9fe6.png

Spoiler

image.png.9cb7b6cf45564d75ea23664b913b212f.png

 

 

1 hour ago, RejZoR said:

Looking at Ampere GPU diagram, they use a similar approach, but they have larger RT core clusters on each shader module opposed to RA on every shader unit. Clearly both have quite similar performance, we'll see which approach benefits under certain conditions and which is flawed under the others when cards get into reviewer's hands.

They're not inside texture unit like RDNA2. 🤔

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, xAcid9 said:

They're not inside texture unit like RDNA2. 🤔

I'm not sure how different they actually are, the current AMD diagrams aren't that clear. Both have Ray Accelerators in their SM/CU respectively, we really only know the details of Nvidia RT cores currently.

 

 

As to your other point:

Quote

The RT Cores in Turing can process all the BVH traversal and ray-triangle intersection testing, saving the SM from spending the thousands of instruction slots per ray, which could be an enormous amount of instructions for an entire scene. The RT Core includes two specialized units. The first unit does bounding box tests, and the second unit does ray-triangle intersection tests. The SM only has to launch a ray probe, and the RT core does the BVH traversal and ray-triangle tests, and return a hit or no hit to the SM. The SM is largely freed up to do other graphics or compute work. See Figure 18 or an illustration of Turing ray tracing with RT Cores.

To be honest it doesn't sound that different to what AMD is also doing. As long as the bolded part is true for both then the same limitations should apply to both, both have return data paths and those require bandwidth, I just don't actually know how much (I assume based on the data type that it is not very much, but then the data itself might be small like I'm thinking but there just lots of it per second totaling more than I'm thinking?? 🤷‍♂️)

Link to comment
Share on other sites

Link to post
Share on other sites

We'll see testing but looks fine. Still RT is still in infancy really. Base game needs to look good first and foremost. Some games don't even look that good and have RT and that it self doesn't make them look better per say. As in vs some awesome looking non RT games for example. But yeah in time more games will use it and more RT features with also higher use. Eventually all games. 

Also HDR needs to evolve faster on PC too. Monitors them selves though, but definitely having both of these in new games as standard will be amazing. 

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Doobeedoo said:

We'll see testing but looks fine. Still RT is still in infancy really. Base game needs to look good first and foremost. Some games don't even look that good and have RT and that it self doesn't make them look better per say. As in vs some awesome looking non RT games for example. But yeah in time more games will use it and more RT features with also higher use. Eventually all games. 

Also HDR needs to evolve faster on PC too. Monitors them selves though, but definitely having both of these in new games as standard will be amazing. 

Same could be said for pixel shaders when they were first introduced with GeForce 3. It was basically only used for water initially, but these days, pixel shading is integral part of anything rendered on screen. Ray tracing, given it's already in usable form now will over the upcoming years slowly become an integral part of every game. It'll come to a point where every game will be ray traced, just like all games have been pixel shaded for years.

Link to comment
Share on other sites

Link to post
Share on other sites

22 hours ago, Random_Person1234 said:

The AMD equivalent of DLSS is not out yet, according to rumors it's being worked on. So they were using native res.

It's not a rumor, AMD talked about working on it during the reveal event.

CPU - Ryzen 7 3700X | RAM - 64 GB DDR4 3200MHz | GPU - Nvidia GTX 1660 ti | MOBO -  MSI B550 Gaming Plus

Link to comment
Share on other sites

Link to post
Share on other sites

21 hours ago, Caroline said:

Oh no. Has AMD entered the memetracing competition too? welp, byebye to low prices I guess, $400 extra on a card just because it comes with that "feature", just like nvidia

 

Bait post, but I'll respond for the sake of education

 

AMD is providing a card that is 2x the performance of an RX 5700 while only boosting power consumption by a small margin, all on the same node. Despite this, they are charging much less than double the price of an RX 5700 at $579, not to mention that the 6800 XT provides an even better deal with the higher tier card performing about 5-10% better than a 3080 at $50 less. Raytracing was obviously going to be a staple feature on these newer cards, but AMD is not tacking on exorbitant prices to these cards, in fact the 6800 XT is priced the same as the 980 TI which was regarded as one of the best price/performance top end cards made in recent times (only superseded by the 1080 TI). 

8086k Winner BABY!!

 

Main rig

CPU: R7 5800x3d (-25 all core CO 102 bclk)

Board: Gigabyte B550 AD UC

Cooler: Corsair H150i AIO

Ram: 32gb HP V10 RGB 3200 C14 (3733 C14) tuned subs

GPU: EVGA XC3 RTX 3080 (+120 core +950 mem 90% PL)

Case: Thermaltake H570 TG Snow Edition

PSU: Fractal ION Plus 760w Platinum  

SSD: 1tb Teamgroup MP34  2tb Mushkin Pilot-E

Monitors: 32" Samsung Odyssey G7 (1440p 240hz), Some FHD Acer 24" VA

 

GFs System

CPU: E5 1660v3 (4.3ghz 1.2v)

Mobo: Gigabyte x99 UD3P

Cooler: Corsair H100i AIO

Ram: 32gb Crucial Ballistix 3600 C16 (3000 C14)

GPU: EVGA RTX 2060 Super 

Case: Phanteks P400A Mesh

PSU: Seasonic Focus Plus Gold 650w

SSD: Kingston NV1 2tb

Monitors: 27" Viotek GFT27DB (1440p 144hz), Some 24" BENQ 1080p IPS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, TheDankKoosh said:

Bait post, but I'll respond for the sake of education

 

AMD is providing a card that is 2x the performance of an RX 5700 while only boosting power consumption by a small margin, all on the same node. Despite this, they are charging much less than double the price of an RX 5700 at $579, not to mention that the 6800 XT provides an even better deal with the higher tier card performing about 5-10% better than a 3080 at $50 less. Raytracing was obviously going to be a staple feature on these newer cards, but AMD is not tacking on exorbitant prices to these cards, in fact the 6800 XT is priced the same as the 980 TI which was regarded as one of the best price/performance top end cards made in recent times (only superseded by the 1080 TI). 

? My memory was 5700 was $299.  Might be a currency though.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

CPU - Ryzen 5 5600X | CPU Cooler - EVGA CLC 240mm AIO  Motherboard - ASRock B550 Phantom Gaming 4 | RAM - 16GB (2x8GB) Patriot Viper Steel DDR4 3600MHz CL17 | GPU - MSI RTX 3070 Ventus 3X OC | PSU -  EVGA 600 BQ | Storage - PNY CS3030 1TB NVMe SSD | Case Cooler Master TD500 Mesh

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Bombastinator said:

? My memory was 5700 was $299.  Might be a currency though.

5700 was $350 but could typically be found for a bit less before the 5600 XT came out, I'm pretty much expecting the 6800 to be in a similar boat as I think it's too close to the price of the 6800 XT to be considered.

8086k Winner BABY!!

 

Main rig

CPU: R7 5800x3d (-25 all core CO 102 bclk)

Board: Gigabyte B550 AD UC

Cooler: Corsair H150i AIO

Ram: 32gb HP V10 RGB 3200 C14 (3733 C14) tuned subs

GPU: EVGA XC3 RTX 3080 (+120 core +950 mem 90% PL)

Case: Thermaltake H570 TG Snow Edition

PSU: Fractal ION Plus 760w Platinum  

SSD: 1tb Teamgroup MP34  2tb Mushkin Pilot-E

Monitors: 32" Samsung Odyssey G7 (1440p 240hz), Some FHD Acer 24" VA

 

GFs System

CPU: E5 1660v3 (4.3ghz 1.2v)

Mobo: Gigabyte x99 UD3P

Cooler: Corsair H100i AIO

Ram: 32gb Crucial Ballistix 3600 C16 (3000 C14)

GPU: EVGA RTX 2060 Super 

Case: Phanteks P400A Mesh

PSU: Seasonic Focus Plus Gold 650w

SSD: Kingston NV1 2tb

Monitors: 27" Viotek GFT27DB (1440p 144hz), Some 24" BENQ 1080p IPS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, RejZoR said:

Same could be said for pixel shaders when they were first introduced with GeForce 3. It was basically only used for water initially, but these days, pixel shading is integral part of anything rendered on screen. Ray tracing, given it's already in usable form now will over the upcoming years slowly become an integral part of every game. It'll come to a point where every game will be ray traced, just like all games have been pixel shaded for years.

The wild bit for real time ray tracing for me is when that actually happens.  People talk about narrow hallways and elevators being used for loading times, but the same sort of thing happens with game set design in a way.  Systems used to render 3D models are still to one degree or another chiaroscuro (by which I include cangigante, sfumato, and unione)  based. They’re designed to be seen from one direction and thought has to be placed in how they’re set up.  Ray tracing ignores chiaroscuro.  Games will be able to do literally different things.   Before chiaroscuro there was cavalier perspective, which to a modern eye looks ridiculously primitive.  I don’t think the shift will be that big, but I think it could be big.  Chiaroscuro is not the only perspective system, and it must be kept in mind that it is merely a system.  Chiaroscuro attempts to create the illusion of a 3d space behind the picture plane.  There is also iconographic perspective though (used by Russian Orthodox Church icons) that attempts to create a 3d illusion in front of the picture plane.  To someone who’s brain is schooled to chiaroscuro a Russian icon painting looks strange and bloated.  It took me years to be able to see in iconographic perspective, but when I did I saw that Russian iconography was every bit as realistic as Angelico or Caravaggio. It has to be looked at in a totally different way though.  I think All-raytracing-all-the-time has the capacity to fundamentally change video games.  It could be so much bigger than “oh look! The reflection in the hubcap moves”

Edited by Bombastinator

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

Based on everything in this thread, there seems to be ZERO reason to wait for RTX 30's to be in stock unless you need NVENC. 

 

If you're just playing games, if the ATI cards are in stock, just might as well get one of those. Raster performance is likely going to be on par with the RTX 30's, and RT is surprisingly within spitting distance when not using DLSS. 

 

I'm real curious how AMD's super resolution looks in comparison to DLSS..

 

IF THIS IS LEGIT... I can't wait to get my hands on a 5000 series Ryzen chip and combining it with a 6800XT. I'll use my current system to upgrade the i7 3770 I have at work. 

Work Rigs - 2015 15" MBP | 2019 15" MBP | 2021 16" M1 Max MBP | Lenovo ThinkPad T490 |

 

AMD Ryzen 9 5900X  |  MSI B550 Gaming Plus  |  64GB G.SKILL 3200 CL16 4x8GB |  AMD Reference RX 6800  |  WD Black SN750 1TB NVMe  |  Corsair RM750  |  Corsair H115i RGB Pro XT  |  Corsair 4000D  |  Dell S2721DGF  |
 

Fun Rig - AMD Ryzen 5 5600X  |  MSI B550 Tomahawk  |  32GB G.SKILL 3600 CL16 4x8GB |  AMD Reference 6800XT  | Creative Sound Blaster Z  |  WD Black SN850 500GB NVMe  |  WD Black SN750 2TB NVMe  |  WD Blue 1TB SATA SSD  |  Corsair RM850x  |  Corsair H100i RGB Pro XT  |  Corsair 4000D  |  LG 27GP850  |

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/31/2020 at 11:16 PM, Random_Person1234 said:

I'm still not sure if this justifies the $80 more the 6800 is compared to the 3070, especially considering the 3070 has DLSS.

But 6800 got more vRAM tho.... even if it just breaks even with performance.. I can justify the extra $80 on 8Gbs of VRAM

Link to comment
Share on other sites

Link to post
Share on other sites

UPDATE: Frank Azor (Chief Gaming Architect at AMD) promised that more details would be released about Big Navi ray tracing and super resolution before the launch of the cards.

https://videocardz.com/newz/amd-frank-azor-promises-more-details-on-radeon-rx-6000-ray-tracing-and-super-sampling-tech-before-launch

https://twitter.com/AzorFrank/status/1321808969283870721 

 

CPU - Ryzen 5 5600X | CPU Cooler - EVGA CLC 240mm AIO  Motherboard - ASRock B550 Phantom Gaming 4 | RAM - 16GB (2x8GB) Patriot Viper Steel DDR4 3600MHz CL17 | GPU - MSI RTX 3070 Ventus 3X OC | PSU -  EVGA 600 BQ | Storage - PNY CS3030 1TB NVMe SSD | Case Cooler Master TD500 Mesh

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/31/2020 at 1:46 PM, Random_Person1234 said:

 I'm still not sure if this justifies the $80 more the 6800 is compared to the 3070, especially considering the 3070 has DLSS.

 

The problem I see with DLSS right now is that we don't know if DLSS is going to be a bait and switch or not. 

 

Most scaling algorithms used (i.e. bicubic, bilinear, etc) can be thought of as dumb algorithms (figuratively speaking). With DLSS in its current form, it has to be trained through machine learning using high resolution reference images supplied by the game developer to Nvidia. It's theoretically possible that at some point an AI can be trained with enough reference images to upscale all games without any training and provide better fidelity than dumb upscaling algorithms, but I would argue that it will never be as effective as a smart upscaler that is trained for a specific game the way that Nvidia are doing it now, and there are not that many games with DLSS to begin with. I would also argue that the computation power required with a general purpose AI upscaler might be better served in removing the tensor cores and adding more Float/Integer processors instead.

 

I'm not saying DLSS is snake oil, I'm merely saying that unless Nvidia train the upscaler for a specific game, its never going to be that much better than dumb upscaling, and even if it is the horsepower required and the extra diespace used is a wash. I hope to be proven wrong and Nvidia will eventually release a ubiquitous form of DLSS that is better than a dumb algorithm, without sacrificing die space on the GPU. 

 

for now the extra $80 gets you 16GB of Vram vs 8GB, and better rasterization performance. Nvidia does manage Vram usage better than AMD in drivers, but that only gets you so far. 

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

delete, duplicate

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

It's time for black screens and other driver breaking bugs 🤭

A PC Enthusiast since 2011
AMD Ryzen 7 5700X@4.65GHz | GIGABYTE GTX 1660 GAMING OC @ Core 2085MHz Memory 5000MHz
Cinebench R23: 15669cb | Unigine Superposition 1080p Extreme: 3566
Link to comment
Share on other sites

Link to post
Share on other sites

28 minutes ago, Briggsy said:

The problem I see with DLSS right now is that we don't know if DLSS is going to be a bait and switch or not. 

 

Most scaling algorithms used (i.e. bicubic, bilinear, etc) can be thought of as dumb algorithms (figuratively speaking). With DLSS in its current form, it has to be trained through machine learning using high resolution reference images supplied by the game developer to Nvidia. It's theoretically possible that at some point an AI can be trained with enough reference images to upscale all games without any training and provide better fidelity than dumb upscaling algorithms, but I would argue that it will never be as effective as a smart upscaler that is trained for a specific game the way that Nvidia are doing it now, and there are not that many games with DLSS to begin with. I would also argue that the computation power required with a general purpose AI upscaler might be better served in removing the tensor cores and adding more Float/Integer processors instead.

 

I'm not saying DLSS is snake oil, I'm merely saying that unless Nvidia train the upscaler for a specific game, its never going to be that much better than dumb upscaling, and even if it is the horsepower required and the extra diespace used is a wash. I hope to be proven wrong and Nvidia will eventually release a ubiquitous form of DLSS that is better than a dumb algorithm, without sacrificing die space on the GPU. 

 

for now the extra $80 gets you 16GB of Vram vs 8GB, and better rasterization performance. Nvidia does manage Vram usage better than AMD in drivers, but that only gets you so far. 

yeah im cautious about seeing any new feature as a selling point for nvidia. remember when they said in the future every game will use physx. how many new games have physx today? and what about hairworks?

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/31/2020 at 5:45 PM, Random_Person1234 said:

AMD does have "ray tracing accelerators" built into their RX6000 GPUs. I think there's supposed to be one per CU.

I was referring to their software implementation.  It's just the general DX12 RT, instead of unique programming that requires their hardware.  Hopefully there will be optimizations for AMD cards, but it won't be dependent on unique AMD hardware to run.

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/31/2020 at 5:29 PM, RejZoR said:

Technically, so is "RTX". It's just NVIDIA branded crap slammed on top of what's essentially DXR. And since it's "RTX" they can make it NVIDIA specific. Meaning if they want to be real assholes, they can prevent it from running on Radeon cards as part of GameWorks proprietary nonsense used by game devs.

I have a feeling there will be more RT support and less RTX support now.  Programmers don't like closed features plus its easier to port from console

AMD 7950x / Asus Strix B650E / 64GB @ 6000c30 / 2TB Samsung 980 Pro Heatsink 4.0x4 / 7.68TB Samsung PM9A3 / 3.84TB Samsung PM983 / 44TB Synology 1522+ / MSI Gaming Trio 4090 / EVGA G6 1000w /Thermaltake View71 / LG C1 48in OLED

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, ewitte said:

Programmers don't like closed features plus its easier to port from console

Unless someone hands them a big sack of cash to make it happen 💰

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, gabrielcarvfer said:

Are you sure on that? Sounds pretty reasonable, but after seeing a ton of really impressive results I kind of expect it to work more generally. Not saying it will work for every single game, but it could work for similar game classes that share a ton of similar visual features (e.g. COD/BF, EA Star Wars games, Minecraft and spin-offs, Dota/LoL/HotS/Warcraft 3, etc).

For moderate upscaling its totally doable. see the video below starting at around 15:20 mark.

 

This is a video from a couple years ago Wendel and crew did where they talk about Radeon Image Sharpening, which basically allows you to game at 1440p and upscale to 4K with almost no discernable difference. iirc It's the same tech the game consoles have been using for years. Right now Nvidia have nothing like this that works across all games, but I assume a more ubiquitous version of DLSS would do something similar to what AMD has. It's not going to do what the true Deep Learning DLSS can do with 240p upscaled to 1080p kind of sorcery, but 1440p to 4K with minimal differences is pretty good considering its slightly more than a 2x upscale. If that's all DLSS ends up being for most games, I'd be fine with it.

 

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Briggsy said:

...Radeon Image Sharpening...

 

RIS is a very "dumb" algorithm (as in, it's just a sharpening filter instead of an AI trained on a supercomputer like Nvidia), but as you said

 

7 hours ago, Briggsy said:

...almost no discernible difference.

If the end result looks the same to human eyes, the one with the least effort wins, whether dumb or not. The only problem is that since Nvidia's approach involves AI, a lot of people who aren't able to see a difference between relatively modest upscaling (like 1440p to 4k, not like 720p to 4k), DO see a difference due to the placebo effect ("the work done by an AI must be better, right?")

All in all, I like both approaches, with Radeon's approach being miraculously simple compared to Nvidia's OP solution, both being able to achieve modest upscaling with minimal difference at best (in both image quality and performance). 

EDIT: also is it just me or does Wendel look like Gabe Newell?

Edited by Ash_Kechummm
Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×