Jump to content

Do AMD graphics look better?

2 minutes ago, i_build_nanosuits said:

Splitting hairs here...both cards running the same game will look exactly the same...

Not really because they have different feature sets. 

 

But I agree trying to find "differences" in youtube videos with heavy compression artefacts is... futile. 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, i_build_nanosuits said:

Splitting hairs here...both cards running the same game will look exactly the same...

 

35 minutes ago, papajo said:

 

 

9f87f639a156a596eea9f784fd98cc2d.png

 

(nvidia left <>AMD right) 

 

 

Obviously it's better if you full screen and max out the resolution and watch it from the benchmark videos directly but I screenshoted them just to make a point 

I think most people posting didnt even bother checking out the videos....  take a look first and then start being skeptic :P 

 

 

 

8 minutes ago, Mark Kaine said:

That is somehow my point tho, could be they used different settings... These videos are heavily edited too, you can't say "but we saw the settings". 

They are benchmark videos dude in order for them to be worth 2 cents they have to run on the same system....  To be honest I am not a follower on any of these channels I linked but they have traffic... its not that they are some 12yo first video attempt to compare graphics cards with his friend using different PCs or what not... 

 

And I dont see any heavy edit... they are just squeezed together using 50% of the screen ... 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, papajo said:

They are benchmark videos dude in order for them to be worth 2 cents they have to run on the same system

But not at the same settings... And you can't even prove the same system thing, there's no reason they couldn't fake that to show whatever they want to "show", even the overlays don't prove really anything, no one knows what settings they used... 

 

And that offer for the bridge is still up! 

 

 

6 minutes ago, papajo said:

And I dont see any heavy edit

Hint: most stuff on youtube is fake, has always been (ufo sightings around the year 2000 were great lol, people believed that too!) 

 

 

Also I said you can't compare this stuff on heavily compressed videos, *plus* the AMD doesn't even look better, they look 99% the same. 🤷🏼

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Mark Kaine said:

But not at the same settings..

My dear friend this is not some tinfoil propaganda I am trying to convey to you those are channels that do upload benchmark videos on the reg.... 

 

System metrics are displayed with msi after burner and settings are shown before the start of each game (or in some videos mentioned in the description) this is like how these channels work you obviously didnt even check the videos otherwise you would have noticed they show settings etc 

 

So that begs the question why are you preemptively trying to find reasons for something not being true when you didnt even bother to check with your eyes if it is true ? :P

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, papajo said:

System metrics are displayed with msi after burner and settings are shown before the start of each game (or in some videos mentioned in the description) this is like how these channels work you obviously didnt even check the videos otherwise you would have noticed they show settings etc 

And that cannot be faked (easily)?! 

 

5 minutes ago, papajo said:

you obviously didnt even check the videos 

 

32 minutes ago, Mark Kaine said:

Additionally I watched a bit (as far as my tolerance for terrible animations allowed) and I didn't really see a difference, it goes back and forth which is better, which is probably due to the horrible youtube compression 

Obviously. 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Mark Kaine said:

And that cannot be faked (easily)?! 

WHY? why would all the channels conspire to fake the settings? the only thing that would do is to lessen the viewer base because they cant be trusted... 

 

On top of that their results are like on point with what we know about these cards performance go check any benchmark site you trust and you will see that the average FPS is about the same with what those channels show  on top of that they dont favor AMD the nvidia cards clearly have better FPS in those videos I showed.... 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I did notice the small difference in the first video, particularly in the strap, where the dark brown meets with the white.

Again, I guess it depends on how / what they are using to capture the footage.

Interesting to point that out nonetheless.

 

5700XT_vs_RTX3080.png

Intel Z390 Rig ( *NEW* Primary )

Intel X99 Rig (Officially Decommissioned, Dead CPU returned to Intel)

  • i7-8086K @ 5.1 GHz
  • Gigabyte Z390 Aorus Master
  • Sapphire NITRO+ RX 6800 XT S.E + EKwb Quantum Vector Full Cover Waterblock
  • 32GB G.Skill TridentZ DDR4-3000 CL14 @ DDR-3400 custom CL15 timings
  • SanDisk 480 GB SSD + 1TB Samsung 860 EVO +  500GB Samsung 980 + 1TB WD SN750
  • EVGA SuperNOVA 850W P2 + Red/White CableMod Cables
  • Lian-Li O11 Dynamic EVO XL
  • Ekwb Custom loop + 2x EKwb Quantum Surface P360M Radiators
  • Logitech G502 Proteus Spectrum + Corsair K70 (Red LED, anodized black, Cheery MX Browns)

AMD Ryzen Rig

  • AMD R7-5800X
  • Gigabyte B550 Aorus Pro AC
  • 32GB (16GB X 2) Crucial Ballistix RGB DDR4-3600
  • Gigabyte Vision RTX 3060 Ti OC
  • EKwb D-RGB 360mm AIO
  • Intel 660p NVMe 1TB + Crucial MX500 1TB + WD Black 1TB HDD
  • EVGA P2 850W + White CableMod cables
  • Lian-Li LanCool II Mesh - White

Intel Z97 Rig (Decomissioned)

  • Intel i5-4690K 4.8 GHz
  • ASUS ROG Maximus VII Hero Z97
  • Sapphire Vapor-X HD 7950 EVGA GTX 1070 SC Black Edition ACX 3.0
  • 20 GB (8GB X 2 + 4GB X 1) Corsair Vengeance DDR3 1600 MHz
  • Corsair A50 air cooler  NZXT X61
  • Crucial MX500 1TB SSD + SanDisk Ultra II 240GB SSD + WD Caviar Black 1TB HDD + Kingston V300 120GB SSD [non-gimped version]
  • Antec New TruePower 550W EVGA G2 650W + White CableMod cables
  • Cooler Master HAF 912 White NZXT S340 Elite w/ white LED stips

AMD 990FX Rig (Decommissioned)

  • FX-8350 @ 4.8 / 4.9 GHz (given up on the 5.0 / 5.1 GHz attempt)
  • ASUS ROG Crosshair V Formula 990FX
  • 12 GB (4 GB X 3) G.Skill RipJawsX DDR3 @ 1866 MHz
  • Sapphire Vapor-X HD 7970 + Sapphire Dual-X HD 7970 in Crossfire  Sapphire NITRO R9-Fury in Crossfire *NONE*
  • Thermaltake Frio w/ Cooler Master JetFlo's in push-pull
  • Samsung 850 EVO 500GB SSD + Kingston V300 120GB SSD + WD Caviar Black 1TB HDD
  • Corsair TX850 (ver.1)
  • Cooler Master HAF 932

 

<> Electrical Engineer , B.Eng <>

<> Electronics & Computer Engineering Technologist (Diploma + Advanced Diploma) <>

<> Electronics Engineering Technician for the Canadian Department of National Defence <>

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, -rascal- said:

I did notice the small difference in the first video, particularly in the strap, where the dark brown meets with the white.

Again, I guess it depends on how / what they are using to capture the footage.

Interesting to point that out nonetheless.

 

5700XT_vs_RTX3080.png

 

IMG_20201031_012456.jpg.d96e32f2eb7b99768e5fa1b5b180e6db.jpg

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, -rascal- said:

I did notice the small difference in the first video, particularly in the strap, where the dark brown meets with the white.

Again, I guess it depends on how / what they are using to capture the footage.

Interesting to point that out nonetheless.

 

5700XT_vs_RTX3080.png

Also look at the belt (or whatever that is near the bottom of the square you made) scratches and stuff or the hair "threads" the nvidia one is blurry... maybe that's why the 3000 series are so much faster than the 2000 :P 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, papajo said:

.

my 4870 back in the days definitely had better out of the box colors than my gtx 580, i still remember when i fired up titan quest, when i switched over to the gtx 580 the colors looked complete ass. I haven't had an amd card for awhile so all i can say is, it's possible and for me at least it's believable. Everytime this discussion comes up it's always amd colors better than nvidia, not the other way around 🤔

 

I do see more gradient on the rx5700 over the 3080 there, but i have to see it in person to confirm, hope this carries over the the 6xxx

5950x 1.33v 5.05 4.5 88C 195w ll R20 12k ll drp4 ll x570 dark hero ll gskill 4x8gb 3666 14-14-14-32-320-24-2T (zen trfc)  1.45v 45C 1.15v soc ll 6950xt gaming x trio 325w 60C ll samsung 970 500gb nvme os ll sandisk 4tb ssd ll 6x nf12/14 ippc fans ll tt gt10 case ll evga g2 1300w ll w10 pro ll 34GN850B ll AW3423DW

 

9900k 1.36v 5.1avx 4.9ring 85C 195w (daily) 1.02v 4.3ghz 80w 50C R20 temps score=5500 ll D15 ll Z390 taichi ult 1.60 bios ll gskill 4x8gb 14-14-14-30-280-20 ddr3666bdie 1.45v 45C 1.22sa/1.18 io  ll EVGA 30 non90 tie ftw3 1920//10000 0.85v 300w 71C ll  6x nf14 ippc 2000rpm ll 500gb nvme 970 evo ll l sandisk 4tb sata ssd +4tb exssd backup ll 2x 500gb samsung 970 evo raid 0 llCorsair graphite 780T ll EVGA P2 1200w ll w10p ll NEC PA241w ll pa32ucg-k

 

prebuilt 5800 stock ll 2x8gb ddr4 cl17 3466 ll oem 3080 0.85v 1890//10000 290w 74C ll 27gl850b ll pa272w ll w11

 

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, xg32 said:

I do see more gradient on the rx5700 over the 3080 there, but i have to see it in person to confirm,

I have to see a linus video with two identical good quality, chroma 4:4:4,  4k monitors side by side and identical systems only different GPUs (amd vs nvidian and no driver tweaking on either side  just installing the latest drivers on both)  and sit people in front of them trying to spot differences :P

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, i_build_nanosuits said:

Splitting hairs here...both cards running the same game will look exactly the same...

There's an observable difference shown in this thread, in videos and screenshots

Link to comment
Share on other sites

Link to post
Share on other sites

Looks to me that the AMD card also has better dynamic range, like it's running in 10-bit color mode internally, then compressing it down to normal color spaces very well.

CPURyzen 7 5800X Cooler: Arctic Liquid Freezer II 120mm AIO with push-pull Arctic P12 PWM fans RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-30

MotherboardASRock X570M Pro4 GPUASRock RX 5700 XT Reference with Eiswolf GPX-Pro 240 AIO Case: Antec P5 PSU: Rosewill Capstone 750M

Monitor: ASUS ROG Strix XG32VC Case Fans: 2x Arctic P12 PWM Storage: HP EX950 1TB NVMe, Mushkin Pilot-E 1TB NVMe, 2x Constellation ES 2TB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, BTGbullseye said:

Looks to me that the AMD card also has better dynamic range, like it's running in 10-bit color mode internally, then compressing it down to normal color spaces very well.

And you seen that in a YouTube video or screenshots?! 😂

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, i_build_nanosuits said:

And you seen that in a YouTube video or screenshots?! 😂

Due to the difference in shadow details across every one of the games and videos, yes.

CPURyzen 7 5800X Cooler: Arctic Liquid Freezer II 120mm AIO with push-pull Arctic P12 PWM fans RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-30

MotherboardASRock X570M Pro4 GPUASRock RX 5700 XT Reference with Eiswolf GPX-Pro 240 AIO Case: Antec P5 PSU: Rosewill Capstone 750M

Monitor: ASUS ROG Strix XG32VC Case Fans: 2x Arctic P12 PWM Storage: HP EX950 1TB NVMe, Mushkin Pilot-E 1TB NVMe, 2x Constellation ES 2TB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/31/2020 at 9:25 AM, papajo said:

well many variables yes but all the same result ?  I mean e.g a football match has many variables but if Team 1 beats team 2 in all their games then you can say team 1 is better 

 

here is an other one 

 

 

But I agree with you that this is not like a "proof" but its a strong indication hence I suggested linus making a video about it :P

 

Watching that video, it looks like the 5700XT has better color saturation, but the RTX2080 has better gamma.  

It also looks like the 5700XT is sharper around the edges giving it an illusion of a resolution uplift, while the RTX2080 has better aliasing giving more realistic definition. 

Overall though the dynamic lighting loks more real on the RTX2080, where it looks more fake on the 5700XT. 

 

This could be due to calibration though, and not anything inherently different with the architecture. 

It's hard to tell as some of this could also be a difference in the way they're captured, and the compression artifacting doesnt help (either by the encoding during capture, or by YouTube)

 

Spoiler

Desktop: Ryzen9 5950X | ASUS ROG Crosshair VIII Hero (Wifi) | EVGA RTX 3080Ti FTW3 | 32GB (2x16GB) Corsair Dominator Platinum RGB Pro 3600Mhz | EKWB EK-AIO 360D-RGB | EKWB EK-Vardar RGB Fans | 1TB Samsung 980 Pro, 4TB Samsung 980 Pro | Corsair 5000D Airflow | Corsair HX850 Platinum PSU | Asus ROG 42" OLED PG42UQ + LG 32" 32GK850G Monitor | Roccat Vulcan TKL Pro Keyboard | Logitech G Pro X Superlight  | MicroLab Solo 7C Speakers | Audio-Technica ATH-M50xBT2 LE Headphones | TC-Helicon GoXLR | Audio-Technica AT2035 | LTT Desk Mat | XBOX-X Controller | Windows 11 Pro

 

Spoiler

Server: Fractal Design Define R6 | Ryzen 3950x | ASRock X570 Taichi | EVGA GTX1070 FTW | 64GB (4x16GB) Corsair Vengeance LPX 3000Mhz | Corsair RM850v2 PSU | Fractal S36 Triple AIO + 4 Additional Venturi 120mm Fans | 14 x 20TB Seagate Exos X22 20TB | 500GB Aorus Gen4 NVMe | 2 x 2TB Samsung 970 Evo Plus NVMe | LSI 9211-8i HBA

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Jarsky said:

RTX2080 has better gamma

What do you mean better gamma? Gamma is brightness (better its the difference of luminance in each pixel compared to the actual value of luminance it should have to capture the image as you see it with your eyes)  maybe you mean they are more bright due to smaller dynamic range , so its a matter of correct and incorrect not good or bad. 

 

 

Or to put it in other words, you cant see "good" colors or details if the luminance is too high or low small, the fact that you see better colors means also that the gamma is correct 

 

e.g 

 

Gamma_correction_brabbit.jpg

 

First kitty has "small" gamma (too dark) second kitty has high gamma (too bright) and the third kitty is the middle value (not too bright not too dark) 

 

I think that you can agree that the third kitty has better colors due to that :P

 

 

Here is another video from yet a different channel 

In this part of the video you can see more defined stitches especially vs the 2070 super or more denied edges near the trigger that's due to a wider dynamic range (click to enlarge in order to be able to see it or watch the video in full screen high res)

 

XxFH7As.png

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, papajo said:

What do you mean better gamma? Gamma is brightness (better its the difference of luminance in each picture compared to the actual value of luminance it should have to capture the image as you see it with your eyes)  maybe you mean they are more bright due to smaller dynamic range , so its a matter of correct and incorrect not good or bad. 

 

I mean higher Gamma. I mean 'better' in that there are some things that are more obvious visibly, such as what a sign says, because of the illumination. 

In some of the scenes the contrast and brightness are very much the same, but some details are lost to shadow on the AMD. 

Spoiler

Desktop: Ryzen9 5950X | ASUS ROG Crosshair VIII Hero (Wifi) | EVGA RTX 3080Ti FTW3 | 32GB (2x16GB) Corsair Dominator Platinum RGB Pro 3600Mhz | EKWB EK-AIO 360D-RGB | EKWB EK-Vardar RGB Fans | 1TB Samsung 980 Pro, 4TB Samsung 980 Pro | Corsair 5000D Airflow | Corsair HX850 Platinum PSU | Asus ROG 42" OLED PG42UQ + LG 32" 32GK850G Monitor | Roccat Vulcan TKL Pro Keyboard | Logitech G Pro X Superlight  | MicroLab Solo 7C Speakers | Audio-Technica ATH-M50xBT2 LE Headphones | TC-Helicon GoXLR | Audio-Technica AT2035 | LTT Desk Mat | XBOX-X Controller | Windows 11 Pro

 

Spoiler

Server: Fractal Design Define R6 | Ryzen 3950x | ASRock X570 Taichi | EVGA GTX1070 FTW | 64GB (4x16GB) Corsair Vengeance LPX 3000Mhz | Corsair RM850v2 PSU | Fractal S36 Triple AIO + 4 Additional Venturi 120mm Fans | 14 x 20TB Seagate Exos X22 20TB | 500GB Aorus Gen4 NVMe | 2 x 2TB Samsung 970 Evo Plus NVMe | LSI 9211-8i HBA

 

Link to comment
Share on other sites

Link to post
Share on other sites

Well I would argue that in many games high gamma damages the immersion especially horror kind of games and gamma is like a game setting you can change in the options its not exactly GPU dependent.

 

1 hour ago, Jarsky said:

but some details are lost to shadow on the AMD. 

Can you give an example?  because thus far I only found loss of detail in Nvidia

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, papajo said:

Well I would argue that in many games high gamma damages the immersion especially horror kind of games and gamma is like a game setting you can change in the options its not exactly GPU dependent.

 

Can you give an example?  because thus far I only found loss of detail in Nvidia

 

Sorry i'm not going to watch that video through again to find the scenes, but im talking about the video I quoted. 

I'm not saying high gamma is a good thing, its often used as a "cheat" for low light games, I was just replying to you about what differences I could perceive between the 2; and in some scenes, some objects were imperceivable on the AMD vs the Nvidia. 

Spoiler

Desktop: Ryzen9 5950X | ASUS ROG Crosshair VIII Hero (Wifi) | EVGA RTX 3080Ti FTW3 | 32GB (2x16GB) Corsair Dominator Platinum RGB Pro 3600Mhz | EKWB EK-AIO 360D-RGB | EKWB EK-Vardar RGB Fans | 1TB Samsung 980 Pro, 4TB Samsung 980 Pro | Corsair 5000D Airflow | Corsair HX850 Platinum PSU | Asus ROG 42" OLED PG42UQ + LG 32" 32GK850G Monitor | Roccat Vulcan TKL Pro Keyboard | Logitech G Pro X Superlight  | MicroLab Solo 7C Speakers | Audio-Technica ATH-M50xBT2 LE Headphones | TC-Helicon GoXLR | Audio-Technica AT2035 | LTT Desk Mat | XBOX-X Controller | Windows 11 Pro

 

Spoiler

Server: Fractal Design Define R6 | Ryzen 3950x | ASRock X570 Taichi | EVGA GTX1070 FTW | 64GB (4x16GB) Corsair Vengeance LPX 3000Mhz | Corsair RM850v2 PSU | Fractal S36 Triple AIO + 4 Additional Venturi 120mm Fans | 14 x 20TB Seagate Exos X22 20TB | 500GB Aorus Gen4 NVMe | 2 x 2TB Samsung 970 Evo Plus NVMe | LSI 9211-8i HBA

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Jarsky said:

 

Sorry i'm not going to watch that video through again to find the scenes, but im talking about the video I quoted. 

I'm not saying high gamma is a good thing, its often used as a "cheat" for low light games, I was just replying to you about what differences I could perceive between the 2; and in some scenes, some objects were imperceivable on the AMD vs the Nvidia. 

Well couldn't you at least remember in what part of the video (which game) you noticed that ?

I already posted screenshots from that particular video and here is an other 

U6BK4mb.png

 

Where the outpost is clearly fuzzier for example and they have about the same distance from it if anything the nvidia one is closer to it (so its not about blurring up far away objects at least not from the game engine's perspective) 

 

 

BTW if that happens to be the case (that's one of the reasons I would like ltt to make a video about it ) this is bigger than just "which card has better graphics quality" Its also about misinformation bad competition I mean think about it e.g in this particular scene the 2080 has 9 FPS more... so its faster like if you just see the number... but if the amd GPU would have blurred out that part too maybe the 2080 wouldnt have the same advantage in FPS and who knows maybe it would have been slower.... 

 

Other people mentioned that we are talking about minute details, and they are right but what they dont think of is that minute details can make a bigger difference in the grand scale of things... 

 

I mean if omitting a small thing in 1 frame saves you e.g a 1ms in frame time that means at a 100FPS rate you are getting +10 exrta FPS and can claim you have the faster product. 

 

And that wouldnt surprise me because nvidia has been known to cheat in order to get an advantage via manipulating the graphics load in games e.g they buffed up a totally unnoticeable and useless barrier in a game map with high tessellation just to make AMD GPUs have lower FPS in that part 

 

https://techreport.com/review/21404/crysis-2-tessellation-too-much-of-a-good-thing/

Link to comment
Share on other sites

Link to post
Share on other sites

I've recently tested this myself as part of my decision on whether I would go Ampere or RDNA2 for my next daily driver for gaming, so I could care less what someone else thinks vs. my own first hand experience.  

 

First, AMD have Radeon Image Sharpening that makes what you see on screen look better, like having sweetFX or reshade with one click. I wouldn't confuse it with Nvidia's sharpening feature in the control panel. The other thing I noticed right away was the color quality. Nvidia colors seem to be washed out in comparison, but you wouldn't notice unless you did a sidebyside. 

 

Second, and its kind of a big one, is how Nvidia manage memory usage in drivers. In particular with ARK in 4K, I noticed that my Vega56 would be hitting the 8GB frame buffer limit all the time and only then swapping to system memory (causing brief hitches), while my 2080 seems to dump resources pre-emptively from VRam when it passes 7-7.5GB of usage, but the quality of textures and objects ingame start bouncing back and forth between playdough and high quality. The solution in either case was to lower one of a setting like LODs or texture quality to stay below 8GB. The gameplay experience is less disruptive for Nvidia because the swapping to system memory is far more aggressive and pre-emptive, but sometimes one of my tames will turn into an N64 model for a few seconds if I don't make adjustments, which is itself annoying and kills immersion.

 

The third thing that stands out to me was the upscaling capabilities. In order to get the same framerate in 4K on the Vega56 compared to my 2080, I had to drop resolution scaling to about 60% on the Vega, whereas I normally play with scaling at about 90% with my 2080. For whatever reason, the Vega56 image quality with Radeon Sharpening turned on looked better in some cases than the 2080. If I turn down resolution scaling to 60% on the 2080, the aliasing is so bad that objects 20 feet away ingame looking like garbage. I suspect the upscaling AMD are using on desktop is what the XBone uses to adjust resolution on the fly. Nvidia's newest upscaling tech DLSS is far superior of course, but only if it's been trained. I suspect that AMD's upcoming Super Resolution feature will be an improvement on Radeon Sharpening, which is already pretty good. I'm also curious how good Nvidia's next iteration of DLSS will be that is supposed to work without being trained. My guess is that they won't be much different, and only the AI trained DLSS will be superior.

 

One thing to note is that Nvidia themselvess back with either Maxwell or Pascal (iirc) introduced their shader optimizations in drivers. This isn't uncommon, AMD optimizes Tessellation, and Nvidia optimizes Antistrophic filtering as well. These optimizations are not supposed to effect image quality. I can confirm that Nvidia's aggressive memory management does affect image quality, but prevents frame drops by doing so. AMD going with 16GB with RDNA2 isn't just a flex on their part imo, I think it's because they need to for 4K gaming. It's probably cheaper to slap 16GB of VRam on a card than to invest in driver level  memory optimizations. 

 

This is all just my own observations and/or opinions, take them for what they are. I could care less if someone believes me, I originally only set out to figure this stuff out for myself and to help me decide what I'm buying this generation. 

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, Briggsy said:

First, AMD have Radeon Image Sharpening

I personally dont like it and its not on by default also I do not see why this would be the case because all the videos are not about AMD picture quality but benchmark videos and AMD image sharpening although by a very small amount does impact performance in terms of FPS so why to turn that on? 

 

14 minutes ago, Briggsy said:

In particular with ARK in 4K, I noticed that my Vega56 would be hitting the 8GB frame buffer limit all the time and swapping to system memory,

Which is a good thing the more game resources you cache the less time you need to load them when they need to render the 2080 probably would do the same but it cached less compared to the Vega 56 because its faster I guess although I am not sure if the caching system is identical between the two brand what I am sure though is that the more you cache the better. 

 

14 minutes ago, Briggsy said:

he third thing that stands out to me was the upscaling capabilities. In order to get the same framerate in 4K on the Vega56 compared to my 2080, I had to drop resolution scaling to about 60% on the Vega

The vega 56 is a slower GPU than the GTX 2080 I dont know why you think that having to lower down your settings or scale the resolution down on it is something that has to do with our conversation... if you had a RX 580 you probably would have to scale down the resolution even further because its an even slower card compared to the GTX 2080... 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Dude your analyzing two different screenshots how do you expect them to look exactly identical the refresh rate is 60hz or more...wake up!

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×