Mostly comes down to weird quirks when I had AMD. Minecraft with shaders would have bad frametimes, every couple seconds my FPS drops to sub 10 for a second, then back up. The nvidia gpu at the exact same settings runs at a constant 144 fps (16 view distance, complementary shaders with shadows set to 16, Versions 1.17 and 1.18) Even the 1080 Ti I used to have didn't have these issues.
7 days to die runs like crap regardless of hardware, but in Alpha 19, the AMD gpu would have sub 20 fps while either moving your character or your camera, even when set to 1080p. Coil whine was abnormally loud only in this instance as well. Turning shadows off had the game run as expected. Fixed in alpha 20, but that issue was never present in Alpha 19 with the nvidia card.
Left 4 Dead 2 at this point should be a CPU bound game, running at hundreds of frames per second with any modern system. The AMD card would drop down to 50 fps during hordes (90 with Vulkan enabled), where the nvidia card only drops down to 295 in the same situations.
When the AMD GPU worked as intended, I loved it. Borderlands 3, Metro Edodus Enhanced and heavily modded Skyrim ran like a dream. Unfortunately for most of what I played it was unable to deliver consistent framerates. All of this said, the issues, minus 7DTD as alpha 20 was out at the time, I had were completely absent when I demoed Manjaro Linux for two months. Leading me to believe AMD's Windows drivers are to blame. Had to DDU AMD's drivers a few times due to failed updates, or Windows breaking the adrenaline software.
GPUs in question are the 6900 XT and 3080 Ti, paired with a 5900x running at 3840 x 1600. Had a chance to side grade for a few hundred bucks and took it.