Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

GamerDude

Member
  • Content Count

    4,743
  • Joined

  • Last visited

Everything posted by GamerDude

  1. Depend on OP's preference, IF RT is important, go for the RTX 3080. But, if rasterization is more important than RT, then the RX 6800 XT which is pretty fast for regular non-RT games. I went with an RX card because, while I appreciate that RT is the future, present gen cards from nVidia (let alone AMD) are still not good enough to do games at high framerates at native res (like 4K or perhaps 1440P) without the need for DLSS. Plus, as someone at guru3D had said, DLSS is planned obsolescence, there'll come a time when GPU's are powerful enough that DLSS isn't needed. Also, for RT to
  2. Might as well add my Heaven and Valley scores...CPU at 4.45GHz, undervolted my GPU.
  3. Aw shoot, how did I miss this? FX 770K is weaker than an FX6300, it looks like Apex has a point, this could be a factor as to why your RX 480 is 'breathing' as it should.
  4. Ryzen loves faster RAM, perhaps a pair of 8GB 3600MHz RAM....I have a Ryzen 9 3900X paired with 3600Mhz RAM (see sig), and I'm willing to bet just on CPU performance alone, mine would outpace yours.
  5. I have a Sapphire Nitro+ card, and when I play DX9 games, it generally draws about 50-60W while maintaining a constant 144fps to match my monitor's 144 refresh rate. But when I play a more modern game (DX11/DX12) I'd notice that it would kick into gear and draw >250W. Also, ensure Radeon Chill is disabled as I've heard saves power by limiting framerate and such.
  6. OP had swapped a GTX 1080 Ti for the RX 6800 XT, so, my question is, did he run DDU first? I'd swapped a GTX 1080 for a GB RX VEGA64 Gaming OC (the GTX 1080 went into my HTPC rig), I'd run DDU in Safe Mode to get rid of all traces of nVidia driver, shut down and swap out the cards. Also, I'd had disabled net connection by unplugging my LAN cable (Windows has a darn bad habit of installing older drivers should it detect the absence of the GPU driver) or disable WiFi before running DDU. My GB VEGA64 Gaming OC ran like a champ in my 2nd rig......till it died. I've replaced it with a P
  7. I'd agree with you on that, that guy knows diddly squat! To OP, just a shot in the dark, but have you enable 'Enhanced Sync', if yes, try disabling it and see what happens....
  8. He's talking about AMD RX 6000 series, hence my response since I'd owned an RX 6800 and now own an RX 6900 XT. One thing to bear in mind is where one buys the card from, in my neck of the woods, both the RX 6800 XT and the RTX 3080 are virtually impossible to find. In my case, an RTX 3080 cost more than an RX 6800 XT, sometimes by a fair bit. So, if the OP has the ideal choice between an RTX 3080 or the RX 6800 XT (and ideally, at very similar price point), I'd say it's entirely up to him (bsed on his own personal preference and usage scenario) to pick either as both are great gam
  9. The thing with DLSS, and why I have reservations about it, is it's up to nVidia to do their DLSS optimization on it, IF more and more RT games are released in the future, nVidia may have to pick and choose which gets the DLSS treatment. At 1080P, yes, RT can be done natively now, but at higher res, that's where DLSS is needed because present hardware, both nVidia and AMD, are not powerful enough to do RT natively at, say, at QHD or 3440x1440, let alone 4K. Don't get me wrong, I'd said RT is the future, and a nice feature to have now, but hardware is not powerful enough as of now,
  10. Depends on res you game at, I had a Nitro+ RX 6800 for a month or so, it ran all the games I'd thrown at it with ease at 3840x1080, only game I had issue with was Metro Exodus with RT enabled, it'd CTD. I've since replaced it with a Nitro+ RX 6900XT (only because the RX 6800 XT is near impossible to find). With performance comparison among the RX 6800, RX 6800 XT and RX 6900 XT - RX 6800 XT's nearer the RX 6900 XT level I believe - it represents the 'sweet spot' for AMD high end GPU's. price difference between an RX 6800 XT and RX 6800 is relatively small, the RX 6900 XT entails a
  11. As requested, what's your system spec? What kind of games do you play? Where do you play to buy the card from?
  12. Run AMDcleanuputility, since previous driver is an AMD one, then reboot (ensure that you've downloaded the latest driver - although some have said 20.11.2 is better iirc - and that you're NOT connected online as Windows would sometimes try to install some crap old driver), and install the driver you want. I had an RX 6800 before upgrading to an RX 6900 XT, it was awesome and I'd had no issue with it running games....only Metro Exodus has CTD issue, especially with RT enabled (tho even without RT, it'd CTD randomly). I'm on 20.12.1 since the fixes in 20.12.2 doesn't interest me.
  13. LOL, that'd explain a lot! I'd read your thread, and I was scratching my head as to why an 850W Seasonic PSU can't handle an RTX3090, the PSU actually being a 620W one cleared things up real quick. II'd suggest that instead of getting an 850W PSU, get a 1000W one instead, gives you much more headroom for adding more parts to your rig (like more HDDs, AIO's or custom loop, etc)
  14. Try Superposition, and Timespy....and Firestrike Extreme. I usually have MSI Afterburner running to check temps and gpu clocks while I was running these benchmarks. For pure benchmarking, I'd disable MSI AB because it may chew up precious CPU cycles, thus lowering the final score.....I believe so anyway.
  15. I tried setting Metro Exodus Sam's Story framerate to 144Hz, ensure I was on Ultra and started the game. Framerate's much better, only one slowdown, and it was at the fire station level up on the tower , but other than that, it's good. Now, I get anything between 40+fps to 100+fps, doesn't dip below 1100MHz, it seems to be smooth now. So weird, having the boost clock go all over the place previously, strengthens my belief that it's a driver issue. I guess AMD need to polish their drivers a little more for the newer cards.
  16. @Pat-RonerI'd suggest you use the Report Bug tab in Adrenalin to report this to AMD, they need to be aware that this issue exists.
  17. I think it's a driver bug, my RX 6900 XT has this issue too. On games like Serious Sam 4, the clocks go all over the place, but can still maintain >100fps at 3840x1080 (at max setting). But, when I played Metro Exodus Sam's Story, the framerate drop was quite noticeable, with my RX 6800 previously, I was still able to hit 40+ and higher fps, and gameplay was surprisingly smooth. With my 6900XT, the clock dipped to 800Mhz which resulted in framerate dipping to 30fps (I should be getting higher fps easily with this card), I've reported this to AMD. Will reset my card and see if the issue wou
  18. Nope, I've heard certain mobo makers, like MSI, have been releasing beta BIOS to enable SAM. PCIe 4.0 isn't necessary to have SAM enabled. I've read that even RX 5000 series owners have had SAM enabled on their mobos too (using a Ryzen 2700X for example). Now whether PCIe 3.0 has enough bandwidth to support SAM fully, or that RX 5000 series cards can take advantage of SAM is.....debatable at best. To refute Stahlmann's claim, my X570 mobo (latest beta BIOS that support SAM, Adrenalin driver must be 20.11.2 or later I think) + R9 3900X + RX 6900 XT has SAM enabled, proof's in the pu
  19. A Ryzen R5 5600X would be a great CPU for many modern GPU, and that includes flagship cards imo. Even compared to my R9 3900X, it has faster IPC and is better at games, and I have my 3900X paired with one of AMD's best GPU (for now), a Sapphire Nitro+ RX 6900 XT. I'd probably see higher framerate with a 5600X, but I don't see the need to step up to a Ryzen 3 CPU....probably do it later this year, and it'd probably be a 5900X/5950X.
  20. Heard that some mobo manufacturers are even extending SAM support to even older Ryzen CPU and 300 series mobos. Also read that some with RX 5000 series cards have enabled SAM with various mobos. For my part, I can seen 'Large Memory Range' (indicating that SAM is enabled) in my Display Adapter, and I have an X570 mobo (latest F31q beta BIOS), 3900X and RX 6900XT.
  21. Undervolted my 6900XT a little, score went up much to my surprise....not in the habit of undervolting. Ran Valley benchmark, CPU = 4.45GHz, GPU undervolted
  22. I dunno about the Asrock Challenger, but the AMD card I have always seem to have rather anemic default fan setting. Usually, I'd set my own fan curve, and I'd disable Zero Fan while I'm at it. I don't care about fan noise as I'm more about cooling the GPU than noise generated by the fans. Besides, with my soundbar in front of me when I game, I can't hear the fan noise.
  23. Agree with you, OP has 16GB of RAM, it should more than suffice for any game. I have three rigs, all with 16GB of RAM each, I've not encountered any game where it needs more than 16GB. My main rig has an X570 mobo + R9 3900X + 2x Patriot Viper RGB 8GB DDR4 3600CL17 + Sapphire Nitro+ RX 6900 XT and don't have any issue with any game, so having 'just' 16GB of system RAM isn't the OP's issue.
  24. Ran this benchmark with a Sapphire Nitro+ RX 6800 before yet another upgrade...CPU at 4.4ghz, no OC on the GPU Upgraded to a Sapphire Nitro+ RX 6900 XT, CPU at 4.425GHz, no OC on my GPU (other than its own boost clocks)
  25. Wonder when the RX 6000 series cards would be added...
×