Jump to content

Where Gaming Begins - AMD RX6000 series Live Event

LAwLz
1 hour ago, Moonzy said:

1) it wont be necessary for at least another generation or two, look at 4gb to 8gb vram jump, it's not necessary to have more than 4gb until 8gb cards became common

I'm still running 4GB 290X GPUs @ 2560x1600 perfectly fine, though I don't think I've tried the very latest games but ones from 2019 were fine. Really the only thing that's been killing me is the dog shit crossfire support now, I just don't have the GPU power required anymore.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, leadeater said:

 I just don't have the GPU power required anymore.

Time to sell and upgrade.

I could use some help with this!

please, pm me if you would like to contribute to my gpu bios database (includes overclocking bios, stock bios, and upgrades to gpus via modding)

Bios database

My beautiful, but not that powerful, main PC:

prior build:

Spoiler

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, leadeater said:

I'm still running 4GB 290X GPUs @ 2560x1600 perfectly fine.

technically "necessary" is the wrong word i used there

there should be an asterisk to mention at a certain settings because technically even a GT710 can run games "just fine", depending on expectations and settings

 

i would say, medium-high settings at 1080p/1440p for recent games is a good bar to set today.

 

4 minutes ago, leadeater said:

Really the only thing that's been killing me is the dog shit crossfire support now, I just don't have the GPU power required anymore.

it's always been wiser to sell older GPU and buy a newer one, rather than buy another one to run multi-GPU anyways

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, Moonzy said:

it's always been wiser to sell older GPU and buy a newer one, rather than buy another one to run multi-GPU anyways

Always had both, got them at release before GPU pricing went crazy so while my crossfire support has actually been very good up until 2019 there wasn't anything worth upgrading to that wasn't very bad value compared to what I have now. Now there are decently priced options for the offered performance and the dropping of multi GPU support is forcing my hand anyway.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, LAwLz said:

I haven't even seen any proof that you need more than 10GB of VRAM for gaming. 

 

On 1080p I would easily reach 8Gb while experiencing slowdowns due to swapping

MOTHERBOARD: ASRock H97 Pro4 CPU: Intel Core i5-4460 @3.30 Ghz Intel Xeon E3-1271v3 @4.00 Ghz RAM: 32Gb (4x8Gb) Kingstone HyperX Fury DDR3@1600 Mhz (9-9-9-27)

GPU: MSI 390 8Gb Gaming Edition PSU: XFX TS 650w Bronze Enermax Revolution D.F. 650w 80+ Gold MOUSE: Logitech G502 Proteus Spectrum KEYBOARD: Monokey Standard Suave Blue

STORAGE: SSD Samsung EVO 850 250Gb // HDD WD Green 1Tb // HDD WD Blue 4Tb // HDD WD Blue 160Gb CASE: Fractal Design Define R5 Windowed OS: Windows 11 Pro x64 Bit

MONITORS: Samsung CFG7 C24FG7xFQ @144hz // Samsung SyncMaster TA350 LT23A350 @60hz Samsung Odyssey G7 COOLER: Noctua NH-D15

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, LAwLz said:

 

I haven't even seen any proof that you need more than 10GB of VRAM for gaming. 

Watch Dogs Legion will use more than 8GB, on maximum settings at 4k, at least from my initial testing. 

 

10GB is fine for now, but I don't see it being so a couple of years down the line, given the consoles' and RDNA 2's higher capacities. 

Link to comment
Share on other sites

Link to post
Share on other sites

58 minutes ago, Parideboy said:

On 1080p I would easily reach 8Gb while experiencing slowdowns due to swapping

Interesting, according to other reviews (1 , 2 ) the game doesn't use more than 8GB of VRAM even at 4k. The in-game "Graphics memory" indicator is not accurate to true VRAM usage.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Medicate said:

Interesting, according to other reviews (1 , 2 ) the game doesn't use more than 8GB of VRAM even at 4k. The in-game "Graphics memory" indicator is not accurate to true VRAM usage.

I don't know how long those benchmarks took, what I experienced occured during a full run (start to finish. 6+ hours) so it is possible that the game just kept stuff cached in the VRAM. I was using the best texture pack, the game reported something like 12 or 10 GB of VRAM. Never got close to that, but I was constantly between 7.5 gb and slightly over 8.

 

It is possible that the game didn't drop the unused textures until the VRAM was full, but when this happened the game shuttered.

MOTHERBOARD: ASRock H97 Pro4 CPU: Intel Core i5-4460 @3.30 Ghz Intel Xeon E3-1271v3 @4.00 Ghz RAM: 32Gb (4x8Gb) Kingstone HyperX Fury DDR3@1600 Mhz (9-9-9-27)

GPU: MSI 390 8Gb Gaming Edition PSU: XFX TS 650w Bronze Enermax Revolution D.F. 650w 80+ Gold MOUSE: Logitech G502 Proteus Spectrum KEYBOARD: Monokey Standard Suave Blue

STORAGE: SSD Samsung EVO 850 250Gb // HDD WD Green 1Tb // HDD WD Blue 4Tb // HDD WD Blue 160Gb CASE: Fractal Design Define R5 Windowed OS: Windows 11 Pro x64 Bit

MONITORS: Samsung CFG7 C24FG7xFQ @144hz // Samsung SyncMaster TA350 LT23A350 @60hz Samsung Odyssey G7 COOLER: Noctua NH-D15

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Moonzy said:

I like how they overhype their own stuff, which will only leads to disappointment when it's not up to snuff

just like nvidia's claim of "2x better than 2080" or smth, got everyone hyped up and disappointed, even thought it's still a good card

Everyone hates marketing, but how would you rather them present it? AMD's benchmarks and presentations seem legitimately less misleading than Nvidia's (although that's not a high bar to clear). From what I can tell, they've disclosed everything they've done to get their results, even if most of it is in the fine print. 

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, thechinchinsong said:

Everyone hates marketing, but how would you rather them present it?

How would I prefer it may not be the best way

But I hope they cover new features and such, and leave the benchmarks to third party, preferably the next day following the presentation

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Moonzy said:

How would I prefer it may not be the best way

But I hope they cover new features and such, and leave the benchmarks to third party, preferably the next day following the presentation

I mean that would be the ideal for any tech company, but alas, I've given up hope that anybody does that.

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Moonzy said:

How would I prefer it may not be the best way

But I hope they cover new features and such, and leave the benchmarks to third party, preferably the next day following the presentation

Honestly, I have to disagree with you here. Cards are almost never available to reviewers the next day, and even if we have to take manufacturer benchmarks with a grain of salt, they're still useful, especially for AMD vs AMD comparisons. For example, we know from AMD's presentation that the 6900XT is probably not going to be a major step up over the 6800XT, even if we don't know how it's going to compete with the 3090 in non-handpicked workloads.

Link to comment
Share on other sites

Link to post
Share on other sites

25 minutes ago, Grabhanem said:

Cards are almost never available to reviewers the next day

This is not a valid argument, just think about it

 

26 minutes ago, Grabhanem said:

ven if we have to take manufacturer benchmarks with a grain of salt, they're still useful

If we can't trust it then it's not useful

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Moonzy said:

This is not a valid argument, just think about it

How so? AMD is going to have a working card long before it's polished enough to send out to a reviewer. The alternative is for them to not say anything until it's ready to go on shelves, which is arguably a good option, but wouldn't get cards in people's hands any sooner. I'd argue sending out review samples before the card is on the shelf is a good thing, because it lets manufacturers get feedback from reviewers if something's wrong (see the 5700XT evoke, for example).

 

Quote

If we can't trust it then it's not useful

I already gave an example of useful information that doesn't have to come from necessarily perfect data. It's pretty safe to assume that AMD doesn't cheat against its own same-gen cards.

 

Buildzoid pulls some excellent information and conclusions out of the presentation:

 

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, Grabhanem said:

AMD is going to have a working card long before it's polished enough to send out to a reviewer.

then what makes their number trustworthy? when they're using those "unpolished" cards

 

derail, but i would argue that nvidia's 30 series card should be re-benchmarked after they tuned the clockspeeds and such to fix the crashes

but no one would do it, and it's more or less ~5% drop in performance probably, so no one would bother.

 

14 minutes ago, Grabhanem said:

wouldn't get cards in people's hands any sooner.

they just have to move their presentation date later, say, to nov 17th (iirc the review is out on 18th) and leave benchmark to reviewers.

nothing else would change.

 

23 minutes ago, Grabhanem said:

It's pretty safe to assume that AMD doesn't cheat against its own same-gen cards.

thing is, from that video, they arent even benching their cards with the same settings enabled, so "guessing" is still true.

 

sure, good for them if it's a accurate number, but if the reviews are out 1 day later, then no one would have to guess.

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Moonzy said:

it's more or less ~5% drop in performance probably, so no one would bother.

There is no real-world performance drop. They just adjusted how aggressively the short-term boost peaks behave. The average boost clocks are still the same and thus general performance is still the same. As a 3080 user myself i get exactly the performance i expected, nothing less. My card pretty much stays between 2080-2120MHz while gaming and even peaks at ~2200MHz from time to time without any crashes at all. The only time reviewers said they won't use the newer drivers after the fix is because they literally fight over single-digit point values in benchmarks. Like in the RIPGN / RIPJAY / RIPPAUL series.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

  • 3 weeks later...

🤔 RTX 3080 vs RX 6800XT...largely equal results in most games, although the 6800XT edges ahead quite a bit in a few games.

 

 

System Specs

  • CPU
    AMD Ryzen 7 5800X
  • Motherboard
    Gigabyte AMD X570 Auros Master
  • RAM
    G.Skill Ripjaws 32 GBs
  • GPU
    Red Devil RX 5700XT
  • Case
    Corsair 570X
  • Storage
    Samsung SSD 860 QVO 2TB - HDD Seagate B arracuda 1TB - External Seagate HDD 8TB
  • PSU
    G.Skill RipJaws 1250 Watts
  • Keyboard
    Corsair Gaming Keyboard K55
  • Mouse
    Razer Naga Trinity
  • Operating System
    Windows 10
Link to comment
Share on other sites

Link to post
Share on other sites

On 11/19/2020 at 11:38 PM, BlackManINC said:

🤔 RTX 3080 vs RX 6800XT...largely equal results in most games, although the 6800XT edges ahead quite a bit in a few games.

 

 

It all comes down to what the user needs, again:

 

1080p or 1440p the 6800XT wins most times.

At 4K the 3080 wins most times.

As soon as you want to use ray-tracing, the 3080 will also be the better choice.

Also all games that support DLSS have a substantial performance gain on the NVIDIA side.

And NVidia has quite the software feature set.  For example many people rely on the NVENC encoder, etc... And the encoder from Radeon GPUs still sucks.

 

That being said, if you're only after raw performance, both cards are compelling options. It's sad that you can't buy both of them even if you'd want to.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

AMD's ray tracing is pretty similar to NVIDIA's, but with problems in certain games. Which is not unexpected given it's first generation with drivers still to catch up. Only time where NVIDIA pulls ahead significantly is when there is something obviously wrong like in Minecraft where it just chokes itself to death or where DLSS is used.

 

And I still have problems with EITHER such feature that's not game agnostic. I absolutely hate it how everyone shoves fancy high framerate numbers in our faces all the time for those same 6 games. Guess what, not everyone plays those 6 games and I want better performance outside of that too. Until that happens, all these fancy DLSS like features are nice, but essentially useless and do not show a realistic performance expectations. It'll make performance improvement in THAT game and THAT game alone and cannot and shouldn't be just applied as some sort of average of performance indication. If it's not available for games that you play and are not those supported, the feature just doesn't exist then.

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Stahlmann said:

It all comes down to what the user needs, again:

 

1080p or 1440p the 6800XT wins most times.

At 4K the 3080 wins most times.

As soon as you want to use ray-tracing, the 3080 will also be the better choice.

Also all games that support DLSS have a substantial performance gain on the NVIDIA side.

And NVidia has quite the software feature set.  For example many people rely on the NVENC encoder, etc... And the encoder from Radeon GPUs still sucks.

 

That being said, if you're only after raw performance, both cards are compelling options. It's sad that you can't buy both of them even if you'd want to.

🤷‍♂️ Well, we'll see if Nvidia truly has the advantage with ray tracing performance this generation as time goes along, once we start seeing more games actually be optimized for AMD's ray tracing feature. At launch, sure, Nvidia has a sort of head start, but that is no way an indication of how things will go from here on in in newer titles. We already see it working well in games like Dirt 5 and Far Cry 6 on the RX 6000 series so. 

System Specs

  • CPU
    AMD Ryzen 7 5800X
  • Motherboard
    Gigabyte AMD X570 Auros Master
  • RAM
    G.Skill Ripjaws 32 GBs
  • GPU
    Red Devil RX 5700XT
  • Case
    Corsair 570X
  • Storage
    Samsung SSD 860 QVO 2TB - HDD Seagate B arracuda 1TB - External Seagate HDD 8TB
  • PSU
    G.Skill RipJaws 1250 Watts
  • Keyboard
    Corsair Gaming Keyboard K55
  • Mouse
    Razer Naga Trinity
  • Operating System
    Windows 10
Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, JediFragger said:

Managed to get a 6800XT on launch day and it's pretty damn awesome! Have it paired with a 10700k@5Ghz and a Iiyama 3440x1440p Ultrawide and it's all just buttery smooth :)

:) That gives me comfort. I'm just waiting on the benchmarks for the custom versions then I'll make my move. 

System Specs

  • CPU
    AMD Ryzen 7 5800X
  • Motherboard
    Gigabyte AMD X570 Auros Master
  • RAM
    G.Skill Ripjaws 32 GBs
  • GPU
    Red Devil RX 5700XT
  • Case
    Corsair 570X
  • Storage
    Samsung SSD 860 QVO 2TB - HDD Seagate B arracuda 1TB - External Seagate HDD 8TB
  • PSU
    G.Skill RipJaws 1250 Watts
  • Keyboard
    Corsair Gaming Keyboard K55
  • Mouse
    Razer Naga Trinity
  • Operating System
    Windows 10
Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, JediFragger said:

Managed to get a 6800XT on launch day and it's pretty damn awesome! Have it paired with a 10700k@5Ghz and a Iiyama 3440x1440p Ultrawide and it's all just buttery smooth :)

Nice, glad you actually got one.

Link to comment
Share on other sites

Link to post
Share on other sites

🤔 For those who jumped on the Nvidia bandwagon over Ray Tracing, it performs better in Dirt 5 with the 6800XT than it does with a 3080 as discussed below at the 14:40 mark of the video, so this is already a good start for AMD even with Ray Tracing. Its got very little to do with "DLSS" which really has nothing to do with Ray Tracing, or whatever other sorcery people assume Nvidia is doing to get better performance that AMD can't or won't do. It basically comes down to how well the developers optimize it. I think I'll stick with the cheaper option with AMD. They have the better software features to me anyway. 👍 👍

 

 

 

 

 

System Specs

  • CPU
    AMD Ryzen 7 5800X
  • Motherboard
    Gigabyte AMD X570 Auros Master
  • RAM
    G.Skill Ripjaws 32 GBs
  • GPU
    Red Devil RX 5700XT
  • Case
    Corsair 570X
  • Storage
    Samsung SSD 860 QVO 2TB - HDD Seagate B arracuda 1TB - External Seagate HDD 8TB
  • PSU
    G.Skill RipJaws 1250 Watts
  • Keyboard
    Corsair Gaming Keyboard K55
  • Mouse
    Razer Naga Trinity
  • Operating System
    Windows 10
Link to comment
Share on other sites

Link to post
Share on other sites

Which is why I was wondering how much AMD can do with drivers. They can't change the ray accelerators on the GPU itself, but I'm guessing biggest issue is that DXR to this date has only really been developed and tested for RTX cards. Even if it's a standard, it doesn't mean it's implemented ideally. Which is why I'm interested to see if Radeon RT performance will change in games that are most problematic with massive performance drops. Would be cool if it's all just driver issues and game issues coz of above mentioned things. But we can't know for sure. Would be cool if it's true though.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×