Jump to content

Where Gaming Begins - AMD RX6000 series Live Event

LAwLz
3 minutes ago, porina said:

On forums like this, people tend to over analyse every little detail of performance. A few % here, a few % there. It's a bit like the bottleneck questions that keep popping up. To me, they're not the best question to ask, but more appropriate is will a combination of components give me the desired level of performance? Is it good enough? If a game is giving me 100+ fps 1440 ultra, do I care if one GPU is 10fps faster? The software ecosystem and other features may be more important.

 

I made my bed in green's camp, I have G-sync displays, and make extensive use of the software. At the same time, I welcome AMD finally having a competitive high end GPU again, as it gives more options to those who are not so attached to one side or other, and it might help me get a 30 series card before 40 series comes out...

I mean, I had both Radeon and GeForce cards, a lot of them in my life and I never cared if any title is for one camp or another. I always got so much performance I just never cared. Like you said, I'm getting 100fps, is 10 or 20fps difference in very specific select titles even relevant at this point?

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, RejZoR said:

But if it already achieves RTX 3080 and RTX 3090 performance now, AMD being more present in the gaming scene, it'll only get better. But again, even if you look at CURRENT situation, it doesn't look bad. People often complain this game is for NVIDIA, that one for AMD, but is there really that big of a difference you should actually even care?

Agree, but I think that right now it might be slightly under the 3000 series, surely higher than Turing. Can't wait for the reviews :(

MOTHERBOARD: ASRock H97 Pro4 CPU: Intel Core i5-4460 @3.30 Ghz Intel Xeon E3-1271v3 @4.00 Ghz RAM: 32Gb (4x8Gb) Kingstone HyperX Fury DDR3@1600 Mhz (9-9-9-27)

GPU: MSI 390 8Gb Gaming Edition PSU: XFX TS 650w Bronze Enermax Revolution D.F. 650w 80+ Gold MOUSE: Logitech G502 Proteus Spectrum KEYBOARD: Monokey Standard Suave Blue

STORAGE: SSD Samsung EVO 850 250Gb // HDD WD Green 1Tb // HDD WD Blue 4Tb // HDD WD Blue 160Gb CASE: Fractal Design Define R5 Windowed OS: Windows 11 Pro x64 Bit

MONITORS: Samsung CFG7 C24FG7xFQ @144hz // Samsung SyncMaster TA350 LT23A350 @60hz Samsung Odyssey G7 COOLER: Noctua NH-D15

 

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, Parideboy said:

Agree, but I think that right now it might be slightly under the 3000 series, surely higher than Turing. Can't wait for the reviews :(

While official numbers given by vendors themselves are often overly optimistic, AMD has recently been giving out very realistic numbers. I mean, if you can do it, why fake it. Days of R9 Fury and RX Vega are long gone now. There, they had to present things in best possible way by bending the truth a bit to make it look better. I honestly believe RX 6000 series are as real as it gets. We'll see when reviews hit the interwebs, but I think there is no reason for AMD to be bending truth this time around. Which is why I'm so excited.

Link to comment
Share on other sites

Link to post
Share on other sites

I think AMD has to be more honest with benchmark slides because they aren't the market leader, but until reviews are out I'm skeptical of those performance claims being 100% accurate as who knows if the 6800XT performs better in AMD optimized games, or if they used certain settings.  Some of the benchmark slides noted Rage Mode and Smart Memory Access being enabled, a Ryzen 5000 CPU is required to use SAM and it isn't known what kind of increase SAM provides.

And ray tracing is probably worse since AMD didn't exactly hype up their ray tracing, although I care more about raster performance than RT performance, as long as ray tracing on RX 6000 is decent it could be a better value for some people.

But i'm excited to see how well the 6800XT does, Nvidia really screwed up their 30 series launch, 10GB of vram doesn't seem like enough for a high end GPU.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, RejZoR said:

While official numbers given by vendors themselves are often overly optimistic, AMD has recently been giving out very realistic numbers. I mean, if you can do it, why fake it. Days of R9 Fury and RX Vega are long gone now. There, they had to present things in best possible way by bending the truth a bit to make it look better. I honestly believe RX 6000 series are as real as it gets. We'll see when reviews hit the interwebs, but I think there is no reason for AMD to be bending truth this time around. Which is why I'm so excited.

There were no numbers regarding raytracing tho. The slides focused on rasterization and I do believe those numbers, even tho cherry picked, are somewhat realistic.

 

If AMD had comparable numbers in raytracing, they would have shown them. I hope I'm wrong, cause both 6800 and 6800XT are very compelling.

MOTHERBOARD: ASRock H97 Pro4 CPU: Intel Core i5-4460 @3.30 Ghz Intel Xeon E3-1271v3 @4.00 Ghz RAM: 32Gb (4x8Gb) Kingstone HyperX Fury DDR3@1600 Mhz (9-9-9-27)

GPU: MSI 390 8Gb Gaming Edition PSU: XFX TS 650w Bronze Enermax Revolution D.F. 650w 80+ Gold MOUSE: Logitech G502 Proteus Spectrum KEYBOARD: Monokey Standard Suave Blue

STORAGE: SSD Samsung EVO 850 250Gb // HDD WD Green 1Tb // HDD WD Blue 4Tb // HDD WD Blue 160Gb CASE: Fractal Design Define R5 Windowed OS: Windows 11 Pro x64 Bit

MONITORS: Samsung CFG7 C24FG7xFQ @144hz // Samsung SyncMaster TA350 LT23A350 @60hz Samsung Odyssey G7 COOLER: Noctua NH-D15

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, RejZoR said:

While official numbers given by vendors themselves are often overly optimistic, AMD has recently been giving out very realistic numbers. I mean, if you can do it, why fake it. Days of R9 Fury and RX Vega are long gone now. There, they had to present things in best possible way by bending the truth a bit to make it look better. I honestly believe RX 6000 series are as real as it gets. We'll see when reviews hit the interwebs, but I think there is no reason for AMD to be bending truth this time around. Which is why I'm so excited.

Yeah, Im still kinda supprised they showed benchmarks where their cards lose.

I could use some help with this!

please, pm me if you would like to contribute to my gpu bios database (includes overclocking bios, stock bios, and upgrades to gpus via modding)

Bios database

My beautiful, but not that powerful, main PC:

prior build:

Spoiler

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

39 minutes ago, Blademaster91 said:

 it isn't known what kind of increase SAM provides..

They presented the increase when showing SAM. They showed 13% performance increase at best IIRC.

Main Rig :

Ryzen 7 2700X | Powercolor Red Devil RX 580 8 GB | Gigabyte AB350M Gaming 3 | 16 GB TeamGroup Elite 2400MHz | Samsung 750 EVO 240 GB | HGST 7200 RPM 1 TB | Seasonic M12II EVO | CoolerMaster Q300L | Dell U2518D | Dell P2217H | 

 

Laptop :

Thinkpad X230 | i5 3320M | 8 GB DDR3 | V-Gen 128 GB SSD |

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Fatih19 said:

They presented the increase when showing SAM. They showed 13% performance increase at best IIRC.

That was with both SAM and RAAAAAGEEE MOOODEEE enabled, and was most likely an outlier.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, HelpfulTechWizard said:

Yeah, Im still kinda supprised they showed benchmarks where their cards lose.

I think it was because they were way ahead in others, which means they are trading blows and are perfectly competitive. If they showed only beating them by a bit everyone would say results were cherry picked.

Link to comment
Share on other sites

Link to post
Share on other sites

"Meanwhile AMD sprinkles fairy dust on their cards and shits out a demon".....🤣🤣🤣

 

 

System Specs

  • CPU
    AMD Ryzen 7 5800X
  • Motherboard
    Gigabyte AMD X570 Auros Master
  • RAM
    G.Skill Ripjaws 32 GBs
  • GPU
    Red Devil RX 5700XT
  • Case
    Corsair 570X
  • Storage
    Samsung SSD 860 QVO 2TB - HDD Seagate B arracuda 1TB - External Seagate HDD 8TB
  • PSU
    G.Skill RipJaws 1250 Watts
  • Keyboard
    Corsair Gaming Keyboard K55
  • Mouse
    Razer Naga Trinity
  • Operating System
    Windows 10
Link to comment
Share on other sites

Link to post
Share on other sites

I was planning on getting AMDs top offering, the 6900XT, unfortunately they chose to give the consumer the finger and use Nvidia 3090 as an excuse to charge $1000. I can afford it, but on principle there's no way im paying that.

 

So right now im probably going to skip this generation, despite wanting/needing to replace my Plasma, aiming for the 48" OLED, i do need HDMi 2.1 to fully utilize its 4k 120hz capability. Im now thinking ill just use 1440p 120hz 8bit with my 1080ti until GPU prices normalize.

 

Ofc, im also thinking ..this could well be the new 'normal', with this new pricing tier here to stay thanks to those who are less disciplined with their money willing to just bend over and take it.

 

I hope, the competition will force prices down next gen, we shall see.

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

@SolarNova Why not get the 6800XT then if you don't like the price?

Main Rig :

Ryzen 7 2700X | Powercolor Red Devil RX 580 8 GB | Gigabyte AB350M Gaming 3 | 16 GB TeamGroup Elite 2400MHz | Samsung 750 EVO 240 GB | HGST 7200 RPM 1 TB | Seasonic M12II EVO | CoolerMaster Q300L | Dell U2518D | Dell P2217H | 

 

Laptop :

Thinkpad X230 | i5 3320M | 8 GB DDR3 | V-Gen 128 GB SSD |

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, SolarNova said:

I was planning on getting AMDs top offering, the 6900XT, unfortunately they chose to give the consumer the finger and use Nvidia 3090 as an excuse to charge $1000. I can afford it, but on principle there's no way im paying that.

 

So right now im probably going to skip this generation, despite wanting/needing to replace my Plasma, aiming for the 48" OLED, i do need HDMi 2.1 to fully utilize its 4k 120hz capability. Im now thinking ill just use 1440p 120hz 8bit with my 1080ti until GPU prices normalize.

 

Ofc, im also thinking ..this could well be the new 'normal', with this new pricing tier here to stay thanks to those who are less disciplined with their money willing to just bend over and take it.

 

I hope, the competition will force prices down next gen, we shall see.

It's literally $500 cheaper than it's direct competition. Nobody likes paying more but realistically what were you expecting AMD to do? The product (pending third party benchmarks) matches (and sometimes beats) it's direct competition while being one third cheaper in price.

 

AMD didn't give you the middle finger. Pricing it at $1399 would have been giving you the middle finger.

 

This is basically a "Titan-class" card - the original Titan was also $999. And so was basically every other variant except for their newest RTX iterations (which is $2000 I might remind you).

 

Anyway, I digress - perhaps next generation will see prices drop further. But I wouldn't necessarily hold my breath on that.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, dalekphalm said:

It's literally $500 cheaper than it's direct competition. Nobody likes paying more but realistically what were you expecting AMD to do? The product (pending third party benchmarks) matches (and sometimes beats) it's direct competition while being one third cheaper in price.

Which is why i mentioned Nvidia gave them the excuse.

10 minutes ago, dalekphalm said:

This is basically a "Titan-class" card

That argument is old and long dead. Comes across as if the only feature a cards needs to be called 'Titan class' is the bloody price. Nobody would be calling the 2080ti, 3090, or 6900XT ''titan class' if they held true to 'normal' pricing conventions.

 

The 2080ti was called it initially, it wasnt

The 3090 has been called it despite it lacking the compute capabilities and driver support,

Now u label the 6900XT it despite it being a 6800XT with 8 more CU's, enough to make a difference in gaming, not enough to make it 'titan class'.

 

The whole Titan class was made and designed to be the intermediate between 'professional' cards and 'gaming' cards.

The 3090 has the VRAM of a professional card, but nothing else, Thats the only argument u can make to call it Titan class.

The 6900XT doesnt even have that as it has the same VRAM as the cards below it.

 

So no ..no its not a Titan class, its just priced like one.

 

I will never understand why people have a desire to defend companies like this. Your not gaining from it, ur loosing out. Your not the company, your its customer.  These prices go against 2 decades of pricing (which includes inflation).

 

 

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

https://www.amd.com/en/gaming/graphics-gaming-benchmarks

AMD has released new benchmarks. 6800XT beats the 3090 in some cases. In some very select cases (Battlefield 5 1440p ultra), the 6800 non XT beats the 3090. Keep in mind all these benchmarks have SAM enabled.

CPU - Ryzen 5 5600X | CPU Cooler - EVGA CLC 240mm AIO  Motherboard - ASRock B550 Phantom Gaming 4 | RAM - 16GB (2x8GB) Patriot Viper Steel DDR4 3600MHz CL17 | GPU - MSI RTX 3070 Ventus 3X OC | PSU -  EVGA 600 BQ | Storage - PNY CS3030 1TB NVMe SSD | Case Cooler Master TD500 Mesh

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Random_Person1234 said:

https://wccftech.com/amd-radeon-rx-6900-xt-rx-6800-xt-rx-6800-gaming-4k-wqhd-benchmarks/

AMD has released new benchmarks. 6800XT beats the 3090 in some cases.

There's a couple things I really like about this. Amd has been very open about the whole set up. Including when they get slightly "beat" by Nvidia. They list their whole set up and drivers. As long as these numbers are confirmed by reviewers I'm really impressed with them.

I'd also like to point out, they use 16 GB ram. Proof no one *needs* 32 Gigs of ram for gaming only.

I'm not actually trying to be as grumpy as it seems.

I will find your mentions of Ikea or Gnome and I will /s post. 

Project Hot Box

CPU 13900k, Motherboard Gigabyte Aorus Elite AX, RAM CORSAIR Vengeance 4x16gb 5200 MHZ, GPU Zotac RTX 4090 Trinity OC, Case Fractal Pop Air XL, Storage Sabrent Rocket Q4 2tbCORSAIR Force Series MP510 1920GB NVMe, CORSAIR FORCE Series MP510 960GB NVMe, PSU CORSAIR HX1000i, Cooling Corsair XC8 CPU block, Bykski GPU block, 360mm and 280mm radiator, Displays Odyssey G9, LG 34UC98-W 34-Inch,Keyboard Mountain Everest Max, Mouse Mountain Makalu 67, Sound AT2035, Massdrop 6xx headphones, Go XLR 

Oppbevaring

CPU i9-9900k, Motherboard, ASUS Rog Maximus Code XI, RAM, 48GB Corsair Vengeance LPX 32GB 3200 mhz (2x16)+(2x8) GPUs Asus ROG Strix 2070 8gb, PNY 1080, Nvidia 1080, Case Mining Frame, 2x Storage Samsung 860 Evo 500 GB, PSU Corsair RM1000x and RM850x, Cooling Asus Rog Ryuo 240 with Noctua NF-12 fans

 

Why is the 5800x so hot?

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, Random_Person1234 said:

https://www.amd.com/en/gaming/graphics-gaming-benchmarks

AMD has released new benchmarks. 6800XT beats the 3090 in some cases. In some very select cases (Battlefield 5 1440p ultra), the 6800 non XT beats the 3090. Keep in mind all these benchmarks have SAM enabled.

It's a really user-friendly way to show the benchmark, I have to say.

Main Rig :

Ryzen 7 2700X | Powercolor Red Devil RX 580 8 GB | Gigabyte AB350M Gaming 3 | 16 GB TeamGroup Elite 2400MHz | Samsung 750 EVO 240 GB | HGST 7200 RPM 1 TB | Seasonic M12II EVO | CoolerMaster Q300L | Dell U2518D | Dell P2217H | 

 

Laptop :

Thinkpad X230 | i5 3320M | 8 GB DDR3 | V-Gen 128 GB SSD |

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Random_Person1234 said:

https://www.amd.com/en/gaming/graphics-gaming-benchmarks

AMD has released new benchmarks. 6800XT beats the 3090 in some cases. In some very select cases (Battlefield 5 1440p ultra), the 6800 non XT beats the 3090. Keep in mind all these benchmarks have SAM enabled.

Those are some nice graphs AMD are providing us. There's one big flaw about this tho imo:

The 6800 is worse price/perf in every single title AMD chose than the 6800XT. This doesn't inspire confidence, usually price/perf should increase the lower in tier you go, not decrease. That $580 price tag seems to be a bit high.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Voluspa said:

There's a couple things I really like about this. Amd has been very open about the whole set up. Including when they get slightly "beat" by Nvidia. They list their whole set up and drivers. As long as these numbers are confirmed by reviewers I'm really impressed with them.

I'd also like to point out, they use 16 GB ram. Proof no one *needs* 32 Gigs of ram for gaming only.

True, you don't need 32 gbs....in most cases. None of those games benchmarked are ram intensive. There are plenty of games where the overall usage rate reaches 16gbs and on occasion even exceeds it. Digital Combat Simulator, MS Flight Sim 2020 and Squad to name a few. 

System Specs

  • CPU
    AMD Ryzen 7 5800X
  • Motherboard
    Gigabyte AMD X570 Auros Master
  • RAM
    G.Skill Ripjaws 32 GBs
  • GPU
    Red Devil RX 5700XT
  • Case
    Corsair 570X
  • Storage
    Samsung SSD 860 QVO 2TB - HDD Seagate B arracuda 1TB - External Seagate HDD 8TB
  • PSU
    G.Skill RipJaws 1250 Watts
  • Keyboard
    Corsair Gaming Keyboard K55
  • Mouse
    Razer Naga Trinity
  • Operating System
    Windows 10
Link to comment
Share on other sites

Link to post
Share on other sites

Don't forget their promise guys.

 

Main Rig :

Ryzen 7 2700X | Powercolor Red Devil RX 580 8 GB | Gigabyte AB350M Gaming 3 | 16 GB TeamGroup Elite 2400MHz | Samsung 750 EVO 240 GB | HGST 7200 RPM 1 TB | Seasonic M12II EVO | CoolerMaster Q300L | Dell U2518D | Dell P2217H | 

 

Laptop :

Thinkpad X230 | i5 3320M | 8 GB DDR3 | V-Gen 128 GB SSD |

Link to comment
Share on other sites

Link to post
Share on other sites

Remember to never trust first party benchmarks, even if they are from AMD. They are literally created for marketing purposes by a marketing team. Believing them is like believing an ad running on TV saying "this is the best burger in the world". 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Fatih19 said:

Don't forget their promise guys.

I like how they overhype their own stuff, which will only leads to disappointment when it's not up to snuff

just like nvidia's claim of "2x better than 2080" or smth, got everyone hyped up and disappointed, even thought it's still a good card

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

 

On 10/30/2020 at 10:58 PM, BlackManINC said:

True, you don't need 32 gbs....in most cases. None of those games benchmarked are ram intensive. There are plenty of games where the overall usage rate reaches 16gbs and on occasion even exceeds it. Digital Combat Simulator, MS Flight Sim 2020 and Squad to name a few. 

I haven't even seen any proof that you need more than 10GB of VRAM for gaming. 

And remember, just because task manager says it is using X amount of VRAM doesn't mean it NEEDS X amount of VRAM.

 

If your task manager says it has 10GB of VRAM in use then it might be 5GB of actual data in it, and 5GB of useless junk data that is no longer needed but the OS/GPU thinks "why waste time and resources cleaning the VRAM of 5GB of junk when I am in no danger of running out anyway? I'll clean it if I have to".

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, LAwLz said:

 

I haven't even seen any proof that you need more than 10GB of VRAM for gaming. 

And remember, just because task manager says it is using X amount of VRAM doesn't mean it NEEDS X amount of VRAM.

 

If your task manager says it has 10GB of VRAM in use then it might be 5GB of actual data in it, and 5GB of useless junk data that is no longer needed but the OS/GPU thinks "why waste time and resources cleaning the VRAM of 5GB of junk when I am in no danger of running out anyway? I'll clean it if I have to".

I agree, but the guy you were quoting was talking about regular RAM, not VRAM. Just clearing things up.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, LAwLz said:

I haven't even seen any proof that you need more than 10GB of VRAM for gaming. 

it probably wont be necessary for a long time to come because why develop something when the majority of the market cant use it?

but when higher VRAM cards becomes more common, it'll likely be an issue, when game dev starts using it

 

im not sure how GPU Direct (gpu accessing data from SSD thingy) will affect VRAM usage, only time will tell

 

two things though:

1) it wont be necessary for at least another generation or two, look at 4gb to 8gb vram jump, it's not necessary to have more than 4gb until 8gb cards became common

2) can your gpu core even process the amount of data that can utilise the amount of VRAM? sure you can turn textures and stuff on high, which uses a lot of VRAM, but how many fps are you even gonna get?

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×