Jump to content

Where Gaming Begins - AMD RX6000 series Live Event

LAwLz

Interesting times!

 

I was interested in a 3070 potentially, but the 8GB Vram is a bit low to my taste...

 

The RX6800 with 16GB might be a bit more interesting. But 579 USD is pushing it for what I am willing to spend.

 

In my opinion, strange pricing for the RX6800 XT, at 649 USD, so close to 579. seems like the 6800 XT is going to be better price / performance than the 6800 then.

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, porina said:

I'm most interested to see how that cache works. Is it "dumb" and only stores more recently loaded data, or is it more selective?

I'm hoping somehow it's just really amazing at F@H, 5800 could be the best value card, or it could be worthless and no use at all lol.

 

Spoiler

Side note the same could apply to mining but that would be horrific

 

Link to comment
Share on other sites

Link to post
Share on other sites

38 minutes ago, maartendc said:

I was interested in a 3070 potentially, but the 8GB Vram is a bit low to my taste...

 

The RX6800 with 16GB might be a bit more interesting. But 579 USD is pushing it for what I am willing to spend.

 

In my opinion, strange pricing for the RX6800 XT, at 649 USD, so close to 579. seems like the 6800 XT is going to be better price / performance than the 6800 then.

It is interesting, that AMD are using VRAM quantity as a differentiator against nvidia, but not against themselves. At least not within this product segment. 

 

As a general question, who here was looking to get a 3070 later today? Has the AMD reveal changed plans for that?

 

38 minutes ago, leadeater said:

I'm hoping somehow it's just really amazing at F@H, 5800 could be the best value card, or it could be worthless and no use at all lol.

On another forum some people are eagerly awaiting testing of these cards as the use case in theory would fit inside that cache, and the performance is currently limited by available ram bandwidth. Having said that, as a consumer card the FP64 performance is unremarkable so I wonder how one of these with practically unlimited ram bandwidth compares with for example Radeon VII with much higher FP64 rate but bandwidth limited.

 

I'm not up to date on GPU mining. Ethereum was the main one during the boom. Based on my understanding at the time, assuming it hasn't significantly changed since then, the cache would be of limited value. Ethereum was based on a large data set to prevent it from being implemented cheaply in ASIC, so if you have to access a large data set presumably in a random-ish way, a relatively small cache I don't think is likely to help much.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, porina said:

Ethereum was based on a large data set to prevent it from being implemented cheaply in ASIC

Correct, but it my also depend on how big the execution size is compared to how much data is loaded in to memory buffer. Hopefully it is useless, but also there are other algo and coins to mine which might work within that cache.

 

But yea hoping that is really useful for F@H, that would be great.

Link to comment
Share on other sites

Link to post
Share on other sites

I didn't even bother to watch the live show but did watch various videos talking of the live event and the reason being ..

The only people that can get excited about gpu pricing these days is Americans, at times, or the filthy rich.  I am excited that AMD has gotten to the point where they are competitive with Nvidia but reality set in some time ago that AMD like Nvidia would want to please shareholders and such and so I expected what I saw in terms of pricing with this launch.  Disappointed but not surprised.  Competition is supposed to really drive prices down but it doesn't with many things these days.  It totally is price fixing / anti-consumerism but the bodies that regulate things / are watch dogs don't care.  

Link to comment
Share on other sites

Link to post
Share on other sites

Where my wallet ends.

 

Did anyone make that joke yet? Hope I'm not too late. 

CPU: AMD Athlon 200GE

Mobo: Gigabyte B450MDS3H

RAM: Corsair Vengance LPX DDR4 3000Mhz

GPU: Asus ROG Strix RX570 4GB

1TB HDD, Windows 10 64-bit

Link to comment
Share on other sites

Link to post
Share on other sites

33 minutes ago, idontlikephysx said:

I didn't even bother to watch the live show but did watch various videos talking of the live event and the reason being ..

The only people that can get excited about gpu pricing these days is Americans, at times, or the filthy rich.  I am excited that AMD has gotten to the point where they are competitive with Nvidia but reality set in some time ago that AMD like Nvidia would want to please shareholders and such and so I expected what I saw in terms of pricing with this launch.  Disappointed but not surprised.  Competition is supposed to really drive prices down but it doesn't with many things these days.  It totally is price fixing / anti-consumerism but the bodies that regulate things / are watch dogs don't care.  

Nvidia was the first one that dropped prices with the 3000 series, cause obviously they knew more or less what AMD had up their sleeve.

 

Nowdays you can get the performance of a 2080ti for 600$. I don't get it. People get stuck on tiers, and forget that the only thing that matters in the end is performance.

MOTHERBOARD: ASRock H97 Pro4 CPU: Intel Core i5-4460 @3.30 Ghz Intel Xeon E3-1271v3 @4.00 Ghz RAM: 32Gb (4x8Gb) Kingstone HyperX Fury DDR3@1600 Mhz (9-9-9-27)

GPU: MSI 390 8Gb Gaming Edition PSU: XFX TS 650w Bronze Enermax Revolution D.F. 650w 80+ Gold MOUSE: Logitech G502 Proteus Spectrum KEYBOARD: Monokey Standard Suave Blue

STORAGE: SSD Samsung EVO 850 250Gb // HDD WD Green 1Tb // HDD WD Blue 4Tb // HDD WD Blue 160Gb CASE: Fractal Design Define R5 Windowed OS: Windows 11 Pro x64 Bit

MONITORS: Samsung CFG7 C24FG7xFQ @144hz // Samsung SyncMaster TA350 LT23A350 @60hz Samsung Odyssey G7 COOLER: Noctua NH-D15

 

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, Random_Person1234 said:

I would think the XTX models would be released after Nvidia releases their supers and Tis.

RX 6900XT is fully enabled Navi 21 XTX.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, porina said:

On another forum some people are eagerly awaiting testing of these cards as the use case in theory would fit inside that cache, and the performance is currently limited by available ram bandwidth. Having said that, as a consumer card the FP64 performance is unremarkable so I wonder how one of these with practically unlimited ram bandwidth compares with for example Radeon VII with much higher FP64 rate but bandwidth limited.

the folks over at mersenneforum? i dont stay as up to date as i should with GIMPS but something that's at least competitive with the RVII in PRPs would be very exciting, even the 3090 cant match it in that workload. 

topics i need help on:

Spoiler

 

 

my "oops i bought intel right before zen 3 releases" build

CPU: Ryzen 5 3600 (placeholder)

GPU: Gigabyte 980ti Xtreme (also placeholder), deshroud w/ generic 1200rpm 120mm fans x2, stock bios 130% power, no voltage offset: +70 core +400 mem 

Memory: 2x16gb GSkill Trident Z RGB 3600C16, 14-15-30-288@1.45v

Motherboard: Asus ROG Strix X570-E Gaming

Cooler: Noctua NH-D15S w/ white chromax bling
OS Drive: Samsung PM981 1tb (OEM 970 Evo)

Storage Drive: XPG SX8200 Pro 2tb

Backup Storage: Seagate Barracuda Compute 4TB

PSU: Seasonic Prime Ultra Titanium 750W w/ black/white Cablemod extensions
Case: Fractal Design Meshify C Dark (to be replaced with a good case shortly)

basically everything was bought used off of reddit or here, only new component was the case. absolutely nutty deals for some of these parts, ill have to tally it all up once it's "done" :D 

Link to comment
Share on other sites

Link to post
Share on other sites

AMD also acquired Xilinx for 35B

 

https://www.amd.com/en/press-releases/2020-10-27-amd-to-acquire-xilinx-creating-the-industry-s-high-performance-computing

 

Does anyone knows what is the plan here?

Link to comment
Share on other sites

Link to post
Share on other sites

Remember guys in Jayz2cent vid they saw the 6900XT on the slide it said +rage mode + smart memory vs the 3090. Though it is still less wattage then the 3090 sooo...

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, RejZoR said:

Sweet mother of god, has anyone seen the back of ASUS TUF RX6800 GPU? The number of capacitors!

RX6000_TUF.jpg

Asus "stole" the pass through design from the 3000 series cards, nice.

 

 

MOTHERBOARD: ASRock H97 Pro4 CPU: Intel Core i5-4460 @3.30 Ghz Intel Xeon E3-1271v3 @4.00 Ghz RAM: 32Gb (4x8Gb) Kingstone HyperX Fury DDR3@1600 Mhz (9-9-9-27)

GPU: MSI 390 8Gb Gaming Edition PSU: XFX TS 650w Bronze Enermax Revolution D.F. 650w 80+ Gold MOUSE: Logitech G502 Proteus Spectrum KEYBOARD: Monokey Standard Suave Blue

STORAGE: SSD Samsung EVO 850 250Gb // HDD WD Green 1Tb // HDD WD Blue 4Tb // HDD WD Blue 160Gb CASE: Fractal Design Define R5 Windowed OS: Windows 11 Pro x64 Bit

MONITORS: Samsung CFG7 C24FG7xFQ @144hz // Samsung SyncMaster TA350 LT23A350 @60hz Samsung Odyssey G7 COOLER: Noctua NH-D15

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Parideboy said:

Asus "stole" the pass through design from the 3000 series cards, nice.

 

 

Hopefully Asus just didn't throw the RTX 3000 series cooler on there without making any changes for the GPU die or VRM's, AIB's have a reputation for reusing coolers from their Nvidia versions with AMD cards.

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, porina said:

From the given information, the CU count is different, same stated clocks and power. That leaves the only variable being binning of the die. 5900XT must have higher "quality" die than 5800XT. If we assume they were of similar quality, then the clocks would be more different, lower on higher CU unit. It is also one thing to see paper specs, and another to see how they actually run. So running clocks in practice may still differ.

🤔 The 6900XT also has more "texture units", "stream processors", and "ray accelerators". I'm sure all of that would contribute to it giving a better performance, which I don't think will be any greater than the 6800XT anymore than the 2080TI was over the 2080S. Diminishing returns is what I expect when it comes to gaming. Beyond that, the base architecture being more efficient likely has more to do with both having the same TDP. It doesn't have to work harder to get a meaningful performance boost. 

System Specs

  • CPU
    AMD Ryzen 7 5800X
  • Motherboard
    Gigabyte AMD X570 Auros Master
  • RAM
    G.Skill Ripjaws 32 GBs
  • GPU
    Red Devil RX 5700XT
  • Case
    Corsair 570X
  • Storage
    Samsung SSD 860 QVO 2TB - HDD Seagate B arracuda 1TB - External Seagate HDD 8TB
  • PSU
    G.Skill RipJaws 1250 Watts
  • Keyboard
    Corsair Gaming Keyboard K55
  • Mouse
    Razer Naga Trinity
  • Operating System
    Windows 10
Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Blademaster91 said:

Hopefully Asus just didn't throw the RTX 3000 series cooler on there without making any changes for the GPU die or VRM's, AIB's have a reputation for reusing coolers from their Nvidia versions with AMD cards.

something something strix 5700xt...

 

worked well for 3070s though, the temps and noise levels on the tuf/gaming x/ftw3 3070s are insanely low lmao

topics i need help on:

Spoiler

 

 

my "oops i bought intel right before zen 3 releases" build

CPU: Ryzen 5 3600 (placeholder)

GPU: Gigabyte 980ti Xtreme (also placeholder), deshroud w/ generic 1200rpm 120mm fans x2, stock bios 130% power, no voltage offset: +70 core +400 mem 

Memory: 2x16gb GSkill Trident Z RGB 3600C16, 14-15-30-288@1.45v

Motherboard: Asus ROG Strix X570-E Gaming

Cooler: Noctua NH-D15S w/ white chromax bling
OS Drive: Samsung PM981 1tb (OEM 970 Evo)

Storage Drive: XPG SX8200 Pro 2tb

Backup Storage: Seagate Barracuda Compute 4TB

PSU: Seasonic Prime Ultra Titanium 750W w/ black/white Cablemod extensions
Case: Fractal Design Meshify C Dark (to be replaced with a good case shortly)

basically everything was bought used off of reddit or here, only new component was the case. absolutely nutty deals for some of these parts, ill have to tally it all up once it's "done" :D 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, RejZoR said:

Sweet mother of god, has anyone seen the back of ASUS TUF RX6800 GPU? The number of capacitors!

 

Seems to be the reference layout:

Spoiler

 

ElgEqxmXYAEoTu2?format=jpg&name=4096x4096

 

Link to comment
Share on other sites

Link to post
Share on other sites

22 hours ago, maartendc said:

Interesting times!

 

I was interested in a 3070 potentially, but the 8GB Vram is a bit low to my taste...

 

The RX6800 with 16GB might be a bit more interesting. But 579 USD is pushing it for what I am willing to spend.

 

In my opinion, strange pricing for the RX6800 XT, at 649 USD, so close to 579. seems like the 6800 XT is going to be better price / performance than the 6800 then.

IMHO, they're being rather smart and evil with the pricing, even though it may seem weird.

Please keep in mind all this is based solely on yesterday's presentation, so truck o' salt and all that with regards to performance; also, I may be insane:

 

They know the 6800 trounces the 3070 in rasterization and offers double the memory; so even if it has worse RT performance, they can easily charge more for it. 

 

The 6800XT more or less trades blows with the 3080, while also offering more VRAM, although slower. Also, RT performance is also likely going to be quite worse (otherwise they would've shown numbers yesterday) so they need to drop the price a little bit. More VRAM - Worse RT performance + Slightly better price = competitive.

 

Since the 3070 is 500 smackaroonies and they can charge a bit more than that, and the 3080 is 700$ and they want to charge a bit less than that, they used the situation to lay a trap:

  • Why buy the 3070 when, for 75$ more, I can get a 6800, which is a better card with literally double the VRAM?
  • And now that I'm at the 575$ price point, what's 75$ more to go with the 6800xt and get a truly high-end card?
Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, RejZoR said:

Sweet mother of god, has anyone seen the back of ASUS TUF RX6800 GPU? The number of capacitors!

RX6000_TUF.jpg

Their all-MLCC design on the RTX 3000 cards has made them, by far, the first go-to AIB partner for people purchasing RTX 3000 cards. It looks like they see the small additional cost of MLCC caps delivers a fantastic return-on-investment through sales.

You own the software that you purchase - Understanding software licenses and EULAs

 

"We’ll know our disinformation program is complete when everything the american public believes is false" - William Casey, CIA Director 1981-1987

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Rauten said:

IMHO, they're being rather smart and evil with the pricing, even though it may seem weird.

Please keep in mind all this is based solely on yesterday's presentation, so truck o' salt and all that with regards to performance; also, I may be insane:

 

They know the 6800 trounces the 3070 in rasterization and offers double the memory; so even if it has worse RT performance, they can easily charge more for it. 

 

The 6800XT more or less trades blows with the 3080, while also offering more VRAM, although slower. Also, RT performance is also likely going to be quite worse (otherwise they would've shown numbers yesterday) so they need to drop the price a little bit. More VRAM - Worse RT performance + Slightly better price = competitive.

 

Since the 3070 is 500 smackaroonies and they can charge a bit more than that, and the 3080 is 700$ and they want to charge a bit less than that, they used the situation to lay a trap:

  • Why buy the 3070 when, for 75$ more, I can get a 6800, which is a better card with literally double the VRAM?
  • And now that I'm at the 575$ price point, what's 75$ more to go with the 6800xt and get a truly high-end card?

AMD has finally gone the route of smart design opposed to brute force they were doing in the past. Just remember the R9 290? It wasn't a bad card, but it had 512bit bus and that meant more complex memory interface, higher cost and higher price. R9 Fury, same issue with exotic HBM. RX Vega, the same with HBM2. RX 6000 series is a smart approach. "Simple" and cheaper 256bit bus with large amount of cheaper GDDR6 instead of GDDR6X and offsetting bus shortcomings using cleverly designed on-GPU cache. It's how NVIDIA went with GTX 900 series. Instead of R9 390's brute force approach they introduced framebuffer compression and tiling to squeeze more from a narrower memory subsystem.

 

Ray tracing is still under big question mark, but given it's used in both consoles, it can't possibly be that bad, otherwise it just wouldn't work there. Especially since ray tracing is less forgiving compared to rasterization. With rasterization you can do bunch of tricks to get performance. Tracing a ray is rather straight forward with very few corners you can cut (before image quality goes really bad as a result).

 

I'm frankly leaning towards RX 6800XT more now and I was already down to paying for RTX 3080, but refunded after they were basically unobtainable. Infinity Cache intrigues me and since I do plan on buying Ryzen 5900X, Smart Access Memory also intrigues me a bit. Then there is lower power requirement and 16GB of memory so I'm actually upgrading to my current 11GB instead of downgrading to 10GB (though for my 1080p it won't play such a role I'm guessing). Still, why not have more if I can.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, RejZoR said:

Ray tracing is still under big question mark, but given it's used in both consoles, it can't possibly be that bad, otherwise it just wouldn't work there. Especially since ray tracing is less forgiving compared to rasterization. With rasterization you can do bunch of tricks to get performance. Tracing a ray is rather straight forward with very few corners you can cut (before image quality goes really bad as a result).

I think we'll see the true potential of this cards with new releases, games nowdays are implemented with Nvidia cartds in mind.

MOTHERBOARD: ASRock H97 Pro4 CPU: Intel Core i5-4460 @3.30 Ghz Intel Xeon E3-1271v3 @4.00 Ghz RAM: 32Gb (4x8Gb) Kingstone HyperX Fury DDR3@1600 Mhz (9-9-9-27)

GPU: MSI 390 8Gb Gaming Edition PSU: XFX TS 650w Bronze Enermax Revolution D.F. 650w 80+ Gold MOUSE: Logitech G502 Proteus Spectrum KEYBOARD: Monokey Standard Suave Blue

STORAGE: SSD Samsung EVO 850 250Gb // HDD WD Green 1Tb // HDD WD Blue 4Tb // HDD WD Blue 160Gb CASE: Fractal Design Define R5 Windowed OS: Windows 11 Pro x64 Bit

MONITORS: Samsung CFG7 C24FG7xFQ @144hz // Samsung SyncMaster TA350 LT23A350 @60hz Samsung Odyssey G7 COOLER: Noctua NH-D15

 

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Parideboy said:

I think we'll see the true potential of this cards with new releases, games nowdays are implemented with Nvidia cartds in mind.

But if it already achieves RTX 3080 and RTX 3090 performance now, AMD being more present in the gaming scene, it'll only get better. But again, even if you look at CURRENT situation, it doesn't look bad. People often complain this game is for NVIDIA, that one for AMD, but is there really that big of a difference you should actually even care?

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, RejZoR said:

People often complain this game is for NVIDIA, that one for AMD, but is there really that big of a difference you should actually even care?

On forums like this, people tend to over analyse every little detail of performance. A few % here, a few % there. It's a bit like the bottleneck questions that keep popping up. To me, they're not the best question to ask, but more appropriate is will a combination of components give me the desired level of performance? Is it good enough? If a game is giving me 100+ fps 1440 ultra, do I care if one GPU is 10fps faster? The software ecosystem and other features may be more important.

 

I made my bed in green's camp, I have G-sync displays, and make extensive use of the software. At the same time, I welcome AMD finally having a competitive high end GPU again, as it gives more options to those who are not so attached to one side or other, and it might help me get a 30 series card before 40 series comes out...

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×