Jump to content

AMD Radeon RX 6000 Specifications Revealed / Leaked from 2 Separate Locations: Up to 5120-Cores & 2.5GHz Clock Speeds

31 minutes ago, HenrySalayne said:

It's not too far off. The 3080's base clock is 1.44 GHz, but some cards are easily boosting beyond 1.9 GHz.

https://i.gyazo.com/b3350622e8562e9356df8814c993c800.png (RTX 3080)

 

Nearly 300MHz difference between the factory base and boost clock is perfectly normal. Some AIBs are bigger and some are smaller. Anything over the factory boost clock or the guaranteed boost clock (such as the 1900MHz boost clock seen by some 3080s) is the result of GPU Boost 4.0 or whatever it is now.

 

There's no way these NAVI cards have a 1000MHz difference between the factory base and factory boost clock. That 2500MHz clock has to be the maximum adjustment range reported by the driver. And the 1500MHz must be an ES card.

Our Grace. The Feathered One. He shows us the way. His bob is majestic and shows us the path. Follow unto his guidance and His example. He knows the one true path. Our Saviour. Our Grace. Our Father Birb has taught us with His humble heart and gentle wing the way of the bob. Let us show Him our reverence and follow in His example. The True Path of the Feathered One. ~ Dimboble-dubabob III

Link to comment
Share on other sites

Link to post
Share on other sites

58 minutes ago, Energycore said:

256-bit GDDR6 looks a bit scarce compared to Nvidia's 320-bit GDDR6X.

 

I'm gonna call (X) Doubt on that spec.

Also gddr6x is power hungry it's an odd choice for the 3080.

CPU | AMD Ryzen 7 7700X | GPU | ASUS TUF RTX3080 | PSU | Corsair RM850i | RAM 2x16GB X5 6000Mhz CL32 MOTHERBOARD | Asus TUF Gaming X670E-PLUS WIFI | 
STORAGE 
| 2x Samsung Evo 970 256GB NVME  | COOLING 
| Hard Line Custom Loop O11XL Dynamic + EK Distro + EK Velocity  | MONITOR | Samsung G9 Neo

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Chris Pratt said:

The big sticking point for me currently is Nvidia's DLSS. AMD thus far has nothing to match it, and it frankly gives Nvidia a huge edge. Granted, not every game supports it, but more and more will over time. Even if AMD somehow came out with GPUs more powerful than Nvidia's, you'd still get better perf out of Nvidia's simply because of DLSS. I'm not sure AMD can even compete with that, since they don't have any AI chops.

When NVIDIA manages to make DLSS game agnostic, then it's really going to hurt AMD which will have to match them with raw horsepower unless they get an answer to DLSS of their own somehow.

 

The other problem I have ditching NVIDIA is "Fast V-Sync". I don't have FreeSync or G-Sync monitor and this mode gives me by far the best tear free performance. AMD has "Enhanced Sync",but as far as I know it works like NVIDIA's Adaptive V-Sync and there is some possibility for slight tearing in some rare cases where Fast never has it. I generally hate a lot of stupid NVIDIA things, especially retarded NVCP, but Fast V-Sync is one of those that I do really like. I'm not sure I can ditch this even if Radeon is much faster.

Link to comment
Share on other sites

Link to post
Share on other sites

If the 80cu die can OC to 2500MHz on air/water I'm in

MOAR COARS: 5GHz "Confirmed" Black Edition™ The Build
AMD 5950X 4.7/4.6GHz All Core Dynamic OC + 1900MHz FCLK | 5GHz+ PBO | ASUS X570 Dark Hero | 32 GB 3800MHz 14-15-15-30-48-1T GDM 8GBx4 |  PowerColor AMD Radeon 6900 XT Liquid Devil @ 2700MHz Core + 2130MHz Mem | 2x 480mm Rad | 8x Blacknoise Noiseblocker NB-eLoop B12-PS Black Edition 120mm PWM | Thermaltake Core P5 TG Ti + Additional 3D Printed Rad Mount

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, S w a t s o n said:

If the 80cu die can OC to 2500MHz on air/water I'm in

Older chips were made for wide design with low clock. RDNA seems to be focused on really high clocks and narrower approach. Sort of like NVIDIA has been doing since Maxwell 2. And it's paying off for AMD. First RDNA cards have gone in the right direction and I'm really curious how they'll advance it further with RDNA2. I'm also wondering what kind of approach AMD is using for ray tracing. They clearly have a functional method since it's being advertised for consoles. Why does end of October have to be so far away...

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, RejZoR said:

I'm also wondering what kind of approach AMD is using for ray tracing.

The Xbox team basically explained RDNA2 raytracing at Hot Chips this year, watch the presentation

MOAR COARS: 5GHz "Confirmed" Black Edition™ The Build
AMD 5950X 4.7/4.6GHz All Core Dynamic OC + 1900MHz FCLK | 5GHz+ PBO | ASUS X570 Dark Hero | 32 GB 3800MHz 14-15-15-30-48-1T GDM 8GBx4 |  PowerColor AMD Radeon 6900 XT Liquid Devil @ 2700MHz Core + 2130MHz Mem | 2x 480mm Rad | 8x Blacknoise Noiseblocker NB-eLoop B12-PS Black Edition 120mm PWM | Thermaltake Core P5 TG Ti + Additional 3D Printed Rad Mount

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, leadeater said:

5700 XT vs 6700 XT doesn't look all that great on paper, hopefully it's a bit better for cheaper but still odd it's drop VRAM and thus bandwidth too.

The Road to PS5 presentation from March given by Mark Cerny is fairly interesting. His remarks about certain features from Sony and AMD's collaboration ending up in the Desktop GPU's are intriguing (around the 25:42 mark), as one of the apparent features is large cache on-die. Whether that makes its way into the Desktop cards is just rumour at this point, but it would explain why AMD don't need a larger bus width than 256bit on the top card.

 

 

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, DildorTheDecent said:

Boost clocks can't be real. There's no way a 6700 XT has a 1500MHz base and 2500MHz boost clock. Guaranteed base and boost clocks are not that far apart. Maybe it's base and max boost assuming power, voltage and temperature are in check but certainly not the numbers they'll put on the box. Or it could be the max boost that the driver or program reports.

 

Even the 6900 XT has a 700MHz difference. No way, Jose.

that could have been the case except the same value for the navi 21 is 2050 or 2200, which are far too low for max allowed clocks, and considering that the ps5 will run at 2.23Ghz, its not that much of a stretch, still its some really high clocks, my guess here is that navi 2x had some sort of bug at high clocks (the ps5 dude talked about it) so they must have fixed it by the time the made navi 22.

its high clocks also kinda of explain the navi 23 which usually would not have nearly the same amount of Cus

Link to comment
Share on other sites

Link to post
Share on other sites

42 minutes ago, S w a t s o n said:

The Xbox team basically explained RDNA2 raytracing at Hot Chips this year, watch the presentation

Hot Chips? All I get searching for it is hot chips challenges on youtube... With literal spicy chips...

Link to comment
Share on other sites

Link to post
Share on other sites

Please beat the pants off a 1080ti so I can finally kick Nvidia.

Ketchup is better than mustard.

GUI is better than Command Line Interface.

Dubs are better than subs

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, GatioH said:

pls no more space-heater wars

Tell Nvidia that xD

 

Though I still don't fully believe these specs. Need more people to confirm them

We have a NEW and GLORIOUSER-ER-ER PSU Tier List Now. (dammit @LukeSavenije stop coming up with new ones)

You can check out the old one that gave joy to so many across the land here

 

Computer having a hard time powering on? Troubleshoot it with this guide. (Currently looking for suggestions to update it into the context of <current year> and make it its own thread)

Computer Specs:

Spoiler

Mathresolvermajig: Intel Xeon E3 1240 (Sandy Bridge i7 equivalent)

Chillinmachine: Noctua NH-C14S
Framepainting-inator: EVGA GTX 1080 Ti SC2 Hybrid

Attachcorethingy: Gigabyte H61M-S2V-B3

Infoholdstick: Corsair 2x4GB DDR3 1333

Computerarmor: Silverstone RL06 "Lookalike"

Rememberdoogle: 1TB HDD + 120GB TR150 + 240 SSD Plus + 1TB MX500

AdditionalPylons: Phanteks AMP! 550W (based on Seasonic GX-550)

Letterpad: Rosewill Apollo 9100 (Cherry MX Red)

Buttonrodent: Razer Viper Mini + Huion H430P drawing Tablet

Auralnterface: Sennheiser HD 6xx

Liquidrectangles: LG 27UK850-W 4K HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Trik'Stari said:

Please beat the pants off a 1080ti so I can finally kick Nvidia.

*just bought a used 1080 ti LOL

 

(but it's more than I'll need for a while cuz I don't do AAA games, I just wanted it for 4K Skyrim-level of detail games)

We have a NEW and GLORIOUSER-ER-ER PSU Tier List Now. (dammit @LukeSavenije stop coming up with new ones)

You can check out the old one that gave joy to so many across the land here

 

Computer having a hard time powering on? Troubleshoot it with this guide. (Currently looking for suggestions to update it into the context of <current year> and make it its own thread)

Computer Specs:

Spoiler

Mathresolvermajig: Intel Xeon E3 1240 (Sandy Bridge i7 equivalent)

Chillinmachine: Noctua NH-C14S
Framepainting-inator: EVGA GTX 1080 Ti SC2 Hybrid

Attachcorethingy: Gigabyte H61M-S2V-B3

Infoholdstick: Corsair 2x4GB DDR3 1333

Computerarmor: Silverstone RL06 "Lookalike"

Rememberdoogle: 1TB HDD + 120GB TR150 + 240 SSD Plus + 1TB MX500

AdditionalPylons: Phanteks AMP! 550W (based on Seasonic GX-550)

Letterpad: Rosewill Apollo 9100 (Cherry MX Red)

Buttonrodent: Razer Viper Mini + Huion H430P drawing Tablet

Auralnterface: Sennheiser HD 6xx

Liquidrectangles: LG 27UK850-W 4K HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, Energycore said:

*just bought a used 1080 ti LOL

Don't get me wrong. It's a monster of a card. Especially water cooled.

 

Quote

(but it's more than I'll need for a while cuz I don't do AAA games, I just wanted it for 4K Skyrim-level of detail games)

If you mod Skyrim or Fallout 4, you will not be disappointed. It can handle a lot. Mind the god rays though.

Ketchup is better than mustard.

GUI is better than Command Line Interface.

Dubs are better than subs

Link to comment
Share on other sites

Link to post
Share on other sites

Was bored, so I decided to play with the colors/colours of the supposed design of the upcoming Radeon RX 6000-series card's cooler. Came out pretty good if I say so myself: 

 

 

purp.jpg.ba8be980124a80edadc69b5c2c6ee106.jpg

 

green.jpg.f2291cfc43a895fe4c043a9beee2cc1d.jpg

 

ogsaphire.jpg.9b23b17053f90c72d500ca1d4d723a21.jpg

 

cyan.jpg.af147b9ef35ff86c83c2cbb48fc277a8.jpg

 

If this actually ended up happening (even with only these colors as alternatives to the original/standard Red/Scarlet option), I really think AMD would give NVIDIA a run for their money on reference cooler design. 

 

Of course this wouldn't be complete without proper related song referencing:

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, DildorTheDecent said:

Boost clocks can't be real. There's no way a 6700 XT has a 1500MHz base and 2500MHz boost clock. Guaranteed base and boost clocks are not that far apart. Maybe it's base and max boost assuming power, voltage and temperature are in check but certainly not the numbers they'll put on the box. Or it could be the max boost that the driver or program reports.

 

Even the 6900 XT has a 700MHz difference. No way, Jose.

I have serious doubts about the 2nd table shown with the base clocks.

 

8 hours ago, Energycore said:

256-bit GDDR6 looks a bit scarce compared to Nvidia's 320-bit GDDR6X.

 

I'm gonna call (X) Doubt on that spec.

I would tend to agree but MLID has commented on G6 vs G6X

 

7 hours ago, BiG StroOnZ said:

snip

Well, AMD wasn't lying about the first 50% part from GCN to RDNA1 so why wouldn't they also be telling the truth about RDNA2?

 

A 60 CU Vega 2.0 card performs the same (ish) as a 40CU Navi 1.0 card. If we're right then that should perform like a 27 CU RDNA2 card. And hey presto, wouldn't you know it but PS5 has a 36CU 2.2GHz GPU. Giving us a bit more than 2080 perf. Presumably around 10-15% faster as optimistic even though 25% may not be out of the question in one or two titles.

 

What's 20-30% faster than a 2080? A 2080TI.

 

What else is gonna target 2080Ti performance? RTX 3070. So really I hope AMD will need only 40-44CUs to beat it but if they don't then I'm sure they could brute force it with 56 or 60.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, leadeater said:

5700 XT vs 6700 XT doesn't look all that great on paper, hopefully it's a bit better for cheaper but still odd it's drop VRAM and thus bandwidth too.

Yeah I dont get it, why would you downgrade ram on especialy now. I hope this is gonna be 170-210USd card to replace 5500XT because If its like 300 than Im kinda mad at AMD....  

Link to comment
Share on other sites

Link to post
Share on other sites

I have really bad feeling that Nvidia will destroy AMD by having so many options 3060,3070,3070ti, 3080 10gb,3080 20gb and 3090. I know that TI cards are rumors but we know that Nvidia doesnt drop prices they just replace card or just make other price point. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, AluminiumTech said:

I would tend to agree but MLID has commented on G6 vs G6X

MLID has pretty much never been correct so why does it matter what he says?

He is about as accurate as WCCFTech in his "leaks" and predictions.

 

You just have to go and look at his Ampere "leak" video to see how much things he got wrong. Even if you buy into the whole conspiarcy theory that Nvidia deliberately sent out incorrect info and MLID's insider source accidentally leaked this incorrect info, that still means MLID's sources have a terrible track record and therefore we can't trust MLID either.

Link to comment
Share on other sites

Link to post
Share on other sites

hope it's not an over 90°C card when running the latest games. Cause if i wanted a jet engine in my room i'd get a console

One day I will be able to play Monster Hunter Frontier in French/Italian/English on my PC, it's just a matter of time... 4 5 6 7 8 9 years later: It's finally coming!!!

Phones: iPhone 4S/SE | LG V10 | Lumia 920 | Samsung S24 Ultra

Laptops: Macbook Pro 15" (mid-2012) | Compaq Presario V6000

Other: Steam Deck

<>EVs are bad, they kill the planet and remove freedoms too some/<>

Link to comment
Share on other sites

Link to post
Share on other sites

2.5Ghz isn't that far fetched considering the PS5's GPU reaches 2.23Ghz and most likely that's power and thermal limited. I bet if you give it enough power and a decent cooler, it should hit it. The base clocks look wrong to me though.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×