Jump to content

AMD Radeon RX 6900 XT "Big Navi 21 XT/XL" Ostensibly Confirmed to Run @ 2.2-2.4GHz Clock Speeds + 255W (TGP) & 16GB GDDR6 (Updated)

We are just 9 days away from AMD's Radeon RX 6000 "Big Navi" reveal and AMD appears to have something potentially groundbreaking up its sleeve; to not only challenge the GeForce RTX 3080, but also possibly even the mighty GeForce RTX 3090. Patrick Schur, who has been fairly reliable with past leaks, is suggesting that the Navi 21 XT (a la Radeon RX 6900 XT) will feature a staggering game clock of 2.4GHz. Details of the Navi 21 XT and XL GPUs have leaked too; these details include base, game, and boost clocks of both GPUs, plus the TGP (total graphics power) and memory capacity of the Navi 21 XT.

 

Quote

specz600seri.thumb.jpg.689d6be5e11d0485d0130b1ee1f6da7a.jpg

 

small_navi_21_vs_navi_10.jpg.6f8f1a5f2d954ecbe164722d5e784c17.jpg

 

untitled-1.png.20439440726ac9728ad7c89680722de8.png

 

For perspective, the Radeon RX 5700 XT (RDNA) has a base clock of 1.6GHz, game clock of 1.75GHz and a boost frequency of 1.9GHz. At 2.4GHz, that is the highest clock speed that we've seen for any consumer graphics cards stock from the factory. And considering that just the game clock is mentioned, we could be looking at even higher speeds for the maximum boost clock with this new RDNA 2 architecture. Based on previous rumors, the Radeon RX 6900 XT has a total of 80 CUs and 5120 stream processors. With a game clock of 2.4GHz, we're looking at around 24.5 TFLOPs compute compared to 9.754 TFLOPs for the Radeon RX 5700 XT. In other words, this is going to be a performance monster.

 

Apparently, the Navi 21 XL, which is expected to be the lesser version of the Navi 21 GPU, will come with a base clock ranging from 1350MHz and 1400MHz, a game clock between 1800MHz and 1900MHz, and capable of boosting up to 2100/2200MHz. On the other hand, the Navi 21 XT is higher clocked, featuring a base clock of 1450-1500MHz, a game clock ranging from 2000MHz to 2100MHz, and a boost clock of up to 2200-2400MHz. The reference cards should be closer to the bottom of the clock range, while AIB cards are expected to be closer to the top of it. This also suggests that we can expect even higher Turbo frequencies. These days, the thing with AMD cards is that the turbo boost and actual gaming frequency are two different things. So we're not quite sure what to make of that.

 

The GPU is also said to feature a TGP (total graphics power) of 255W. This does not mean all the power of the card, but the GPU itself. The GeForce RTX 3080 has a TDP of 320W, and the GeForce RTX 3070 has a TDP of 220W. Navi 21 XT, which is built on an enhanced 7nm process node, is reported to have a TDP (total design power) of 225 watts compared to a TDP (total design power) of 320 watts for the GeForce RTX 3080. The GeForce RTX 3090 has a TDP of 350 watts. Finally, Schur is alleging that the Radeon RX 5900 XT is packing 16GB of GDDR6 memory compared to 10GB GDDR6X for the GeForce RTX 3080 and 24GB GDDR6X for the GeForce RTX 3090.

 

Source 1: https://www.eteknix.com/amd-navi-21-xt-graphics-card-specs-leak/

Source 2: https://hothardware.com/news/amd-radeon-rx-6900-xt-big-navi-24ghz-16gb-of-ram

Source 3: https://www.techpowerup.com/273490/amd-navi-21-xt-seemingly-confirmed-to-run-at-2-3-2-4-ghz-clock-250-w

Source 4: https://www.guru3d.com/news-story/rumor-amds-navi-21-gpu-has-255-watts-tgp-and-boost-up-to-2-4-ghz.html 

Source 5: https://videocardz.com/newz/amd-navi-21-xt-to-feature-2-3-2-4-ghz-game-clock-250w-tgp-and-16-gb-gddr6-memory

Source 6: https://www.kitguru.net/components/graphic-cards/joao-silva/amd-navi-21-xt-xl-leak-suggests-an-over-2-0ghz-boost-clock/

 

 

For now we will have to wait until 10/28/2020 to really see what AMD has in store for us. However, from what has been seen thus far, is that AMD is looking to shake up the dGPU market with its Radeon RX 6000 family of GPUs. Additionally, besides the Navi 21 XT and XL, rumors also say that there’s an XTX GPU with more CUs and conceivably different clock speeds. Regardless, in the meantime; take all of these leaks/rumors with a grain of salt, as currently, there is nothing really to back up these specs. 

 

Smallish update:

 

A custom AMD Radeon RX 6800XT board partner card allegedly features a 2577MHz boost clock

 

Quote

AMD-Radeon-RX.thumb.jpg.a9a320355f8517fdd3c4eb98ae819267.jpg

 

AMD-Radeon-RX-6800XT-Custom-2577-MHz-clock.jpg.aa1fc47168db5ae2a122f76b87057714.jpg

 

AMD-Radeon-RX-6800XT-Custom-2577-MHz-clock2.jpg.10b81ba01a5b225e7031074086fa603a.jpg

 

AMD-Radeon-RX-6800XT-Maximum-Values-1.jpg.f8af489d46789c7e993626ce54b6595d.jpg

 

AMD-Radeon-RX-6800XT-Maximum-Values-2.jpg.54e188fdd8bbf1f945ccf8b44a5e6512.jpg

 

According to Igor’sLAB, a ‘special board partner’ card based on Navi 21 features a boost clock of 2577 MHz. Before we all get too excited, Igor did not mention which SKU this is (XTX, XT or XL). He did say that this is a boost clock, meaning this is not the ‘actual’ clock speed of the graphics card. This Radeon RX 6000 model will likely stay at 2.3-2.4 clocks, where AMD’s ‘gaming clock’ is more representative of what may be seen during gameplay. The data came from a BIOS that Igor got access to. 

 

Source 7: https://videocardz.com/newz/amd-radeon-rx-6800xt-board-partner-card-allegedly-features-a-2577-mhz-boost-clock

Link to comment
Share on other sites

Link to post
Share on other sites

here's to hoping they don't screw up as bad as nvidia did this time around

hope they are more conservative with clock speeds, so people wont crash on stock

and OC headroom is always fun

 

might finally be a uno reverse situation.

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

I'm eyeballing my wife's FreeSync / G-Sync compatible monitor and thinking I might just go team red this time around... she can have my Gsync certified monitor in trade ;) 

Link to comment
Share on other sites

Link to post
Share on other sites

The wait for the announcement is getting unbearable... For the release date and the price... And for actual stocks to force used prices down.

CPU: AMD Ryzen 3700x / GPU: Asus Radeon RX 6750XT OC 12GB / RAM: Corsair Vengeance LPX 2x8GB DDR4-3200
MOBO: MSI B450m Gaming Plus / NVME: Corsair MP510 240GB / Case: TT Core v21 / PSU: Seasonic 750W / OS: Win 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Hope those performance graphs vs the RTX 3080 were from one of the shit tier big navi cards.

 

Hope that big boi starts pulling bus lengths the faster it's clocked.

 

AMD pls. 7nm dream again pls.

Our Grace. The Feathered One. He shows us the way. His bob is majestic and shows us the path. Follow unto his guidance and His example. He knows the one true path. Our Saviour. Our Grace. Our Father Birb has taught us with His humble heart and gentle wing the way of the bob. Let us show Him our reverence and follow in His example. The True Path of the Feathered One. ~ Dimboble-dubabob III

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, TetraSky said:

The wait for the announcement is getting unbearable... For the release date and the price... And for actual stocks to force used prices down.

I’m worried used cards may have serious problems depending on how this whole storage thing works out.  Numbers are great, but what about the data latency thing?

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

It'd be interesting if this is the case, but a lot of the leaks for the CPUs were wrong and I don't see this being any different. I wonder how long the cards will actually be able to keep the game clocks up for. 

1 hour ago, Bombastinator said:

I’m worried used cards may have serious problems depending on how this whole storage thing works out.  Numbers are great, but what about the data latency thing?

If you're referring to stuff like RTX I/O, we're years away from seeing that in PC games. 

CPU: Ryzen 9 5900 Cooler: EVGA CLC280 Motherboard: Gigabyte B550i Pro AX RAM: Kingston Hyper X 32GB 3200mhz

Storage: WD 750 SE 500GB, WD 730 SE 1TB GPU: EVGA RTX 3070 Ti PSU: Corsair SF750 Case: Streacom DA2

Monitor: LG 27GL83B Mouse: Razer Basilisk V2 Keyboard: G.Skill KM780 Cherry MX Red Speakers: Mackie CR5BT

 

MiniPC - Sold for $100 Profit

Spoiler

CPU: Intel i3 4160 Cooler: Integrated Motherboard: Integrated

RAM: G.Skill RipJaws 16GB DDR3 Storage: Transcend MSA370 128GB GPU: Intel 4400 Graphics

PSU: Integrated Case: Shuttle XPC Slim

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

Budget Rig 1 - Sold For $750 Profit

Spoiler

CPU: Intel i5 7600k Cooler: CryOrig H7 Motherboard: MSI Z270 M5

RAM: Crucial LPX 16GB DDR4 Storage: Intel S3510 800GB GPU: Nvidia GTX 980

PSU: Corsair CX650M Case: EVGA DG73

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

OG Gaming Rig - Gone

Spoiler

 

CPU: Intel i5 4690k Cooler: Corsair H100i V2 Motherboard: MSI Z97i AC ITX

RAM: Crucial Ballistix 16GB DDR3 Storage: Kingston Fury 240GB GPU: Asus Strix GTX 970

PSU: Thermaltake TR2 Case: Phanteks Enthoo Evolv ITX

Monitor: Dell P2214H x2 Mouse: Logitech MX Master Keyboard: G.Skill KM780 Cherry MX Red

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

May I ask what is the date when independent reviews of them can be published? Is it before or after they get into shops?

I've been telling my friend wait wait and wait for the past two months and after RTX3000 premiere he's thinking of going for Navi cards on the premiere day, whereas I strongly hope they will have much better availability and Nvidia is going to put out much more of their cards too.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Loote said:

May I ask what is the date when independent reviews of them can be published? Is it before or after they get into shops?

I've been telling my friend wait wait and wait for the past two months and after RTX3000 premiere he's thinking of going for Navi cards on the premiere day, whereas I strongly hope they will have much better availability and Nvidia is going to put out much more of their cards too.

Usually on the day they get into shops.

CPU - Ryzen 5 5600X | CPU Cooler - EVGA CLC 240mm AIO  Motherboard - ASRock B550 Phantom Gaming 4 | RAM - 16GB (2x8GB) Patriot Viper Steel DDR4 3600MHz CL17 | GPU - MSI RTX 3070 Ventus 3X OC | PSU -  EVGA 600 BQ | Storage - PNY CS3030 1TB NVMe SSD | Case Cooler Master TD500 Mesh

 

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, dizmo said:

It'd be interesting if this is the case, but a lot of the leaks for the CPUs were wrong and I don't see this being any different. I wonder how long the cards will actually be able to keep the game clocks up for. 

If you're referring to stuff like RTX I/O, we're years away from seeing that in PC games. 

Not ray tracing.  They’ve got to do I suspect a minimum of two more gpu iterations for that.  I mean the high speed storage thing.  There was an article about a wolfemstein thing going for it.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, Bombastinator said:

Not ray tracing.  They’ve got to do I suspect a minimum of two more gpu iterations for that.  I mean the high speed storage thing.  There was an article about a wolfemstein thing going for it.

RTX I/O mentioned by dizmo is Nvidia's implementation of DirectStorage

Link to comment
Share on other sites

Link to post
Share on other sites

5ghz confirmed

MOAR COARS: 5GHz "Confirmed" Black Edition™ The Build
AMD 5950X 4.7/4.6GHz All Core Dynamic OC + 1900MHz FCLK | 5GHz+ PBO | ASUS X570 Dark Hero | 32 GB 3800MHz 14-15-15-30-48-1T GDM 8GBx4 |  PowerColor AMD Radeon 6900 XT Liquid Devil @ 2700MHz Core + 2130MHz Mem | 2x 480mm Rad | 8x Blacknoise Noiseblocker NB-eLoop B12-PS Black Edition 120mm PWM | Thermaltake Core P5 TG Ti + Additional 3D Printed Rad Mount

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Loote said:

RTX I/O mentioned by dizmo is Nvidia's implementation of DirectStorage

Ah. So nothing to do with ray tracing.  they just stuck the branding on it. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

The memory i'd belie. But either that top clock speed or the TDP is nonsense. That kind of clock-speed increase from the previous generation requires that the TDP per CU goes up and there's more CU's than the 5700XT, yet it supposedly has the same TDP on a node thats only slightly more efficient. Thats not happening. This is complete and utter nonsense.

 

The only way i can make sense of this is if the clock is for when the CU's are running in RT mode, depending on how thats done it might well be that it consumes less power per CU at higher clocks compared to when running in raster mode. But those clocks at that TDP in raster mode are just flat out impossible IMO, i don't think they could get that TDP even if they had TSMC 5nm to work with.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, CarlBar said:

The memory i'd belie. But either that top clock speed or the TDP is nonsense. That kind of clock-speed increase from the previous generation requires that the TDP per CU goes up and there's more CU's than the 5700XT, yet it supposedly has the same TDP on a node thats only slightly more efficient. Thats not happening. This is complete and utter nonsense.

and your point? this could be on a 7nm+ node

Good luck, Have fun, Build PC, and have a last gen console for use once a year. I should answer most of the time between 9 to 3 PST

NightHawk 3.0: R7 5700x @, B550A vision D, H105, 2x32gb Oloy 3600, Sapphire RX 6700XT  Nitro+, Corsair RM750X, 500 gb 850 evo, 2tb rocket and 5tb Toshiba x300, 2x 6TB WD Black W10 all in a 750D airflow.
GF PC: (nighthawk 2.0): R7 2700x, B450m vision D, 4x8gb Geli 2933, Strix GTX970, CX650M RGB, Obsidian 350D

Skunkworks: R5 3500U, 16gb, 500gb Adata XPG 6000 lite, Vega 8. HP probook G455R G6 Ubuntu 20. LTS

Condor (MC server): 6600K, z170m plus, 16gb corsair vengeance LPX, samsung 750 evo, EVGA BR 450.

Spirt  (NAS) ASUS Z9PR-D12, 2x E5 2620V2, 8x4gb, 24 3tb HDD. F80 800gb cache, trueNAS, 2x12disk raid Z3 stripped

PSU Tier List      Motherboard Tier List     SSD Tier List     How to get PC parts cheap    HP probook 445R G6 review

 

"Stupidity is like trying to find a limit of a constant. You are never truly smart in something, just less stupid."

Camera Gear: X-S10, 16-80 F4, 60D, 24-105 F4, 50mm F1.4, Helios44-m, 2 Cos-11D lavs

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, CarlBar said:

The memory i'd belie. But either that top clock speed or the TDP is nonsense. That kind of clock-speed increase from the previous generation requires that the TDP per CU goes up and there's more CU's than the 5700XT, yet it supposedly has the same TDP on a node thats only slightly more efficient. Thats not happening. This is complete and utter nonsense.

 

The only way i can make sense of this is if the clock is for when the CU's are running in RT mode, depending on how thats done it might well be that it consumes less power per CU at higher clocks compared to when running in raster mode. But those clocks at that TDP in raster mode are just flat out impossible IMO, i don't think they could get that TDP even if they had TSMC 5nm to work with.

I'm assuming that TGP stands for Total Graphics Power so 255W is just for the GPU die. If you add in the GDDR6 we are looking at 300W+ card. Seems fine.

Link to comment
Share on other sites

Link to post
Share on other sites

Big Navi is getting so hyped that you might as well put it alongside your Ampere card, pair them with your Zen N PCIe 4.0 DDR5 CPU and play Star Citizen with it. 

 

 

 

 

Spoiler

(don't worry about cross-vendor mGPU support, I'm sure it will be added to the roadmap through feature creep) 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Is TGP for any clock, or the base clock?

Here they say that XSX peaked at 211W https://www.eurogamer.net/articles/digitalfoundry-2020-xbox-series-x-power-consumption-and-heat-analysis

Due to binning and being a newer design, I can believe that big navi consumes at 2GHz what Xbox does at 1825MHz, if you then remove power supply efficiency and other elements, and then extrapolate from 52 CUs to 80*, you could fit in 250W TGP, now as for the boost... It's hard to imagine, but not impossible.

 

* I just assumed that from those 211W 150 were used by GPU, including memory, 61W for CPU, efficiency, storage etc. seems like going low even, then if you multiply by 1.5 it's 225W for 78CUs

 

I think someone could come up with those numbers based on what we know so far, I am not advocating for this leak being true, just wondering if we can write it off so easily.

Link to comment
Share on other sites

Link to post
Share on other sites

https://www.igorslab.de/en/amd-radeon-rx-6000-the-actual-power-consumption-of-navi21xt-and-navi21xl-the-memory-and-the-availability-of-board-partner-card-exclusive/

 

Haven't seen above posted yet.

 

tldr:

Difference between AMD chip level power consumption and nvidia board level, if you account for this, they're same ball park

Sapphire, Asus, limited cards mid-November. Other companies, end November.

PCB design similar to old navi, implies 8x 2GB chips, not 16x 1GB, don't need major board redesign.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

I've seen this trend a lot lately that even if it is as good as the 3080, people will choose that one over the amd card. Even though the 3080 consumes ~70-100w more power. Last time this was the other way around, people were screaming that their electricity bills will be insane, that they will die of heat exposure and basically the apocalypse will come. 

Link to comment
Share on other sites

Link to post
Share on other sites

Meh, its all rumors until confirmed by AMD. I'm tired of rumors, its time we start getting some real news. 

System Specs

  • CPU
    AMD Ryzen 7 5800X
  • Motherboard
    Gigabyte AMD X570 Auros Master
  • RAM
    G.Skill Ripjaws 32 GBs
  • GPU
    Red Devil RX 5700XT
  • Case
    Corsair 570X
  • Storage
    Samsung SSD 860 QVO 2TB - HDD Seagate B arracuda 1TB - External Seagate HDD 8TB
  • PSU
    G.Skill RipJaws 1250 Watts
  • Keyboard
    Corsair Gaming Keyboard K55
  • Mouse
    Razer Naga Trinity
  • Operating System
    Windows 10
Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, VanayadGaming said:

Even though the 3080 consumes ~70-100w more power.

It wont, that misunderstanding comes from differences in how AMD and nvidia state power. See my previous link for a better description.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

So far on it's looking very good for Big Navi though. How much they managed to improve architecture wise is yet to be seen. Well soon enough.

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

Quote

CONFIRMED

No it isn't confirmed. It's a "leak" that might as well just be a rumor someone came up with.

Don't get suckered on to the hype-train. Just sit back, relax and wait for third party reviews.

 

The best thing to do is expect it to be shit because then you will have the smug feeling of being right if it turns out to be shit, or be pleasantly surprised if it turns out that it's good.

If you expect it to be great then you have more to lose. Hell, even if it turns out to be decent you might get disappointed if you have set your expectations too high.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×