Jump to content

NVIDIA 'Ada Lovelace' GeForce RTX 4090 to have rumored 2520MHz Boost Clocks and over 2750MHz Max Clocks - more than 90 TFLOPs single-precision compute

BiG StroOnZ

Summary

According to the leaker "kopite7kimi" the NVIDIA RTX 4090 now apparently has base, boost, and maximum clocks specifications. It is revealed that RTX 4090 has 2235MHz base and 2520MHz boost with 2750MHz actual clock (such as in-game clock).

 

Quotes

Quote

This is indeed a noticeable upgrade over Ampere series:

  • 60% Increase In Base Clock (1395MHz RTX 3090 compared to 2235MHz RTX 4090 )
  • 49% Increase In Boost Clock (1695MHz RTX 3090 compared to 2520MHz RTX 4090)
  • 31% Increase In Max Clocks (2100MHz RTX 3090 compared to 2750MHz RTX 4090)

Given that RTX 4090 has a rumored 16,384 CUDA core spec configuration, this means that with said ‘actual’ clock speed of 2750MHz the card’s compute performance could reach 90 TFLOPS in single-precision workload. That’s almost exactly twice as high as RTX 3090 non-Ti.

 

Kopite also reiterates the previously shared specs of the RTX 4080 and RTX 4070 SKUs. 

 

NVIDIA is now expected to unveil its RTX 40 series around September or October this year. Initially only these three models are to be launched, with the mid-range RTX 4060 coming next year around CES 2023.

 

My thoughts

Given these numbers one should expect a full fat AD102 GPU (18,432 CUDA cores) with similar clocks to break the 100 TFLOP figure, making previous rumors much more possible. Also, if expected max clocks reach 2750MHz it's quite conceivable that AIB partner non-reference designs could reach clocks of 2800MHz and higher; maybe we can even see GPU clocks reach 3.0GHz! Rumors currently have the RTX 4090 launching between September/October, the RTX 4080 between October/November, and the RTX 4070 between November/December. Therefore, we have quite a ways to go before we get confirmed specs, but we should continue to get better leaks as we approach Fall/Autumn 2022. 

 

Sources

https://www.tweaktown.com/news/87210/the-latest-geforce-rtx-4090-rumored-specs-crazy-2-75ghz-gpu-clocks/index.html

https://videocardz.com/newz/nvidia-geforce-rtx-4090-to-feature-2520-mhz-boost-clock-almost-50-higher-than-rtx-3090

Link to comment
Share on other sites

Link to post
Share on other sites

Damn. These rumours make the 4000 series sound rather attractive. Let's hope for some reasonable pricing.

Crystal: CPU: i7 7700K | Motherboard: Asus ROG Strix Z270F | RAM: GSkill 16 GB@3200MHz | GPU: Nvidia GTX 1080 Ti FE | Case: Corsair Crystal 570X (black) | PSU: EVGA Supernova G2 1000W | Monitor: Asus VG248QE 24"

Laptop: Dell XPS 13 9370 | CPU: i5 10510U | RAM: 16 GB

Server: CPU: i5 4690k | RAM: 16 GB | Case: Corsair Graphite 760T White | Storage: 19 TB

Link to comment
Share on other sites

Link to post
Share on other sites

I'll skip this generation. I speculate the 4000 series will be the same as the 2000 series.

  • no performance per dollar improvement
  • little performance per watt improvement
  • launched just after a crypto boom

I'm curious about the CUDA design of the 4000 series. The 3000 series had the new split INT/FLOAT execution unit. AMD has been playing around with gigantic cash dies.

Ray tracing is still gimmiky. I like the look of ray traced reflections and global illumination in cyberpunk, but the noise gets worse. I would like for an high end gpu to be able to do good ray tracing without artefacts.

Link to comment
Share on other sites

Link to post
Share on other sites

Makes me wonder how good their yields are.

mY sYsTeM iS Not pErfoRmInG aS gOOd As I sAW oN yOuTuBe. WhA t IS a GoOd FaN CuRVe??!!? wHat aRe tEh GoOd OvERclok SeTTinGS FoR My CaRd??  HoW CaN I foRcE my GpU to uSe 1o0%? BuT WiLL i HaVE Bo0tllEnEcKs? RyZEN dOeS NoT peRfORm BetTer wItH HiGhER sPEED RaM!!dId i WiN teH SiLiCON LotTerrYyOu ShoUlD dEsHrOuD uR GPUmy SYstEm iS UNDerPerforMiNg iN WarzONEcan mY Pc Run WiNdOwS 11 ?woUld BaKInG MY GRaPHics card fIX it? MultimETeR TeSTiNG!! aMd'S GpU DrIvErS aRe as goOD aS NviDia's YOU SHoUlD oVERCloCk yOUR ramS To 5000C18

 

Link to comment
Share on other sites

Link to post
Share on other sites

24 minutes ago, tikker said:

Damn. These rumours make the 4000 series sound rather attractive. Let's hope for some reasonable pricing.

Everytime new cards are about to release all rumours tend to sound too good to be true.

 

60% higher BASE clocks vs RTX 3090 only sounds impressive because of how low the base clock on 3090 is. In reality most 3090 cards actually run at 1800MHz+ anyways even the ones with bad coolers.

Actually since 1000 series the GPU boost on NVIDIA cards did such a great job that the base clock is basicaly just a meaningless number for the most part.

 

This will be only impressive if we can see the same behaviour on the 4000 series and the cards will boost themselves close to 3000MHz. But due to the massive power increase I believe that NVIDIA started pushing the silicon closer to its limit rather than leaving a massive headroom like they did for the last 3 generations.

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, 05032-Mendicant-Bias said:

Ray tracing is still gimmiky. I like the look of ray traced reflections and global illumination in cyberpunk, but the noise gets worse. I would like for an high end gpu to be able to do good ray tracing without artefacts.

You're asking way too much, its a miracle we even HAVE raytracing, it wasn't expected this soon.  I'd still argue the noise is less distracting than screen space reflections and glowing areas where its supposed to be dark.

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

well 1200w psu is going to be a necessity if you want to pair this with a high-end cpu

hot damn nvidia 

if it was useful give it a like :) btw if your into linux pay a visit here

 

Link to comment
Share on other sites

Link to post
Share on other sites

*3000 series price immedeatly tanks before 4000 series release, buy a 3000 series before 4000 series launch, another bloody gpu shortage happens, prices skyrocket to the moon even for old gpus, sell said 3000 series card for profit

 

Lets just hope this doesnt actually happen again ;-;

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, tikker said:

Damn. These rumours make the 4000 series sound rather attractive. Let's hope for some reasonable pricing.

There's no way they'll be reasonably priced. My heart says to get ready for moon prices. I bet they'll be - 4070: $899 4080: $1299 and 4090: $1699.

Desktop: Ryzen 7 5800X3D - Kraken X62 Rev 2 - STRIX X470-I - 3600MHz 32GB Kingston Fury - 250GB 970 Evo boot - 2x 500GB 860 Evo - 1TB P3 - 4TB HDD - RX6800 - RMx 750 W 80+ Gold - Manta - Silent Wings Pro 4's enjoyer

SetupZowie XL2740 27.0" 240hz - Roccat Burt Pro Corsair K70 LUX browns - PC38X - Mackie CR5X's

Current build on PCPartPicker

 

Link to comment
Share on other sites

Link to post
Share on other sites

Looking at those memory numbers gives me performance concerns, compared to equivalent tier Ampere. Assuming the info given is correct. 

 

Mem BW GB/s = chip Gbps * bus width / 8

 

4070 vs 3070: 360 vs 448. 20% decrease.

4080 vs 3080: 672 vs 760. 12% decrease.

4090 vs 3090: 1008 vs 936. 8% increase.

 

If I had to guess, they're targeting a certain amount of VRAM for each model, but at the cost of using higher density chips on narrower bus for the new 70/80 tier. The faster memory offsets this but not enough. The 90 tier remains the same as the maximum configuration, so you do get the clock boost.

 

We get more core potential but less BW to feed it. I hope they have implemented something which reduces that need, for example AMD's Infinity Cache.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, JKRsega said:

If they are priced well, I'll eat my left arm.

Don't worry, you'll keep both of your arms.

DAC/AMPs:

Klipsch Heritage Headphone Amplifier

Headphones: Klipsch Heritage HP-3 Walnut, Meze 109 Pro, Beyerdynamic Amiron Home, Amiron Wireless Copper, Tygr 300R, DT880 600ohm Manufaktur, T90, Fidelio X2HR

CPU: Intel 4770, GPU: Asus RTX3080 TUF Gaming OC, Mobo: MSI Z87-G45, RAM: DDR3 16GB G.Skill, PC Case: Fractal Design R4 Black non-iglass, Monitor: BenQ GW2280

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, CTR640 said:

Don't worry, you'll keep both of your arms.

I don't know, I get REAL hungry sometimes...  Just not as starving as Nvidia...

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, JKRsega said:

I don't know, I get REAL hungry sometimes...  Just not as starving as Nvidia...

*Eddie Murphy meme*

Well, why you gonna eat your left arm when you can eat someone's else left arm?

DAC/AMPs:

Klipsch Heritage Headphone Amplifier

Headphones: Klipsch Heritage HP-3 Walnut, Meze 109 Pro, Beyerdynamic Amiron Home, Amiron Wireless Copper, Tygr 300R, DT880 600ohm Manufaktur, T90, Fidelio X2HR

CPU: Intel 4770, GPU: Asus RTX3080 TUF Gaming OC, Mobo: MSI Z87-G45, RAM: DDR3 16GB G.Skill, PC Case: Fractal Design R4 Black non-iglass, Monitor: BenQ GW2280

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, tikker said:

Let's hope for some reasonable pricing good GPU's.

fixed it for ya,

but yea those numbers are doing some things to me.

that 4080 is looking very....attractive.

*Insert Witty Signature here*

System Config: https://au.pcpartpicker.com/list/Tncs9N

 

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, Alex Atkin UK said:

You're asking way too much, its a miracle we even HAVE raytracing, it wasn't expected this soon.  I'd still argue the noise is less distracting than screen space reflections and glowing areas where its supposed to be dark.

Yeah, perhaps I expect too much. I do play cyberpunk with RT and DLSS on at 1440p, I like it better than the regular pipeline.

I think the low end models have no businness sporting dedicated ray tracing cores, I think lower area dies (GA103 and below) should only have tensor and shader, no ray tracing. 2060 a ray tracing card? Let's be real... And let's not talk of the ray tracing mobile chip! https://news.samsung.com/global/samsung-introduces-game-changing-exynos-2200-processor-with-xclipse-gpu-powered-by-amd-rdna-2-architecture

As far as I understand, the Tensor cores have lots of uses, both at the low end and the high end. The Ray Tracing cores instead accelerate something that is only useful in ray tracing productivity or ray traced games. I would like it better if lower end GPUs would allocate all die areas to functions that increase the FPS/dollar, which I think should be the most important factor in mid tier cards.

Link to comment
Share on other sites

Link to post
Share on other sites

50 minutes ago, 05032-Mendicant-Bias said:

Yeah, perhaps I expect too much. I do play cyberpunk with RT and DLSS on at 1440p, I like it better than the regular pipeline.

I think the low end models have no businness sporting dedicated ray tracing cores, I think lower area dies (GA103 and below) should only have tensor and shader, no ray tracing. 2060 a ray tracing card? Let's be real... And let's not talk of the ray tracing mobile chip! https://news.samsung.com/global/samsung-introduces-game-changing-exynos-2200-processor-with-xclipse-gpu-powered-by-amd-rdna-2-architecture

As far as I understand, the Tensor cores have lots of uses, both at the low end and the high end. The Ray Tracing cores instead accelerate something that is only useful in ray tracing productivity or ray traced games. I would like it better if lower end GPUs would allocate all die areas to functions that increase the FPS/dollar, which I think should be the most important factor in mid tier cards.

I can kinda agree on the lower models to some extent, although even there DLSS can make up some leeway and its probably harder to remove the RT entirely than just tweak the design or use chips with defective parts fused off.  Plus there are some niche cases, like you might want to use the RT cores to speedup animation work on a lower end card, where the frame rate being low is less important, so long as its faster/better quality than what you'd have without it.

 

There's also the problem that RT needs to be a standard feature across ALL cards for it to be interesting to developers, even if it has to be cut-down to work on those cards. So it does make sense to include it.  Its the same when any new GPU feature is implemented, it needs to be available everywhere and be scalable.  For example if you're playing on something like a Steam Deck or a small laptop, then having to run the game at 720p, maybe even lower, is not necessarily a huge problem.

 

Problem I had with Cyberpunk is without RT it looks a LOT worse IMO, but performance DLSS (at least at launch) had huge issues that made it look like garbage.  If I recall correctly, only Quality mode was any good, and I hit just the wrong sweet spot on a 2080 there needing to reduce the resolution that kinda offset the difference, making it still look a bit rough on a 55" screen.

The Tensor cores are indeed very useful, as I plan to get a 4080, move the 3080 to my AI upscaling box, than use the 2080 in my desktop for the odd AI image upscaling as its painfully slow on GTX 1650.  Plus nobody seems to really be looking into using the Tensor cores for actual games beyond DLSS, I'd imagine they might be able to do something with AI but its tricky given AMD AFAIK has nothing comparable in theirs and there is more focus there due to consoles.

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

Uh.

I am confused.

 

So the alleged 4080, with 10240 CUDA cores and 16GB of VRAM, has a TDP of 420W.

And the alleged 4090, with 16384 CUDA cores and 24GB VRAM, has a TDP of 450W.

 

How? What? How?  Only 30W more for 60% more cores and 50% more GDDR6X?

 

I know this guy is a well known and respected leaker but this makes no sense to me.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Rauten said:

Uh.

I am confused.

 

So the alleged 4080, with 10240 CUDA cores and 16GB of VRAM, has a TDP of 420W.

And the alleged 4090, with 16384 CUDA cores and 24GB VRAM, has a TDP of 450W.

 

How? What? How?  Only 30W more for 60% more cores and 50% more GDDR6X?

 

I know this guy is a well known and respected leaker but this makes no sense to me.

Only one way to find out: Wait for the official announcement.

Desktop: i9-10850K [Noctua NH-D15 Chromax.Black] | Asus ROG Strix Z490-E | G.Skill Trident Z 2x16GB 3600Mhz 16-16-16-36 | Asus ROG Strix RTX 3080Ti OC | SeaSonic PRIME Ultra Gold 1000W | Samsung 970 Evo Plus 1TB | Samsung 860 Evo 2TB | CoolerMaster MasterCase H500 ARGB | Win 10

Display: Samsung Odyssey G7A (28" 4K 144Hz)

 

Laptop: Lenovo ThinkBook 16p Gen 4 | i7-13700H | 2x8GB 5200Mhz | RTX 4060 | Linux Mint 21.2 Cinnamon

Link to comment
Share on other sites

Link to post
Share on other sites

On 7/4/2022 at 12:31 PM, CTR640 said:

Don't worry, you'll keep both of your arms.

I'm sure a 4090 will cost an arm, leg, kidney, and left nut.

Link to comment
Share on other sites

Link to post
Share on other sites

On 7/5/2022 at 5:01 AM, porina said:

Looking at those memory numbers gives me performance concerns, compared to equivalent tier Ampere. Assuming the info given is correct. 

 

Mem BW GB/s = chip Gbps * bus width / 8

 

4070 vs 3070: 360 vs 448. 20% decrease.

4080 vs 3080: 672 vs 760. 12% decrease.

4090 vs 3090: 1008 vs 936. 8% increase.

 

If I had to guess, they're targeting a certain amount of VRAM for each model, but at the cost of using higher density chips on narrower bus for the new 70/80 tier. The faster memory offsets this but not enough. The 90 tier remains the same as the maximum configuration, so you do get the clock boost.

 

We get more core potential but less BW to feed it. I hope they have implemented something which reduces that need, for example AMD's Infinity Cache.

The Ti models/refreshes were a bit messed up in RTX 30 series, I suspect things are like they are to give room for more sensible and meaningful Ti models across the product lineup.

 

image.png.04a2006de318a2194624e3d79ceda535.pngimage.png.5fade229ca11b90e29e0c9da5668652d.png

 

This mess I suspect will get ironed out in RTX 40 series.

 

Also two notes, I believe the caches are getting huge increases so will have similar benefit to Infinity Cache and the other note is if Nvidia really is looking to reduce TSMC 5nm allocation then to me that means RTX 40 series is not as good as rumored. If RTX 40 series is as good as rumored Nvidia would not be looking to reduce allocation because literally everyone is going to want them and I doubt a recession would change it that much. RTX 30 -> RTX 40 may not be as good as RTX 20 -> RTX 30, a node shrink should mean that it would be but who knows. It's a very odd move from Nvidia in my opinion is all.

Link to comment
Share on other sites

Link to post
Share on other sites

39 minutes ago, StDragon said:

I'm sure a 4090 will cost an arm, leg, kidney, and left nut.

Well shit, I need them all and my right nut needs the left nut to stay a complete pair. The 4090 can go to someone else who doesn't mind missing them all. 

DAC/AMPs:

Klipsch Heritage Headphone Amplifier

Headphones: Klipsch Heritage HP-3 Walnut, Meze 109 Pro, Beyerdynamic Amiron Home, Amiron Wireless Copper, Tygr 300R, DT880 600ohm Manufaktur, T90, Fidelio X2HR

CPU: Intel 4770, GPU: Asus RTX3080 TUF Gaming OC, Mobo: MSI Z87-G45, RAM: DDR3 16GB G.Skill, PC Case: Fractal Design R4 Black non-iglass, Monitor: BenQ GW2280

Link to comment
Share on other sites

Link to post
Share on other sites

20 hours ago, Rauten said:

Uh.

I am confused.

 

So the alleged 4080, with 10240 CUDA cores and 16GB of VRAM, has a TDP of 420W.

And the alleged 4090, with 16384 CUDA cores and 24GB VRAM, has a TDP of 450W.

 

How? What? How?  Only 30W more for 60% more cores and 50% more GDDR6X?

 

I know this guy is a well known and respected leaker but this makes no sense to me.

Because adding CUDA cores doesn't itself increase power draw. Adding CUDA cores at the same frequencies does. This means the 4090 will have a decent bit lower boost clocks, if this information is true of course.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, leadeater said:

The Ti models/refreshes were a bit messed up in RTX 30 series, I suspect things are like they are to give room for more sensible and meaningful Ti models across the product lineup.

I was amazed the RTX3080 came with the GA102. Previous model all came with the top 3° die, instead of the 2° die shaved down. That's why the 3090 was so close to the 3080 in performance, with only difference being the VRAM, which was on the small side for an high end card.

With the 4000 series, we get back to the 80 tier to use the 3° die, giving the 90 tier a significant bump in performance for a huge premium.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×