Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

NVIDIA 'Ada Lovelace' GeForce RTX 4090 to have rumored 2520MHz Boost Clocks and over 2750MHz Max Clocks - more than 90 TFLOPs single-precision compute

BiG StroOnZ
 Share

Summary

According to the leaker "kopite7kimi" the NVIDIA RTX 4090 now apparently has base, boost, and maximum clocks specifications. It is revealed that RTX 4090 has 2235MHz base and 2520MHz boost with 2750MHz actual clock (such as in-game clock).

 

Quotes

Quote

This is indeed a noticeable upgrade over Ampere series:

  • 60% Increase In Base Clock (1395MHz RTX 3090 compared to 2235MHz RTX 4090 )
  • 49% Increase In Boost Clock (1695MHz RTX 3090 compared to 2520MHz RTX 4090)
  • 31% Increase In Max Clocks (2100MHz RTX 3090 compared to 2750MHz RTX 4090)

Given that RTX 4090 has a rumored 16,384 CUDA core spec configuration, this means that with said ‘actual’ clock speed of 2750MHz the card’s compute performance could reach 90 TFLOPS in single-precision workload. That’s almost exactly twice as high as RTX 3090 non-Ti.

 

Kopite also reiterates the previously shared specs of the RTX 4080 and RTX 4070 SKUs. 

 

NVIDIA is now expected to unveil its RTX 40 series around September or October this year. Initially only these three models are to be launched, with the mid-range RTX 4060 coming next year around CES 2023.

 

My thoughts

Given these numbers one should expect a full fat AD102 GPU (18,432 CUDA cores) with similar clocks to break the 100 TFLOP figure, making previous rumors much more possible. Also, if expected max clocks reach 2750MHz it's quite conceivable that AIB partner non-reference designs could reach clocks of 2800MHz and higher; maybe we can even see GPU clocks reach 3.0GHz! Rumors currently have the RTX 4090 launching between September/October, the RTX 4080 between October/November, and the RTX 4070 between November/December. Therefore, we have quite a ways to go before we get confirmed specs, but we should continue to get better leaks as we approach Fall/Autumn 2022. 

 

Sources

https://www.tweaktown.com/news/87210/the-latest-geforce-rtx-4090-rumored-specs-crazy-2-75ghz-gpu-clocks/index.html

https://videocardz.com/newz/nvidia-geforce-rtx-4090-to-feature-2520-mhz-boost-clock-almost-50-higher-than-rtx-3090

Spoiler
Spoiler

                                                                                 Mentioned on 09/30/20    in  TechLinked @ (1:54)              

                                                                 Mentioned on 10/07/20    in  TechLinked @ (1:59)

         Mentioned on 10/16/20    in  TechLinked @ (4:06) 

      Mentioned on 10/21/20    in  TechLinked @ (1:22)

     Mentioned on 12/30/20    in  TechLinked @ (0:14)

    Mentioned on 12/30/20    in  TechLinked @ (2:17)

   Mentioned on 05/18/22    in  TechLinked @ (4:25)

  Mentioned on 05/20/22    in  TechLinked @ (0:13)

 Mentioned on 05/25/22    in  TechLinked @ (5:24)

Mentioned on 06/08/22    in  TechLinked @ (1:20)

Mentioned on 06/20/22    in  TechLinked @ (3:54)

Mentioned on 06/27/22    in  TechLinked @ (3:52)

Mentioned on 08/10/22    in  TechLinked @ (3:50)

Mentioned on 08/13/22    in  TechLinked @ (1:16)

Mentioned on 08/17/22    in  TechLinked @ (4:12)

 

Currently Playing:

Fortnite

Path oExile

Call oDuty: Warzone

Mirror's Edge Catalyst

 

                       🌕              

                                               Eye of Providence (HP OMEN 17-ck1111nr)

CPU12th Gen Intel Core i7-12700H 14c/20t GPU: NVIDIA Geforce RTX 3060 ~ GA106 Memory: 16GB DDR5 @4800MHz SSD: 1TB PCIe Gen4 NVMe M.2 (OS/Programs/Apps/Games) HDD1: WD Elements 4TB External (Backup/Additional Storage) Monitor: 17.3” Full HD (1080p) IPS Micro-Edge Anti-Glare Low Blue Light 144Hz Display Mouse: Artic White Roccat Kone Pro Mouse Mat: Corsair MM350 Premium Headset: Corsair VØID Stereo Gaming Headset OS: Windows 11 Home

                                                                         

Link to comment
Share on other sites

Link to post
Share on other sites

Damn. These rumours make the 4000 series sound rather attractive. Let's hope for some reasonable pricing.

Crystal: CPU: i7 7700K | Motherboard: Asus ROG Strix Z270F | RAM: GSkill 16 GB@3200MHz | GPU: Nvidia GTX 1080 Ti FE | Case: Corsair Crystal 570X (black) | PSU: EVGA Supernova G2 1000W | Monitor: Asus VG248QE 24"

Laptop: Dell XPS 13 9370 | CPU: i5 10510U | RAM: 16 GB

Server: CPU: i5 4690k | RAM: 16 GB | Case: Corsair Graphite 760T White | Storage: 19 TB

Link to comment
Share on other sites

Link to post
Share on other sites

I'll skip this generation. I speculate the 4000 series will be the same as the 2000 series.

  • no performance per dollar improvement
  • little performance per watt improvement
  • launched just after a crypto boom

I'm curious about the CUDA design of the 4000 series. The 3000 series had the new split INT/FLOAT execution unit. AMD has been playing around with gigantic cash dies.

Ray tracing is still gimmiky. I like the look of ray traced reflections and global illumination in cyberpunk, but the noise gets worse. I would like for an high end gpu to be able to do good ray tracing without artefacts.

My System: i7-8700 // Noctua NH-U9B SE2 CPU Cooler + Noctua 2X120mm 2X140mm system fans// Gigabyte Z370 HD3// 2x8GB DDR4-3000 Corsair // Asus TUF 3080 10GB // Itek Replay2.0 Modified for airflow // Corsair RM650x Gold modular // 32GB M.2 Optane + 2X120GB Kingston A400 + 2X1TB WD10EZEX 7200 RPM +1X USB3.0 External 6TB 5400RPM // Displays: LG 27GL850 IPS 1440p 144Hz + Philips 273V5LHAB TN 1080p 60Hz + Acer ka220hq TN 1080p 60Hz // https://pcpartpicker.com/list/dL42q3

Link to comment
Share on other sites

Link to post
Share on other sites

Makes me wonder how good their yields are.

mY s YsTeM iS Not pErfoRmInG aS gOOd As I sAW oN yOuTuBe. WhA t IS a GoOd FaN CuRVe??!!? wHat aRe tEh GoOd OvERclok SeTTinGS FoR My CaRd??
 HoW CaN I foRcE my GpU to uSe 1o0%? BuT WiLL i HaVE Bo0tllEnEcKs? RyZEN dOeS NoT peRfORm BetTer wItH HiGhER sPEED RaM!!dId i WiN teH SiLiCON LotTerrYyOu ShoUlD dEsHrOuD uR GPUmy SYstEm iS UNDerPerforMiNg iN WarzONEcan mY Pc Run WiNdOwS 11 ?woUld BaKInG MY GRaPHics card fIX it?
 MultimETeR TeSTiNG!! aMd'S GpU DrIvErS aRe as goOD aS NviDia's YOU SHoUlD oVERCloCk yOUR ramS To 5000C18

 

Link to comment
Share on other sites

Link to post
Share on other sites

24 minutes ago, tikker said:

Damn. These rumours make the 4000 series sound rather attractive. Let's hope for some reasonable pricing.

Everytime new cards are about to release all rumours tend to sound too good to be true.

 

60% higher BASE clocks vs RTX 3090 only sounds impressive because of how low the base clock on 3090 is. In reality most 3090 cards actually run at 1800MHz+ anyways even the ones with bad coolers.

Actually since 1000 series the GPU boost on NVIDIA cards did such a great job that the base clock is basicaly just a meaningless number for the most part.

 

This will be only impressive if we can see the same behaviour on the 4000 series and the cards will boost themselves close to 3000MHz. But due to the massive power increase I believe that NVIDIA started pushing the silicon closer to its limit rather than leaving a massive headroom like they did for the last 3 generations.

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, 05032-Mendicant-Bias said:

Ray tracing is still gimmiky. I like the look of ray traced reflections and global illumination in cyberpunk, but the noise gets worse. I would like for an high end gpu to be able to do good ray tracing without artefacts.

You're asking way too much, its a miracle we even HAVE raytracing, it wasn't expected this soon.  I'd still argue the noise is less distracting than screen space reflections and glowing areas where its supposed to be dark.

Router:  Intel Celeron N5105 (pfSense) WiFi: Zyxel NWA210AX (1.44Gbit peak at 160Mhz 2x2 MIMO, ~900Mbit at 80Mhz)

Switches: Netgear MS510TXUP, Netgear MS510TXPP, Netgear GS110EMX
ISPs: Zen Full Fibre 900 (~915Mbit) + Three 5G (~500Mbit average)

Link to comment
Share on other sites

Link to post
Share on other sites

well 1200w psu is going to be a necessity if you want to pair this with a high-end cpu

hot damn nvidia 

if it was useful give it a like :) btw if your into linux pay a visit here

 

Link to comment
Share on other sites

Link to post
Share on other sites

*3000 series price immedeatly tanks before 4000 series release, buy a 3000 series before 4000 series launch, another bloody gpu shortage happens, prices skyrocket to the moon even for old gpus, sell said 3000 series card for profit

 

Lets just hope this doesnt actually happen again ;-;

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, tikker said:

Damn. These rumours make the 4000 series sound rather attractive. Let's hope for some reasonable pricing.

There's no way they'll be reasonably priced. My heart says to get ready for moon prices. I bet they'll be - 4070: $899 4080: $1299 and 4090: $1699.

Desktop: Ryzen 7 2700X - Kraken X62 Rev 2 - STRIX X470-I - 3200MHz 16GB Dominator Platinum - 250GB 970 Evo boot - 2x 500GB 860 Evo - 4TB HDD - XFX 5600XT - RMx 750 W 80+ Gold - Manta

SetupZowie XL2740 27.0" 240hz - AOC E2460SH 24.0 Roccat Burt Pro Corsair K70 LUX browns - HyperX Cloud - Mackie CR5X's

Link to comment
Share on other sites

Link to post
Share on other sites

Looking at those memory numbers gives me performance concerns, compared to equivalent tier Ampere. Assuming the info given is correct. 

 

Mem BW GB/s = chip Gbps * bus width / 8

 

4070 vs 3070: 360 vs 448. 20% decrease.

4080 vs 3080: 672 vs 760. 12% decrease.

4090 vs 3090: 1008 vs 936. 8% increase.

 

If I had to guess, they're targeting a certain amount of VRAM for each model, but at the cost of using higher density chips on narrower bus for the new 70/80 tier. The faster memory offsets this but not enough. The 90 tier remains the same as the maximum configuration, so you do get the clock boost.

 

We get more core potential but less BW to feed it. I hope they have implemented something which reduces that need, for example AMD's Infinity Cache.

TV Gaming system: Asus B560M-A, i7-11700k, Scythe Fuma 2, Corsair Vengeance Pro RGB 3200@2133 4x16GB, MSI 3070 Gaming Trio X, EVGA Supernova G2L 850W, Anidees Ai Crystal, Samsung 980 Pro 2TB, LG OLED55B9PLA 4k120 G-Sync Compatible
Streaming system: Asus X299 TUF mark 2, i9-7920X, Noctua D15, Corsair Vengeance LPX RGB 3000 8x8GB, Gigabyte 2070, Corsair HX1000i, GameMax Abyss, Samsung 970 Evo 500GB, Crucial BX500 1TB, BenQ XL2411 1080p144 + HP LP2475w 1200p60
Gaming laptop: Lenovo Legion, 5800H, DDR4 3200C22 2x8GB, RTX 3070, SK Hynix 512 GB + Crucial P1 TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, JKRsega said:

If they are priced well, I'll eat my left arm.

Don't worry, you'll keep both of your arms.

DAC/AMPs:

Klipsch Heritage Headphone Amplifier

Headphones: Klipsch Heritage HP-3 Walnut, Beyerdynamic Amiron Home, Amiron Wireless Copper, T5p.2, Tygr 300R, DT880 600ohm Manufaktur, Fidelio X2HR, Meze 99 Classics Walnut/Gold

Earphones: Airpods 2019, Sony WF1000XM3, Sony MDR-E818LP

CPU: Intel 4770, GPU: Gigabyte Aorus GTX1080Ti, Mobo: MSI Z87-G45, RAM: DDR3 16GB G.Skill, PC Case: Fractal Design R4 Black non-iglass, Monitor: BenQ GW2280.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, CTR640 said:

Don't worry, you'll keep both of your arms.

I don't know, I get REAL hungry sometimes...  Just not as starving as Nvidia...

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, JKRsega said:

I don't know, I get REAL hungry sometimes...  Just not as starving as Nvidia...

*Eddie Murphy meme*

Well, why you gonna eat your left arm when you can eat someone's else left arm?

DAC/AMPs:

Klipsch Heritage Headphone Amplifier

Headphones: Klipsch Heritage HP-3 Walnut, Beyerdynamic Amiron Home, Amiron Wireless Copper, T5p.2, Tygr 300R, DT880 600ohm Manufaktur, Fidelio X2HR, Meze 99 Classics Walnut/Gold

Earphones: Airpods 2019, Sony WF1000XM3, Sony MDR-E818LP

CPU: Intel 4770, GPU: Gigabyte Aorus GTX1080Ti, Mobo: MSI Z87-G45, RAM: DDR3 16GB G.Skill, PC Case: Fractal Design R4 Black non-iglass, Monitor: BenQ GW2280.

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, tikker said:

Let's hope for some reasonable pricing good GPU's.

fixed it for ya,

but yea those numbers are doing some things to me.

that 4080 is looking very....attractive.

*Insert Witty Signature here*

System Config: https://au.pcpartpicker.com/list/Tncs9N

 

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, Alex Atkin UK said:

You're asking way too much, its a miracle we even HAVE raytracing, it wasn't expected this soon.  I'd still argue the noise is less distracting than screen space reflections and glowing areas where its supposed to be dark.

Yeah, perhaps I expect too much. I do play cyberpunk with RT and DLSS on at 1440p, I like it better than the regular pipeline.

I think the low end models have no businness sporting dedicated ray tracing cores, I think lower area dies (GA103 and below) should only have tensor and shader, no ray tracing. 2060 a ray tracing card? Let's be real... And let's not talk of the ray tracing mobile chip! https://news.samsung.com/global/samsung-introduces-game-changing-exynos-2200-processor-with-xclipse-gpu-powered-by-amd-rdna-2-architecture

As far as I understand, the Tensor cores have lots of uses, both at the low end and the high end. The Ray Tracing cores instead accelerate something that is only useful in ray tracing productivity or ray traced games. I would like it better if lower end GPUs would allocate all die areas to functions that increase the FPS/dollar, which I think should be the most important factor in mid tier cards.

My System: i7-8700 // Noctua NH-U9B SE2 CPU Cooler + Noctua 2X120mm 2X140mm system fans// Gigabyte Z370 HD3// 2x8GB DDR4-3000 Corsair // Asus TUF 3080 10GB // Itek Replay2.0 Modified for airflow // Corsair RM650x Gold modular // 32GB M.2 Optane + 2X120GB Kingston A400 + 2X1TB WD10EZEX 7200 RPM +1X USB3.0 External 6TB 5400RPM // Displays: LG 27GL850 IPS 1440p 144Hz + Philips 273V5LHAB TN 1080p 60Hz + Acer ka220hq TN 1080p 60Hz // https://pcpartpicker.com/list/dL42q3

Link to comment
Share on other sites

Link to post
Share on other sites

50 minutes ago, 05032-Mendicant-Bias said:

Yeah, perhaps I expect too much. I do play cyberpunk with RT and DLSS on at 1440p, I like it better than the regular pipeline.

I think the low end models have no businness sporting dedicated ray tracing cores, I think lower area dies (GA103 and below) should only have tensor and shader, no ray tracing. 2060 a ray tracing card? Let's be real... And let's not talk of the ray tracing mobile chip! https://news.samsung.com/global/samsung-introduces-game-changing-exynos-2200-processor-with-xclipse-gpu-powered-by-amd-rdna-2-architecture

As far as I understand, the Tensor cores have lots of uses, both at the low end and the high end. The Ray Tracing cores instead accelerate something that is only useful in ray tracing productivity or ray traced games. I would like it better if lower end GPUs would allocate all die areas to functions that increase the FPS/dollar, which I think should be the most important factor in mid tier cards.

I can kinda agree on the lower models to some extent, although even there DLSS can make up some leeway and its probably harder to remove the RT entirely than just tweak the design or use chips with defective parts fused off.  Plus there are some niche cases, like you might want to use the RT cores to speedup animation work on a lower end card, where the frame rate being low is less important, so long as its faster/better quality than what you'd have without it.

 

There's also the problem that RT needs to be a standard feature across ALL cards for it to be interesting to developers, even if it has to be cut-down to work on those cards. So it does make sense to include it.  Its the same when any new GPU feature is implemented, it needs to be available everywhere and be scalable.  For example if you're playing on something like a Steam Deck or a small laptop, then having to run the game at 720p, maybe even lower, is not necessarily a huge problem.

 

Problem I had with Cyberpunk is without RT it looks a LOT worse IMO, but performance DLSS (at least at launch) had huge issues that made it look like garbage.  If I recall correctly, only Quality mode was any good, and I hit just the wrong sweet spot on a 2080 there needing to reduce the resolution that kinda offset the difference, making it still look a bit rough on a 55" screen.

The Tensor cores are indeed very useful, as I plan to get a 4080, move the 3080 to my AI upscaling box, than use the 2080 in my desktop for the odd AI image upscaling as its painfully slow on GTX 1650.  Plus nobody seems to really be looking into using the Tensor cores for actual games beyond DLSS, I'd imagine they might be able to do something with AI but its tricky given AMD AFAIK has nothing comparable in theirs and there is more focus there due to consoles.

Router:  Intel Celeron N5105 (pfSense) WiFi: Zyxel NWA210AX (1.44Gbit peak at 160Mhz 2x2 MIMO, ~900Mbit at 80Mhz)

Switches: Netgear MS510TXUP, Netgear MS510TXPP, Netgear GS110EMX
ISPs: Zen Full Fibre 900 (~915Mbit) + Three 5G (~500Mbit average)

Link to comment
Share on other sites

Link to post
Share on other sites

Uh.

I am confused.

 

So the alleged 4080, with 10240 CUDA cores and 16GB of VRAM, has a TDP of 420W.

And the alleged 4090, with 16384 CUDA cores and 24GB VRAM, has a TDP of 450W.

 

How? What? How?  Only 30W more for 60% more cores and 50% more GDDR6X?

 

I know this guy is a well known and respected leaker but this makes no sense to me.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Rauten said:

Uh.

I am confused.

 

So the alleged 4080, with 10240 CUDA cores and 16GB of VRAM, has a TDP of 420W.

And the alleged 4090, with 16384 CUDA cores and 24GB VRAM, has a TDP of 450W.

 

How? What? How?  Only 30W more for 60% more cores and 50% more GDDR6X?

 

I know this guy is a well known and respected leaker but this makes no sense to me.

Only one way to find out: Wait for the official announcement.

Desktop: i9-10850K [Noctua NH-D15 Chromax.Black] | Asus ROG Strix Z490-E | G.Skill Trident Z 2x16GB 3600mhz 16-16-16-36 | Asus ROG Strix RTX 3080Ti OC | SeaSonic PRIME Ultra Gold 1000W | Samsung 970 Evo Plus 1TB | Samsung 860 Evo 2TB | CoolerMaster MasterCase H500 ARGB | Win 10

Display: Samsung Odyssey G7A (28" 4K)

 

Laptop: LenovoY70-70 | i7-4820hq | 16GB 1600mhz | GTX 960M 4GB | Linux Ubuntu 18.04

Link to comment
Share on other sites

Link to post
Share on other sites

On 7/4/2022 at 12:31 PM, CTR640 said:

Don't worry, you'll keep both of your arms.

I'm sure a 4090 will cost an arm, leg, kidney, and left nut.

Link to comment
Share on other sites

Link to post
Share on other sites

On 7/5/2022 at 5:01 AM, porina said:

Looking at those memory numbers gives me performance concerns, compared to equivalent tier Ampere. Assuming the info given is correct. 

 

Mem BW GB/s = chip Gbps * bus width / 8

 

4070 vs 3070: 360 vs 448. 20% decrease.

4080 vs 3080: 672 vs 760. 12% decrease.

4090 vs 3090: 1008 vs 936. 8% increase.

 

If I had to guess, they're targeting a certain amount of VRAM for each model, but at the cost of using higher density chips on narrower bus for the new 70/80 tier. The faster memory offsets this but not enough. The 90 tier remains the same as the maximum configuration, so you do get the clock boost.

 

We get more core potential but less BW to feed it. I hope they have implemented something which reduces that need, for example AMD's Infinity Cache.

The Ti models/refreshes were a bit messed up in RTX 30 series, I suspect things are like they are to give room for more sensible and meaningful Ti models across the product lineup.

 

image.png.04a2006de318a2194624e3d79ceda535.pngimage.png.5fade229ca11b90e29e0c9da5668652d.png

 

This mess I suspect will get ironed out in RTX 40 series.

 

Also two notes, I believe the caches are getting huge increases so will have similar benefit to Infinity Cache and the other note is if Nvidia really is looking to reduce TSMC 5nm allocation then to me that means RTX 40 series is not as good as rumored. If RTX 40 series is as good as rumored Nvidia would not be looking to reduce allocation because literally everyone is going to want them and I doubt a recession would change it that much. RTX 30 -> RTX 40 may not be as good as RTX 20 -> RTX 30, a node shrink should mean that it would be but who knows. It's a very odd move from Nvidia in my opinion is all.

Link to comment
Share on other sites

Link to post
Share on other sites

39 minutes ago, StDragon said:

I'm sure a 4090 will cost an arm, leg, kidney, and left nut.

Well shit, I need them all and my right nut needs the left nut to stay a complete pair. The 4090 can go to someone else who doesn't mind missing them all. 

DAC/AMPs:

Klipsch Heritage Headphone Amplifier

Headphones: Klipsch Heritage HP-3 Walnut, Beyerdynamic Amiron Home, Amiron Wireless Copper, T5p.2, Tygr 300R, DT880 600ohm Manufaktur, Fidelio X2HR, Meze 99 Classics Walnut/Gold

Earphones: Airpods 2019, Sony WF1000XM3, Sony MDR-E818LP

CPU: Intel 4770, GPU: Gigabyte Aorus GTX1080Ti, Mobo: MSI Z87-G45, RAM: DDR3 16GB G.Skill, PC Case: Fractal Design R4 Black non-iglass, Monitor: BenQ GW2280.

Link to comment
Share on other sites

Link to post
Share on other sites

20 hours ago, Rauten said:

Uh.

I am confused.

 

So the alleged 4080, with 10240 CUDA cores and 16GB of VRAM, has a TDP of 420W.

And the alleged 4090, with 16384 CUDA cores and 24GB VRAM, has a TDP of 450W.

 

How? What? How?  Only 30W more for 60% more cores and 50% more GDDR6X?

 

I know this guy is a well known and respected leaker but this makes no sense to me.

Because adding CUDA cores doesn't itself increase power draw. Adding CUDA cores at the same frequencies does. This means the 4090 will have a decent bit lower boost clocks, if this information is true of course.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, leadeater said:

The Ti models/refreshes were a bit messed up in RTX 30 series, I suspect things are like they are to give room for more sensible and meaningful Ti models across the product lineup.

I was amazed the RTX3080 came with the GA102. Previous model all came with the top 3° die, instead of the 2° die shaved down. That's why the 3090 was so close to the 3080 in performance, with only difference being the VRAM, which was on the small side for an high end card.

With the 4000 series, we get back to the 80 tier to use the 3° die, giving the 90 tier a significant bump in performance for a huge premium.

My System: i7-8700 // Noctua NH-U9B SE2 CPU Cooler + Noctua 2X120mm 2X140mm system fans// Gigabyte Z370 HD3// 2x8GB DDR4-3000 Corsair // Asus TUF 3080 10GB // Itek Replay2.0 Modified for airflow // Corsair RM650x Gold modular // 32GB M.2 Optane + 2X120GB Kingston A400 + 2X1TB WD10EZEX 7200 RPM +1X USB3.0 External 6TB 5400RPM // Displays: LG 27GL850 IPS 1440p 144Hz + Philips 273V5LHAB TN 1080p 60Hz + Acer ka220hq TN 1080p 60Hz // https://pcpartpicker.com/list/dL42q3

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share


×