Jump to content

Rtx 4090 is a monster (Official Benchmarks)

Fasterthannothing

so gamers nexus’s said the fe competes with the AIB cards.  So for watercooling an FE would do great?  Or pay 400 more for a strix lol?

CPU:                       Motherboard:                Graphics:                                 Ram:                            Screen:

i9-13900KS   Asus z790 HERO      ASUS TUF 4090 OC    GSkill 7600 DDR5       ASUS 48" OLED 138hz

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, porina said:

Keep in mind that different workloads may scale a bit differently depending on the mix of execution resources used. Still, that is an interesting chart. Roughly between 30% and 60% power target, it looks like it is power scaling. Only above 60% power target does it go into some mix of limiting other than just power.

Also keep in mind, that this is a synthetic benchmark so it should be basically the worst case in terms of performance loss. What we're seeing is a product that is basically overclocked by default. 30% power increase for just 10% more performance? That's overclocking territory.

 

It would still have been an impressive card at 60-70% Power Target and would still have a little headroom for overclocking. They could have also just went with 3x 8-Pin (like some AIBs did) and have a ton of headroom.

Use the quote function when answering! Mark people directly if you want an answer from them!

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, bowrilla said:

Also keep in mind, that this is a synthetic benchmark so it should be basically the worst case in terms of performance loss.

Not sure I follow that, but it does fall into my statement that different workloads can behave differently. If memory serves Time Spy is a DX12 showcase of sorts.

 

4 minutes ago, bowrilla said:

What we're seeing is a product that is basically overclocked by default. 30% power increase for just 10% more performance? That's overclocking territory.

I'd take a more blunt view on overclocking. It is overclocking if the manufacturer considers it overclocking. Just because it is running well into the less efficient end that historically only overclockers explored doesn't make it overclocking. Especially since AMD pushed Intel with Ryzen we've seen hardware running closer to its limits as standard than in the past, like 5 or 10 years ago. Big headroom is long gone.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, porina said:

I'd take a more blunt view on overclocking. It is overclocking if the manufacturer considers it overclocking. Just because it is running well into the less efficient end that historically only overclockers explored doesn't make it overclocking. Especially since AMD pushed Intel with Ryzen we've seen hardware running closer to its limits as standard than in the past, like 5 or 10 years ago. Big headroom is long gone.

While I do understand your point here, I don't fully agree but that's ok.

 

What I do think though however: I wonder how much sense RTX 50 will make in 2-3 years time when the 4090 is already being CPU limited at 1440p and even high end games with massive resource hunger run easily at 4k ultra. The "problem" with these kind of performance leaps is: developers can't really aim for that kind of hardware unless they intentionally leave out the majority of gamers with even a little bit older hardware. It's hard to top it all at this point and it makes me wonder what's coming up next. Personally I do expect the performance level of the 4090 to stay in the top 10 for probably at least 3 if not even 4 generations of GPUs. This generation might actually be what the GTX 10 series was many years ago and the 4090 being the Titan equivalent.

Use the quote function when answering! Mark people directly if you want an answer from them!

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, bowrilla said:

While I do understand your point here, I don't fully agree but that's ok.

 

What I do think though however: I wonder how much sense RTX 50 will make in 2-3 years time when the 4090 is already being CPU limited at 1440p and even high end games with massive resource hunger run easily at 4k ultra.

In 2-3 years there will be more demanding graphics. There will likely be a larger % of titles not pushing it further but AAA almost always does.

AMD 7950x / Asus Strix B650E / 64GB @ 6000c30 / 2TB Samsung 980 Pro Heatsink 4.0x4 / 7.68TB Samsung PM9A3 / 3.84TB Samsung PM983 / 44TB Synology 1522+ / MSI Gaming Trio 4090 / EVGA G6 1000w /Thermaltake View71 / LG C1 48in OLED

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, bowrilla said:

What I do think though however: I wonder how much sense RTX 50 will make in 2-3 years time when the 4090 is already being CPU limited at 1440p and even high end games with massive resource hunger run easily at 4k ultra.

Maybe 8k will start to become enough of a thing by then. I'm not sure it makes much sense at monitor sizes as the pixel density of 4k is pretty decent already. People complaining about DP 2.0 might have displays needing it by then.

 

11 minutes ago, bowrilla said:

The "problem" with these kind of performance leaps is: developers can't really aim for that kind of hardware unless they intentionally leave out the majority of gamers with even a little bit older hardware.

Maybe game devs will start to be more generous in their usage of what would be too expensive to run today features? I'd expect scaling options to be present for a while yet. I need to look at the Steam Hardware Survey again and see what is the latest share of RT capable GPUs for example. Maybe not in the near future but I do wonder when it will start to be mandatory for some games. More people will have better hardware over time.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

People are reporting even the 5800 3d is bottlenecking the 4090 in anything below 4K.  That’s crazy.

 

 We need intels new 24 core to try out testing.

 

Anyone seen any overclock videos ? What’s it take to get 3ghZ?

CPU:                       Motherboard:                Graphics:                                 Ram:                            Screen:

i9-13900KS   Asus z790 HERO      ASUS TUF 4090 OC    GSkill 7600 DDR5       ASUS 48" OLED 138hz

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Shzzit said:

People are reporting even the 5800 3d is bottlenecking the 4090 in anything below 4K.  That’s crazy.

 

 We need intels new 24 core to try out testing.

 

Anyone seen any overclock videos ? What’s it take to get 3ghZ?

LN!

I mean soon all parts seem to require LN. The temps are going insane, at least for CPUs.

I edit my posts more often than not

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, ewitte said:

In 2-3 years there will be more demanding graphics. There will likely be a larger % of titles not pushing it further but AAA almost always does.

The major question is: how well do game engines scale? Enough to include at least at mild settings higher end 10 series or 20 series cards? 40-80% performance jump is massive, 2 or 3 of those jumps and the range even just between 3 generations starts to become huge. And keep in mind: the 4090 is already CPU limited at 1440p. A 5070 or at least a 5080 will be on the same level. Do CPUs increase in performance by the same level as GPUs? I don't think so. So 4k will become more and more normal and 8k will become more interesting for high end gaming I assume.

 

7 minutes ago, porina said:

Maybe 8k will start to become enough of a thing by then. I'm not sure it makes much sense at monitor sizes as the pixel density of 4k is pretty decent already. People complaining about DP 2.0 might have displays needing it by then.

Yeah, I'm thinking 8k as well. But that would require also massive displays to make any sense. I mean, even 43" displays at 4k have >100ppi. That size is not really ideal on a desk anymore.

 

Interesting times

Use the quote function when answering! Mark people directly if you want an answer from them!

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Shzzit said:

Anyone seen any overclock videos ? What’s it take to get 3ghZ?

Jay mentioned something about 10mins of tweaking settings.

Use the quote function when answering! Mark people directly if you want an answer from them!

Link to comment
Share on other sites

Link to post
Share on other sites

53 minutes ago, Stahlmann said:

I can't believe just one generation later there is already a card that is 90-100% faster than my 3080 (10G), with an even bigger margin looking at RT. This is huge. If i wouldn't be looking at building a house atm, i would probably get one to fully unlock my C2's 4K 120Hz potential.

 

Just started replaying Cyberpunk a week back. To see how this card BREEZES through 4K High / RT Ultra / DLSS quality (especially at DLSS 3.0) looks completely insane to me. (sitting here at around 70-80 fps on 4K high / DLSS quality / RT off) Don't get me wrong, it's still a great experience. But the 4090's performance capabilities in RT games are just mindblowing imo.

You've got the AIB 3080 that caught my attention. How are the temps and performance? The 4090 is indeed a beast and a true 4K/120 GPU but it's not an option for me lol. Otherwise it means I have to change my case aswell as the PSU and all.

DAC/AMPs:

Klipsch Heritage Headphone Amplifier

Headphones: Klipsch Heritage HP-3 Walnut, Meze 109 Pro, Beyerdynamic Amiron Home, Amiron Wireless Copper, Tygr 300R, DT880 600ohm Manufaktur, T90, Fidelio X2HR

CPU: Intel 4770, GPU: Asus RTX3080 TUF Gaming OC, Mobo: MSI Z87-G45, RAM: DDR3 16GB G.Skill, PC Case: Fractal Design R4 Black non-iglass, Monitor: BenQ GW2280

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, bowrilla said:

Jay mentioned something about 10mins of tweaking settings.

Nice so that’s in air?  I’m so damn excited lol.

CPU:                       Motherboard:                Graphics:                                 Ram:                            Screen:

i9-13900KS   Asus z790 HERO      ASUS TUF 4090 OC    GSkill 7600 DDR5       ASUS 48" OLED 138hz

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, CTR640 said:

You've got the AIB 3080 that caught my attention. How are the temps and performance? The 4090 is indeed a beast and a true 4K/120 GPU but it's not an option for me lol. Otherwise it means I have to change my case aswell as the PSU and all.

I’m thinking an FE + ek block for cheaper then a 4090 strix lol. 
 

And a 4090 in water is very tiny vs air lol

CPU:                       Motherboard:                Graphics:                                 Ram:                            Screen:

i9-13900KS   Asus z790 HERO      ASUS TUF 4090 OC    GSkill 7600 DDR5       ASUS 48" OLED 138hz

Link to comment
Share on other sites

Link to post
Share on other sites

36 minutes ago, bowrilla said:

While I do understand your point here, I don't fully agree but that's ok.

 

What I do think though however: I wonder how much sense RTX 50 will make in 2-3 years time when the 4090 is already being CPU limited at 1440p and even high end games with massive resource hunger run easily at 4k ultra. The "problem" with these kind of performance leaps is: developers can't really aim for that kind of hardware unless they intentionally leave out the majority of gamers with even a little bit older hardware. It's hard to top it all at this point and it makes me wonder what's coming up next. Personally I do expect the performance level of the 4090 to stay in the top 10 for probably at least 3 if not even 4 generations of GPUs. This generation might actually be what the GTX 10 series was many years ago and the 4090 being the Titan equivalent.

The devs would be pissing off the majority big time and the devs would also lose money. I can see nVidia bribing the devs to make games even more demanding to force the average joes to buy their high-end GPUs.

DAC/AMPs:

Klipsch Heritage Headphone Amplifier

Headphones: Klipsch Heritage HP-3 Walnut, Meze 109 Pro, Beyerdynamic Amiron Home, Amiron Wireless Copper, Tygr 300R, DT880 600ohm Manufaktur, T90, Fidelio X2HR

CPU: Intel 4770, GPU: Asus RTX3080 TUF Gaming OC, Mobo: MSI Z87-G45, RAM: DDR3 16GB G.Skill, PC Case: Fractal Design R4 Black non-iglass, Monitor: BenQ GW2280

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, bowrilla said:

The major question is: how well do game engines scale? Enough to include at least at mild settings higher end 10 series or 20 series cards? 40-80% performance jump is massive, 2 or 3 of those jumps and the range even just between 3 generations starts to become huge. And keep in mind: the 4090 is already CPU limited at 1440p. A 5070 or at least a 5080 will be on the same level. Do CPUs increase in performance by the same level as GPUs? I don't think so. So 4k will become more and more normal and 8k will become more interesting for high end gaming I assume.

 

Yeah, I'm thinking 8k as well. But that would require also massive displays to make any sense. I mean, even 43" displays at 4k have >100ppi. That size is not really ideal on a desk anymore.

 

Interesting times

Does say upscaling to 8k on a 4K monitor produce better picture?  Or is it snake oil ?

CPU:                       Motherboard:                Graphics:                                 Ram:                            Screen:

i9-13900KS   Asus z790 HERO      ASUS TUF 4090 OC    GSkill 7600 DDR5       ASUS 48" OLED 138hz

Link to comment
Share on other sites

Link to post
Share on other sites

43 minutes ago, bowrilla said:

While I do understand your point here, I don't fully agree but that's ok.

 

What I do think though however: I wonder how much sense RTX 50 will make in 2-3 years time when the 4090 is already being CPU limited at 1440p and even high end games with massive resource hunger run easily at 4k ultra. The "problem" with these kind of performance leaps is: developers can't really aim for that kind of hardware unless they intentionally leave out the majority of gamers with even a little bit older hardware. It's hard to top it all at this point and it makes me wonder what's coming up next. Personally I do expect the performance level of the 4090 to stay in the top 10 for probably at least 3 if not even 4 generations of GPUs. This generation might actually be what the GTX 10 series was many years ago and the 4090 being the Titan equivalent.

have a link on this?

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Shzzit said:

Nice so that’s in air?  I’m so damn excited lol.

The coolers are just bonkersly oversized. Apparently all testers reported temps never exceeding 70°C under load with fans still staying very quiet. Another German outlet published some sound measurements. Apparently under load the 4090 FE stayed at around 3 sone and 100% fan speed was >10 sone. Should tell you something about how oversized the coolers are. Sure, just some rough values and we don't know about their air temperatures but still. That's extremely quiet.

Use the quote function when answering! Mark people directly if you want an answer from them!

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Shzzit said:

Does say upscaling to 8k on a 4K monitor produce better picture?  Or is it snake oil ?

I once tried GTAV to do 8K on my FHD monitor and it looks damn nice but...difference is 0. 4K on 1080p looks very crisp and nice.

I also had a 4K OLED C2 and it looks great too. 8K is..well, pretty much marketing bullshit to crank up the sales.

DAC/AMPs:

Klipsch Heritage Headphone Amplifier

Headphones: Klipsch Heritage HP-3 Walnut, Meze 109 Pro, Beyerdynamic Amiron Home, Amiron Wireless Copper, Tygr 300R, DT880 600ohm Manufaktur, T90, Fidelio X2HR

CPU: Intel 4770, GPU: Asus RTX3080 TUF Gaming OC, Mobo: MSI Z87-G45, RAM: DDR3 16GB G.Skill, PC Case: Fractal Design R4 Black non-iglass, Monitor: BenQ GW2280

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, bowrilla said:

The coolers are just bonkersly oversized. Apparently all testers reported temps never exceeding 70°C under load with fans still staying very quiet. Another German outlet published some sound measurements. Apparently under load the 4090 FE stayed at around 3 sone and 100% fan speed was >10 sone. Should tell you something about how oversized the coolers are. Sure, just some rough values and we don't know about their air temperatures but still. That's extremely quiet.

Thanks mate,  woot woot.  4090 plus 13900k is gonna be so damn awsome.  Maybe some 7800mhz ddr5 for fun lol. 

CPU:                       Motherboard:                Graphics:                                 Ram:                            Screen:

i9-13900KS   Asus z790 HERO      ASUS TUF 4090 OC    GSkill 7600 DDR5       ASUS 48" OLED 138hz

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Shzzit said:

Does say upscaling to 8k on a 4K monitor produce better picture?  Or is it snake oil ?

It doesn't. Rendering at a higher resolution than your monitor can display has basically no positive effect. There are better ways to avoid jagged lines.

 

4 minutes ago, CTR640 said:

The devs would be pissing off the majority big time and the devs would also lose money. I can see nVidia bribing the devs to make games even more demanding to force the average joes to buy their high-end GPUs.

My thought exactly. If the game can't at least acceptably run on mainstream hardware it's not going to be very successful.

3 minutes ago, pas008 said:

have a link on this?

Link on what? That was purely hypothetical. I mean, we do expect the 4080s to at least match the 3090, right? Probably the 12GB matching it and the 16GB exceeding it a bit. If we keep up these kind of performance jumps, game devs will face some issues in terms of scalability. Many visual features will probably need to be optional with some form of alternatives to fall back to. That's a lot of work.

Use the quote function when answering! Mark people directly if you want an answer from them!

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, bowrilla said:

It doesn't. Rendering at a higher resolution than your monitor can display has basically no positive effect. There are better ways to avoid jagged lines.

DLSS seems to be doing a decent job, but the brute force method is to oversample. It's just insanely computationally expensive to do right and needs bigger multipliers than those offered by DSR for example. 4x by pixel count is the bare minimum and you'd ideally need to go far higher than that.

 

5 minutes ago, bowrilla said:

If the game can't at least acceptably run on mainstream hardware it's not going to be very successful.

Where do you draw the line? People will upgrade over time, and at some point you have to drop support for older hardware to enable you to get the best out of newer technologies. I recall Elden Ring caused some fuss when it came out with higher than expected requirements, although we're looking at minimums of 1060/RX 580. I think an interesting milestone will be when we start to see new AAA games come out that hard require RT (not remasters or special editions). I don't think we reached 50% RT era hardware on Steam Hardware Survey yet but it is getting close.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, porina said:

I don't think we reached 50% RT era hardware on Steam Hardware Survey yet but it is getting close.

We barely passed the 20% mark, the 1060 is still the most used GPU. We need to remember that this forum doesn't represent the vast majority of gamers, but rather a small enthusiastic niche with an above-average income.

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, porina said:

DLSS seems to be doing a decent job

DLSS renders at a lower resolution and upscales. Rendering at a higher resolution would not give you a performance benefit.

 

5 minutes ago, porina said:

Where do you draw the line?

List them up in order of computational power and put number of users on the y-axis. You'll probably end up with a bell curve. You should make sure that those around the peak of the bell can run your game. Steam has the raw data for you.

 

8 minutes ago, porina said:

I think an interesting milestone will be when we start to see new AAA games come out that hard require RT (not remasters or special editions).

That's a long way until we reach that point. RTX 20 series was mostly useless with RTX with the exception of maybe 2070Ti/Super and above. The 2070 was already a stretch and the 2060 had pointless RTX support. Compared to the 30 series and now the 40 series everything but a 2080 or 2080Ti has pointless RTX support. Not to mention all those users with AMD GPUs. So it needs to be standardized raytracing and not just NVIDIA raytracing. And then there's the issue with consoles. Yeah, not going to happen any time soon.

Use the quote function when answering! Mark people directly if you want an answer from them!

Link to comment
Share on other sites

Link to post
Share on other sites

What I'm more interested in here is not so much RTX 4090 data - we know it's fast now and that's all that matters. I don't care about the price since I'll never buy it.

 

BUT - 13th gen launch coming up will probably mean the reviewers will now use the 4090 - meaning hopefully Zen4 chips will get a re-test. We knew that the 6950XT/3090ti were the limiting factors in a LOT of the 1080p data so it didn't really give us good comparisons between Intel and AMD architectures. Very interested to see this retested.

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, igormp said:

We barely passed the 20% mark, the 1060 is still the most used GPU. We need to remember that this forum doesn't represent the vast majority of gamers, but rather a small enthusiastic niche with an above-average income.

Looking only at nvidia+AMD GPUs only, it was already 29% in the February 2022 survey. I know that intentionally excludes Apple and Intel so would skew it.

 

12 minutes ago, bowrilla said:

DLSS renders at a lower resolution and upscales. Rendering at a higher resolution would not give you a performance benefit.

The question was about AA, which is part of what DLSS delivers (also it's DLAA form).

 

12 minutes ago, bowrilla said:

List them up in order of computational power and put number of users on the y-axis. You'll probably end up with a bell curve. You should make sure that those around the peak of the bell can run your game. Steam has the raw data for you.

I did manually transcribe the Feb 2022 data into a spreadsheet. It would be interesting to do that but it would be a LOT of manual processing since I'd also have to assign a perf value to each GPU. I suppose alternatively, you could sort each listing by usage, and take the top n% of users to determine the line. Where would that line be for certain potential markets? I think that would be easier to do so I might give it a go later.

 

I'd really like to transcribe the latest data but I couldn't find a sane way to import the data without doing it manually. The webpage formatting I just couldn't get working how I wanted when dumping into a spreadsheet. If anyone can find a way to automate that, I'd love to play with the data, even historic to show better trends.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×