Jump to content

Extra Monitors DO Hurt Your Gaming Performance

Plouffe

We've wondered for years if adding multiple high-resolution displays to your battlestation could hurt gaming performance, so we asked the lab to take a look and the results might surprise you.

 

Buy a Sony INZONE M9 4K 27" Monitor: https://geni.us/PwdHT63

Buy a LG Ultragear 27GL83A-B 27" Monitor: https://geni.us/eISB

Buy an ASUS VA24DQ 24" Monitor: https://geni.us/CJRM

Buy a Sceptre E249W-19203R 24" Monitor: https://geni.us/AQyb

Buy a Corsair XENEON FLEX 45" Bendable Monitor: https://geni.us/bYjt

 

Purchases made through some store links may provide some compensation to Linus Media Group.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I love how LMG links the new monitors as if I had the money to buy one

*Spends another 1000 pounds on a gpu*

Message me on discord (bread8669) for more help 

 

Current parts list

CPU: R5 5600 CPU Cooler: Stock

Mobo: Asrock B550M-ITX/ac

RAM: Vengeance LPX 2x8GB 3200mhz Cl16

SSD: P5 Plus 500GB Secondary SSD: Kingston A400 960GB

GPU: MSI RTX 3060 Gaming X

Fans: 1x Noctua NF-P12 Redux, 1x Arctic P12, 1x Corsair LL120

PSU: NZXT SP-650M SFX-L PSU from H1

Monitor: Samsung WQHD 34 inch and 43 inch TV

Mouse: Logitech G203

Keyboard: Rii membrane keyboard

 

 

 

 

 

 

 

 

 


 

 

 

 

 

 

Damn this space can fit a 4090 (just kidding)

Link to comment
Share on other sites

Link to post
Share on other sites

Can you please do something about Linus pointing to something on a screen and the camera operator zooming in just to show a blurry unfocused mess before zooming back out as they try to keep up with Linus?

To be clear I'm not blaming the camera operators, I understand it isn't easy trying to focus on to a screen or to keep up with Linus's fast pace as he moves around. I'm not sure what the solution would be whether it would be to plan the shots out in advance so the camera operators know what they're going to be focusing on so they can be better prepared, telling Linus to slow down with his presentation a few seconds when he points to things on screen to give his team the time they need for shots like that (which can be cut in editing), cutting in a separate shot from B roll/screen capture if the primary footage ends up being unusable, cutting those sections from the video if they're too blurry to be usable, or having the camera operator on the spot saying "Sorry Linus I didn't quite catch that can we run it back again so I can focus".

 

 

Linus: "And then look what happens"

image.png

 

Audience: ?????

image.jpeg

CPU: Intel i7 6700k  | Motherboard: Gigabyte Z170x Gaming 5 | RAM: 2x16GB 3000MHz Corsair Vengeance LPX | GPU: Gigabyte Aorus GTX 1080ti | PSU: Corsair RM750x (2018) | Case: BeQuiet SilentBase 800 | Cooler: Arctic Freezer 34 eSports | SSD: Samsung 970 Evo 500GB + Samsung 840 500GB + Crucial MX500 2TB | Monitor: Acer Predator XB271HU + Samsung BX2450

Link to comment
Share on other sites

Link to post
Share on other sites

Personally not really interested in the fairly uninformative "average FPS".
What about the 0.1% lows? (or how the worst frame times gets affected.)
 

That there is a performance impact isn't too surprising. Nor its size.

However, here comes the good old. "what about if one has two GPUs?"

Like one GPU for the primary monitor, and a second one for the ancillary ones.

This should logically offload at least the GPU side of the issue from one's games. And as long as one has a suitably good CPU and enough system memory bandwidth, then the CPU shouldn't noticeably be meaningfully burdened by the extra load either.

However, streaming video while playing online games likely has a measureable impact on network latency. After all, the network won't care what package contains what data, only that it should arrive at its destination at some point in time. Video or game data are simply equal in priority as far as latency goes, IP networking largely don't care that one technically has a much more relaxed latency budget than the other.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Nystemy said:

Personally not really interested in the fairly uninformative "average FPS".
What about the 0.1% lows? (or how the worst frame times gets affected.)
 

 

You should look closely at the charts in the video and the labels for the different bars. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, ToboRobot said:

You should look closely at the charts in the video and the labels for the different bars. 

1% low is still fairly average.
At least they have stopped using the even more meaningless 10% low.

To be fair, even 0.1% or 0.01% is somewhat meaningless as well.
To a degree it is better to just provide the longest frame times.

Since if running at 100-200 FPS on average, then a 120 ms spike every 20 seconds won't meaningfully impact even the 0.1% average.
However, the spike in latency is a very noticeable stutter to everyone.

And having the system do more stuff in the background is a rather surefire way to ensure that the Kernel will schedule away aspects of the game engine at inopportune moments. Effectively ensuring that one will get a massive spike in frame delivery times. (A wonderfully common problem back when single threaded CPUs were effectively the only thing around.)

Link to comment
Share on other sites

Link to post
Share on other sites

So.. I've noticed lately while having a YT video open on my 2nd monitor while playing Anno 1800 that YT is dropping down to 360P playback quality after a while (while still being set to auto).. Makes me wonder if this is because chrome isn't getting enough oomph from my old RTX 2070 (non-super) as my in game frames dip under 40 fps late game and my PC starts to struggle to keep things smooth..

 

 

To add.. Normally a video's playback is fine, but after a while of not looking at it is when I'll notice it's switched over to 360p. Audio quality remains the same, so I don't really hear that switch. Connection is symmetrical gigabit.

 

 

While this was meant to be a more generalized video, it does make me wonder if they had done this testing with lets say...  a Mid-tier system, or one that's a generation old hardware wise.. 

 

 

Edit: Also, maybe YT just hates me because I don't pay for their crap subscription they're pushing... (LOOKING AT YOU ENHANCED BITRATE 1080P OPTION.. ARE YOU THE ONE DOING THIS TO ME??)

Link to comment
Share on other sites

Link to post
Share on other sites

If you have your iGPU enabled and plug your second monitor on it, the performance drop is unexistent.

 

Also, this is the video description: image.png.556f01619c2c281fd6c59878d7a1e0d6.png

Made In Brazil 🇧🇷

Link to comment
Share on other sites

Link to post
Share on other sites

Hmm, I wonder how this is working to my PC's detriment. I currently run 2 monitors on my system so that I can game on one and watch look at info on another. However, when I click off the game the frame rate drops incredibly on the gaming monitor. Would anyone able to help me maximize the performance of this pc?
CPU: Intel i7-11700KF 3.6GHz
Motherboard: Asustek G15CE
Memory: DDR4-3200 16GB
GPU: Nvidia GeForce RTX 3080 10GB
OS: Win 11
Monitors: 2x LG 27" 4K UHD UltraFine™ IPS Monitor with HDR10

Link to comment
Share on other sites

Link to post
Share on other sites

37 minutes ago, themrsbusta said:

If you have your iGPU enabled and plug your second monitor on it, the performance drop is unexistent.

 

Also, this is the video description: image.png.556f01619c2c281fd6c59878d7a1e0d6.png

Fixed it. Thank you!

Link to comment
Share on other sites

Link to post
Share on other sites

This is the most stupid video I've ever seen. I have 3 monitors and when I game I turn off 2 of them!

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, LMGcommunity said:

Fixed it. Thank you!

The 'Discuss' button seems to be gone, too:

image.png.0d635d92afde5c6d5fa796babc36b951.png

If you can read this you're using the wrong theme.  You can change it at the bottom.

#MuricaParrotGang

The meme thread

 

 

 

All of my image memes are made with GIMP.

 

My specs are crap but if you are interested:

Spoiler

 

The meme-making machine - Optiplex 780:

CPU: Intel Core 2 Duo E8400 @ 3.0 GHz

GPU: NVidia Quadro FX 580

RAM: 2 GB

SSD: Non-existent

HDD: 1 TB

OS: Windows 7

 

Laptop: HP 255 G7

CPU: Ryzen 5 3500U

GPU: Radeon Vega 8

RAM: 8 GB

SSD: 500 GB NVMe

OS: Windows 10

 

Link to comment
Share on other sites

Link to post
Share on other sites

I have all my secondary monitors plugged into a GTX750ti in a spare PCI-E 1x slot, avoids choppy YouTube performance while gaming at high Hz.

Link to comment
Share on other sites

Link to post
Share on other sites

Can I request that we also run a test with secondary displays plugged into integrated graphics?

I don't have integrated graphics but I'd like to see if that does anything. I'll likely have an iGPU at SOME point. 

3900x | 32GB RAM | RTX 2080

1.5TB Optane P4800X | 2TB Micron 1100 SSD | 16TB NAS w/ 10Gbe
QN90A | Polk R200, ELAC OW4.2, PB12-NSD, SB1000, HD800
 

Link to comment
Share on other sites

Link to post
Share on other sites

I really wish they mentioned applications like Wallpaper Engine where there is essentially always something moving on my other monitors.

Link to comment
Share on other sites

Link to post
Share on other sites

is it the same for AMD? looked like Nvidia 4090 founders on the b-roll bench

can't you just use the Nvidia control panel to set browsers and non-games to run on the iGPU or turn off hardware acceleration in the browser so it's using the overhead available on your CPU? there are likely ways around this to help bring back fps lost to less optimized set-ups.

The best gaming PC is the PC you like to game on, how you like to game on it

Link to comment
Share on other sites

Link to post
Share on other sites

I actually did have in a niche scenario on my 2070 laptop, duplicating the screen hurt performance (ONLY WITH DIFFERENT RESOLUTIONS)

I was duplicating my laptops screen (4k) to a portable monitor (1080) for shared gaming (Champions of Norrath via PCSX2)


And it deeply affected the performance. But Champions of Norrath is infamously difficult to run and I was duplicating the screen between two drastically different resolutions while emulating a game.

 

 

I am wondering if such a scenario was ever tested. I know its incredibly niche

Link to comment
Share on other sites

Link to post
Share on other sites

Hey, can you give recommendations for monitors available in the UK and India?

Link to comment
Share on other sites

Link to post
Share on other sites

@ 9:03 in the video, Linus says you'd link the ShortCircuit review of the Sony monitor "down below" but it's not linked anywhere in the description...
Link to comment
Share on other sites

Link to post
Share on other sites

36 minutes ago, crism said:
@ 9:03 in the video, Linus says you'd link the ShortCircuit review of the Sony monitor "down below" but it's not linked anywhere in the description...

Just added it in.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, themrsbusta said:

If you have your iGPU enabled and plug your second monitor on it, the performance drop is unexistent.

 

Also, this is the video description: image.png.556f01619c2c281fd6c59878d7a1e0d6.png

Not necessarily, depending on the setup. When doing this on my own desktop, for example, the iGPU only acts as a pass through for the dGPU. You get full 3D performance regardless of monitor, but the dGPU is still handling both screens. 
 

Though my desktop is going on 8 years old, so I’m unsure of this has changed on recent platforms. 
 

The additional load for multi-display is most likely due to fill rate, and memory bandwidth. Redrawing multiple layers of windows at 4K is probably relatively expensive too, owing to overdraw. 
 

I wonder what the performance difference would be with 4 1080P screens, vs 1 4K screen split 4 ways. 

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

not a single mention of refresh rates...

mismatching your refresh rates can cause bad frame timings.

i had stutters when i still used a 60hz 2nd monitor and it went away when i replaced it with a 144hz monitor.

Link to comment
Share on other sites

Link to post
Share on other sites

I read somewhere that plugging your 2nd monitor to the MoBo will use your processor’s integrated graphics instead of the dedicated gpu which will then not affect performance of the gpu. Is this true? And if so is there a setting thing you have to do to enable it? Will the downside be more temps and possible cpu slowdown?

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, LMGcommunity said:

Just added it in.

thank you

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, simpsonfan409 said:

not a single mention of refresh rates...

mismatching your refresh rates can cause bad frame timings.

i had stutters when i still used a 60hz 2nd monitor and it went away when i replaced it with a 144hz monitor.

Yes came here to post about this. Hopefully they can test this in the future.

 

I think its very common for people to have mismatched refresh rates these days. You have a couple of old 1080p 60Hz monitors, but then decide to treat yourself to some 144Hz goodness and use those 60Hz monitors as side displays. Who on earth can afford to buy a similarily high-spec monitor as side monitors straight away?

 

Well I did a few years back. Bought 2x MSI 1440p 144Hz G-sync gaming monitors and moved all my old monitors out. It was great, albeit 144Hz as a side display was gross overkill lol. Unfortunately one monitor  broke after 1,5yr, so I switched back to a 1080p/60Hz display and since then 144Hz doesn't feel as smooth when I have the 2nd display running. When I play RL I usually disable this monitor in Windows to have the optimal 144Hz experience.

 

But lately I've been running it at 120Hz and that is giving me less issues. Presumably because 120 en 60Hz are integer divisible.

 

Now this does sound increasingly hard to test though. Because I think a main point of interest is to also consider laptops, where you're basically stuck with whatever internal display and a potential mix of GPU vendors (Intel, AMD, NVIDIA) and mix of GPU <=> display configurations (e.g. some displays run on iGPU, some on dGPU, some via mux others via PCIe image transfers). Gets complicated quickly! But would be really interested to see if there are any difference or gains to be had here..

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×