Jump to content

Benchmarking game performance while having second monitor content

Hi,

 

I'm a long time youtube viewer, first time forum poster.

(I dunno if I'm posting in the right place, feel free to correct me if I'm wrong)

 

Lately I've been having a particular issue which made me think about benchmarks done when presenting cpus and gpus; nowadays, Some of us (or a lot of us) have 2 monitors, not necessarily fancy monitors but still, more than 1. And while we game, we're using that second monitor for youtube, maybe music, maybe twitch viewing, or anything really. But that second monitor cost precious gpu ressources, and sometimes, it might reach its limits.

 

I used to have a 1440p and a 1080p secondary monigor, my 1440p died on me, and so I bought a 4k monitor. But now, opera dies quite often on youtube whenever I play games ! Out of Vram it says. Task manager tells the same story, my Vram is maxed out all the time. That never happened to me before, never really thought about it. And so I thought, maybe, could we have some actual benchmark ? how much Vram would we need ? can we maybe plug second monitor on our good old gpu taking dust and making sure our browser plays on the old gpu ? does it make a difference ? does it make it worse ? do fps drop significantly by just having youtube running or is it mostly fine ? Is 12gigs of Vram on brand new midrange gpus actually enough for multi-monitor content ?

 

There are tons of questions that I'm wondering about now that I'm facing those issues, and I think the community would be interested in a deep dive of this sort of thing because well, more and more people take an additional monitor, and before we know it, our precious computing ressources fade away into our seconday content viewing. The LTT video about having an Arc gpu as a secondary GPU for streaming also made me wonder about the possibility to have it run video content while the main gpu runs games too.

 

What do you guys think ? I'm curious as to whether or not y'all would be interested to see this sort of benchmarking, cause let's be real, none of us play games or do anything with nothing running in the background.

Link to comment
Share on other sites

Link to post
Share on other sites

You can fix your issue by disabling hardware acceleration in opera

PLEASE QUOTE ME IF YOU ARE REPLYING TO ME

Desktop Build: Ryzen 7 2700X @ 4.0GHz, AsRock Fatal1ty X370 Professional Gaming, 48GB Corsair DDR4 @ 3000MHz, RX5700 XT 8GB Sapphire Nitro+, Benq XL2730 1440p 144Hz FS

Retro Build: Intel Pentium III @ 500 MHz, Dell Optiplex G1 Full AT Tower, 768MB SDRAM @ 133MHz, Integrated Graphics, Generic 1024x768 60Hz Monitor


 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Ymaldor said:

But that second monitor cost precious gpu ressources

no it doesn't

Your issue of running out of vram doesn't have anything to do with the resources needed to run a 2d monitor that only needs a few megabytes of vram to display. displaying a second monitor isn;t hard for a gpu to do and they've been doing it for decades.

 

1 hour ago, Ymaldor said:

Task manager tells the same story, my Vram is maxed out all the time.

stop using task manager to monitor vram usage

1 hour ago, Ymaldor said:

opera dies quite often on youtube whenever I play games

does it happen with a normal browser?

Link to comment
Share on other sites

Link to post
Share on other sites

I read a bunch about this a while back, because I was curious, and every article and benchmark I found showed that while yes, having a twitch/youtube video etc open alongside your game would impact your FPS it was typically negligible, being about 5fps for the testers who were running at over 100fps...

 

It should also be noted that none of the tests showed the actual 2nd monitor costing any extra resources itself, it was just the act of multitasking. i.e. the FPS results would be identical even if you were just running a single monitor setup, and had the same youtube video on in the background. 

 

My point to all of this is just that I don't think there's a meaningful impact for most graphics cards, and therefore the results wouldn't vary enough to make testing for this regularly very interesting.

Link to comment
Share on other sites

Link to post
Share on other sites

34 minutes ago, emosun said:

does it happen with a normal browser?

Ouch burn... lol

 

Actually, as a long time Opera user, I'd suggest Firefox. Or Waterfox. Either one is better.

"Don't fall down the hole!" ~James, 2022

 

"If you have a monitor, look at that monitor with your eyeballs." ~ Jake, 2022

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Ymaldor said:

1440p died on me, and so I bought a 4k monitor.

I think I may have solved this conundrum.

 

What gpu do you have?

🌲🌲🌲

 

 

 

◒ ◒ 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Arika S said:

What gpu do you have?

Just about to ask the same thing

 With all the Trolls, Try Hards, Noobs and Weirdos around here you'd think i'd find SOMEWHERE to fit in!

Link to comment
Share on other sites

Link to post
Share on other sites

I personally see no reason to have content on the second screen unless you're trying to load the hardware up. Everything being displayed, as well as things (usually) related to those processes all operate in one pool of resources, so anything else you're doing aside from the benchmark will contaminate your results. Also, the benchmarks likely are programmed to fully load the graphics hardware, and aren't necessarily intended to have to share with anything else as they fully load the hardware. Save your second screen content for when you do actually need it.

 

Speaking of which, are there test that allow testing of both screens simultaneously of a dual screen setup?

 

...though if you wanted to know the correct board for this discussion, go out to Hardware and in under Graphics Cards.

Link to comment
Share on other sites

Link to post
Share on other sites

Hi all, sorry for the delayed response, I see that it's not as broad as i thought it'd be ^^

 

On 10/15/2022 at 1:40 AM, SimplyChunk said:

Just about to ask the same thing

On 10/14/2022 at 11:53 PM, Arika S said:

I think I may have solved this conundrum.

 

What gpu do you have?

I have an i9-12900K with an RTX 2070 (I do intend to upgrade, awaiting releases from both nvidia and amd at the moment)

my card runs everything i want smoothly enough (sure 4K isn't perfect 60 fps but I'm not picky, if it runs, it runs)

I've been wanting to upgrade for a while but well, as y'all know, graphics cards prices being what they were, I started with cpu first. like 8 months ago.

On 10/14/2022 at 6:11 PM, Sarra said:

Ouch burn... lol

 

Actually, as a long time Opera user, I'd suggest Firefox. Or Waterfox. Either one is better.

I use the opera gx thing, which runs perfectly fine, I actually never had any issue before I got my 4K monitor. I've no intention of using chrome, and as for firefox well, why not I guess but tbh since it worked fine before I assume it's just a lack of ressources to make it all work. And well, Vram seemed to be out of whack.

 

I mostly play Path of Exile and mby some of you know how CPU intensive that game is and how bad it is for Vram too when you play heavy maps on that. I'd like to be able to play the game while checking youtube out without anything ever slowing down, which it didn't with a 1440p screen.

 

What I'm mostly worried about is, if I get myself, let's say, an RTX 4070 or something along those lines with, let's assume, 12GB of Vram, will that be enough to go to town with both game and 2nd monitor content with no issues ? Someone suggested "deactivate hardware acceleration" I mean sure, but that option is there for a reason, no ? pretty sure I activated that at some point myself, but without that it means the cpu works on it right ? meaning if my game is cpu intensive, then the browser will just lag out. Aka hardware acceleration = load off CPU ? mby y'all can help me understand exactly the + and - of having it on or off depending on a game being cpu or gpu heavy while browser runs with either.

 

Regards,

Ymaldor

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Ymaldor said:

I use the opera gx thing, which runs perfectly fine, I actually never had any issue before I got my 4K monitor. I've no intention of using chrome, and as for firefox well, why not I guess but tbh since it worked fine before I assume it's just a lack of ressources to make it all work. And well, Vram seemed to be out of whack.

Having used the non-GX version of Opera for 5-6 years now, and FireFox on and off for the last 15 years, I will say that overall, FireFox is a better experience.

 

1 hour ago, Ymaldor said:

I mostly play Path of Exile and mby some of you know how CPU intensive that game is and how bad it is for Vram too when you play heavy maps on that. I'd like to be able to play the game while checking youtube out without anything ever slowing down, which it didn't with a 1440p screen.

Your game is potentially using your entire VRAM at 4K, whereas, at 1440p, it might not have been doing so. The best suggestion I can come up with is to drop your game settings a little and watch your VRAM usage while gaming. If you can free up a bit of VRAM, you might be able to keep Opera from crashing.

 

1 hour ago, Ymaldor said:

Someone suggested "deactivate hardware acceleration"

Hardware acceleration is an absolute nightmare, regardless of browser. It causes issues, both in websites and in genera usage. When I install a browser, Opera, FireFox, WaterFox, Edge, Internet Explorer, Netscape Navigator 3.01, doesn't matter what browser, even in Linux, in Windows, even in Mac OS or iOS, the FIRST thing I do is find that ridiculous tick box for hardware acceleration and make damn sure it's not ticked.

 

Turning hardware acceleration off is essential, IMO, for a good browsing experience. I was shocked at how much better FireFox ran with that turned off in KUbuntu, blew my mind.

 

I cannot guarantee that it will fix your specific problem, but in general, unless you're actually gaming in a browser, it's entirely unnecessary, so just turn it off. o.x

1 hour ago, Ymaldor said:

What I'm mostly worried about is, if I get myself, let's say, an RTX 4070 or something along those lines with, let's assume, 12GB of Vram, will that be enough to go to town with both game and 2nd monitor content with no issues ?

I've used single, dual, triple, and quad monitors in the past. I've gamed on each of those configurations. And today, I would trade a quad monitor setup with a single PC for a dual monitor setup with two PC's every time. I currently have 3 monitors, and two PC's. For gaming and watching YouTube, I find that some games will die when alt-tabbed, or even lock up the system, so having a separate PC for YT is great, just pause your game, faceroll your second keyboard, then unpause your game and keep on your way. To each their own, I know not everyone can afford two PC's, but personally... I would get a Mini PC just for YT if I had no other option. Which is actually one of my plans. :3

 

If you do look at a card with more VRAM, for your case, using two displays and one PC, I would actually suggest maybe getting a 16GB card. 6800XT, 6900XT, 7000 series GPU, you would have more headroom for better gaming performance while not losing your YT on the side so frequently. But, I know a lot of people are opposed to the 6X00 series, so whatever, just... Oof. The price for the 4000 series is really painful. You might also be waiting a while for a 4070 to launch, and what if it still only has 8GB of RAM?

"Don't fall down the hole!" ~James, 2022

 

"If you have a monitor, look at that monitor with your eyeballs." ~ Jake, 2022

Link to comment
Share on other sites

Link to post
Share on other sites

43 minutes ago, Sarra said:

If you do look at a card with more VRAM, for your case, using two displays and one PC, I would actually suggest maybe getting a 16GB card. 6800XT, 6900XT, 7000 series GPU, you would have more headroom for better gaming performance while not losing your YT on the side so frequently. But, I know a lot of people are opposed to the 6X00 series, so whatever, just... Oof. The price for the 4000 series is really painful. You might also be waiting a while for a 4070 to launch, and what if it still only has 8GB of RAM?

I've only ever had issues with AMD gpus so even though I'm very aware it's just a feeling and not objective at all I'm sort of itchy to get amd again. Pricing doesn't bother me I have my budget and it either fit or doesn't, $/perf then is not that relevant to me at this sort of pricerange.

 

If the 4070 is 8GB well, I dunno, I've wondered about the 2nd gpu possibility, back in the day some people actually did this sort of thing. Pretty sure even LTT did some videos a long time ago about having a cheaper off gpu to run physX or something, so I'm wondering, what if I put both 2070 and 4070 in my rig, I plug 2nd monitor on 2070 and do all the settings so the browser and things all run on 2070 instead of 4070 ? I've no clue if this sort of thing work or is just a cesspool of potential issues. Honestly it's not gonna cost me anything to try even just for fun of having 2 different gpus in.

 

I'm mostly hoping these twats making gpus don't decide we only need 8GB, it'd be dumb af, if they do well, I'm gonna go for the rebranded 4080 12GB, whatever name they give it when they inevitably bring it about again. I wonder how much that'll be tho, says 900$ but well, I mean, 4090 says "1599" while in europe it's anywhere between 2k and 4k€ which is nuts. if the 12GB is 1200€ ish, I'll take it, but still gonna wait for amd to potentially impact prices with their release.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×