Jump to content

Gaming on 4k monitor with lower resolution

0x4D6178

Hey guys, I am using a 4k monitor, however my pc can't handle gaming at 4k (2012 build). How do I play games on it with lower resolution (say fullhd) without blurry menus and such?

Link to comment
Share on other sites

Link to post
Share on other sites

change the render resolution, but leave the display resolution intact. some games can do this, such as BF1 via pixel density slider. 

 

or use 1440p, or any res that divides evenly with 4K 

Link to comment
Share on other sites

Link to post
Share on other sites

It is a 27 inch. Tried 1440p and it looks fine, I guess other than that there isn't much to do?

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, 0x4D6178 said:

Hey guys, I am using a 4k monitor, however my pc can't handle gaming at 4k (2012 build). How do I play games on it with lower resolution (say fullhd) without blurry menus and such?

You don't.  If you play on a lower resolution than the native resolution of your monitor you will be forced to have every single frame scaled before it's displayed on your monitor, and whether you do that with the GPU or the monitor you will be introducing a ridiculous amount of input lag.  This is one of several reasons why no one should be buying 4k monitors yet for gaming.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, aithos said:

You don't.  If you play on a lower resolution than the native resolution of your monitor you will be forced to have every single frame scaled before it's displayed on your monitor, and whether you do that with the GPU or the monitor you will be introducing a ridiculous amount of input lag.  This is one of several reasons why no one should be buying 4k monitors yet for gaming.

Define "ridiculous."  Actually, I haven't really seen anyone do a scientific test to see just how much input lag scaling introduces.

 

The times I've done this I haven't really had a problem with input lag either.

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, M.Yurizaki said:

Define "ridiculous."  Actually, I haven't really seen anyone do a scientific test to see just how much input lag scaling introduces.

 

The times I've done this I haven't really had a problem with input lag either.

What you mean is that you didn't know what to look for, and ridiculous is more than a frame of lag but there isn't an exact number because it varies depending on the monitor and how you choose to do it.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, aithos said:

What you mean is that you didn't know what to look for, and ridiculous is more than a frame of lag but there isn't an exact number because it varies depending on the monitor and how you choose to do it.

Okay. However I'm going to contend that is a subjective matter. Some people are fine with a frame of lag. Others are not. And it also depends on the time scale. 1 frame of lag at 144Hz is 7ms. 1 frame of lag at 60Hz is 16ms.

 

I'm also calling into question if someone has done an actual scientific study (like how TFT Central does their input lag analysis) with scaling. All I can find is people going "it generates input lag."

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, M.Yurizaki said:

Okay. However I'm going to contend that is a subjective matter. Some people are fine with a frame of lag. Others are not. And it also depends on the time scale. 1 frame of lag at 144Hz is 7ms. 1 frame of lag at 60Hz is 16ms.

 

I'm also calling into question if someone has done an actual scientific study (like how TFT Central does their input lag analysis) with scaling. All I can find is people going "it generates input lag."

I'm considering a frame of lag 16ms, which for a faster monitor would be two frames.  As for the rest: even tftcentral isn't perfect and they have changed how they measure lag numerous times over the years.  It would be impossible to test for scaling lag because it involves too many variables, it can be monitor or GPU based AND it can vary depending on hardware and algorithm (ie: brand to brand). 

 

I've never met someone who played a fast-paced competitive game who was "fine" with more than 16ms of input lag, it's horrendous but a lot of people "miss" it because of network latency in games (ping).  They blame "lag" when really it's their monitor causing the problem.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, aithos said:

I'm considering a frame of lag 16ms, which for a faster monitor would be two frames.  As for the rest: even tftcentral isn't perfect and they have changed how they measure lag numerous times over the years.  It would be impossible to test for scaling lag because it involves too many variables, it can be monitor or GPU based AND it can vary depending on hardware and algorithm (ie: brand to brand). 

Which is why you start with a "gold standard" reference point and work from there.

2 minutes ago, aithos said:

I've never met someone who played a fast-paced competitive game who was "fine" with more than 16ms of input lag, it's horrendous but a lot of people "miss" it because of network latency in games (ping).  They blame "lag" when really it's their monitor causing the problem.

And I've met people who are amazing at fast paced games and run on "potato" hardware.

 

Anyone who's skilled enough can compensate for their hardware. Anyone who isn't skilled enough just complains their hardware sucks.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, M.Yurizaki said:

Which is why you start with a "gold standard" reference point and work from there.

That's what I'm saying though... there is NO way to do a gold standard.  I don't think you full grasp the number of potential variables here so let's list a few:

- GPU - hardware differences, brand differences, chipset differences, driver differences, algorithm differences

- Monitor - panel type, PCB type, scaler algorithm, input type, resolution

 

5 hours ago, M.Yurizaki said:

And I've met people who are amazing at fast paced games and run on "potato" hardware.

Yeah, and those same people are probably playing in 1024x768 on low settings too.  Besides you're ignoring a really important fact and I'll mention it below.

5 hours ago, M.Yurizaki said:

 

Anyone who's skilled enough can compensate for their hardware. Anyone who isn't skilled enough just complains their hardware sucks.

That is completely irrelevant to the point here and it's not even worth mentioning.  If you take two people of exactly equal skill and give one a 16ms edge, that person will win a significantly higher amount of the time (I'd put it somewhere between 70-80%).  Just because you can "compensate" for bad hardware doesn't excuse bad hardware, just like not knowing about it doesn't mean it isn't there or isn't important.

 

Say you've got someone who is in the top 1% of all CSGO players in the world with 30ms of input lag.  Should that person care?  You're damn right, because with 0ms of input lag they may very well be in the top 0.01% of all CSGO players and be good enough to get picked up by a pro team and play professionally.  Claiming that a flat out performance difference "doesn't matter" is one of the most ignorant things you can possibly say.  

 

You won't see a professional golfer (who isn't sponsored) say: "X brand gives me 10% better fairway accuracy and 20 more yards... but I have always done ok with Y brand so I'll stick with them..."  period.  You use what gives you either: a) the best experience or b) the best performance, and when there is a clear winning for BOTH you use it... period.

 

Besides: there isn't any native 4k content, so what's the point?  So you can upscale 1080p textures to 4k?  Yeah, go ahead... I'll stick with 1440p.

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, CoolJ S.A.S. said:

You could play AAA games maxed out (without super high AA) As far back as the Titan Black. I've never input lag when trying Skyrim in 1440p (the only game I've ever played at 1440p, but I've spent at least 500 hours at that resolution). When I go to use 1440p, I leave the native resolution alone and just set it in game.

Input lag doesn't matter for single player games, you'd *NEVER* notice.  It only matters for online/multiplayer games where you're receiving information from a server and there is latency, because you're seeing something even later than your latency would normally display it.  So if you're playing CSGO the enemy on your screen appears a frame or two later than they should and if they don't have input lag that's a HUGE advantage for them, not to mention you will always be shooting at a position slightly off from where they actually are... meaning you're effectively shrinking the hitboxes and ruining your own registration.  15-30ms doesn't seem like a lot, but it really is, it's a LOT more than a wireless mouse back in the day and those are widely considered garbage too.  That's virtually doubling a really good ping, and let me tell you: when I play competitive counterstrike 50 ping was considered unplayable for matches.  

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.

×