Jump to content

Built in display or (possibly) faster external?

Dr. Dressing
Go to solution Solved by Takumidesh,

I think we are talking about 2 different things. I thought you were asking if the pixel response time changed on a laptop display vs an external display. 

 

ultimately, the cable will add negligible latency, like imperceptible by a human. It all comes down to faster prt and refresh rate. the higher the refresh rate, the less time each frame will be displayed on the monitor, giving you more information. the faster the pixel response time, the crisper the image will be, you will experience less ghosting and smearing. 

 

weather or not the display is hooked up internally or through a displayport cable won't make any difference. Like you said, the biggest bottleneck is yourself on top of, your connection, server tick rate, mouse and keyboard polling, and a myriad of other factors.

 

You could spend time working on your response time to shave off 20ms, instead of stressing about 5ms differences in monitors. 

Alright, so picture this situation;

 

You have a 144hz monitor, say it's fast response time.

- But you have a laptop, and the built in display is directly connected to the motherboard.

 

Which is faster?

 

On one hand, we have a 60hz built in monitor, getting as close to the processing as possible.

On the other hand, we have a fast response time monitor, taking and receiving information from an HDMI port.

 

Does the PC "wait" before updating the external monitor?

Or can it update the fast response monitor faster than the built in monitor?

Link to comment
Share on other sites

Link to post
Share on other sites

the processor or graphics card does not determine the refresh rate and pixel response times of the display, the only difference will be how fast the GPU can push the frames, if it is sending frames to the display faster than the display itself can handle, then the frames just go to waste.

 

I'm not an expert on display tech though.

If your question is answered, mark it so.  | It's probably just coil whine, and it is probably just fine |   LTT Movie Club!

Read the docs. If they don't exist, write them. | Professional Thread Derailer

Desktop: i7-8700K, RTX 2080, 16G 3200Mhz, EndeavourOS(host), win10 (VFIO), Fedora(VFIO)

Server: ryzen 9 5900x, GTX 970, 64G 3200Mhz, Unraid.

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Takumidesh said:

the processor or graphics card does not determine the refresh rate and pixel response times of the display, the only difference will be how fast the GPU can push the frames, if it is sending frames to the display faster than the display itself can handle, then the frames just go to waste.

 

I'm not an expert on display tech though.

But see, now, that's the thing. The graphics card (being a mobile version of the GTX 1050 ti) can easily push beyond the 60hz display. And we both know, that a higher frame rate (not to be confused with refresh rate) with give the screen more updated frames.


I've tested similar, but I am considering buying a new one;

 

I put a 2008 TV up to my PC, and ran the tests there.

I did a frame-by-frame comparison, and to my knowledge, the TV itself is in fact incredibly slow. Despite being advertised as "75hz", it was 40ms behind my built in monitor.

Meaning it gave me no benefit whatsoever. I can't tell if it was because of the old TV, or because of the port.

 

What if we grab a new(-er) monitor?

 

 

[If anyone cares];

Why is this important? I play competitively online, and the time really adds up after a while.

I fixed my internet, going from 40 ms in ping, all the way down to 7 ms.

I fixed my mouse, going from 125hz to 1000hz (So that turning fast doesn't mess up the aim.)

And I fixed my sounds.

 

So, say we get about 180ms (from me, myself.), 20ms from the mouse, and 16ms from the 60hz screen. That piles up to about 216 ms, just to identify an enemy in no time flat.

 

In a game, that would be incredibly slow. Because we don't shoot in no time flat. We react slowly [and as VSauce will explain,] there's an 80ms "wait" time between your brain identifying whether or not something happened at the same time, from your hands to your brain - and that's at its fastest.

Link to comment
Share on other sites

Link to post
Share on other sites

I think we are talking about 2 different things. I thought you were asking if the pixel response time changed on a laptop display vs an external display. 

 

ultimately, the cable will add negligible latency, like imperceptible by a human. It all comes down to faster prt and refresh rate. the higher the refresh rate, the less time each frame will be displayed on the monitor, giving you more information. the faster the pixel response time, the crisper the image will be, you will experience less ghosting and smearing. 

 

weather or not the display is hooked up internally or through a displayport cable won't make any difference. Like you said, the biggest bottleneck is yourself on top of, your connection, server tick rate, mouse and keyboard polling, and a myriad of other factors.

 

You could spend time working on your response time to shave off 20ms, instead of stressing about 5ms differences in monitors. 

If your question is answered, mark it so.  | It's probably just coil whine, and it is probably just fine |   LTT Movie Club!

Read the docs. If they don't exist, write them. | Professional Thread Derailer

Desktop: i7-8700K, RTX 2080, 16G 3200Mhz, EndeavourOS(host), win10 (VFIO), Fedora(VFIO)

Server: ryzen 9 5900x, GTX 970, 64G 3200Mhz, Unraid.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×