Jump to content

2560x1440p 240Hz IPS 1ms - Eve Spectrum gaming monitors, featuring LG's panels

Doobeedoo
On 2/3/2020 at 6:27 PM, RejZoR said:

How is it useless? This is IPS, something that was basically impossible before. Also 1440p is fine and 240Hz means you'll never have to enable V-Sync on systems that might not hit such ridiculous framerates. Or if you play super competitive and those games easily push out 500fps on top end systems.

No need for V-sync these have adaptive sync.

 

< content removed >

Edited by LogicalDrm

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Doobeedoo said:

It's a bad and very limited, thus not good thing for this topic. It's comparing numbers from one field to another and that's it, holds no basis. There's much more to it than just number vs number and "it's like.." comment.

It's easily still quite a difference between 60 to 120 though, it's well known.

 

Topic already derailed quite a bit. All I was saying is that there's definitely difference and improvement with 240Hz to those that continue saying there's almost none. Yet they them selves don't even play or use it, but continue saying stuff that is factually wrong.

Analogies are like that.  One could make comparisons between toasters and aspects of politics for example (I don’t know what but I’m sure one could be found). They’re ways to describe concepts.  I disagree about the analogy being bad, because in this context bad means “not useful” instead of “wrong” an analogy is a model and all models are by definition wrong.  Sometimes they are useful though.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

< content removed >

 

I am saying (and it has been proven on LTT video) that the visual difference between 144 and 240 is not apparent even to professionals though they find that it is visible using slow mo and magnification, and that for people playing those kinds of games it is noticeable as far as hits made on moving targets goes in those sorts of games.  My point is that not everyone even plays those sorts of games in the first place for the reasons I outlined.

Edited by LogicalDrm

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Bombastinator said:

Analogies are like that.  One could make comparisons between toasters and aspects of politics for example (I don’t know what but I’m sure one could be found). They’re ways to describe concepts.  I disagree about the analogy being bad, because in this context bad means “not useful” instead of “wrong” an analogy is a model and all models are by definition wrong.  Sometimes they are useful though.

It simply dilutes the topic and points in it, this topic is not a good thing to have analogy, it brings nothing to context, it's a mute point. Giving analogy for something like this makes it a smoke screen and closes of many details actually needed to understand it. It is a waste here. Can't really apply it in every convo.

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

https://forums.blurbusters.com/viewtopic.php?t=4914

 

 

The 240Hz player has .0042 seconds between frames.
The 144Hz has .0069 seconds between frames.
So technically the 240Hz player could see you 2.7ms sooner. It would take the average player 200-300ms to even respond.
So realistically, no, you both appear at the same time to a human.

Additionally, you have to take into account tick rate and latency on servers. Some games have "peekers advantage" where they will see you first due to server latency, refresh rate is not important.

Ryzen 3600 4.33ghz . CM Hyper 212 Turbo. MSI X470 Gaming Plus Max. Crucial Ballistix Sport LT 3200 @ 3200 CL 15 (OC). Powercolor RX 5700XT Red Dragon. FD Meshify S2. Crucial P1 M.2 1TB. Corsair Vengeance 650W 80+ Silver.

 

Core 2 Duo 3.4ghz . WenjuFeng cooler . ASUS P5G41C-M LX . Crucial 1066mhz 3GB DDR2 . Gainward Golden Sample HD 4850 . Coolermaster Elite 430 . Seagate 160GB IDE 7200RPM . BeQuiet System Power 9 400w 80+ Bronze

Link to comment
Share on other sites

Link to post
Share on other sites

Any refresh rate and frame rate advantage you may have is practically negated by the fact the server runs on ticks at a much slower polling rate on top of how often the game itself runs a loop. So it doesn't matter who sees who first. It's whoever gets their shot in on a particular tick that counts. I can count the number of times with multiple hands when I ran a dungeon in FFXIV where I had plenty of time to drop an invulnerability cooldown only to die because I didn't pop it at the right tick. And no, I wasn't lagging, unless by some miracle all of those instances was from lag.

 

The only reason I can think of why professionals use high refresh rate monitors is:

  • Shifts blame more on the skill of the player than the equipment
  • Placebo
Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Mira Yurizaki said:

Any refresh rate and frame rate advantage you may have is practically negated by the fact the server runs on ticks at a much slower polling rate on top of how often the game itself runs a loop. So it doesn't matter who sees who first. It's whoever gets their shot in on a particular tick that counts. I can count the number of times with multiple hands when I ran a dungeon in FFXIV where I had plenty of time to drop an invulnerability cooldown only to die because I didn't pop it at the right tick. And no, I wasn't lagging, unless by some miracle all of those instances was from lag.

 

The only reason I can think of why professionals use high refresh rate monitors is:

  • Shifts blame more on the skill of the player than the equipment
  • Placebo

Regarding tickrate, this is an interesting video 

 

I never tried 240Hz so can't really comment from an experience but it would be nice anyways, at least to increase the range of VRR. 

Link to comment
Share on other sites

Link to post
Share on other sites

< content removed >

 

On 2/4/2020 at 1:21 AM, cesh me inside b0z said:

https://forums.blurbusters.com/viewtopic.php?t=4914

 

 

The 240Hz player has .0042 seconds between frames.
The 144Hz has .0069 seconds between frames.
So technically the 240Hz player could see you 2.7ms sooner. It would take the average player 200-300ms to even respond.
So realistically, no, you both appear at the same time to a human.

Additionally, you have to take into account tick rate and latency on servers. Some games have "peekers advantage" where they will see you first due to server latency, refresh rate is not important.

I've seen that topic long ago. First, all panels are not made the same, some are better some worse at same Hz though. Some are not 'proper' 144Hz or 240Hz so there is issue there too. He states he can't sustain high fps that has nothing to do with 240Hz monitor. In number of competitive fps games you easily can. Those games even at average can run higher. So obviously you want to sustain 240fps min on a good 240Hz that has proper frame compliance. So yeah having fluctuating framerate, inconsistent frametimes and not really a good monitor that properly reduces blur, yeah it will feel shit. 

Average player... well a good player is more inline of 150ms reaction for example. That also doesn't give a full picture. It's like saying human reaction is much slower compared to monitor response time thus no need for it.

 

Tick rate and latency are another matter though. Depends how the games netcode is optimized that's for sure. That doesn't make that refresh rate is not important.

Edited by LogicalDrm

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, Mira Yurizaki said:

Any refresh rate and frame rate advantage you may have is practically negated by the fact the server runs on ticks at a much slower polling rate on top of how often the game itself runs a loop. So it doesn't matter who sees who first. It's whoever gets their shot in on a particular tick that counts. I can count the number of times with multiple hands when I ran a dungeon in FFXIV where I had plenty of time to drop an invulnerability cooldown only to die because I didn't pop it at the right tick. And no, I wasn't lagging, unless by some miracle all of those instances was from lag.

 

The only reason I can think of why professionals use high refresh rate monitors is:

  • Shifts blame more on the skill of the player than the equipment
  • Placebo

That's so not true, it doesn't work like that. By that logic no point going over 60Hz because many games tickrate is barely so and we have higher pings than monitors response. 

Comparing MMO tickrate which is around 20 is nowhere near and fps game. I've encountered such issues in WoW which is much more polished game and with low ping. 

 

Don't know how you thought about that. Definitely eliminates the equipment limiting factor which is great. Saying it's placebos is blatantly false.

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, Doobeedoo said:

That's so not true, it doesn't work like that. By that logic no point going over 60Hz because many games tickrate is barely so and we have higher pings than monitors response. 

And that's exactly what I'm saying. Any extra frames being rendered for the purposes of having an advantage is mitigated by the fact the server tick rate tends to be slower. The only advantage you get is maybe being able to see something sooner, but the practical perceived difference between 60, 120, and 240 FPS becomes increasingly harder. This is on top of the fact that reaction times are at least a couple of frames at 60 FPS anyway and that something you see sooner may not be enough to positively identify what it is you're shooting, maybe except in a 1v1 match.

 

The only thing that matters is who was aiming at the right spot and pulled the trigger in the time frame the client is supposed to send their snapshot. It doesn't matter who technically shot first, it's happening at the same time as far as the server is concerned. And even then, the server may do lag compensation and adjust the results as necessary.

 

Quote

Comparing MMO tickrate which is around 20 is nowhere near and fps game. I've encountered such issues in WoW which is much more polished game and with low ping.

Tick rate is irrelevant to the problem that if you don't submit an action within a time slot of a server tick, your action doesn't matter because something else got to it first.

 

Quote

Saying it's placebos is blatantly false.

Placebo is a thing that can affect performance.

Link to comment
Share on other sites

Link to post
Share on other sites

That 4k option looks quite nice.

 

Absolute steal too.

Our Grace. The Feathered One. He shows us the way. His bob is majestic and shows us the path. Follow unto his guidance and His example. He knows the one true path. Our Saviour. Our Grace. Our Father Birb has taught us with His humble heart and gentle wing the way of the bob. Let us show Him our reverence and follow in His example. The True Path of the Feathered One. ~ Dimboble-dubabob III

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, Mira Yurizaki said:

And that's exactly what I'm saying. Any extra frames being rendered for the purposes of having an advantage is mitigated by the fact the server tick rate tends to be slower. The only advantage you get is maybe being able to see something sooner, but the practical perceived difference between 60, 120, and 240 FPS becomes increasingly harder. This is on top of the fact that reaction times are at least a couple of frames at 60 FPS anyway and that something you see sooner may not be enough to positively identify what it is you're shooting, maybe except in a 1v1 match.

 

The only thing that matters is who was aiming at the right spot and pulled the trigger in the time frame the client is supposed to send their snapshot. It doesn't matter who technically shot first, it's happening at the same time as far as the server is concerned. And even then, the server may do lag compensation and adjust the results as necessary.

Server sends players info of shots and positions though nothing to do with framerate on local machine, you still see smoother in action. What you're saying is would like a static server lag test with both enemies standing still like Battlenonsense shows.

13 minutes ago, Mira Yurizaki said:

Tick rate is irrelevant to the problem that if you don't submit an action within a time slot of a server tick, your action doesn't matter because something else got to it first.

Really it's isolating the case, like I said above and also that's speaking of hit reg alone.

14 minutes ago, Mira Yurizaki said:

Placebo is a thing that can affect performance.

What does that even mean though, because it really isn't placebo.

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

*** Thread cleaned and locked ***

 

People, as a reminder, you all have different opinions. And they don't get changed easily, its even harder if the other person is not willing to look into things and educate themselves from multiple sources. So going around and around with arguments that lead to nowhere is waste of everyone's time.

^^^^ That's my post ^^^^
<-- This is me --- That's your scrollbar -->
vvvv Who's there? vvvv

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.

×