Jump to content

Giving a reason to use high end cards on 1080p - BOE announces 500hz Display

williamcll
20 hours ago, porina said:

I'll wait for 1000 Hz displays so it can run synchronously with gaming mouse polling rate.

 

More seriously, doubt I can even see it vs 240 Hz. Even if I could, doubt I can react to it. 

While I would disagree as 240hz vs 360hz was significant noticeable to me so I imagine 240hz vs 500 is pretty jarring to anyone who sees it in person. Also even if you can't make total use of the higher refreshrate it won't change that whatever you do see will be more up to date vs someone with a lower refreshrate monitor making it a competitive advantage regardless. 

Link to comment
Share on other sites

Link to post
Share on other sites

45 minutes ago, Moonzy said:

From the photo, 1ms liquid crystal, so not OLED probably

Which is a tad disappointing

Yeah I'd guess it's maybe IPS though.

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, ouroesa said:

As I said 'just a weenie measuring/basis for being a troll.'

You would be pretty much super human to descern the 4.4ms difference between 144hz (6.9ms) and 400hz (2.5ms). Think you are referring to the game running fast enough to not muck up or miss inputs. Doubt 4.4ms is going to make a difference there - for reference a quick blink is circa 100ms so you can feel something that can fit 22 times in a blink of an eye?
13ms is deemed the unperceivable limit for humans which is around 75frps but Ill halve that to give you the benefit of doubt. Stil 7.5ms .
Nah mate. Sorrry, but Nah.

I think you are completely missing one point here. If it takes 7.5ms to react to something then let's say someone comes around the corner and you see it faster than the guy coming around the corner as your screen refreshed faster than them. If both of you take 7.5ms to react then guess who gets to shoot first? Yeah the one with the 500hz monitor. I think people who say these high refreshrate monitors don't make a difference haven't played on these monitors before in fast pace competitive fps games. I say that because honestly if you play anything other than fps games then you would be hard pressed to see the difference. Also faster pace fps games you can notice it more than slower paced ones. 

Link to comment
Share on other sites

Link to post
Share on other sites

47 minutes ago, Brooksie359 said:

While I would disagree as 240hz vs 360hz was significant noticeable to me so I imagine 240hz vs 500 is pretty jarring to anyone who sees it in person. Also even if you can't make total use of the higher refreshrate it won't change that whatever you do see will be more up to date vs someone with a lower refreshrate monitor making it a competitive advantage regardless. 

Likely not.  You're already in rarified air as someone who can benefit from 360.  If you did a side-by-side-by-side of 240hz, 360hz and 500hz, in a double-blind test--it is likely that you would not get positive ID on the 500hz panel any more than random statistical probability would indicate.  At the point where a user indicates they are experiencing diminishing returns (or no discernible difference) with a 50% increase, is a 5000% increase going to make any difference whatsoever?

 

Eventually the same thing will happen with HDR spec and with resolutions.  But we are WAY WAY far away from that happening (especially for resolutions).  Thus again begging the question why the fool's errand that is 500hz.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, IPD said:

Likely not.  You're already in rarified air as someone who can benefit from 360.  If you did a side-by-side-by-side of 240hz, 360hz and 500hz, in a double-blind test--it is likely that you would not get positive ID on the 500hz panel any more than random statistical probability would indicate.  At the point where a user indicates they are experiencing diminishing returns (or no discernible difference) with a 50% increase, is a 5000% increase going to make any difference whatsoever?

 

Eventually the same thing will happen with HDR spec and with resolutions.  But we are WAY WAY far away from that happening (especially for resolutions).  Thus again begging the question why the fool's errand that is 500hz.

This doesn't change the fact that you would see things earlier with the higher refreshrate monitor and start reacting sooner regardless if you can tell the difference between 240hz or say 360hz. And yeah I guess I might be one of the few as I play overwatch at masters level so 240hz vs 360hz was noticeable especially when play tracer that dashes and moves all over the place. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Brooksie359 said:

While I would disagree as 240hz vs 360hz was significant noticeable to me so I imagine 240hz vs 500 is pretty jarring to anyone who sees it in person. Also even if you can't make total use of the higher refreshrate it won't change that whatever you do see will be more up to date vs someone with a lower refreshrate monitor making it a competitive advantage regardless. 

tldr: I'm saying this tech would probably be useless to me. That doesn't mean it wont be useful to someone else.

 

Longer version:

I've never even seen 240 Hz. The fastest display I have is 165 Hz as it happened to come with the laptop I got. I didn't choose to have it. With a big "it depends on the game" can I see a smoothness difference between say 100 fps and 144 fps? Yes. Never got around to pushing the 165 Hz display to its limits. But diminishing returns means I don't feel a benefit beyond around 90fps or so even if I can see the smoothness above that. I'm only guessing that, personally, I might not be able to see the difference between 240 and 500. I'd even limit to 72 fps (144 Hz half rate) to keep the GPU a bit cooler and quieter. I just don't play games that need ultra-fast reactions. Whatever the real world input to photon latency is of say >60 Hz G-sync is good enough for me and the games I actually play. 60 Hz Vsync can start to get noticeable on some faster games.

 

Actually, at 500 Hz, ping time is probably going to be a bigger factor to optimise if you want the best play conditions possible. Of course, for those who actually play competitively at a high level, it may be worth consideration. The example you mention of 240 vs 360, that's a frame time reduction of 1.4ms. 240 to 500 Hz "improves" that to 2.2ms benefit.  Suffice to say, I'm erm... way older than peak e-sports. Those single digit ms are not going to meaningfully help me. You can better measure my reaction time on a geological time scale.

 

 

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Moonzy said:

I'm just sick of people saying "lul 240hz dont matter over 144hz" when they probably never used one and just saying it based off of reviews or something

i went from 240hz to 144hz and i can feel it, it may not matter to 99.9% of the people out there but to outright say that it's useless for everyone is just bullshit

Like you say one could feel it, and some will be by what you are going to use it for too. But would say that in general 240hz doesn't matter, more so if the panel doesn't even give you good 240hz and much worse image at lower HZ while paying a premium. Like you mention with osu.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Quackers101 said:

Like you say one could feel it, and some will be by what you are going to use it for too. But would say that in general 240hz doesn't matter, more so if the panel doesn't even give you good 240hz and much worse image at lower HZ while paying a premium. Like you mention with osu.

Tbh my 360hz monitor is actually crazy good image quality for 1080p monitor. Even third party reviews showed that its color accuracy was crazy good to the point where you could actually realistically do art work on the monitor. Anyways I wouldn't say high hz monitor have poor image quality especially now that most of them are ips. Another thing to note is that if you are going to buy a 1080p monitor for gaming you might as well get the best. Honestly I got my 360hz monitor because I wanted an ips monitor as prior I had a 240hz tn monitor that they don't even sell anymore. 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, IPD said:

Likely not.  You're already in rarified air as someone who can benefit from 360.  If you did a side-by-side-by-side of 240hz, 360hz and 500hz, in a double-blind test--it is likely that you would not get positive ID on the 500hz panel any more than random statistical probability would indicate.  At the point where a user indicates they are experiencing diminishing returns (or no discernible difference) with a 50% increase, is a 5000% increase going to make any difference whatsoever?

 

Eventually the same thing will happen with HDR spec and with resolutions.  But we are WAY WAY far away from that happening (especially for resolutions).  Thus again begging the question why the fool's errand that is 500hz.

Because if you are going to play at 1080p for esports games why not get the fastest monitor you can get? It's better regardless if you can identify it. Honestly 360hz is super smooth so I assume that 500hz would be crazy smooth as well. 

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/30/2022 at 10:36 AM, Bombastinator said:

I personally find anything over 100hz is more or less wasted on me.  I’m not everyone. 

Same.

I'm sure there's a difference. I doubt I'd appreciate it. I'll give it a shot at some point.

As it stands we need faster displayport/HDMI standards. Those are legitimate limiters.

My dream is probably something like 5K (5120x2880) 240Hz or 8K (7680x4320) 120Hz (ideally on the same screen with different modes as options). A 65" TV that you can use as 4x virtual 32" 4K monitors should work nicely. We need 2x the bandwidth for that.

 

2 hours ago, Brooksie359 said:

Tbh my 360hz monitor is actually crazy good image quality for 1080p monitor. Even third party reviews showed that its color accuracy was crazy good to the point where you could actually realistically do art work on the monitor. Anyways I wouldn't say high hz monitor have poor image quality especially now that most of them are ips. Another thing to note is that if you are going to buy a 1080p monitor for gaming you might as well get the best. Honestly I got my 360hz monitor because I wanted an ips monitor as prior I had a 240hz tn monitor that they don't even sell anymore. 

 

While I wouldn't call some of the better IPS monitors POOR in terms of image quality, these days the best LCD panels are arguably Samsung's top VA panels. Lower refresh times (if you're willing to accept some overshoot error), better contrast, slightly better color uniformity. The only downsides: moderately reduced viewing angles and slightly reduced color gamut.

https://www.rtings.com/monitor/learn/ips-vs-va-vs-tn
https://www.rtings.com/tv/tools/compare/samsung-qn90a-qled-vs-samsung-qn85a-qled/21551/21552?usage=1&threshold=0.10

3900x | 32GB RAM | RTX 2080

1.5TB Optane P4800X | 2TB Micron 1100 SSD | 16TB NAS w/ 10Gbe
QN90A | Polk R200, ELAC OW4.2, PB12-NSD, SB1000, HD800
 

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, cmndr said:

Same.

I'm sure there's a difference. I doubt I'd appreciate it. I'll give it a shot at some point.

As it stands we need faster displayport/HDMI standards. Those are legitimate limiters.

My dream is probably something like 5K (5120x2880) 240Hz or 8K (7680x4320) 120Hz (ideally on the same screen with different modes as options). A 65" TV that you can use as 4x virtual 32" 4K monitors should work nicely. We need 2x the bandwidth for that.

 

 

While I wouldn't call some of the better IPS monitors POOR in terms of image quality, these days the best LCD panels are arguably Samsung's top VA panels. Lower refresh times (if you're willing to accept some overshoot error), better contrast, slightly better color uniformity. The only downsides: moderately reduced viewing angles and slightly reduced color gamut.

https://www.rtings.com/monitor/learn/ips-vs-va-vs-tn

 

Yeah I like ips better than VA tbh. If I was going for 1440p I would probably go for the Samsung 240hz monitor which is VA but for 1080p I would still prefer my current monitor. Very nice picture quality. 

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, Quackers101 said:

very high, yes.

but higher than 60? no.

Although its not only about higher HZ too, the panel has to be fast enough and give good image quality when doing so, while the software side has to work too. (GPU, G/Vsync, frame timings etc). I would say that 144-175 could be a fine range and will get better experience. Like others say, not everyone are going to feel it and might need to do a comparison between them. However it will still give benefits to everyone to double the HZ from 60. Better animation and more, of course if one can have enough FPS and isn't locked to 60. So any game that does a lot of visuals and can run above 60 fps with little delay, would look better. Also to the HDR experience. QD OLED 🙂

I specefically said THESE, meanting 500hz.
I also mentioned my use cases where it is more and less noticeable. 
I suggested 120hz to be a good spot.
I also mentioned gsync.

Not sure what the point was of your comment. I dont know what the point is of me typ[ing this comment. 

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Brooksie359 said:

I think you are completely missing one point here. If it takes 7.5ms to react to something then let's say someone comes around the corner and you see it faster than the guy coming around the corner as your screen refreshed faster than them. If both of you take 7.5ms to react then guess who gets to shoot first? Yeah the one with the 500hz monitor. I think people who say these high refreshrate monitors don't make a difference haven't played on these monitors before in fast pace competitive fps games. I say that because honestly if you play anything other than fps games then you would be hard pressed to see the difference. Also faster pace fps games you can notice it more than slower paced ones. 

It does not take anybody 7.5ms to react. Maybe 150ms+? So difference between 154ms and 160ms is <3%. Not gonna make a meaningful difference. 
https://en.wikipedia.org/wiki/Mental_chronometry#/media/File:Reaction_time_density_plot.svg

 

 

The amount of redicilous hypotheticals people make up to try and prove a point is silly.  The basis of an argument needs to be sound, at the least. 

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Brooksie359 said:

Yeah I like ips better than VA tbh. If I was going for 1440p I would probably go for the Samsung 240hz monitor which is VA but for 1080p I would still prefer my current monitor. Very nice picture quality. 

It really DOES depend. The 35" ultra-wide IPS I have next to my 55" VA generally has poorer image quality. It's also 5 years older and was a much cheaper unit overall.

There's a lot that goes into a display beyond the panel type and ALL of the panel types have really improved in the last decade. A lot of the "IPS IS AMAZING" mantra comes from the situation 15ish years ago.

Don't get me wrong, budget VA displays DEFINITELY have their deficiencies. I've just become more open to VA in general since I've actually gotten some (bought my mother one, gifted one to a step parent that I got for free, got a TV/monitor that's a 120Hz 4K VA panel).

I do half expect that future OLED variants will render the point moot. We're still a few years out though.

3900x | 32GB RAM | RTX 2080

1.5TB Optane P4800X | 2TB Micron 1100 SSD | 16TB NAS w/ 10Gbe
QN90A | Polk R200, ELAC OW4.2, PB12-NSD, SB1000, HD800
 

Link to comment
Share on other sites

Link to post
Share on other sites

Back in the day, didn't several plasma tvs have refresh rates that high? But looks like they are really trying to push boundaries, and xbox and ps are like "I'll give you 60@1080p. Thats the best I can do."

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, cmndr said:

Same.

I'm sure there's a difference. I doubt I'd appreciate it. I'll give it a shot at some point.

As it stands we need faster displayport/HDMI standards. Those are legitimate limiters.

My dream is probably something like 5K (5120x2880) 240Hz or 8K (7680x4320) 120Hz (ideally on the same screen with different modes as options). A 65" TV that you can use as 4x virtual 32" 4K monitors should work nicely. We need 2x the bandwidth for that.

 

 

While I wouldn't call some of the better IPS monitors POOR in terms of image quality, these days the best LCD panels are arguably Samsung's top VA panels. Lower refresh times (if you're willing to accept some overshoot error), better contrast, slightly better color uniformity. The only downsides: moderately reduced viewing angles and slightly reduced color gamut.

https://www.rtings.com/monitor/learn/ips-vs-va-vs-tn
https://www.rtings.com/tv/tools/compare/samsung-qn90a-qled-vs-samsung-qn85a-qled/21551/21552?usage=1&threshold=0.10

I’m not sure there’s a difference but not because I can’t see one.  Apparently when 240 was tested no one could see it either without equipment, but they proved there was an improvement for hit boxes.  I allow that there might be one for pro athlete gamers.  But I don’t think anyone human is going to be able to see a difference.  Could be wrong about that.  I think it should be independently tested before any claims are made though.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Bombastinator said:

I’m not sure there’s a difference but not because I can’t see one.  Apparently when 240 was tested no one could see it either without equipment, but they proved there was an improvement for hit boxes.  I allow that there might be one for pro athlete gamers.  But I don’t think anyone human is going to be able to see a difference.  Could be wrong about that.  I think it should be independently tested before any claims are made though.

There's certainly a difference between perceiving a difference and benefiting from it.

Especially in a game where it's a series of winner-takes-all mini races. Though most genres aren't so sensitive (think racing games, RTS, etc.).

If you're playing in auto-matches at most the difference you'll get is that you're against people SLIGHTLY lower ranked. Not exactly a big difference in experience.

My personal evaluation criteria is "can I have fun and enjoy myself?" and the bar for that is relatively low.
 

3900x | 32GB RAM | RTX 2080

1.5TB Optane P4800X | 2TB Micron 1100 SSD | 16TB NAS w/ 10Gbe
QN90A | Polk R200, ELAC OW4.2, PB12-NSD, SB1000, HD800
 

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, cmndr said:

There's certainly a difference between perceiving a difference and benefiting from it.

Especially in a game where it's a series of winner-takes-all mini races. Though most genres aren't so sensitive (think racing games, RTS, etc.).

If you're playing in auto-matches at most the difference you'll get is that you're against people SLIGHTLY lower ranked. Not exactly a big difference in experience.

My personal evaluation criteria is "can I have fun and enjoy myself?" and the bar for that is relatively low.
 

Placebo effect is real and pro athletes need all the psychological motivation they can find.   I’d want to see a measurable difference beyond placebo effect between a merely ridiculously fast screen like 240hz and this one though.  Since there apparently isn’t a visible difference testing it should be pretty easy. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Bombastinator said:

Placebo effect is real and pro athletes need all the psychological motivation they can find.   I’d want to see a measurable difference beyond placebo effect between a merely ridiculously fast screen like 240hz and this one though.  Since there apparently isn’t a visible difference testing it should be pretty easy. 

Part of the issue is that pros are VERY likely to have much better response times than normal people. It's very likely that their physiology is in some ways different from ordinary peoples'.

Also, when stakes are higher I suspect that there's greater physiological arousal.

 

I can see an argument for "the stakes are high, use the best gear"

When I play a single player game, the stakes are 0. When I play a multiplayer game... also 0. No one cares about how good or bad I am.

3900x | 32GB RAM | RTX 2080

1.5TB Optane P4800X | 2TB Micron 1100 SSD | 16TB NAS w/ 10Gbe
QN90A | Polk R200, ELAC OW4.2, PB12-NSD, SB1000, HD800
 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, cmndr said:

Part of the issue is that pros are VERY likely to have much better response times than normal people. It's very likely that their physiology is in some ways different from ordinary peoples'.

Also, when stakes are higher I suspect that there's greater physiological arousal.

 

I can see an argument for "the stakes are high, use the best gear"

When I play a single player game, the stakes are 0. When I play a multiplayer game... also 0. No one cares about how good or bad I am.

They might.  Probably do. I doubt they’re massively better than say major league batters though.  And a lot of research has been done on them visual acuity wise. There is a maximum.  It is apparently a good deal higher than originally thought, but even 240hz was brushing not useful for anyone but a handful of people.  I don’t know if there’s anything useful above 240 or not.  There apparently wasn’t at 300, or whatever the last maximum was.  Someone will get ahold of one of the things and test it.  A few will be sold even if it provides no tangible benefit because the audiophile mindset occurs in a lot of places besides audio gear.  Myself my interest wanes after 100hz. 120 is fine because it’s more than 100, but I find no difference between it and 144.  That’s just me though. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Bombastinator said:

A few will be sold even if it provides no tangible benefit because the audiophile mindset occurs in a lot of places besides audio gear. 

I am actively working to tone this down myself (you can see a partial list of my audiophile gear - I have 4 extra ceiling speakers in my atmos set up that aren't cited).

"Enough" is a VERY powerful word that's very underrated. Know when enough is enough really goes a long way in life.

 

3900x | 32GB RAM | RTX 2080

1.5TB Optane P4800X | 2TB Micron 1100 SSD | 16TB NAS w/ 10Gbe
QN90A | Polk R200, ELAC OW4.2, PB12-NSD, SB1000, HD800
 

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/30/2022 at 6:40 PM, ouroesa said:

Think I echo everyone when I say these high refreshrates are wasted resources, for extremely isoteric usecases and just a weenie measuring/basis for being a troll.

Nobody "needs" a monitor over 60hz, many people desire something between 60 and 120, for various reasons, but generally nothing that matters short of VR experiences. The worst use of high-framerate monitors/televisions are "soap opera effect" which makes 24fps theatrical films looks like they were filmed on a handheld camcorders. 24fps x 5 = 120fps. Hence duplicate the frame 5 times without interpolation and it looks correct. 60hz duplicate the frame twice without interpolation, and it looks correct. That's why 120fps exists at all.

24fps drop frame to 60fps does not look correct. 3:2 pulldown just isn't what you see in the theatre.

 

NTSC Television is 59.94 drop-frame. PAL Television is 50. Neither of these framerates are suitable for theatrical film, and 59.94 NTSC isn't good for game synchronization. Games are literately synced to 30, 60, or some multiplier of 30, and when the GPU can't keep up, then you get sluggish input combined with low frame rates.

 

500fps? I can't think of a use case.

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/31/2022 at 4:56 AM, Bombastinator said:

My suspicion is the sweet spot is going to be somewhere a bit north of 90hz. That was the point where even the flicker in type went away on CRTs. Sony and Microsoft chose 120hz.  I suspect they did testing. 

CRTs use different technology .. an electron gun heating up a layer of phosphorus on the surface of the screen as it goes from top to bottom. Flicker was caused by the phosphorus cooling down and changing in brightness as the gun moved down towards the bottom. By the time the monitor was heating the last line of pixels, the top lines were already cooling, lower brightness. 

So they needed at least 75-85 Hz or updates a second to reduce that variation of brightness - some humans are more sensitive and need 85 or more, others are fine with 75. 

 

LCDs don't need to be periodically refreshed, once the pixels are set they remain like that, the brightness doesn't vary as the light is produced by the CFFL tubes or by leds on the edges of the LCD glass. 

 

They went with 60 Hz because it made TVs cheaper to manufacture, they could use the 60 Hz AC frequency to time things in the first black and white TVs - same reason they went with 50 Hz in Europe.   It also made recording things in TV studios easier, because flicker of incandescent lights would not affect cameras that much that way.

 

It was also a good refresh rate for movies, 60 hz is twice 29.976 / 30 Hz  ... so they could send half a frame then next half a frame and interlace them and you get 30 fps movies. 

 

120 Hz was chosen probably because it was already standardized for 3D / stereoscopic and the HDMI / whatever standards supported it, and because it's 2 times 60 hz .. makes it easy to do some bit shifting, or duplicating frames or whatever, and games can run at 24 fps  (5 x 24 = 120), 30 fps (4 x 30  = 120), 60 fps (2 x 60 = 120) , 90 fps  and 120 fps...  would also allow to simplify switching between refresh rates on a scene by scene basis just like some games will render some scenes at lower resolutions to maintain a minimum fps.

Link to comment
Share on other sites

Link to post
Share on other sites

19 hours ago, IPD said:

The fastest reaction time a human can have is about .15 sec.  MIGHT be able to get that down to .10 or .09 if you are a professional athlete; but doubtful.  500hz is .002 sec; so 40x faster than even a godly .08 reaction time.

 

I'm calling shiens on this being of use, even for OSU die-hards.

i would also love to add that even though my reaction time is around ~180ms averagely, i could hit objects that are ~80ms apart from one another

 

example:

Spoiler

this song is 414BPM, the circle are mapped at 1/2 spacing, meaning the effective BPM for the circles are 828bpm, which translate to around 72ms between the notes (u can verify this by using > key to move the youtube video frame by frame and count the frames between me hitting each circle, 72ms ~= 5 frames in 60fps video)

while my reaction time can be seen right after the pause before the first note (around 1:38)
it took me around 11 frames (11*16.7 = 183ms)  from when the circle to appear to before my cursor shoots towards it

so humans can have action for things below human reaction time, as long as they have the time to process it


and in osu, the time window to get a perfect hit is:

image.png.c598a6d9fb44b8fc90ac9916adf2de2a.png

 

+-19.5ms, way lower than 180ms
it'll count as a miss if you're off by 100ms

 

there's more to human response than just reaction time, is my point

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, ouroesa said:

It does not take anybody 7.5ms to react. Maybe 150ms+? So difference between 154ms and 160ms is <3%. Not gonna make a meaningful difference. 
https://en.wikipedia.org/wiki/Mental_chronometry#/media/File:Reaction_time_density_plot.svg

 

 

The amount of redicilous hypotheticals people make up to try and prove a point is silly.  The basis of an argument needs to be sound, at the least. 

You realize that 3% means the difference between winning and losing a fight in a fast pace fps game. Guess what happens when I click your head 3% faster than you do? I end up winning the duel. Also I find it hilarious when people think that higher hz monitors don't give a distinct competitive advantage. I played overwatch when it first came out on a 60hz monitor and peaked at about high gold low Plat. I got a 240hz monitor and then I went from that to high diamond low masters in about a couple months so I call bs on your stats because statistically 60hz vs 240hz isn't that large when you have a reaction time of 150ms yet the improvement it had to my gameplay was incredibly significant. I mean one very distinct things I remember when I got the monitor was that it was way easier to hit headshots during the duration of mccrees stun as you can actually see the stunned person sooner and more clearly. This was a huge game changer as especially against tracer players as if you get the flashbang stun plus headshot off on the tracer they are dead. Anyways I think you are seriously underestimating the advantage higher hz gives and while stats might make it seem like a negligible advantage I would say it's not nearly as negligible as you think. 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×