Jump to content

Giving a reason to use high end cards on 1080p - BOE announces 500hz Display

williamcll
2 hours ago, Bombastinator said:

I’m not sure if there’s 5 9s on it, but definitely one or two

99.999% corresponds with 1/100,000 NEEDING high performance.

https://techjury.net/blog/esports-growth/#gref
>Active professional players are currently 10,537 in total and the number of all players is changing every day.

So 99.99985% using that figure. If you assumed that there are 10x as many people as cited on that site, then you'd get to around 99.999%


 

Quote

It's crazy how you explained how clearly this monitor isn't targeting everyone when I don't think anyone is denying that. I think people are saying this does have a target audience and for them a 500hz monitor is probably worth it.

I won't speak out against more/better features in the future as display bandwidths improve. At the present though, people are likely better off focusing on resolution and image quality over higher refresh rates. With today's tech the sweet spot for most people is likely one of: 1440p@240Hz, 4K@120Hz or something intermediate (e.g. 3440x1440@160Hz)

With that said, this monitor is being marketed to people that DO think they 'need' it. A dream is being sold, not a reality.

3900x | 32GB RAM | RTX 2080

1.5TB Optane P4800X | 2TB Micron 1100 SSD | 16TB NAS w/ 10Gbe
QN90A | Polk R200, ELAC OW4.2, PB12-NSD, SB1000, HD800
 

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/31/2022 at 6:58 AM, RejZoR said:

And you'll only get CSGO anywhere those framerates. Rather silly and only really useful for CSGO pros who need that last only advantage.

tbh youre also able to get minecraft to those frames. But then i highly doubt you really need that many frames for that anyway

✨FNIGE✨

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, cmndr said:

99.999% corresponds with 1/100,000 NEEDING high performance.

https://techjury.net/blog/esports-growth/#gref
>Active professional players are currently 10,537 in total and the number of all players is changing every day.

So 99.99985% using that figure. If you assumed that there are 10x as many people as cited on that site, then you'd get to around 99.999%


 

I won't speak out against more/better features in the future as display bandwidths improve. At the present though, people are likely better off focusing on resolution and image quality over higher refresh rates. With today's tech the sweet spot for most people is likely one of: 1440p@240Hz, 4K@120Hz or something intermediate (e.g. 3440x1440@160Hz)

With that said, this monitor is being marketed to people that DO think they 'need' it. A dream is being sold, not a reality.

1/100,000 seems high.  Even 1/10,000 seems a bit high.  1/1000 or 1/100 seems more reasonable.  Even that doesn’t change the effect of the percentage though.  It’s possible I guess that the real number is closer to one in a million as you say.  On this site it’s well over 50% though.  I still suspect that it’s possible basically no one can tell the difference between 500 and 240, most can’t tell the difference between 500 and 144, and many can’t tell the difference between 100 and 500.  Testing I think is needed.

 

5 hours ago, cmndr said:

99.999% corresponds with 1/100,000 NEEDING high performance.

https://techjury.net/blog/esports-growth/#gref
>Active professional players are currently 10,537 in total and the number of all players is changing every day.

So 99.99985% using that figure. If you assumed that there are 10x as many people as cited on that site, then you'd get to around 99.999%


 

I won't speak out against more/better features in the future as display bandwidths improve. At the present though, people are likely better off focusing on resolution and image quality over higher refresh rates. With today's tech the sweet spot for most people is likely one of: 1440p@240Hz, 4K@120Hz or something intermediate (e.g. 3440x1440@160Hz)

With that said, this monitor is being marketed to people that DO think they 'need' it. A dream is being sold, not a reality.

The issue with 1440p is that it doesn’t cut down to 1080p well, though it does cut down to 720p well. 4k does though.  Usually.  TV manufacturers have been playing fast and loose with the meaning of 4k though and the result is it doesn’t always. A 1440p monitor will still do 1080p of course but there is interpolation and consequent lag increase. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, FnigePython said:

tbh youre also able to get minecraft to those frames. But then i highly doubt you really need that many frames for that anyway

It’s doubtful imho that many frames/sec are needed for much of anything.  Could be wrong though.  I’d want to see testing before committing  to anything.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, Bombastinator said:

1/100,000 seems high.  Even 1/10,000 seems a bit high.  1/1000 or 1/100 seems more reasonable.  Even that doesn’t change the effect of the percentage though.  It’s possible I guess that the real number is closer to one in a million as you say.  On this site it’s well over 50% though.  I still suspect that it’s possible basically no one can tell the difference between 500 and 240, most can’t tell the difference between 500 and 144, and many can’t tell the difference between 100 and 500.  Testing I think is needed.

 

The issue with 1440p is that it doesn’t cut down to 1080p well, though it does cut down to 720p well. 4k does though.  Usually.  TV manufacturers have been playing fast and loose with the meaning of 4k though and the result is it doesn’t always. A 1440p monitor will still do 1080p of course but there is interpolation and consequent lag increase. 

Thus the point of why we should be pushing resolution + speed, not just capping at 1080p and trying push refresh rates up so high that you are dividing existence by 0.  2160p is a wonderful thing, as it is 3x 720p and 2x 1080p; making it divisible both ways.  And if 2160p @ 240hz was the focus of tech improvements--we would probably not even be having this argument.

 

p.s.

There's no way 1/1000 users can benefit from 500hz panels.  Probably not even 360hz panels.  Take the aggregate total of all competitive gamers (actual competitions, actual prize winning potential, etc) and then multiply that total by 10x....and you still are nowhere near 1/1000th of the population.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, Bombastinator said:

1/100,000 seems high.  Even 1/10,000 seems a bit high.  1/1000 or 1/100 seems more reasonable.  Even that doesn’t change the effect of the percentage though.  It’s possible I guess that the real number is closer to one in a million as you say.  On this site it’s well over 50% though.  I still suspect that it’s possible basically no one can tell the difference between 500 and 240, most can’t tell the difference between 500 and 144, and many can’t tell the difference between 100 and 500.  Testing I think is needed.

Do you have any data showing that 1/100 people on this planet (70 million people) are professional game players that play twitch FPS titles?

Now, it is arguable that I should shift the denominator from global population to "gamers" If we use Steams' 120 million monthly active users as a base (assume there's 3 other platforms out there but only 1/3 people play a genre that is performance sensitive) and 10,000 people as professional gamers

You still end up with 99.992% of the "gamer population" not being well served, despite a much more conservative denominator.

Saying "well on all the gaming forums I go to, people obsess over gaming stuff so that number must be higher" isn't exactly rigorous reasoning, especially when you can approach the question from multiple difference angles while being "generous" to the counter argument and still get ratios that are 10-10000x smaller that you're citing without evidence.

16 hours ago, Bombastinator said:

The issue with 1440p is that it doesn’t cut down to 1080p well, though it does cut down to 720p well. 4k does though.  Usually.  TV manufacturers have been playing fast and loose with the meaning of 4k though and the result is it doesn’t always. A 1440p monitor will still do 1080p of course but there is interpolation and consequent lag increase. 


Is there anything wrong with having the GPU do the interpolation?
As it stands we're getting better and better "interpolation" each year a la FSR and DLSS.

The downsides from pixel-miss matching also aren't as bad at 1440p or 4K are also, in some sense, 1/4th as bad as they were with lower resolutions (720p and 1080p) - more pixels means that discretization errors aren't as big of a deal.

3900x | 32GB RAM | RTX 2080

1.5TB Optane P4800X | 2TB Micron 1100 SSD | 16TB NAS w/ 10Gbe
QN90A | Polk R200, ELAC OW4.2, PB12-NSD, SB1000, HD800
 

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, cmndr said:

Do you have any data showing that 1/100 people on this planet (70 million people) are professional game players that play twitch FPS titles?

Now, it is arguable that I should shift the denominator from global population to "gamers" If we use Steams' 120 million monthly active users as a base (assume there's 3 other platforms out there but only 1/3 people play a genre that is performance sensitive) and 10,000 people as professional gamers

You still end up with 99.992% of the "gamer population" not being well served, despite a much more conservative denominator.

Saying "well on all the gaming forums I go to, people obsess over gaming stuff so that number must be higher" isn't exactly rigorous reasoning, especially when you can approach the question from multiple difference angles while being "generous" to the counter argument and still get ratios that are 10-10000x smaller that you're citing without evidence.


Is there anything wrong with having the GPU do the interpolation?
As it stands we're getting better and better "interpolation" each year a la FSR and DLSS.

The downsides from pixel-miss matching also aren't as bad at 1440p or 4K are also, in some sense, 1/4th as bad as they were with lower resolutions (720p and 1080p) - more pixels means that discretization errors aren't as big of a deal.

It would burn cycles, but if you’ve got cycles to burn I doubt it.   Probably depends.  As far as the lag goes it would depend on which does it faster.  The monitor is more likely to do it in hardware so I suspect the lag would be lower.  There is always some lag so it’s a question of if the difference is meaningful or not.  Again, it depends.

Edited by Bombastinator

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, cmndr said:

Do you have any data showing that 1/100 people on this planet (70 million people) are professional game players that play twitch FPS titles?

Now, it is arguable that I should shift the denominator from global population to "gamers" If we use Steams' 120 million monthly active users as a base (assume there's 3 other platforms out there but only 1/3 people play a genre that is performance sensitive) and 10,000 people as professional gamers

You still end up with 99.992% of the "gamer population" not being well served, despite a much more conservative denominator.

Saying "well on all the gaming forums I go to, people obsess over gaming stuff so that number must be higher" isn't exactly rigorous reasoning, especially when you can approach the question from multiple difference angles while being "generous" to the counter argument and still get ratios that are 10-10000x smaller that you're citing without evidence.


Is there anything wrong with having the GPU do the interpolation?
As it stands we're getting better and better "interpolation" each year a la FSR and DLSS.

The downsides from pixel-miss matching also aren't as bad at 1440p or 4K are also, in some sense, 1/4th as bad as they were with lower resolutions (720p and 1080p) - more pixels means that discretization errors aren't as big of a deal.

Of course not.  But people besides pro gamers can use high refresh. Just almost certainly not 500hz. Even 1/100 is “almost no one” though.  I’ve been thinking hard about a high refresh screen. My phone does 120 and it really clearly has use to me even if I don’t play shooters.  Many things 3d can use over 100hz refresh, and my understanding is it is needed even more for VR and AR stuff to avoid nausea etc..  There are a lot of use cases where it isn’t needed.  The lion’s share of productivity stuff for one.  Video for another. 

Edited by Bombastinator

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, Bombastinator said:

Of course not.  But people besides pro gamers can use high refresh. Just almost certainly not 500hz. Even 1/100 is “almost no one” though.  I’ve been thinking hard about a high refresh screen. My phone does 120 and it really clearly has use to me even if I don’t play shooters.  Many things 3d can use over 100hz refresh, and my understanding is it is needed even more for VR and AR stuff to avoid nausea etc..  There are a lot of use cases where it isn’t needed.  The lion’s share of productivity stuff for one.  Video for another. 

No one has ever said that up to 240hz wasn't of benefit.  That's a 120hz per eye for 3D content--and I fully endorse that being of use to the vast majority of the population.

 

It's when you are > 240hz that I'm going to take engineers to task for not putting the effort into more productive means.

Link to comment
Share on other sites

Link to post
Share on other sites

Other than an OLED display it won't even have fast enough response times to actually make use of 500Hz or higher. It's no good when the monitor want's to draw a new frame when the last one didn't even finish transitioning yet because of slow pixel response times.

 

120Hz is the point of diminishing return imo. 240hz already doesn't really offer a real advantage to the average gamer. Tryhards or professional players (people who make money with their performance) might see it differently though.

 

360Hz already offered no real advantage over 240Hz. Most pros would rather go with a 240Hz monitor that has good BFI to further reduce motion blur, compared to a 360Hz one that doesn't feature BFI. So i don't see 500Hz being any different.

 

It will probably bring a very small advantage in terms of input lag, but even a 120Hz OLED using BFI will bring better motion handling.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×