Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

ASUS Announces first ever 500hz G-Sync Monitor

Haaselh0ff
 Share

Summary

 

ASUS continue the trend of making monitors for refresh rates most GPUs cant produce with a fresh new 500hz G-Sync display straight out of Computex 2022

nvidia_geforce_rtx_computex_2022_asus_50

Quotes

Quote

Nvidia and Asus are announcing the world’s first 500Hz G-Sync display at Computex 2022 today. While we saw a prototype 500Hz gaming monitor earlier this year, Asus is promising a shipping 24-inch 500Hz TN panel that’s designed for esports titles running at 1080p. Asus has not yet announced pricing or a release date for its new ROG Swift 500Hz monitor, though.

 

My thoughts

More Hz for your wallet! Does CSGO even run at 500 fps? I really question whether most eSports games today are capable of running this with whatever eSports PCs productions are buying for players to play on. Maybe those 4000 series GPUs will provide the FPS boost needed to finally reach new heights... And maybe Linus can do a new video on why 500hz would be absolutely unnoticeable in the grand scheme of things.

 

Sources

The Verge : https://www.theverge.com/2022/5/24/23139263/asus-500hz-nvidia-g-sync-gaming-monitor-display-computex-2022

Link to comment
Share on other sites

Link to post
Share on other sites

It doesn't matter if games can run at these FPS or not. There is a huge chunk of kiddos who only care about refresh rate in a monitor because their role model "professional" gamer has talked about fast refresh rates on his stream so now they go and beg their parents to buy them 100000 refresh rate monitors, parents who are absolutely clueless and are told that with this monitor the game will be "much better" so they just buy it. I don't blame ASUS at all. They're just listening to demand. As long kids don't stop being morons stuff like this doesn't change. CSGO remains the most played online game by far with nearly 1m players online at all times. Guess their age.

Link to comment
Share on other sites

Link to post
Share on other sites

Gimmick,bad balance of refresh rate/resolution/price.

The sweet spot is 120Hz 1440p,unless you are playing e-sports shooters and seek an edge over your rivals.

A PC Enthusiast since 2011
AMD Ryzen 5 2600@4.1GHz | GIGABYTE GTX 1660 GAMING OC @ Core 2085MHz Memory 5000MHz
Cinebench R15: 1349cb | Unigine Superposition 1080p Extreme: 3566
Link to comment
Share on other sites

Link to post
Share on other sites

I wonder how balanced the responded time will be, across the refresh range. 

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, Vishera said:

The sweet spot is 120Hz 1440p

Do they even make 120Hz 1440p panels? Even mid-range 1440p panels are 160hz+ nowadays. 

But yeah I don't see the point of anything higher than 200hz. I really doubt it's noticeable

Intel Core i5-8400 / ASRock Z370 Pro 4 / Hyper 212 Evo / 16GB DDR4 @3000 / MSI RTX 2070 Armor / Corsair RM550x / SanDisk 250GB / 1TB WD HDD / Fractal Design Define R4

Link to comment
Share on other sites

Link to post
Share on other sites

Not surprised TN is in use for it. Can't wait to see how it handles blur. 

And no not really GPU bound but you want good CPU and RAM primarily to get high frames. Also sure CSGO but I'm sure Valorant even runs better and Quake Champions too.

In future I really want to see QD OLED used though.

Ryzen 7 3800X | X570 Aorus Elite | G.Skill 16GB 3200MHz C16 | Radeon RX 5700 XT | Samsung 850 PRO 256GB |Mousepad: Skypad 3.0 XL | Mouse: Zowie S1-C |Keyboard: Corsair K63 MX red | OS: Windows 11

Link to comment
Share on other sites

Link to post
Share on other sites

29 minutes ago, Vishera said:

Gimmick,bad balance of refresh rate/resolution/price.

The sweet spot is 120Hz 1440p,unless you are playing e-sports shooters and seek an edge over your rivals.

Wouldn't call it a gimmick but I mean yeah you explain this monitors sole purpose.

Ryzen 7 3800X | X570 Aorus Elite | G.Skill 16GB 3200MHz C16 | Radeon RX 5700 XT | Samsung 850 PRO 256GB |Mousepad: Skypad 3.0 XL | Mouse: Zowie S1-C |Keyboard: Corsair K63 MX red | OS: Windows 11

Link to comment
Share on other sites

Link to post
Share on other sites

42 minutes ago, Vishera said:

...unless you are playing e-sports shooters and seek an edge over your rivals.

Which is exactly what this panel is aimed at.

5800x/3090

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Haaselh0ff said:

 

 

My thoughts

More Hz for your wallet! Does CSGO even run at 500 fps? I really question whether most eSports games today are capable of running this with...

Realistically, nobody needs to "play" a game at over 60hz, because one or more of the following things happen

a) Your eye perceives motion at 15fps, and reasonable reacts as though it's real at 24fps. When it hits 30fps it starts believing there is something wrong because it looks wrong. For example the one LOTR film filmed at 48fps, and various other IMAX films are filmed this way, and what happens is the eye/brain gets a kind of whip-lash. In some people that's motion sickness. So at 60, 90, 120, etc there are further thresholds where the brain can't quite figure out what the heck is going on. If you've acclimated to playing a game at 30 and then switch to 60, suddenly 60 looks wrong, but if you play at 60 for a few months, then playing at 30 looks wrong. It's VR tech that we really discovered what the cross-over point was for motion sickness. It's not straight forward, but usually the higher rate induces motion sickness, but it can also prevent motion sickness. This is where lenses and FOV matter. Even haptic feedback matters here.

b) The GPU is bottlenecked in video bandwidth, and can't actually do this at a resolution that matters

c) display compression or other tricks like DLSS lower the visual quality and induce artifacts that make the experience worse at high refresh rates.

 

So pardon me if I feel that people are just throwing money away when they buy 60hz+ monitors and aren't playing competitively with them. A game and GPU may be capable of 500hz, but I sincerely, very doubt, that anyone can perceive a 500hz refresh rate, and I also doubt the black-to-black of the monitor is really 500hz.

 

Hell, the screenshot shows it's only a 1080p TN panel. So enjoy 500hz dim 6-bit video.

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Kisai said:

Your eye perceives motion at 15fps,

it's quite lowest that yes perceive as a continuous/fluid motion, true. 

But it seems like arguing, how fast cars can drive. 60 km/h is decent.

... but I'm no expert.

Link to comment
Share on other sites

Link to post
Share on other sites

This seems to be aimed at people who think a higher refresh rate makes them better at FPS games.

I'm not surprised its a TN panel, though I wonder how good the response time and motion blur is.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Blademaster91 said:

This seems to be aimed at people who think a higher refresh rate makes them better at FPS games.

 

Pro gamers do actually feel this way and its an interesting phenomenon. I follow the pro Apex scene as I just find the gameplay at high level's exciting but regardless, recently there was a LAN event and the amount of uproar on Twitter over the 144hz monitors for the event was astounding. Some of the things being said by the top players, ones paid money to be there, about the lack of 240 or 360hz panels was wild to see. 

 

Personally i'm plenty happy with my giant ultrawide at 144Hz for my old man single player games. 

5800x/3090

Link to comment
Share on other sites

Link to post
Share on other sites

26 minutes ago, Kisai said:

a) Your eye perceives motion at 15fps, and reasonable reacts as though it's real at 24fps. When it hits 30fps it starts believing there is something wrong because it looks wrong. For example the one LOTR film filmed at 48fps, and various other IMAX films are filmed this way, and what happens is the eye/brain gets a kind of whip-lash. In some people that's motion sickness. So at 60, 90, 120, etc there are further thresholds where the brain can't quite figure out what the heck is going on. If you've acclimated to playing a game at 30 and then switch to 60, suddenly 60 looks wrong, but if you play at 60 for a few months, then playing at 30 looks wrong. It's VR tech that we really discovered what the cross-over point was for motion sickness. It's not straight forward, but usually the higher rate induces motion sickness, but it can also prevent motion sickness. This is where lenses and FOV matter. Even haptic feedback matters here.

I am gonna need you to cite some sources on this, because this is plagued with misinformation and an overall poor understanding of how the eye processes motion. Lets take a look at an actual study on this: https://azretina.sites.arizona.edu/node/837

Quote

Though we aren’t completely sure, we may offer a simple explanation as to why a 240Hz monitor may seem smoother than a 144Hz monitor. Even if our individual cells can only perceive images at say 75 FPS, the goal of films and games is to give an illusion of motion. Creating a series of still images into what we perceive as movement and motion is a lot more complicated than putting a bunch of still images together. Game and film makers must not only capture each image of movement but capture moments in between to create a ‘blur’. A study in 2010 by Rufin vanRullen revealed that the minimum refresh rate for us to detect motion is 13Hz, but we still don’t have a concrete answer on what the maximum is. What we do know it that most people cannot tell the difference between 144Hz and 240Hz. In fact, researchers believe that a steep drop off in perception of higher frame rates begins as low as 90Hz. Typically around 200Hz, though, ‘images’ appear simply as real life motion.

 

So ultimately, what FPS can we see at?

The simple answer is: we don’t know yet. Even though it’s thought that our eyes can only see up to 75FPS, there seems to be some difference in high FPS and refresh rates. Some people have trained their eyes to notice the ‘flicker’ of lower refresh rates, like film makes and professional gamers. Ultimately, we may debunk the myth that there is simply no difference in monitors over 60Hz and games over 60 FPS.

Fluidity of motion extends well beyond "15fps", even far beyond 30. If you put a monitor in front of me and adjust from 30, 60, and 120fps, I can demonstrate with 100% accuracy which refresh rate is currently running with just a simple mouse cursor against a background. No gaming or video content required. The reason being, we notice when an image isn't refreshing quickly enough. We can see the jitter in the trailing of the cursor and our brain knows when information isn't being updated quickly enough relative to what our brain is processing. People that quote nonsense such as "the human eye can't see beyond X framerate" simply doesn't get it and are regurgitating the same misinformation that has been debunked for years.

 

35 minutes ago, Kisai said:

b) The GPU is bottlenecked in video bandwidth, and can't actually do this at a resolution that matters

This depends entirely on the game, and at 500fps (1080p), it's typically not your GPU that is a bottleneck, it's the amount of draw calls your CPU can feed the GPU. 1080p at 500fps for a 3090 is relatively easy in ESports titles, but far more difficult to find a CPU that can feed 500fps without those jarring dips in framerates for 1% lows: 

 

oomp5dh79lx71.png

Here you can see even a 12600k can dip from 622 average down to 372. You'd almost need something like a 12900K to maintain consistently high framerates at that refresh rate. This was with an RTX 3090 btw. So you can see each CPU bottlenecking the RTX 3090, not the other way around.

 

41 minutes ago, Kisai said:

c) display compression or other tricks like DLSS lower the visual quality and induce artifacts that make the experience worse at high refresh rates.

Not entirely sure what this has to do with anything. You don't need to use these features to hit 500fps, especially if you are playing competitive titles in the first place. Most competitive gamers run lowest settings because they value framerate and low input lag over eye candy. There is also a tactical advantage to be had in games where you can lower the amount of grass/bushes in an open field and see the guy with max settings thinking he is hidden, but low-res Larry shows up and snipes him on his Compaq laptop, lol.

 

43 minutes ago, Kisai said:

So pardon me if I feel that people are just throwing money away when they buy 60hz+ monitors and aren't playing competitively with them.

To each their own, I suppose. If my old man eyes can easily tell the difference between 60 and 120, I'll pay that premium for my pleasure, lol. If you want to put yourself at a competitive disadvantage, I certainly won't stop you.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

This is just getting ridiculous at this point...

CPU: AMD Ryzen 3600 / GPU: Radeon HD7970 GHz 3GB with Noctua Fans / RAM: Corsair Vengeance LPX 2x8GB DDR4-3200
MOBO: MSI B450m Gaming Plus / NVME: Corsair MP510 240GB / Case: TT Core v21 / PSU: Seasonic 750W / OS: Win 11 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Vishera said:

Gimmick,bad balance of refresh rate/resolution/price.

The sweet spot is 120Hz 1440p,unless you are playing e-sports shooters and seek an edge over your rivals.

144hz 4k so my GPU actually has something to do.

Link to comment
Share on other sites

Link to post
Share on other sites

i dont understand why people are complaining

you want limits on technology

fyi fighter pilots are know to see different at 1/1000fps  and make out images at 1/255fps or something like that but ok

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 5/24/2022 at 3:41 PM, MageTank said:

-snip-

I think a lot of misinformation regarding this comes from the fact that our eyes do not see "frames" at all, and trying to map the "analog" way our eyes see the world in to the digital "frames" is very messy and imperfect.

 

 

 

  

On 11/3/2015 at 1:27 PM, LAwLz said:

Our eyes don't work like a camera. What happens is that photons hit protein in our eyes (called opsin) which starts a chemical process resulting in an electrical signal getting sent to our brain and processed. Once that process is over there is a brief period where the opsin have to "recharge" (reverse back to how it was before it was hit by light).

If we were going to define an "FPS" value to human vision I guess we should go with that recharge time. I don't have any solid numbers so take this with a shovel of salt, but I have heard it can take up to 10 minutes for an opsin to recharge. So going by that we could say humans see the world at 1 frame per 10 minutes.

Luckily for us humans however, we got billions of opsins in our eyes and they are not all recharging at the same time. So assuming that each opsin goes off one after another we could see billions of "frames" every second, except each "frame" would only be an extremely tiny dot.

If you want to know about FPS in games and such then higher is better. 30 to 60 is a massive difference. 60 to 120 is less noticeable than 30 to 60 but it's still there. Over that and the difference is harder to spot. It also depends on what you are doing. In a test done by the US Air Force they put their pilots in a dark room and then had an image appear for 1/220th of a second (so, 220 FPS). The pilots not only noticed the image but they could even see what aircraft was in the image. That's not to say that you will notice the difference between 219 and 220 FPS though.

 

 

It all boils down to how we define the question.

 

There is evidence that pilots can see details in an image they were only exposed to for 1/220 of a second. Does that mean they can see at 220 FPS? Probably not if it was a constant stream of similar images and then one that was slightly different.

 

If we are talking about being able to detect flickering in a display with high frequency spatial edges? In that case humans can detect it at over 500Hz, but only if we move our eyes.

 

Are we talking about gap-detection? In that case the difference varies greatly between different people. The mean for the general population is 45Hz, but 25% of the population could see more than 60Hz, and some people in the same study could detect 500Hz.

 

When NASA and the US Air Force worked together to develop their high fidelity flight simulators (for 2013 standards), they concluded that their pilots were easily able to perceive 120Hz and set that as a requirement.

https://humansystems.arc.nasa.gov/publications/2012_08_aiaa_ig_obva.pdf

 

 

This MIT study found that it was possible for the test subjects to identify the content of an image shown for as little as 13 milliseconds, even when they were being shown 12 different images in a row. 13 ms would be 76 FPS.

 

So the answer all depends on what question we are asking.

 

 

 

Edit:

If we are after a display that can "outperform" a well trained human eye in any scenario then we need 500Hz or more. 

But such a display will have big drawbacks that are probably way more noticeable than some extreme edge cases. 

 

Personally I prefer 120Hz. I think that's the sweet spot where you will most likely never notice any significant difference by going higher, it's above the threshold where it's still fairly easy to see the difference (like going from 30 to 60), and it's low enough that there isn't any major drawbacks with it (like resolution, ghosting, brightness, difficulty to drive etc) using our current technology. 

 

I have no interest in this monitor, other than it being a neat demo.

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, MageTank said:

I am gonna need you to cite some sources on this, because this is plagued with misinformation and an overall poor understanding of how the eye processes motion. Lets take a look at an actual study on this: https://azretina.sites.arizona.edu/node/837

 

The entire problem is that the eye is analog, and trying to measure it in "FPS" or "Hz" is the same problem as the nyquist theorm. Yes, most people aren't really notice anything over 60hz, and most people aren't going to hear much above 22khz, yet for reasons often on digital side (eg mixing) we do 16bit-44khz or 120hz refresh rates because that covers enough of the margin of error in the analog side. Yet you will have audiophiles who insist on 24bit-96khz, and that's the same argument being presented with 144hz, 240hz, 360hz, etc. Remember "60hz" is because of the power standard in north america, back with CRT monitors. That's the only reason 60hz exists at all.

 

But here's the kicker, on a CRT I can see the flicker at 60hz. I can't at 90hz. I can also hear the full 20hz-20khz audio range, but keep having arguments with people who insist you can't. Yet, https://en.wikipedia.org/wiki/The_Mosquito and various kinds of electronic dog whistles I have been able to hear. Though maybe "hear" is probably the wrong word, "feel in the ear" is more like it.

 

The entire problem with saying "you can't see X fps" is that it assumes everyone is the same. Someone who has grown up with high refresh rate monitors will not want to go back to a low refresh rate one. Just like someone who listens to 24bit/96khz audio won't go back to 16-bit/44khz because there is a perceptible difference even if the difference is largely the sound card DAC behaving differently rather than any additional fidelity.

 

8 hours ago, MageTank said:

To each their own, I suppose. If my old man eyes can easily tell the difference between 60 and 120, I'll pay that premium for my pleasure, lol. If you want to put yourself at a competitive disadvantage, I certainly won't stop you.

You aren't at a competitive disadvantage with a 60hz monitor. TN panels suck, they're dim and make everything look like there is a grey film over it.

Link to comment
Share on other sites

Link to post
Share on other sites

54 minutes ago, Kisai said:

You aren't at a competitive disadvantage with a 60hz monitor.

Can't believe I'm reading this but here we are

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

If I wanted to use a TN panel monitor, I'd go dust off my circa-2007 square Acer. The fact that we haven't moved past TN panels is absolutely ridiculous to me.

Highly knowledgeable in all the obscure 2000s hardware & software you'll never need to ask about

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Moonzy said:

Can't believe I'm reading this but here we are

Cause it's not the monitor, it's the game timing loop and netcode, which is often well below 60hz anyway.

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, Kisai said:

Your eye perceives motion at 15fps

15 FPS is stuttery and feels sluggish...

My comfort zone is at 45-75 Hz.

1 hour ago, Kisai said:

You aren't at a competitive disadvantage with a 60hz monitor. TN panels suck, they're dim and make everything look like there is a grey film over it.

More frames + Hz = more information through your eyes for your brain to react faster

Though too much of it can give you a headache.

A PC Enthusiast since 2011
AMD Ryzen 5 2600@4.1GHz | GIGABYTE GTX 1660 GAMING OC @ Core 2085MHz Memory 5000MHz
Cinebench R15: 1349cb | Unigine Superposition 1080p Extreme: 3566
Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, Kisai said:

Cause it's not the monitor, it's the game timing loop and netcode, which is often well below 60hz anyway.

I am a game dev so....

Both of them don't limit the framerate of the game and the refresh rate of your monitor,

So it doesn't matter here....

So you can have high refresh rate monitor and high FPS so your eyes can perceive information from the monitor as quickly as possible improving your reaction time.

A PC Enthusiast since 2011
AMD Ryzen 5 2600@4.1GHz | GIGABYTE GTX 1660 GAMING OC @ Core 2085MHz Memory 5000MHz
Cinebench R15: 1349cb | Unigine Superposition 1080p Extreme: 3566
Link to comment
Share on other sites

Link to post
Share on other sites

wonder how that went, rip TN and 500hurtz. hz'ing back into the future

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, Vishera said:

I am a game dev so....

Both of them don't limit the framerate of the game and the refresh rate of your monitor,

So it doesn't matter here....

So you can have high refresh rate monitor and high FPS so your eyes can perceive information from the monitor as quickly as possible improving your reaction time.

I sincerely doubt the command lists are being pushed 500 times per second even in the best designed games. In fact most games built in Unity just burn more GPU and CPU cycles with higher refresh rates with a negative return on playability and system performance.

 

We're constantly telling people to lock Unity games at 60hz if they are going to stream them because it otherwise breaks the stream.

 

And as I said, you're not going to be having a "500fps" game experience if the netcode is locked to 200ms. It's a known thing in pretty much every game that relying on the game client to be honest is a mistake. So you're doing something wrong if you're accepting 500 packets per second from a game client when you're accepting 2-15 from everyone else. You are not at a competative advantage with a higher refresh rate monitor, and if by some reason you are, the game is fundamentally broken if you can just cheat with vsync off.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share


×