Jump to content

Question on fps vs Hz on monitors

Agent181

Does 60Hz mean it will show max 60 fps for a game or no.  I will be getting a Dell Ultrasharp U2412M 1920x1200.  This will help me decide if I get a 280x or 290.  I really don't care for any higher fps that goes past 50-60.

 

Thanks for your help.

Link to comment
Share on other sites

Link to post
Share on other sites

You can have more than 60FPS but you cant display more than 60FPS so to answer your question. Yes. Though I would still recommend you 290 as it will have more MINIMUM FPS which is more important than FPS spikes.

Link to comment
Share on other sites

Link to post
Share on other sites

The monitor will refresh 60 times per second, thus only being able to show 60FPS. Over 60FPS will cause tearing.

Spoiler

Prometheus (Main Rig)

CPU-Z Verification

Laptop: 

Spoiler

Intel Core i3-5005U, 8GB RAM, Crucial MX 100 128GB, Touch-Screen, Intel 7260 WiFi/Bluetooth card.

 Phone:

 Game Consoles:

Spoiler

Softmodded Fat PS2 w/ 80GB HDD, and a Dreamcast.

 

If you want my attention quote my post, or tag me. If you don't use PCPartPicker I will ignore your build.

Link to comment
Share on other sites

Link to post
Share on other sites

Does 60Hz mean it will show max 60 fps for a game or no.

 

Yes. The frequency is how many times it upates per second, so you won't be able to see more than 60. I'd choose a 280x.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

Yes. The frequency is how many times it upates per second, so you won't be able to see more than 60. I'd choose a 280x

That's what I thought too.  thanks everyone

Link to comment
Share on other sites

Link to post
Share on other sites

You can have more than 60FPS but you cant display more than 60FPS so to answer your question. Yes. Though I would still recommend you 290 as it will have more MINIMUM FPS which is more important than FPS spikes.

I won't be playing any high demanding games as much so is the 280x still fine?

Link to comment
Share on other sites

Link to post
Share on other sites

The monitor will refresh 60 times per second, thus only being able to show 60FPS. Over 60FPS will cause tearing.

You get tearing when your gpu and monitor are out of sync, not just when you go past the hz of the monitor.

My posts are in a constant state of editing :)

CPU: i7-4790k @ 4.7Ghz MOBO: ASUS ROG Maximums VII Hero  GPU: Asus GTX 780ti Directcu ii SLI RAM: 16GB Corsair Vengeance PSU: Corsair AX860 Case: Corsair 450D Storage: Samsung 840 EVO 250 GB, WD Black 1TB Cooling: Corsair H100i with Noctua fans Monitor: ASUS ROG Swift

laptop

Some ASUS model. Has a GT 550M, i7-2630QM, 4GB or ram and a WD Black SSD/HDD drive. MacBook Pro 13" base model
Apple stuff from over the years
iPhone 5 64GB, iPad air 128GB, iPod Touch 32GB 3rd Gen and an iPod nano 4GB 3rd Gen. Both the touch and nano are working perfectly as far as I can tell :)
Link to comment
Share on other sites

Link to post
Share on other sites

280x is still fine for 1080p gaming.

its 1980x1200

 

 

You get tearing when your gpu and monitor are out of sync, not just when you go past the hz of the monitor.

Doesn't that already happen and the reason for freesync and g-sync's creation?

Link to comment
Share on other sites

Link to post
Share on other sites

its 1980x1200

 

 

Doesn't that already happen and the reason for freesync and g-sync's creation?

Still OK its not that much different.

Link to comment
Share on other sites

Link to post
Share on other sites

Does 60Hz mean it will show max 60 fps for a game or no. I will be getting a Dell Ultrasharp U2412M 1920x1200. This will help me decide if I get a 280x or 290. I really don't care for any higher fps that goes past 50-60.

Thanks for your help.

Get 4k in-game on that monitor via GeForce Experience(Nvidia video card) and OverClock that monitor to 75hertz if possible
Link to comment
Share on other sites

Link to post
Share on other sites

What 60Hz refresh rate means is this:

Your monitor can literally only refresh the display at a maximum of 60 times in 1 second. Within that one second, the display is updated 60 times, so if you're playing a game at 60fps or 120fps or 100000fps it doesn't matter, the fastest the screen can refresh is 60 times in one second, so you'll see 60fps.

 

If your game benchmarks say you're able to play around 80fps, but you're using a 60Hz monitor, it's going to look exactly like 60fps, regardless of the internal framerates your card can pull off.

 

What you should do is decide on a card you want based on your budget, then buy a monitor with a refresh rate close to what the card can pull off on your favorite games. If you get 120fps or higher on all your favorite games, get a 120Hz monitor. If you're closer to 60fps on those games, get a 60Hz monitor.

Git Gud.

Link to comment
Share on other sites

Link to post
Share on other sites

It's not that simple.

 

Monitors refresh 60 times per second at exact intervals. Graphics cards render frames more randomly than that. If your card renders the first 50 frames in 0.5 seconds but only 10 for rest of that second, the second half will feel laggy as hell. That example is extreme but frame timing is AS important as frames per second though it's harder to monitor, it's just something you feel. This is why you get perceived stuttering and games don't quite feel as smooth. Having an average FPS well over the refresh rate of your monitor helps a lot with that!

 

The only way to tell if it bothers you is to give it a try. Some people don't notice it. I do.

Asus Maximus VII GeneIntel i7 4790k @ 4.8 - Corsair Vengance Pro 16GB DDR3 @ 2000Mhz - Asus Strix GTX980 SLI @ 1400Mhz/8000Mhz - 2x Samsing EVO 840 500GB RAID0 - 2x Seagate Barracuda 3TB Cooler Master v1000 - Focusrite Scarlett 18i6

Sennheiser Momentum Over Ear - Genelec 8040a Pair - Audio Technica AT4040 - Asus PG278Q ROG Swift - Asus PB278Q - 2x Bad Asus IPS 1080 Screens - Ducky Shine 3 w/Browns - Corsair m65 - Razer Orbweaver Clicky - Razer Sabertooth

Apple rMBP Late 2013 - Applie retina iPad 2 - Apple iPhone 5 - Apple iPod Classic - XBone - Wii U - Pikachu 3DS XL - Katsukity 3DS XL

Link to comment
Share on other sites

Link to post
Share on other sites

Get 4k in-game on that monitor via GeForce Experience(Nvidia video card) and OverClock that monitor to 75hertz if possible

Don't need to plus I don't like invidia so.... yea

 

 

It's not that simple.

 

Monitors refresh 60 times per second at exact intervals. Graphics cards render frames more randomly than that. If your card renders the first 50 frames in 0.5 seconds but only 10 for rest of that second, the second half will feel laggy as hell. That example is extreme but frame timing is AS important as frames per second though it's harder to monitor, it's just something you feel. This is why you get perceived stuttering and games don't quite feel as smooth. Having an average FPS well over the refresh rate of your monitor helps a lot with that!

 

The only way to tell if it bothers you is to give it a try. Some people don't notice it. I do.

I do notice it but I really don't play that many high usage games a lot.  The frame rates that I will be getting will be around 50-70fps.  I'm fine with that, I'm currently playing most of my games between 18-25 fps so a slight stutter at 50+ fps won't matter at all for me.  

Link to comment
Share on other sites

Link to post
Share on other sites

Don't need to plus I don't like invidia so.... yea

Hopefully you have a legit reason or you're just one of those "fanboy" people no offense. I never bought a AMD video card because it usually runs hot and uses lots of power but it's pack with lots of VRAM, I always bought Nvidia because it uses less power but what I hate about their cards is the lack of VRAM, double the VRAM cost like a fourth of its price. Saying you "don't like" for something, well, you could miss out on lots of stuff that Nvidia offer like 4K, but on AMD side, they offer games so I would probably go for games than features…
Link to comment
Share on other sites

Link to post
Share on other sites

Here's my two cents.

As stated before, a monitor will display frames at specific intervals, like a clock going off at intervals. Your GPU is kind of like you tapping along every second; you'll try to match it, but sometimes you'll be off a little. This is called frametime variance, a factor as important as FPS itself, as it'll determine how "smooth" a game is.

Now, I've noticed a few of you are saying that going over your refresh rate isn't good, it'll cause tearing, you won't notice the difference, etc. I disagree.

Some games rely on their framerate to influence other mechanics and code, especially in multiplayer. Just ask anyone who plays CS:GO competitively: Having 150+ FPS will give you a smoother aim than being capped via Vsync, or Vertical sync, which limits the game's framerate to that of your monitor's refresh rate. It can include aim, netcode, and in some games fire rate and movement speed. This is why I keep Vsync off. If you don't care for it, then I'm not calling your opinion wrong.

That is all.

[witty signature]

Link to comment
Share on other sites

Link to post
Share on other sites

Hopefully you have a legit reason or you're just one of those "fanboy" people no offense. I never bought a AMD video card because it usually runs hot and uses lots of power but it's pack with lots of VRAM, I always bought Nvidia because it uses less power but what I hate about their cards is the lack of VRAM, double the VRAM cost like a fourth of its price. Saying you "don't like" for something, well, you could miss out on lots of stuff that Nvidia offer like 4K, but on AMD side, they offer games so I would probably go for games than features…

For gaming I prefer amd and I also water cool so the heat issue isn't a problem.  Also if I did have to use invidia then I would use it for professional use not video games.

 

 

Here's my two cents.

As stated before, a monitor will display frames at specific intervals, like a clock going off at intervals. Your GPU is kind of like you tapping along every second; you'll try to match it, but sometimes you'll be off a little. This is called frametime variance, a factor as important as FPS itself, as it'll determine how "smooth" a game is.

Now, I've noticed a few of you are saying that going over your refresh rate isn't good, it'll cause tearing, you won't notice the difference, etc. I disagree.

Some games rely on their framerate to influence other mechanics and code, especially in multiplayer. Just ask anyone who plays CS:GO competitively: Having 150+ FPS will give you a smoother aim than being capped via Vsync, or Vertical sync, which limits the game's framerate to that of your monitor's refresh rate. It can include aim, netcode, and in some games fire rate and movement speed. This is why I keep Vsync off. If you don't care for it, then I'm not calling your opinion wrong.

That is all.

I agree with you, that's why I don't believe in paying in any of the v sync or freesync crap.  If we dealt with this problems for years already, why do we all have to switch and get capped.  It's just like what the consoles are doing but a version for pc.

Link to comment
Share on other sites

Link to post
Share on other sites

Here's my two cents.

As stated before, a monitor will display frames at specific intervals, like a clock going off at intervals. Your GPU is kind of like you tapping along every second; you'll try to match it, but sometimes you'll be off a little. This is called frametime variance, a factor as important as FPS itself, as it'll determine how "smooth" a game is.

Now, I've noticed a few of you are saying that going over your refresh rate isn't good, it'll cause tearing, you won't notice the difference, etc. I disagree.

Some games rely on their framerate to influence other mechanics and code, especially in multiplayer. Just ask anyone who plays CS:GO competitively: Having 150+ FPS will give you a smoother aim than being capped via Vsync, or Vertical sync, which limits the game's framerate to that of your monitor's refresh rate. It can include aim, netcode, and in some games fire rate and movement speed. This is why I keep Vsync off. If you don't care for it, then I'm not calling your opinion wrong.

That is all.

Nearly all source games benefit a LOT by doubling your refresh rate. It's not just how it handles frames being displayed, there's also some quirks with the net code that like you running really fast. At least it was, It's been a while since I was into TF2 on a competitive level, though there's some great info on that in the ETF2L forums. I'm not sure if other games benefit in the same way, I guess some do, some don't. Something that is also relevant is that your actions cannot be registered until the next frame is done so more frames means consistently quicker response.

 

While all of this is of some importance, it's only important to you if it bothers you.

 

As far as g-sync goes, you really have to experience it to understand it. There's no substitute for it. G-Sync/Freesync is NOT AT ALL the same as V-Sync.

Asus Maximus VII GeneIntel i7 4790k @ 4.8 - Corsair Vengance Pro 16GB DDR3 @ 2000Mhz - Asus Strix GTX980 SLI @ 1400Mhz/8000Mhz - 2x Samsing EVO 840 500GB RAID0 - 2x Seagate Barracuda 3TB Cooler Master v1000 - Focusrite Scarlett 18i6

Sennheiser Momentum Over Ear - Genelec 8040a Pair - Audio Technica AT4040 - Asus PG278Q ROG Swift - Asus PB278Q - 2x Bad Asus IPS 1080 Screens - Ducky Shine 3 w/Browns - Corsair m65 - Razer Orbweaver Clicky - Razer Sabertooth

Apple rMBP Late 2013 - Applie retina iPad 2 - Apple iPhone 5 - Apple iPod Classic - XBone - Wii U - Pikachu 3DS XL - Katsukity 3DS XL

Link to comment
Share on other sites

Link to post
Share on other sites

 

Doesn't that already happen and the reason for freesync and g-sync's creation?

YA thats why g-sync and free-sync were created so you dont need to enable v-sync or work your ass off getting your fps as close to 60 as possible. Not all games get tearing in the same way though.

My posts are in a constant state of editing :)

CPU: i7-4790k @ 4.7Ghz MOBO: ASUS ROG Maximums VII Hero  GPU: Asus GTX 780ti Directcu ii SLI RAM: 16GB Corsair Vengeance PSU: Corsair AX860 Case: Corsair 450D Storage: Samsung 840 EVO 250 GB, WD Black 1TB Cooling: Corsair H100i with Noctua fans Monitor: ASUS ROG Swift

laptop

Some ASUS model. Has a GT 550M, i7-2630QM, 4GB or ram and a WD Black SSD/HDD drive. MacBook Pro 13" base model
Apple stuff from over the years
iPhone 5 64GB, iPad air 128GB, iPod Touch 32GB 3rd Gen and an iPod nano 4GB 3rd Gen. Both the touch and nano are working perfectly as far as I can tell :)
Link to comment
Share on other sites

Link to post
Share on other sites

I agree with you, that's why I don't believe in paying in any of the v sync or freesync crap.  If we dealt with this problems for years already, why do we all have to switch and get capped.  It's just like what the consoles are doing but a version for pc.

V-sync and free/G-sync are quite different, free/G-sync are really nice technologies and if you can get them I would say go for it.

Link to comment
Share on other sites

Link to post
Share on other sites

V-sync and free/G-sync are quite different, free/G-sync are really nice technologies and if you can get them I would say go for it.

I still don't see why I need to pay double for a monitor with g/free sync when you already pay as much for the card.

Link to comment
Share on other sites

Link to post
Share on other sites

I still don't see why I need to pay double for a monitor with g/free sync when you already pay as much for the card.

because it's that good.

 

It's hard to put into words but I promise you, once you play with it (seeing it isn't enough), there is nothing else.

Asus Maximus VII GeneIntel i7 4790k @ 4.8 - Corsair Vengance Pro 16GB DDR3 @ 2000Mhz - Asus Strix GTX980 SLI @ 1400Mhz/8000Mhz - 2x Samsing EVO 840 500GB RAID0 - 2x Seagate Barracuda 3TB Cooler Master v1000 - Focusrite Scarlett 18i6

Sennheiser Momentum Over Ear - Genelec 8040a Pair - Audio Technica AT4040 - Asus PG278Q ROG Swift - Asus PB278Q - 2x Bad Asus IPS 1080 Screens - Ducky Shine 3 w/Browns - Corsair m65 - Razer Orbweaver Clicky - Razer Sabertooth

Apple rMBP Late 2013 - Applie retina iPad 2 - Apple iPhone 5 - Apple iPod Classic - XBone - Wii U - Pikachu 3DS XL - Katsukity 3DS XL

Link to comment
Share on other sites

Link to post
Share on other sites

because it's that good.

 

It's hard to put into words but I promise you, once you play with it (seeing it isn't enough), there is nothing else.

It's not that great, I have tried it already and its not.  It's just like DSR with all its problems.  

Link to comment
Share on other sites

Link to post
Share on other sites

It's not that great, I have tried it already and its not.  It's just like DSR with all its problems.  

DSR is nothing to do with G-Sync. Not at all. In fact they rarely work together. Your results may vary, obviously.

 

You clearly don't want gsync/freesync and there's nothing wrong with that. You don't have to pay for it if you don't want it. :)

Asus Maximus VII GeneIntel i7 4790k @ 4.8 - Corsair Vengance Pro 16GB DDR3 @ 2000Mhz - Asus Strix GTX980 SLI @ 1400Mhz/8000Mhz - 2x Samsing EVO 840 500GB RAID0 - 2x Seagate Barracuda 3TB Cooler Master v1000 - Focusrite Scarlett 18i6

Sennheiser Momentum Over Ear - Genelec 8040a Pair - Audio Technica AT4040 - Asus PG278Q ROG Swift - Asus PB278Q - 2x Bad Asus IPS 1080 Screens - Ducky Shine 3 w/Browns - Corsair m65 - Razer Orbweaver Clicky - Razer Sabertooth

Apple rMBP Late 2013 - Applie retina iPad 2 - Apple iPhone 5 - Apple iPod Classic - XBone - Wii U - Pikachu 3DS XL - Katsukity 3DS XL

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×