Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
EClinus1999

Can my PC run on a 27" Gsync Monitor IPS WQHD 1440p at 144hz?

8 minutes ago, EClinus1999 said:

i guess 1060 would be good for now. no choice. I cannot afford to upgrade my GPU yet till I got a job after college... thank you for ur insight.

You can always just run games at 1080p on the 1440p monitor, that way you can actually hit 144hz.

The monitor resolution doesn't dictate the game resolution.

Recommended Posts

Just now, Peskanova said:

At least in offline yes , in online there is a dip in fps so you might tweak some settings at medium.

Yeah the online mode is weird. I can get 70 fps in 4k single player, when i join online the game is pretty much unplayable due to fps drops, so i need to drop at 1440p.

Link to post
Share on other sites
1 minute ago, Some Random Member said:

You can always just run games at 1080p on the 1440p monitor, that way you can actually hit 144hz.

The monitor resolution doesn't dictate the game resolution.

1080p on a 1440p display doesn't look good though...it's very blurry. I would rather lower the settings down to medium and achieve playable FPS...you don't need 144fps when you have a variable refresh display...like i said, even 70/80FPS is already a lot smoother than 60hz display...and you get no input lag, no screen tearing and no stuttering...so it's smooth...real smooth.


| CPU: Core i7-8700K @ 5.0ghz - 1.3v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI GTX 1080Ti Gaming X Trio 2ghz OC  RAM: 16GB T-Force Delta RGB 3000mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Rift S

 

Read: My opinions on VR in it's current state, should YOU buy into it?

Link to post
Share on other sites
54 minutes ago, JoostinOnline said:

I felt like saying it twice was rather rude, but maybe it's just how I read it.

 

It's not just the 1060 though.  A 1700 is going to struggle to reach 100fps on max settings in most games.  With only half the cores being utilized and moderate single-core performance, it just won't keep up.

Golly gee Batman, look at it "struggle," what an absolute travesty. Why, my stock 3570K probably does better, not to mention my G3258 when it was at 4.6...

 

And to think, this was last year when Ryzen was still a new kid and it's probably gotten better... https://www.gamersnexus.net/guides/3009-amd-r7-1700-vs-i7-7700k-144hz-gaming

Link to post
Share on other sites
1 hour ago, tmcclelland455 said:

Golly gee Batman, look at it "struggle," what an absolute travesty. Why, my stock 3570K probably does better, not to mention my G3258 when it was at 4.6...

 

And to think, this was last year when Ryzen was still a new kid and it's probably gotten better... https://www.gamersnexus.net/guides/3009-amd-r7-1700-vs-i7-7700k-144hz-gaming

Already addressed.


Make sure to quote or tag me (@JoostinOnline) or I won't see your response!

PSU Tier List  |  The Real Reason Delidding Improves Temperatures

Link to post
Share on other sites
Quote

Let’s move to 1440p.

At this resolution, everything levels out to perform within a couple percentage points in average framerate. The 7700K is technically leading, but it’s close enough to be within our margins for this particularly long test. The R7 1700 is consistently worse in frametimes, measurably and repeatedly, though not in a matter which is appreciable. Both CPUs are capable of sustaining 144Hz at 1440p.

https://www.gamersnexus.net/guides/3009-amd-r7-1700-vs-i7-7700k-144hz-gaming

If Steve says case closed. 
1700 its ok <3 enjoy 


Case: Corsair 760T  |  Psu: Evga  650w p2 | Cpu-Cooler : Noctua Nh-d15 | Cpu : 8600k  | Gpu: Gygabyte 1070 g1 | Ram: 2x8gb Gskill Trident-Z 3000mhz |  Mobo : Aorus GA-Z370 Gaming K3 | Storage : Ocz 120gb sata ssd , sandisk 480gb ssd , wd 1gb hdd | Keyboard : Corsair k95 rgb plat. | Mouse : Razer deathadder elite | Monitor: Dell s2417DG (1440p 165hz gsync) & a crappy hp 24' ips 1080p | Audio: Schiit stack + Akg k712pro + Blue yeti.

Link to post
Share on other sites

I used to run a 1060 6GB on a 1440p 144Hz monitor, the card never reached that in high end games.

I have a GTX 1080Ti now and that struggles in some games to keep that FPS up.

If you are looking at this invest in g-sync and let that do all the refresh rates for you, I love my g-sync panel.


What does an Transformer get? Life insurance or car insurance? - Russell Howard - Standup (Made me giggle a bit)

Link to post
Share on other sites
Posted · Original PosterOP
9 hours ago, Some Random Member said:

You can always just run games at 1080p on the 1440p monitor, that way you can actually hit 144hz.

The monitor resolution doesn't dictate the game resolution.

thats good news. i guess that would do for now. because im preparing to spend this big for a display. wonderful to hear that. like i said, future-proof my display...

 

thank you. 

powers to all of you!

Link to post
Share on other sites
8 hours ago, JoostinOnline said:

What's your problem?

Just people talking out their ass, such as saying that the 1700 will "struggle to hit 100fps on max settings in most games" and the fact that it isn't fast enough for 1440p.

Link to post
Share on other sites
3 minutes ago, tmcclelland455 said:

Just people talking out their ass, such as saying that the 1700 will "struggle to hit 100fps on max settings in most games" and the fact that it isn't fast enough for 1440p.

  1. It's not enough to steadily hit 100fps across all games at max settings.  Did you even read the other reply?  Those are far from the most CPU-intensive games out there, and it's not getting that far above 100fps.
  2. I never said anything about 1440p being a factor there.  Resolution doesn't affect CPU usage.

Make sure to quote or tag me (@JoostinOnline) or I won't see your response!

PSU Tier List  |  The Real Reason Delidding Improves Temperatures

Link to post
Share on other sites
20 minutes ago, tmcclelland455 said:

Just people talking out their ass, such as saying that the 1700 will "struggle to hit 100fps on max settings in most games" and the fact that it isn't fast enough for 1440p.

 

14 minutes ago, JoostinOnline said:
  1. It's not enough to steadily hit 100fps across all games at max settings.  Did you even read the other reply?  Those are far from the most CPU-intensive games out there, and it's not getting that far above 100fps.
  2. I never said anything about 1440p being a factor there.  Resolution doesn't affect CPU usage.

Guys, please keep the discussion civil. No need for being rude to one another, you can disagree but you can do it politely. This applies to anyone in this thread.

While I'm here, I'll throw in my two cents:
For 1440p, an overclocked R7 1700 is more than enough to max out almost any graphics card. And as a response to @JoostinOnline's claim - resolution does affect CPU usage, maybe not directly but it does because the higher the resolution - the more your GPU becomes the limiting factor, meaning that you get lower framerates than on lower resolutions and thus, your CPU has to do less work. At 1080p the difference is a lot more significant between an R7 1700 & let's say a 7700K, at 1440p it's not really noticeable for 99% of users.

In this particular instance, OP's GPU is a mere 1060 6GB so he'll be GPU bound, pretty much no matter what modern CPU he has. Even if OP had a GTX 1080, it wouldn't make a difference between a 1700 or a 7700K at 1440p so IMO such a monitor is a good buy if he plans on getting a faster GPU from Nvidia in the near future. ;)

 


CPU: AMD Ryzen 7 3700X GPU: MSI GTX 1080 Ti GAMING X TRIO 11GB GDDR5X Motherboard: ASUS ROG CROSSHAIR VI EXTREME
CPU Cooler: Corsair H100i V2 RAM: Corsair Vengeance LED 16GB DDR4 3200MHz Case: Lian Li PC-O11 Dynamic PSU: Corsair TX650M Gray Unit
Displays: AORUS AD27QD, DELL UltraSharp U2711 Storage: Samsung 850 EVO 120GB, ADATA SP550 240GB M.2, Kingston UV400 240GB, WD Red 2TB & 1TB
Laptop: Acer Nitro 5 CPU: AMD Ryzen 5 2500U GPU: AMD Radeon RX 560X 4GB RAM: 16GB Storage: 240GB M.2 SSD, 1TB HDD Display: 15.6" IPS

Link to post
Share on other sites
3 minutes ago, Morgan MLGman said:

While I'm here, I'll throw in my two cents:

For 1440p, an overclocked R7 1700 is more than enough to max out almost any graphics card. And as a response to @JoostinOnline's claim - resolution does affect CPU usage, maybe not directly but it does because the higher the resolution - the more your GPU becomes the limiting factor, meaning that you get lower framerates than on lower resolutions and thus, your CPU has to do less work. At 1080p the difference is a lot more significant between an R7 1700 & let's say a 7700K, at 1440p it's not really noticeable for 99% of users.

In this particular instance, OP's GPU is a mere 1060 6GB so he'll be GPU bound, pretty much no matter what modern CPU he has. Even if OP had a GTX 1080, it wouldn't make a difference between a 1700 or a 7700K at 1440p so IMO such a monitor is a good buy if he plans on getting a faster GPU from Nvidia in the near future. ;)

 

For the most part, I agree with this.  However, the OP hasn't given any indication of overclocking, and there are plenty of more demanding games out there (the latest Ghost Recon game, Mankind Divided, PUBG, etc) that you aren't going to get 100fps on even if you overclock and pair it with a 1080 Ti.  The games that keep getting referenced aren't that difficult to hit 100fps on.


Make sure to quote or tag me (@JoostinOnline) or I won't see your response!

PSU Tier List  |  The Real Reason Delidding Improves Temperatures

Link to post
Share on other sites
6 minutes ago, JoostinOnline said:

For the most part, I agree with this.  However, the OP hasn't given any indication of overclocking, and there are plenty of more demanding games out there (the latest Ghost Recon game, Mankind Divided, PUBG, etc) that you aren't going to get 100fps on even if you overclock and pair it with a 1080 Ti.  The games that keep getting referenced aren't that difficult to hit 100fps on.

Yeah, though you need to remember that it's because those games are optimized like crap, example:
QwE0fJ0.png
Note that at 1440p there's no difference, unlike at 1080p with a stock 2600. Once the CPU & memory were fine-tuned,  the R5 2600 actually surpassed the performance of the 8400.

9cW97I0.jpg
9mfZGE7.png
It's actually not that bad and not that far off from Intel's counterparts, note that the higher the resolution - the lower the difference between those CPUs as I mentioned in my previous post.

Though once you overclock the CPU & memory on the Ryzen side, the averages look a lot different:
4QLgx4d.png
Even when you remove the two biggest outliers (Prey, The Witcher 3), the 2600 is still 5% faster once overclocked. It's a bit slower at stock, but it's hard to neglect its ability to overclock, even on cheap boards with a stock cooler.

Again, at 1440p the results would be even closer between the two so I wouldn't worry about the CPU in this instance and focus on upgrading the GPU because you're primarly GPU bound in this case.

 

source for those screenshots: https://www.youtube.com/watch?v=AUyF--fJaaM

 


CPU: AMD Ryzen 7 3700X GPU: MSI GTX 1080 Ti GAMING X TRIO 11GB GDDR5X Motherboard: ASUS ROG CROSSHAIR VI EXTREME
CPU Cooler: Corsair H100i V2 RAM: Corsair Vengeance LED 16GB DDR4 3200MHz Case: Lian Li PC-O11 Dynamic PSU: Corsair TX650M Gray Unit
Displays: AORUS AD27QD, DELL UltraSharp U2711 Storage: Samsung 850 EVO 120GB, ADATA SP550 240GB M.2, Kingston UV400 240GB, WD Red 2TB & 1TB
Laptop: Acer Nitro 5 CPU: AMD Ryzen 5 2500U GPU: AMD Radeon RX 560X 4GB RAM: 16GB Storage: 240GB M.2 SSD, 1TB HDD Display: 15.6" IPS

Link to post
Share on other sites
4 minutes ago, Morgan MLGman said:

Yeah, though you need to remember that it's because those games are optimized like crap, example:

True, but unless you think they're the last games that will ever be optimized like crap, then it doesn't really change anything. xD


Make sure to quote or tag me (@JoostinOnline) or I won't see your response!

PSU Tier List  |  The Real Reason Delidding Improves Temperatures

Link to post
Share on other sites
1 hour ago, JoostinOnline said:
  1. Resolution doesn't affect CPU usage.

That's a solid negative, cap'n. Sure, some games are always CPU whores (Cities Skylines or Beam.NG for example) but shit like Sniper Elite 4, Insurgency, and Fallout 4 scale rather nicely going up in resolution in my experience. Hell, with Insurgency I gained FPS going from 1080p to 1440p because the CPU usage dropped enough to "re-bottleneck" at a higher framerate.

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×