Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
Misanthrope

Talk me into / Out of a 4k Monitor

Recommended Posts

Posted · Original PosterOP

ITT I'd like to open up my thought process to all interested. So situation is that I have been using dual monitors for several years now and I consider them a very real necessity in my day to day stuff: I work from home (Master Data Management so huge excel sheets and SQL queries are my day-to-day) but I also really like to keep an eye on some videos, twitch, or just discord/messenger on the secondary monitor while working. I still kind of game on my down time but truth to be told I am really not keeping up with modern gaming anymore (Full rig in profile) I launch Skyrim from time to time when a mod strikes my fancy, maybe GTA Online for a few days when new DLC is out, sometimes I delve into older CRPGs and my son likes to use my rig for Fortnite and other light gaming. But it's been years since there's a really intensive and difficult-to-run game that I played for a substantial amount of time (Witcher 3 at launch in fact so 4 years)

 

But overall gaming has come to be pretty much secondary to using my PC for mostly light use (surfing, media consumption) and occasional heavier stuff (I might do small meme videos if fancy strikes) but not much heavy use at all.

 

Old set up for screens:

1x 60hz TN 1080p screen that had been dying for a while but it is finally dead

1x75hz IPS 1080p screen (Freesync though I have not used it since I can max any game I play at 75hz no problem)

 

Thinking about:

 

1) 32" VA 4k Screen. 60 hz nothing fancy but it does has Freesync and picture in picture

2) 27" IPS 4k Screen. 60 hz also freesync

 

So what I am thinking is for the kind of light gaming I am doing, 1070 and Freesync (Since you finally can do that on Nvidia cards) would be enough for some 4k gaming (Remember fastest game I'd use is Fortnite but I upscaling 1080p would work fine for that) and if not I might up to a 2080 in the near future without much issues.

 

What I want to figure out is if 32" is too big for desktop use. I know that density is still fine (At least as good as current 1080 @ 24") but I am unsure if 32" is too big for desktop use or if 27" is too small for comfortable reading @ 4k

 

Or maybe this isn't worth it at all vs just getting another basic 1080p panel for the secondary.

 

Thoughts?


-------

Current Rig

-------

Link to post
Share on other sites

I think it is going to come down to readability for you. I currently use a 27" 1440P for my primary monitor, but I rarely put anything like chrome/browser stuff on it because it looks way too big for that. But when I was doing stuff with SQL and Programming it was awesome because the amount of screen space.

If your going to go 4K, you are probably going to want to go with a 32" for readability sake. 

 

Link to post
Share on other sites

To be fair and simple, in my experience native 1080p res screen vs 1080p up scaled,looks way better and as far as your general consumption goes: the distance at which you'll be seating and the usage time-to be frank Your own experience with 4k screens will be the deciding factor. For me the difference was negligible I would just stick to 1080 screen with a high refresh rate ~120hz Which would suffice for the 1080 gaming experience. 

Edit: The screen size is just psychological adaptation you'll be just as fine with 32" until you glance over a 27", and then it might feel huge. 

Link to post
Share on other sites
Posted · Original PosterOP
3 minutes ago, ThePD said:

I think it is going to come down to readability for you. I currently use a 27" 1440P for my primary monitor, but I rarely put anything like chrome/browser stuff on it because it looks way too big for that. But when I was doing stuff with SQL and Programming it was awesome because the amount of screen space.

If your going to go 4K, you are probably going to want to go with a 32" for readability sake. 

 

 

I am leaning a bit that way atm but my follow up question is good good is VA vs IPS? IPS definitively helps a lot with readability vs the older TN panel that died on me but I never had a VA for an extended period but 32" IPS goes up in price quite a bit.

 

2 minutes ago, Zorba2.17 said:

To be fair and simple, in my experience native 1080p res screen vs 1080p up scaled,looks way better and as far as your general consumption goes: the distance at which you'll be seating and the usage time-to be frank Your own experience with 4k screens will be the deciding factor. For me the difference was negligible I would just stick to 1080 screen with a high refresh rate ~120hz Which would suffice for the 1080 gaming experience.

 

I understand that's the most common answer but this is almost as a non gamer/casual gamer only since as I said my interest is just not that much and even if I wanted to game I usually go for fairly slow paced stuff.

 

Distance is me sitting at a small desk btw. So about 1 meter or so (That's 3 feet and change in American)


-------

Current Rig

-------

Link to post
Share on other sites
Just now, Misanthrope said:

I am leaning a bit that way atm but my follow up question is good good is VA vs IPS? IPS definitively helps a lot with readability vs the older TN panel that died on me but I never had a VA for an extended period but 32" IPS goes up in price quite a bit.

Personally I don't think I have ever used a VA Panel so I don't really have an opinion. I currently use an IPS screen but I don't know really what makes it so great as everyone says as I have not been able to compare against a VA screen. 

Link to post
Share on other sites

As someone with a 27" 4K.

 

If you use your system as secondary for gaming and primarily for working with files and such, I say go to one or two 1440p good quality monitors instead.  Plus, be a whole lot easier to push 1440p than 4K.


Just a nutty gal that abuse hardware with F@H and BOINC.

F@H & BOINC Installation on Linux Guide

My CPU Army: 4690K Delid, E5-2670V3, 1900X, 1950X, 5960X J Batch

My GPU Army:960 FTW at 1551MHz, 1080Ti FTW3, 1080Ti SC, 1070 Hybrid, 2x Titan XP

My Console Brigade: Gamecube, Wii, Wii U, Switch, PS2 Fatty, PS4 Pro, Xbox One S, Xbox One X

My Tablet Squad: iPad 9.7" (2018 model), Samsung Tab S, Nexus 7 (1st gen)

 

Hardware lost to Kevdog's Law of Folding

OG Titan, 5960X, ThermalTake BlackWidow 850 Watt PSU

Link to post
Share on other sites
Posted · Original PosterOP
1 minute ago, Ithanul said:

As someone with a 27" 4K.

 

If you use your system as secondary for gaming and primarily for working with files and such, I say go up to one or two 1440p monitors instead.  Plus, be a whole lot easier to push 1440p than 4K.

Hmm

 

Yes I could consider 1440p instead, I am seeing more options than usual at 1440p too but it still kinda bothers me than they're virtually the exact same price and features as 4k monitors. If the price was the same but they had some form of added value vs 4k (Better refresh rate, etc. )


-------

Current Rig

-------

Link to post
Share on other sites
1 minute ago, Misanthrope said:

Hmm

 

Yes I could consider 1440p instead, I am seeing more options than usual at 1440p too but it still kinda bothers me than they're virtually the exact same price and features as 4k monitors. If the price was the same but they had some form of added value vs 4k (Better refresh rate, etc. )

Unsure how the prices are where you are at.

 

Good quality monitors are not cheap even in the 1440p resolution models.  :I  I was looking at some color accurate ones (that tends to be high priority for me) and those easily get into the 500-1000 range on price here in the States. 


Just a nutty gal that abuse hardware with F@H and BOINC.

F@H & BOINC Installation on Linux Guide

My CPU Army: 4690K Delid, E5-2670V3, 1900X, 1950X, 5960X J Batch

My GPU Army:960 FTW at 1551MHz, 1080Ti FTW3, 1080Ti SC, 1070 Hybrid, 2x Titan XP

My Console Brigade: Gamecube, Wii, Wii U, Switch, PS2 Fatty, PS4 Pro, Xbox One S, Xbox One X

My Tablet Squad: iPad 9.7" (2018 model), Samsung Tab S, Nexus 7 (1st gen)

 

Hardware lost to Kevdog's Law of Folding

OG Titan, 5960X, ThermalTake BlackWidow 850 Watt PSU

Link to post
Share on other sites

You would need to upgrade your hardware to get the full 4k experience.

 

 When I used a GTX 980 ti that = a 1070 I had to play Witcher 3 at 1440p and that is an old game now. Games of Mass Effect/2 and Fallout 3 vintage are doable.

A 1080ti, 2080 or Radeon 7 is what you need to not drop to 1440p.

 

For non gaming, 4k is the way to go. I am using a 32" 4k LG VA monitor now. For what we are doing here 1440p is low res as far as I am concerned.

 

4k at 27" is not idle. I used a 28" TN from 2015 to 2017 and on some older apps I used text was not readable so I set the desktop to 1440p. 32" 4k was a game changer to me.

 

My LG 32" VA 4k $355 monitor stands up very well against my $1200 LG IPS monitor. If it didn't it would be gone. 

 

My LG 32" 1440p 144hz VA did not stand up very well at all against my cheap LG 4k monitor. It looked bad. I ended up giving it to my Son. 

 

32" is perfect for desktop use at 4k. 

27" is good for desktop use at 1440p.

 

My 32" 4k sits on a 48" X 24" table and I will have to get a larger table if I went to go bigger. 

I use a 38" 3840 X 1600 ultra wide for gaming and it is a perfect fit for my Autonomous 53" X 30" table.

 

 


RIG#1 CPU: Intel i7 8086k | Motherboard: ASUS ROG Maximus X Hero | RAM: G.SKILL Ripjaws V Series 16GB DDR4 3200 | GPU: EVGA GeForce RTX 2080 ti FTW3 ULTRA | PSU: Corsair CORSAIR AX860W | Case: Cooler Master HAF 922 | Cooler: Noctua NH-D15 | SSD: Samsung 970 EVO 2TB


RIG#2 CPU: Intel i7 8086k | Motherboard: ASUS ROG Maximus X Hero | RAM: G.SKILL Ripjaws V Series 16GB DDR4 3200 | GPU: EVGA GeForce RTX 2080 ti XC | PSU: Corsair RMx1000W | Case: Cooler Master HAF X | Cooler: Noctua NH-D15 | SSD: Crucial MX300 2.5" 1TB  

Link to post
Share on other sites
3 hours ago, Misanthrope said:

ITT I'd like to open up my thought process to all interested. So situation is that I have been using dual monitors for several years now and I consider them a very real necessity in my day to day stuff: I work from home (Master Data Management so huge excel sheets and SQL queries are my day-to-day) but I also really like to keep an eye on some videos, twitch, or just discord/messenger on the secondary monitor while working. I still kind of game on my down time but truth to be told I am really not keeping up with modern gaming anymore (Full rig in profile) I launch Skyrim from time to time when a mod strikes my fancy, maybe GTA Online for a few days when new DLC is out, sometimes I delve into older CRPGs and my son likes to use my rig for Fortnite and other light gaming. But it's been years since there's a really intensive and difficult-to-run game that I played for a substantial amount of time (Witcher 3 at launch in fact so 4 years)

 

But overall gaming has come to be pretty much secondary to using my PC for mostly light use (surfing, media consumption) and occasional heavier stuff (I might do small meme videos if fancy strikes) but not much heavy use at all.

 

Old set up for screens:

1x 60hz TN 1080p screen that had been dying for a while but it is finally dead

1x75hz IPS 1080p screen (Freesync though I have not used it since I can max any game I play at 75hz no problem)

 

Thinking about:

 

1) 32" VA 4k Screen. 60 hz nothing fancy but it does has Freesync and picture in picture

2) 27" IPS 4k Screen. 60 hz also freesync

 

So what I am thinking is for the kind of light gaming I am doing, 1070 and Freesync (Since you finally can do that on Nvidia cards) would be enough for some 4k gaming (Remember fastest game I'd use is Fortnite but I upscaling 1080p would work fine for that) and if not I might up to a 2080 in the near future without much issues.

 

What I want to figure out is if 32" is too big for desktop use. I know that density is still fine (At least as good as current 1080 @ 24") but I am unsure if 32" is too big for desktop use or if 27" is too small for comfortable reading @ 4k

 

Or maybe this isn't worth it at all vs just getting another basic 1080p panel for the secondary.

 

Thoughts?

Many games do not feature 4K textures, so if you are into anything else but newest AAA games, you will not enjoy the real 4K anyway. Imho, 1440p is very detailed, definitely enough for gaming where you do not see individual pixels. But what matters is refresh rate. Go for 144Hz monitor, or 240Hz monitor, especially if you plan to play any faster-moving game. You will get more details. Why? Because movement will be blurry on 60Hz monitor, you will have millions of pixels but no detail in movement. Just try on your 60Hz monitor, move this very browser window and try to read text while moving it. Cannot see? But you have more than enough pixels to be able to read it! So why add pixels anyway. You can add some, if you want a bigger monitor (that´s why you may want 1440p) but most definitely add refresh rate. Unless you use it for work where you have a static picture and you need a good detail (graphical design etc.), more Hz will bring you better detail than more pixels.

Second reason for 1440p 144Hz over 4k 60Hz is this: You need RTX 2080Ti to run all games on 4k 60FPS max details. Or 2080/1080Ti on medium/high. Running game on less than 60FPS personally sucks for me, and running a game on lower than native resolution sucks a lot. So 1440p60 is not an option on 4K monitor. But if you get 1440p 140Hz monitor, you will run less demanding games like CS: GO on 144Hz and more demanding on less, for example 60HZ. Both will look very well as all 144Hz monitors support 60Hz mode. Also if you have an RTX 2080TI, you will be able to fully utilize it on 144Hz 1440p monitor as it is actually more demanding for your hardware to play a game on 1440p144 than 4K60.

Link to post
Share on other sites
On 5/16/2019 at 10:47 PM, chnapo said:

Many games do not feature 4K textures, so if you are into anything else but newest AAA games, you will not enjoy the real 4K anyway. Imho, 1440p is very detailed, definitely enough for gaming where you do not see individual pixels. But what matters is refresh rate. Go for 144Hz monitor, or 240Hz monitor, especially if you plan to play any faster-moving game. You will get more details. Why? Because movement will be blurry on 60Hz monitor, you will have millions of pixels but no detail in movement. Just try on your 60Hz monitor, move this very browser window and try to read text while moving it. Cannot see? But you have more than enough pixels to be able to read it! So why add pixels anyway. You can add some, if you want a bigger monitor (that´s why you may want 1440p) but most definitely add refresh rate. Unless you use it for work where you have a static picture and you need a good detail (graphical design etc.), more Hz will bring you better detail than more pixels.

Second reason for 1440p 144Hz over 4k 60Hz is this: You need RTX 2080Ti to run all games on 4k 60FPS max details. Or 2080/1080Ti on medium/high. Running game on less than 60FPS personally sucks for me, and running a game on lower than native resolution sucks a lot. So 1440p60 is not an option on 4K monitor. But if you get 1440p 140Hz monitor, you will run less demanding games like CS: GO on 144Hz and more demanding on less, for example 60HZ. Both will look very well as all 144Hz monitors support 60Hz mode. Also if you have an RTX 2080TI, you will be able to fully utilize it on 144Hz 1440p monitor as it is actually more demanding for your hardware to play a game on 1440p144 than 4K60.

1440p monitors do look crispier, but trust me after about 5 min in the game with 1080p you'd feel just as fine. I would not spend a single penny on something other than 1080p with higher refresh rates. i really do not buy the hype of high res. and yeah on 1 meter distance reading will sure as hell look a lot better on high res, but it is until you compare 1080 with other high res panels:side-by-side. My opinion would be to stick to 1080 sir, unless you are upgrading your pc soon down the line.Then yeah the 1440p with higher refresh rates is the sweet spot.

PS

I do not think that it is factually correct that RTX2080ti is the only component which would pull off >60fps, a lot of factors come in to play for the fps realm, the cpu,the Ram,cooling efficiency..etc. To have the latest and greatest does not imply high performance. TBH the GTX gen cards outperform RTX cards in terms of sheer fps output. Check out linus's test benches with these cards. The RTX have a more ML functional perspective rather than just raw gaming output. I think its way overpriced and unnecessary for gaming purposes. Correct me if I am wrong.

Link to post
Share on other sites
On 5/18/2019 at 12:55 PM, Zorba2.17 said:

1440p monitors do look crispier, but trust me after about 5 min in the game with 1080p you'd feel just as fine. I would not spend a single penny on something other than 1080p with higher refresh rates. i really do not buy the hype of high res. and yeah on 1 meter distance reading will sure as hell look a lot better on high res, but it is until you compare 1080 with other high res panels:side-by-side. My opinion would be to stick to 1080 sir, unless you are upgrading your pc soon down the line.Then yeah the 1440p with higher refresh rates is the sweet spot.

PS

I do not think that it is factually correct that RTX2080ti is the only component which would pull off >60fps, a lot of factors come in to play for the fps realm, the cpu,the Ram,cooling efficiency..etc. To have the latest and greatest does not imply high performance. TBH the GTX gen cards outperform RTX cards in terms of sheer fps output. Check out linus's test benches with these cards. The RTX have a more ML functional perspective rather than just raw gaming output. I think its way overpriced and unnecessary for gaming purposes. Correct me if I am wrong.

I agree with you up to 24". When you want to go bigger than 24", 1080p is just not enough. I have 27" 1080p at work, it is only good as a secondary monitor for anything from work to gaming. Even reading text sucks.

Check out benchmarks, many games can be run on GTX on 4K 60p but there are games that only 2080Ti can run on 4K60 on max FPS. Really. I have seen almost all of Linus videos, I have yet to see a video where 1080Ti beats 2080Ti. They may have better price/performance ratio but the general formula is that 2060 beats 1070 and matches 1070Ti when overclocked, 2070 beats 1080, 2080 matches or beats 1080Ti and 2080Ti is unrivaled. RTX on or off, newer generation of cards is just faster, even if we talk strictly about FPS output. Look, Linus put 2070s into his LAN center PCs, not 1080s, because 2070s are already cheap enough to be worthier than 1080s. In 60 FPS gaming, CPU does not matter so much, as long as you have a decend 4-core or 6-core Core i5 or Ryzen 5 2xxx or higher. Any decent CPU will manage 60FPS. CPUs become tricky when you want to reach or exceed 144FPS in some games. Many ryzens may have trouble with that (you may want to go for overclocked 2700x for that), as well as older intels that do not feature enough single-core proccessing power.

RAM is not so important at all, you either do or do not have enough RAM, more RAM will not help if you have enough. Check out Linus's video, he found only few games that ran better with 8GB RAM than with 4GB RAM and only one (I think it was the new Tomb Raider) that saw difference between 8GB and 16GB. So for most games, even 8GB is still enough as long as you do not run any other programs.

Of course cooling efficiency is something that you need to have, otherwise you will thermally throttle and your components will not output their best performance and you cannot even think about overclocking.

So my conclusion is, going 60 Hz and high res will only (mostly) need a good GPU, while going lower res and 144+ Hz will also require a good CPU, preferably one of the newer generations. Second, if you know you will cap the 144Hz limit (so you will be able to play games on 144FPS with vsync) then you are just fine. But if you know that you are getting a 144Hz monitor but you will only get 100-110FPS in your favorite game, you will want to go for Gsync or Freesync monitor (depends on whether you run Nvidia or ATI GPU).
Second conclusion that we may already agree on, is that high refresh rate beats high resolution in gaming. But you want to keep certain DPI so if you buy a big monitor, you want a big resolution. Also resolution is always good for productivity.

Link to post
Share on other sites
Posted · Original PosterOP

So update I ended up going for the 4k screen because as I was guessing it is fairly good for productivity: I am really liking the extra real estate to read places like the forums here and with picture in picture I can actually keep most things open and image from my work laptop.

 

Gaming has been fairly decent: backed of a few settings before cranking up the resolution and for my most played titles (Skyrim SE and Fortnite when my son wants to play) are no issue at all with a 1070 medium settings (Textures and other non-intensive settings kept at ultra though).

 

Only negative so far is that 1080p truly looks like shit so I will probably push my GPU upwards sooner than expected (But again, not too soon most of my favorites are pretty old). Also I thought people were kidding about seeing blemishes like acne scars and wrinkles @ 4k but damn it is actually true: most tech guys are the ones who actually do publish in 4k and also don't really do make up or anything like that so it's very noticeable 😂


-------

Current Rig

-------

Link to post
Share on other sites
10 hours ago, Misanthrope said:

So update I ended up going for the 4k screen because as I was guessing it is fairly good for productivity: I am really liking the extra real estate to read places like the forums here and with picture in picture I can actually keep most things open and image from my work laptop.

 

Gaming has been fairly decent: backed of a few settings before cranking up the resolution and for my most played titles (Skyrim SE and Fortnite when my son wants to play) are no issue at all with a 1070 medium settings (Textures and other non-intensive settings kept at ultra though).

 

Only negative so far is that 1080p truly looks like shit so I will probably push my GPU upwards sooner than expected (But again, not too soon most of my favorites are pretty old). Also I thought people were kidding about seeing blemishes like acne scars and wrinkles @ 4k but damn it is actually true: most tech guys are the ones who actually do publish in 4k and also don't really do make up or anything like that so it's very noticeable 😂

1080p does look like crap on a 4k monitors but all my 4k motors do a decent 1440p.

 

Be warned. Skyrim at 4k is a bit of a drug. It is the reason I have RTX 2080 tis. I need them for the ENB.  


RIG#1 CPU: Intel i7 8086k | Motherboard: ASUS ROG Maximus X Hero | RAM: G.SKILL Ripjaws V Series 16GB DDR4 3200 | GPU: EVGA GeForce RTX 2080 ti FTW3 ULTRA | PSU: Corsair CORSAIR AX860W | Case: Cooler Master HAF 922 | Cooler: Noctua NH-D15 | SSD: Samsung 970 EVO 2TB


RIG#2 CPU: Intel i7 8086k | Motherboard: ASUS ROG Maximus X Hero | RAM: G.SKILL Ripjaws V Series 16GB DDR4 3200 | GPU: EVGA GeForce RTX 2080 ti XC | PSU: Corsair RMx1000W | Case: Cooler Master HAF X | Cooler: Noctua NH-D15 | SSD: Crucial MX300 2.5" 1TB  

Link to post
Share on other sites
Posted · Original PosterOP
5 hours ago, jones177 said:

1080p does look like crap on a 4k monitors but all my 4k motors do a decent 1440p.

 

Be warned. Skyrim at 4k is a bit of a drug. It is the reason I have RTX 2080 tis. I need them for the ENB.  

I usually don't install ENBs, I have, but I usually think the weather/lighting mods in standard SE look good enough. Right now I have to do some fiddling (Freesnyc really likes Fullscreen and Vsync On which is opposite settings to my normal set up and performance gets killed at opposite sides) And I had weird issues with display port disconnecting when using Freesync (Luckily a reboot fixed it so probably a weird state in the driver) but when I play Witcher 3 I do think "Well, I do have enough put away for a 2080ti" which is indeed dangerous thinking.

 

5 hours ago, GoldenLag said:

1440p exists. i dont see why anyone would want a 4k monitor..........

 

I appreciate people who do 1440p don't get me wrong. It's just that selection available to me (In this case has to be very specific for fast cheap delivery and financing through my CC) means I'd be paying 80 to 90% of the price tag for 1440p panels with no additional features vs 4k so might as well just do 4k.

In my case I am basically using it as if I was doing a 2x2 monitor stack except it's far more flexible but I do approach a single big 4k as if it was multiple monitors so not really that gaming centric or focused on media consumption and more centered on productivity.


-------

Current Rig

-------

Link to post
Share on other sites
On 5/20/2019 at 1:54 PM, chnapo said:

I agree with you up to 24". When you want to go bigger than 24", 1080p is just not enough. I have 27" 1080p at work, it is only good as a secondary monitor for anything from work to gaming. Even reading text sucks.

Check out benchmarks, many games can be run on GTX on 4K 60p but there are games that only 2080Ti can run on 4K60 on max FPS. Really. I have seen almost all of Linus videos, I have yet to see a video where 1080Ti beats 2080Ti. They may have better price/performance ratio but the general formula is that 2060 beats 1070 and matches 1070Ti when overclocked, 2070 beats 1080, 2080 matches or beats 1080Ti and 2080Ti is unrivaled. RTX on or off, newer generation of cards is just faster, even if we talk strictly about FPS output. Look, Linus put 2070s into his LAN center PCs, not 1080s, because 2070s are already cheap enough to be worthier than 1080s. In 60 FPS gaming, CPU does not matter so much, as long as you have a decend 4-core or 6-core Core i5 or Ryzen 5 2xxx or higher. Any decent CPU will manage 60FPS. CPUs become tricky when you want to reach or exceed 144FPS in some games. Many ryzens may have trouble with that (you may want to go for overclocked 2700x for that), as well as older intels that do not feature enough single-core proccessing power.

RAM is not so important at all, you either do or do not have enough RAM, more RAM will not help if you have enough. Check out Linus's video, he found only few games that ran better with 8GB RAM than with 4GB RAM and only one (I think it was the new Tomb Raider) that saw difference between 8GB and 16GB. So for most games, even 8GB is still enough as long as you do not run any other programs.

Of course cooling efficiency is something that you need to have, otherwise you will thermally throttle and your components will not output their best performance and you cannot even think about overclocking.

So my conclusion is, going 60 Hz and high res will only (mostly) need a good GPU, while going lower res and 144+ Hz will also require a good CPU, preferably one of the newer generations. Second, if you know you will cap the 144Hz limit (so you will be able to play games on 144FPS with vsync) then you are just fine. But if you know that you are getting a 144Hz monitor but you will only get 100-110FPS in your favorite game, you will want to go for Gsync or Freesync monitor (depends on whether you run Nvidia or ATI GPU).
Second conclusion that we may already agree on, is that high refresh rate beats high resolution in gaming. But you want to keep certain DPI so if you buy a big monitor, you want a big resolution. Also resolution is always good for productivity.

I agree with you; my opinion was biased on the basis of the price to performance ratios, and yeah the RTX do good compared to GTX cards- the sheer output difference in fps is minuscule considering the price tags. The RAMs do affect the performance output, the capacity component is not an issue(Depends) although the actual speed of data transmission the RAM can handle does greatly affect output. i do understand CPU's wont matter as much in terms of sheer output alone, but these games are basically a complex program of vector graphics and good output oh high res.  ∝ A better GPU, but still I do not really get the co-ordination GPU's have with CPU for actual execution. The ALU side of the games I suppose is tackled by CPU's where as the IO response is on the GPU side, but GPU's also aid in arithmetic side of processing as well so it's kinda unclear to me, i know we are way off topic for this thread,I am really not getting any clarity in this regds.

Link to post
Share on other sites
On 5/21/2019 at 1:44 AM, GoldenLag said:

1440p exists. i dont see why anyone would want a 4k monitor..........

You may not see a reason, but other individuals can have reasons to get a 4K.

 

On 5/21/2019 at 1:41 AM, jones177 said:

1080p does look like crap on a 4k monitors but all my 4k motors do a decent 1440p.

 

Be warned. Skyrim at 4k is a bit of a drug. It is the reason I have RTX 2080 tis. I need them for the ENB.  

I was doing that drug before I even had a 4K monitor.  I would force the GPU to render 4K then fit it down to the 1440p or 1080p screen.  Didn't need AA when doing a stunt like that.  Of course, that poor Titan OG went belly up (good thing for warranties).

 

The used 2080Tis I seen of late are sure tempting me, but told myself I will hold off till next gen cards drop to get the best jump in performance (hopefully).  Plus, MHW is one of the few games I can't get high FPS out at the moment.


Just a nutty gal that abuse hardware with F@H and BOINC.

F@H & BOINC Installation on Linux Guide

My CPU Army: 4690K Delid, E5-2670V3, 1900X, 1950X, 5960X J Batch

My GPU Army:960 FTW at 1551MHz, 1080Ti FTW3, 1080Ti SC, 1070 Hybrid, 2x Titan XP

My Console Brigade: Gamecube, Wii, Wii U, Switch, PS2 Fatty, PS4 Pro, Xbox One S, Xbox One X

My Tablet Squad: iPad 9.7" (2018 model), Samsung Tab S, Nexus 7 (1st gen)

 

Hardware lost to Kevdog's Law of Folding

OG Titan, 5960X, ThermalTake BlackWidow 850 Watt PSU

Link to post
Share on other sites
1 hour ago, Ithanul said:

You may not see a reason, but other individuals can have reasons to get a 4K.

 

I was doing that drug before I even had a 4K monitor.  I would force the GPU to render 4K then fit it down to the 1440p or 1080p screen.  Didn't need AA when doing a stunt like that.  Of course, that poor Titan OG went belly up (good thing for warranties).

Not the same. I have the tec and have run the tests. I have even rendered at 8k and gone down to 4k. Not much difference. 

Super sampling is the first type of AA. I even used it in my 3D work since it takes less time to render at 8k with no AA and do the reduction in Photoshop. 

1 hour ago, Ithanul said:

 

The used 2080Tis I seen of late are sure tempting me, but told myself I will hold off till next gen cards drop to get the best jump in performance (hopefully).  Plus, MHW is one of the few games I can't get high FPS out at the moment.

I am too old to wait for next gen. Waiting is for the young 😄 .


RIG#1 CPU: Intel i7 8086k | Motherboard: ASUS ROG Maximus X Hero | RAM: G.SKILL Ripjaws V Series 16GB DDR4 3200 | GPU: EVGA GeForce RTX 2080 ti FTW3 ULTRA | PSU: Corsair CORSAIR AX860W | Case: Cooler Master HAF 922 | Cooler: Noctua NH-D15 | SSD: Samsung 970 EVO 2TB


RIG#2 CPU: Intel i7 8086k | Motherboard: ASUS ROG Maximus X Hero | RAM: G.SKILL Ripjaws V Series 16GB DDR4 3200 | GPU: EVGA GeForce RTX 2080 ti XC | PSU: Corsair RMx1000W | Case: Cooler Master HAF X | Cooler: Noctua NH-D15 | SSD: Crucial MX300 2.5" 1TB  

Link to post
Share on other sites
10 hours ago, jones177 said:

I am too old to wait for next gen. Waiting is for the young 😄 .

I am just in no mood to drop over a grand for a GPU (even though I can afford such a GPU).


Just a nutty gal that abuse hardware with F@H and BOINC.

F@H & BOINC Installation on Linux Guide

My CPU Army: 4690K Delid, E5-2670V3, 1900X, 1950X, 5960X J Batch

My GPU Army:960 FTW at 1551MHz, 1080Ti FTW3, 1080Ti SC, 1070 Hybrid, 2x Titan XP

My Console Brigade: Gamecube, Wii, Wii U, Switch, PS2 Fatty, PS4 Pro, Xbox One S, Xbox One X

My Tablet Squad: iPad 9.7" (2018 model), Samsung Tab S, Nexus 7 (1st gen)

 

Hardware lost to Kevdog's Law of Folding

OG Titan, 5960X, ThermalTake BlackWidow 850 Watt PSU

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×