Jump to content

Can I 4K???

Reteeks

Just built a new computer looking to upgrade my monitor now was just gonna get a 2K but 4K would be great. Just wondering if I could run 4K well on my set up?

 

Ryzen 5 2600x

RTX 2060 Super

16GB Ram

 

If any other spec are needed let me know.

I cant find any info on 2060 Supers running 4K, but a lot on 2070's and the 2060 Super has 8GB like the 2070's.

Seems people running the 2070's can run 4K but I have seen some say they have to run medium settings with only 30fps. Which does not seem worth it.

I'd like to run at least high settings with 60fps. If that's not possible with my set up at 4K. I'll just get a 2K.

Link to comment
Share on other sites

Link to post
Share on other sites

dont expect 4k from a 2060 super, the only card that can really run 4k normally is a 2080ti so

 

unless u play really low demanding games,  expect 1440p at most

(◑‿◐)

Link to comment
Share on other sites

Link to post
Share on other sites

4k low, yeah. 4k at anything higher, don't get your hopes up.

I WILL find your ITX build thread, and I WILL recommend the SIlverstone Sugo SG13B

 

Primary PC:

i7 8086k - EVGA Z370 Classified K - G.Skill Trident Z RGB - WD SN750 - Jedi Order Titan Xp - Hyper 212 Black (with RGB Riing flair) - EVGA G3 650W - dual booting Windows 10 and Linux - Black and green theme, Razer brainwashed me.

Draws 400 watts under max load, for reference.

 

How many watts do I needATX 3.0 & PCIe 5.0 spec, PSU misconceptions, protections explainedgroup reg is bad

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Valkyrie Lenneth said:

dont expect 4k from a 2060 super, the only card that can really run 4k normally is a 2080ti so

 

unless u play really low demanding games,  expect 1440p at most

This is completely wrong and misleading.

 

3 minutes ago, Reteeks said:

Just built a new computer looking to upgrade my monitor now was just gonna get a 2K but 4K would be great. Just wondering if I could run 4K well on my set up?

You can play 4k60hz with this setup yes, however that's very likely not to be the best experience. 2560x1440p is by every mean the true sweet spot for graphical quality and performance.

 

There's not so much difference between 4k and 1440p up to 28 inches screens for monitors, like literally speaking you'd be hard pressed to spot it but 4k will be significantly harder to run requiring more settings fine tuning. If you have to decrease the image quality far too much then it defeats the purpose of a higher resolution.

 

6 minutes ago, Reteeks said:

I cant find any info on 2060 Supers running 4K, but a lot on 2070's and the 2060 Super has 8GB like the 2070's.

It can be hard to find mainstream testings because the card is indeed not target for 4k gaming even if it can do fine enough at it for the most part with graphical fine tuning.

 

7 minutes ago, Reteeks said:

Seems people running the 2070's can run 4K but I have seen some say they have to run medium settings with only 30fps. Which does not seem worth it.

Can you be more specific here? this is too vague to be taken seriously.

 

8 minutes ago, Reteeks said:

I'd like to run at least high settings with 60fps. If that's not possible with my set up at 4K. I'll just get a 2K.

It really depends on the game and how well you're willing to fine tune your graphical settings... small tweaks can boost fps by considerable margins while keeping most of the graphical fidelity and quality, but this is not an exact science and requires you to put effort into finding this sweet spot, these two videos below goes on greatly about it:

 

So bottom line is that while 4k gaming with a RTX 2060 Super is by *all means* doable and can be a perfectly fine experience when targeting 60fps you'll still have to give more effort into adjusting things accordingly.

 

Given you've the alternative to go 2560x1440p which is about as good but much easier to run giving you far more flexibility that should be your ultimate conclusion in here.

Personal Desktop":

CPU: Intel Core i7 10700K @5ghz |~| Cooling: bq! Dark Rock Pro 4 |~| MOBO: Gigabyte Z490UD ATX|~| RAM: 16gb DDR4 3333mhzCL16 G.Skill Trident Z |~| GPU: RX 6900XT Sapphire Nitro+ |~| PSU: Corsair TX650M 80Plus Gold |~| Boot:  SSD WD Green M.2 2280 240GB |~| Storage: 1x3TB HDD 7200rpm Seagate Barracuda + SanDisk Ultra 3D 1TB |~| Case: Fractal Design Meshify C Mini |~| Display: Toshiba UL7A 4K/60hz |~| OS: Windows 10 Pro.

Luna, the temporary Desktop:

CPU: AMD R9 7950XT  |~| Cooling: bq! Dark Rock 4 Pro |~| MOBO: Gigabyte Aorus Master |~| RAM: 32G Kingston HyperX |~| GPU: AMD Radeon RX 7900XTX (Reference) |~| PSU: Corsair HX1000 80+ Platinum |~| Windows Boot Drive: 2x 512GB (1TB total) Plextor SATA SSD (RAID0 volume) |~| Linux Boot Drive: 500GB Kingston A2000 |~| Storage: 4TB WD Black HDD |~| Case: Cooler Master Silencio S600 |~| Display 1 (leftmost): Eizo (unknown model) 1920x1080 IPS @ 60Hz|~| Display 2 (center): BenQ ZOWIE XL2540 1920x1080 TN @ 240Hz |~| Display 3 (rightmost): Wacom Cintiq Pro 24 3840x2160 IPS @ 60Hz 10-bit |~| OS: Windows 10 Pro (games / art) + Linux (distro: NixOS; programming and daily driver)
Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Valkyrie Lenneth said:

dont expect 4k from a 2060 super, the only card that can really run 4k normally is a 2080ti so

1080 Ti, 2080, and Radeon VII are solid 4K cards as well. Just have to drop settings lower than the 2080 Ti, but even the 2080 Ti can't push 4K60 in newer titles. They're also half the price of the 2080 Ti or less, lol. You can push 4K decently with a 1080 or 2070 if you're running older titles or willing to compromise harder on settings. 

I've ran Assassin's Creed Odyssey at 4K60 with both a 1080 Ti and a Radeon VII, both did just fine and the game is gorgeous, even at high instead of ultra settings. 

Intel HEDT and Server platform enthusiasts: Intel HEDT Xeon/i7 Megathread 

 

Main PC 

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset 

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads 

Motherboard: EVGA X299 Dark 

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16 

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block 

Storage:  1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo 

Optical Drives: LG WH14NS40 

PSU: EVGA 1600W T2 

Case & Fans: Corsair 750D Airflow - 3x Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM 

OS: Windows 11

 

Display: LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

 

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

 

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Zando Bob said:

1080 Ti, 2080, and Radeon VII are solid 4K cards as well. Just have to drop settings lower than the 2080 Ti, but even the 2080 Ti can't push 4K60 in newer titles. They're also half the price of the 2080 Ti or less, lol. You can push 4K decently with a 1080 or 2070 if you're running older titles or willing to compromise harder on settings. 

I've ran Assassin's Creed Odyssey at 4K60 with both a 1080 Ti and a Radeon VII, both did just fine and the game is gorgeous, even at high instead of ultra settings. 

i dont see the point of running 4k just for the pixels to lower settings, only worth it at ultra

 

1 minute ago, Princess Luna said:

This is completely wrong and misleading.

 

 

no its not, because ppl expect to be playing on proper settings with 4k , not medium high , but ultra

(◑‿◐)

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Valkyrie Lenneth said:

no its not, because ppl expect to be playing on proper settings with 4k , not medium high , but ultra

Watch the video above.

 

I have a 1080 Ti on my 4K TV at the living room which is used for couch gaming, it can do 60fps V-Sync on literally any game I throw at it with more than sufficiently high settings by simply not ultra/max settings everything without any good sense on fine tuning of the settings for the best result.

 

I do not know what goes on your mind that it must be ultra/max settings or nothing or that everyone playing at 4k ought to target it, if that's the case even a RTX TITAN cannot do every game at 4k max settings and keep a steady 60fps, the current Red Dead Redemption 2 PC Port is a good example but not the only one.

 

@Zando Bob Also brought valid information about this.

Personal Desktop":

CPU: Intel Core i7 10700K @5ghz |~| Cooling: bq! Dark Rock Pro 4 |~| MOBO: Gigabyte Z490UD ATX|~| RAM: 16gb DDR4 3333mhzCL16 G.Skill Trident Z |~| GPU: RX 6900XT Sapphire Nitro+ |~| PSU: Corsair TX650M 80Plus Gold |~| Boot:  SSD WD Green M.2 2280 240GB |~| Storage: 1x3TB HDD 7200rpm Seagate Barracuda + SanDisk Ultra 3D 1TB |~| Case: Fractal Design Meshify C Mini |~| Display: Toshiba UL7A 4K/60hz |~| OS: Windows 10 Pro.

Luna, the temporary Desktop:

CPU: AMD R9 7950XT  |~| Cooling: bq! Dark Rock 4 Pro |~| MOBO: Gigabyte Aorus Master |~| RAM: 32G Kingston HyperX |~| GPU: AMD Radeon RX 7900XTX (Reference) |~| PSU: Corsair HX1000 80+ Platinum |~| Windows Boot Drive: 2x 512GB (1TB total) Plextor SATA SSD (RAID0 volume) |~| Linux Boot Drive: 500GB Kingston A2000 |~| Storage: 4TB WD Black HDD |~| Case: Cooler Master Silencio S600 |~| Display 1 (leftmost): Eizo (unknown model) 1920x1080 IPS @ 60Hz|~| Display 2 (center): BenQ ZOWIE XL2540 1920x1080 TN @ 240Hz |~| Display 3 (rightmost): Wacom Cintiq Pro 24 3840x2160 IPS @ 60Hz 10-bit |~| OS: Windows 10 Pro (games / art) + Linux (distro: NixOS; programming and daily driver)
Link to comment
Share on other sites

Link to post
Share on other sites

39 minutes ago, Princess Luna said:

Watch the video above.

 

I have a 1080 Ti on my 4K TV at the living room which is used for couch gaming, it can do 60fps V-Sync on literally any game I throw at it with more than sufficiently high settings by simply not ultra/max settings everything without any good sense on fine tuning of the settings for the best result.

 

I do not know what goes on your mind that it must be ultra/max settings or nothing or that everyone playing at 4k ought to target it, if that's the case even a RTX TITAN cannot do every game at 4k max settings and keep a steady 60fps, the current Red Dead Redemption 2 PC Port is a good example but not the only one.

 

@Zando Bob Also brought valid information about this.

As someone who develop video games as a hobby i can confirm the argument about High vs Ultra,

When i implement a graphics feature i first make it look the best i can and put it in the Ultra preset,

Then i optimize the performance of the feature with small downgrades but big performance improvement (looks similar or close enough),I put the result in the High preset.

 

For example in the Ultra preset i put real time reflections while in the High preset i put baked reflections,

In this case both look the same but the real time reflections do make the game feel more "alive" since you can see the reflections of passing NPCs,flying objects and yourself,but it cost a lot of performance.

A PC Enthusiast since 2011
AMD Ryzen 7 5700X@4.65GHz | GIGABYTE GTX 1660 GAMING OC @ Core 2085MHz Memory 5000MHz
Cinebench R23: 15669cb | Unigine Superposition 1080p Extreme: 3566
Link to comment
Share on other sites

Link to post
Share on other sites

I had a 4k computer and a 1440p ultra wide computer a few feet away from each other for years and most games I played on the ultrawide. The exceptions were my modded games that use 4k textures, since all the detail does not show up at 1440p.

 

I now use 1600p because it does show the detail that 1440p does not in 4k textures and is easier to run. In tests on most unmodded games I have played there is little or no difference between 1600p and 1440p

 

I recommend going 1440p.

I use a RTX 2080 ti for 4k and I still play all my vanilla games on Ultrawides.

 

I have one of my old 28" 4k monitors in the vertical beside my 32" 4k monitor and I hate it. The text is too small even at 150%. If you must go 4k get a 32" monitor and save your eyes.   

 

 

 

 

RIG#1 CPU: AMD, R 7 5800x3D| Motherboard: X570 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3200 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX500 2.5" 2TB | Monitor: ASUS ROG Swift PG42UQ

 

RIG#2 CPU: Intel i9 11900k | Motherboard: Z590 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3600 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1300 G+ | Case: Lian Li O11 Dynamic EVO | Cooler: Noctua NH-D15 | SSD#1: SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX300 2.5" 1TB | Monitor: LG 55" 4k C1 OLED TV

 

RIG#3 CPU: Intel i9 10900kf | Motherboard: Z490 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 4000 | GPU: MSI Gaming X Trio 3090 | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Crucial P1 1TB | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

 

RIG#4 CPU: Intel i9 13900k | Motherboard: AORUS Z790 Master | RAM: Corsair Dominator RGB 32GB DDR5 6200 | GPU: Zotac Amp Extreme 4090  | PSU: EVGA 1000 G+ | Case: Streacom BC1.1S | Cooler: EK 360mm AIO | SSD: Corsair MP600 1TB  | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

Link to comment
Share on other sites

Link to post
Share on other sites

Awesome I've glanced over all that was said. I'll read it all and watch the posted videos when I get home. But from what I've read I'll more than likely go with 2K since I've been gaming on a 24" 1080  gaming tv for the past 12 years should be a big improvement. I was only considering 4K because some gaming buddies were telling me I could push 4K while I figured I could. I want a steady 60 frames and crisp image.

Thanks for all the feed back!

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Valkyrie Lenneth said:

i dont see the point of running 4k just for the pixels to lower settings, only worth it at ultra

Game looks the same at high or ultra, especially at 4K. AA isn't really noticeable, you need none to very light. It's well established that you can massively dial back the clouds and fog in Assassin's Creed Odyssey with little to no noticeable difference in graphical quality (unless you spend 100% of your time looking at the sky and switching from medium to ultra so you can notice the changes between them and actually notice any difference). Textures I ran at ultra, those mostly hit VRAM and not fps as much (same as I could push Very High textures in Rise of The Tomb Raider with an RX480 because it was the 8GB variant). Difference between ultra and high shadows is very very very slight. Noticeable if you directly compare stills of either to each other, in gameplay or on their own, not distinguishable. Lighting varies from game to game, difference between even medium and high is often not too massive. A blend between high/ultra with medium on some really costly graphics options looks the same as full ultra, but will push 60fps on a $700 card vs pushing a shaky 55 or so at ultra with a $1,200 2080 Ti. My 1080 Ti was $550 used though and you can get them for $450 now, making the 2080 Ti look like an even worse deal (I still want one tho lmao). 

So yeah nah there isn't much difference between tweaked settings and straight ultra at 4K. ALSO, I ran this on a 45" 4K 60Hz IPS TV. Differences are much more noticeable on a panel that size, on a 27-32" gaming monitor they'd seem even more similar. 
 

Main reason for wanting a 2080 Ti and good 4K panel is screenshots (I tend to take quite a few and it frustrates me that even running DSR to push 4K ultra at 200% render scale, the screenshot itself is still 1080p), there you can notice a difference if you compare them side to side, there the absolute max graphical fidelity matters. For actual gameplay like I said, there's literally no visual difference. 

Here's a screenshot of my Destiny 2 Guardian at 1080p ultra in the character selection screen:

IMG_1923.thumb.JPG.a0616bbcbed7bf37b7388978d89dd1c0.JPG

 

and with helmet on and a different gun at 4K ultra 200% render scale then downscaled back to 1080p:

IMG_1924.thumb.JPG.2be6ff861e813f029cee8c6b79b216ac.JPG

Noticeably crispier (would be even better on native 4K because higher res screenshot), but that's still images. In game I don't really notice the difference much. Also this is 1080p 100% render scaling vs upscaling to 4K 200% render scaling and stuffing that back into 1080p (so 4x 1080p and then twice that and then downscaled back) the difference between 4K ultra and 4K high/medium is even less (and then even less again because stills are easier to pick differences out of than actually being in game), sadly I don't have any screenshots from when I ran it on my actual 4K panel. 

Games that do noticeably hurt from lower resolutions if built strangely are ones like Ghost Recon Wildlands. That has massive, massive views similar to some in Assassin's Creed, but due to some differences in how it handles those and how far away it renders stuff, lowering certain settings below medium (I think high on one or two specific ones, need to boot it up again and compare) can make trees just not exist in that valley across the way, or remove a lot of detail from far away towns. If you're not a graphics thot like me and don't go climbing places just to admire the view, your gaming experience wouldn't be hurt at all so you could drop settings even lower. 

Basically you can always drop settings with very little sacrifice in the vast majority of games, and depending on playstyle you may be able to cut them down even further. Before you say 4K Ultra is the only way people should experience 4K, take a look-see at your settings, change them, see what all makes a difference. Assuming you have enough VRAM you can usually push ultra textures with little performance hits, and those are one of the things that noticeably drop when lowering settings. Other things have much less of a graphical impact, but can net massive performance gains. IIRC you salvage like 20fps or so dropping volumetric clouds from ultra to medium in Assassin's Creed Odyssey, for basically no graphical drop at all. 

10 minutes ago, Reteeks said:

Awesome I've glanced over all that was said. I'll read it all and watch the posted videos when I get home. But from what I've read I'll more than likely go with 2K since I've been gaming on a 24" 1080  gaming tv for the past 12 years should be a big improvement. I was only considering 4K because some gaming buddies were telling me I could push 4K while I figured I could. I want a steady 60 frames and crisp image.

Thanks for all the feed back!

Hell yeah, 1440p is probs the best for gaming tbh. On a standard 27" panel, there's basically no difference visually between 1440p and 4K, but 1440p is much, much easier to run. 

Intel HEDT and Server platform enthusiasts: Intel HEDT Xeon/i7 Megathread 

 

Main PC 

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset 

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads 

Motherboard: EVGA X299 Dark 

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16 

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block 

Storage:  1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo 

Optical Drives: LG WH14NS40 

PSU: EVGA 1600W T2 

Case & Fans: Corsair 750D Airflow - 3x Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM 

OS: Windows 11

 

Display: LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

 

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

 

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×