Jump to content
6 minutes ago, jaslion said:

As far as I am concerned 4k high refresh inst even happening in the next 5 years or at all anymore. Especially not without dlss/fsr + framegeneration crap to pull it through

 

Seems like for most people "true" 4K gaming isn't really any more attainable now than it was 5 years ago.

Corps aren't your friends. "Bottleneck calculators" are BS. Only suckers buy based on brand. It's your PC, do what makes you happy.  If your build meets your needs, you don't need anyone else to "rate" it for you. And talking about being part of a "master race" is cringe. Watch this space for further truths people need to hear.

 

Ryzen 7 5800X3D | ASRock X570 PG Velocita | RTX 3080 ti Founders Edition | 4x8GB Crucial Ballistix 3600mt/s CL16

Link to post
Share on other sites

9 minutes ago, Middcore said:

 

Seems like for most people "true" 4K gaming isn't really any more attainable now than it was 5 years ago.

Or 10 years ago basically. R9 290x and gtx 780ti could do 4k 60 pretty high settings. Titan for sure. And thats basically where we've been stuck since 😛

Link to post
Share on other sites

24 minutes ago, Middcore said:

 

Seems like for most people "true" 4K gaming isn't really any more attainable now than it was 5 years ago.

because it was always attainable?

anways, considering where linus puts his PC, a 5090 seems like a fire hazard. 

Link to post
Share on other sites

I don't see the point in 8k gaming. Hell, I don't see the point in 4k gaming, save for some niche scenarios, like where someone has an absurdly large display. Personally, I don't notice a difference between 4k and 1440. It's just diminishing returns.

System Specs: Second-class potato, slightly mouldy

Link to post
Share on other sites

8 minutes ago, YellowJersey said:

I don't see the point in 8k gaming. Hell, I don't see the point in 4k gaming, save for some niche scenarios, like where someone has an absurdly large display. Personally, I don't notice a difference between 4k and 1440. It's just diminishing returns.

People have... TVs....
Most peoples TVs are 4k now. 

Link to post
Share on other sites

51 minutes ago, jaslion said:

Or 10 years ago basically. R9 290x and gtx 780ti could do 4k 60 pretty high settings. Titan for sure. And thats basically where we've been stuck since 😛

tf? i get 600fps on 4k in valorant. 160 Final Fantasy VII Rebirth 4k 350 Cyberpunk 4K with RT?

UserBenchmarks: Game 417%, Desk 121%, Work 464%
CPU: Intel Core i5-13600K - 123.4%
GPU: Nvidia RTX 5080 - 354.6%
SSD: WD Blue SN570 NVMe PCIe M.2 1TB - 298.3%
RAM: Corsair Vengeance LPX DDR4 3200 C16 2x16GB - 109.7%
MBD: MSI PRO Z690-A DDR4
Monitor: X32 4k 480hz OLED

Link to post
Share on other sites

3 hours ago, Chree said:

tf? i get 600fps on 4k in valorant. 160 Final Fantasy VII Rebirth 4k 350 Cyberpunk 4K with RT?

Valorant is an esports game made to run pretty well on integrated graphics.

 

Rebirth and cyberpunk when rt is on have dlss enabled. Cyberpunk also will enqble framegen by default too with presets.

 

Anyway talking about native here. Native 4k high(est) a 5090 its like 110fps average. Youndoing 350 is either bullshit or well dlss + framgen and such is on.

 

Which I clearly stated is not what I include for actual 4k gaming.

Link to post
Share on other sites

1 minute ago, jaslion said:

Valorant is an esports game made to run pretty well on integrated graphics.

 

Rebirth and cyberpunk when rt is on have dlss enabled. Cyberpunk also will enqble framegen by default too with presets.

 

Anyway talking about native here. Native 4k high(est) a 5090 its like 110fps average. Youndoing 350 is either bullshit or well dlss + framgen and such is on.

 

Which I clearly stated is not what I include for actual 4k gaming.

in no part of that rambling were u making any sense.  valorant is 4k native getting 600. i can pull 400 + on overwatch native without FG. apparently ur stuck at 60 fps 4k ? tf

UserBenchmarks: Game 417%, Desk 121%, Work 464%
CPU: Intel Core i5-13600K - 123.4%
GPU: Nvidia RTX 5080 - 354.6%
SSD: WD Blue SN570 NVMe PCIe M.2 1TB - 298.3%
RAM: Corsair Vengeance LPX DDR4 3200 C16 2x16GB - 109.7%
MBD: MSI PRO Z690-A DDR4
Monitor: X32 4k 480hz OLED

Link to post
Share on other sites

6 minutes ago, Chree said:

in no part of that rambling were u making any sense.  valorant is 4k native getting 600. i can pull 400 + on overwatch native without FG. apparently ur stuck at 60 fps 4k ? tf

Again valorant and overwatch are esports multiplayer games designed to run on very low end hardware well. As time has shown those games have always been SUPER easy to run at very high framerates even at 4k. Hell probably the cpu is a bottleneck still for one of those at least.

 

The singleplayer "AAA eyecandy" games are the true hard to run ones and those have basically not advanced much beyond 4k 60fps+ high(est) settings native res. 4090 vould do a good amount of very new AAA games at around 100 fps on release but that very quickly fell off during its lifetime. The 5090 being a much lesser upgrade is showing far worse quantity of famez doing 100+

Link to post
Share on other sites

9 hours ago, starsmine said:

People have... TVs....
Most peoples TVs are 4k now. 

Just because the TV is 4k doesn't mean it's displaying a 4k image. When you factor in viewing-distance, you're going to be hard pressed to tell the difference between a 4k image and a 1440 image, and probably nearly impossible to tell the difference between a 4k and 8k image.

System Specs: Second-class potato, slightly mouldy

Link to post
Share on other sites

3 minutes ago, YellowJersey said:

Just because the TV is 4k doesn't mean it's displaying a 4k image.

Thats literally what that means. 

Running any LCD at non native resolution is asking for issues, be that scaling or input latency. 

you can render or send it different resolutions but the display will be 4k. If a PC is hooked up to the TV, let the PC do all the scaling so the TV does as little image processing as possible.

Link to post
Share on other sites

image.png.a0a76d7f1754d1600e61ea2c8dd31fed.png

 

Linus Sebastian, confirmed team killer. It clearly wasn't the resolution that made him a better gamer.

And now a word from our sponsor: 💩

-.-. --- --- .-.. --..-- / -.-- --- ..- / -.- -. --- .-- / -- --- .-. ... . / -.-. --- -.. .

ᑐᑌᑐᑢ

Spoiler

    ▄██████                                                      ▄██▀

  ▄█▀   ███                                                      ██

▄██     ███                                                      ██

███   ▄████  ▄█▀  ▀██▄    ▄████▄     ▄████▄     ▄████▄     ▄████▄██   ▄████▄

███████████ ███     ███ ▄██▀ ▀███▄ ▄██▀ ▀███▄ ▄██▀ ▀███▄ ▄██▀ ▀████ ▄██▀ ▀███▄

████▀   ███ ▀██▄   ▄██▀ ███    ███ ███        ███    ███ ███    ███ ███    ███

 ██▄    ███ ▄ ▀██▄██▀    ███▄ ▄██   ███▄ ▄██   ███▄ ▄███  ███▄ ▄███▄ ███▄ ▄██

  ▀█▄    ▀█ ██▄ ▀█▀     ▄ ▀████▀     ▀████▀     ▀████▀▀██▄ ▀████▀▀██▄ ▀████▀

       ▄█ ▄▄      ▄█▄  █▀            █▄                   ▄██  ▄▀

       ▀  ██      ███                ██                    ▄█

          ██      ███   ▄   ▄████▄   ██▄████▄     ▄████▄   ██   ▄

          ██      ███ ▄██ ▄██▀ ▀███▄ ███▀ ▀███▄ ▄██▀ ▀███▄ ██ ▄██

          ██     ███▀  ▄█ ███    ███ ███    ███ ███    ███ ██  ▄█

        █▄██  ▄▄██▀    ██  ███▄ ▄███▄ ███▄ ▄██   ███▄ ▄██  ██  ██

        ▀███████▀    ▄████▄ ▀████▀▀██▄ ▀████▀     ▀████▀ ▄█████████▄

 

Link to post
Share on other sites

7 hours ago, YellowJersey said:

Just because the TV is 4k doesn't mean it's displaying a 4k image. When you factor in viewing-distance, you're going to be hard pressed to tell the difference between a 4k image and a 1440 image, and probably nearly impossible to tell the difference between a 4k and 8k image.

That just sounds like you are sitting too far away. Also try turning off antialiasing and tell me you still can't tell a difference, & I appreciate that Linus acknowledged this in his first 8k video.

Link to post
Share on other sites

12 hours ago, Chree said:

I feel like I'm talking to a caveman. 

I can run unreal tournament 1999 at 8k on my gtx 1070ti, obviously not a useful comparison. Sounds like you are stuck playing cave era games while the rest of us want 4k cyberpunk, control, and other nextgen games. The games you mentioned are not nextgen.

 

The expectation was that PC hardware would maintain dominance over consoles which have been 4k since the xbox one x. While technically true at the high end, it is not longer the case that a competitive PC (again compared to console) can be built for 1.5 to 2x a console price. Unless you include dlss and just ignore the myriad issues that consoles don't have to suffer. 

 

It is clear that the gpus are now very specifically targeted to 1080p, 1440p, and 4k resolutions. Which means that building a ps5 spec rig requires a high end gpu, again excluding the issues associated with upscaling (even though consoles have dynamic resolution too, their implementation doesn't have the same artifacting).

 

Nvidia is trying to pass off cheaper chips as though they are more expensive models through dlss, well documented if people bothered to dig into the actual hardware. Even the halo tier products are not fully enabled, because those chips are sold as workstation hardware instead. Before Cuda, gamers didn't have to be satisfied with the leftovers.

Link to post
Share on other sites

People once said that 640KB of RAM would be more than anyone ever needed. Time marched on and Tech advanced.

People once said that solid state drives would never be a viable technology. Time marched on and Tech advanced.

People once said that quantum computing was impossible. Time marched on and Tech advanced.

People once said that real time ray tracing would never happen. Time marched on and Tech advanced.

People once said that 4k gaming would never happen. Time marched on and Tech advanced.

 

8k computing and gaming will happen. It's only a matter of time and advances in Tech.

Link to post
Share on other sites

52 minutes ago, Sim2er said:

maintain dominance over consoles which have been 4k since the xbox one x.

Huge asterisk there tho. They could display in 4k but the games were never native 4k always a resolution scaled+checkerboard rendered kinda deal to get there at 30fps. Which was easy enough to do on a lot of hardware.

 

54 minutes ago, Sim2er said:

) can be built for 1.5 to 2x a console price.

2x is plenty to easily surpass a console still. 1.5 yeah rhats rhe cutoff.

 

54 minutes ago, Sim2er said:

Nvidia is trying to pass off cheaper chips as though they are more expensive models through dlss

This basically killed the big jumps in hardware you'd normally get and publishers cutting optimization money

Link to post
Share on other sites

57 minutes ago, Sim2er said:

Nvidia is trying to pass off cheaper chips as though they are more expensive models through dlss

DLSS and FrameGen have nothing to do with it. NVidia is a business, their goal is to make money. They saw an opportunity to take advantage of a market condition where selling a lower tier GPU as an overpriced higher brand model would be favored by the masses. It's crazy and dumb, but it's working because they're flying off the shelves. When the masses stop buying into that business model, NVidia will learn it's lesson.

1 hour ago, Sim2er said:

Even the halo top tier products are not fully enabled

That's "Top Tier". There's no need to re-invent the wheel, or in this case, reinvent a term.

Link to post
Share on other sites

2 hours ago, Avocado Diaboli said:

image.png.a0a76d7f1754d1600e61ea2c8dd31fed.png

 

Linus Sebastian, confirmed team killer. It clearly wasn't the resolution that made him a better gamer.

CS:2 Deathmatch is everyone versus everyone regardless of team composition, mirroring a lot of community deathmatch servers from CS:GO. So yes, he's a team killer, but it's within the bounds of the rules xD

Link to post
Share on other sites

23 hours ago, Middcore said:

 

Seems like for most people "true" 4K gaming isn't really any more attainable now than it was 5 years ago.

 

Nah, I play everything at 4K because that is the native screen resolution. On a 3090. MOST games are fine, but there are situations where it doesn't feel like 60fps even though it says it is. Mostly when it comes to swinging the camera around in a game, and that I immediately feel as motion sickness when it goes on too long.

 

 

 

Link to post
Share on other sites

4 hours ago, ZedRM said:

People once said that 640KB of RAM would be more than anyone ever needed. Time marched on and Tech advanced.

People once said that solid state drives would never be a viable technology. Time marched on and Tech advanced.

People once said that quantum computing was impossible. Time marched on and Tech advanced.

People once said that real time ray tracing would never happen. Time marched on and Tech advanced.

People once said that 4k gaming would never happen. Time marched on and Tech advanced.

 

8k computing and gaming will happen. It's only a matter of time and advances in Tech.

People also once said that 44.1 kHz is high enough sampling rate for a quality audio listening experience and we wouldn't need any higher. Time marched on, and, yeah, turned out they were right, and we still use 44.1 kHz (or 48 kHz) for just about everything for audio consumption.

 

I think both display resolution and refresh rate are approaching the same fate. There is not infinite tangible improvement available in these areas, as the limitations are in what the human eye can perceive and what the human brain can process. Once the "specs" of a display exceeds those biological limits, further improvements are unnecessary. I am extremely skeptical that the average person can perceive a difference between refresh rates over 600 Hz at most. Probably 300 Hz is the upper limit for plenty of people.

 

For the typical viewing distance of a living room TV, I bet most people can't tell the difference between 4k and 8k content, all other factors being equal. For a large monitor at your desk, possibly, depending on DPI. But 8k vs 12k? 16k? Highly unlikely. VR is probably the last frontier for display density since it sits so close to your eye. But even then, the limit will also be reached. It may be the case that 8k per eye is completely indistinguishable from real life, all else being equal. Beyond whatever that threshold is, increasing density further is just wasted cost and compute resources.

 

(Being able to achieve pixel-perfect quality at those levels, reducing latency, and reducing the cost will continue to advance, but the "target" won't change once we've reached it.)

Link to post
Share on other sites

On 2/16/2025 at 7:48 PM, smcoakley said:

For the typical viewing distance of a living room TV, I bet most people can't tell the difference between 4k and 8k content

As much as I don't want to admit it, but most folks actually can't tell the difference between 480p and 4K at normal living room conditions. I did blind tests with them...

 

On the bright side, they could tell the difference between 320 kbps MP3s and FLAC files.

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×