Jump to content

You CAN Game at 8K… but DON’T! (SPONSORED)

SeanLMG

I wonder if 4K is the basic limit for human eye to make a difference @27" in at a reasonable distance.

I edit my posts more often than not

Link to comment
Share on other sites

Link to post
Share on other sites

Maybe I'm weird but I personally really don't understand the push for 4k or 8k gaming to begin with. It might be because I don't have perfect vision but even between 1080p and 1440 p while it's noticeable, the difference isn't at a level where I stopped using my 1080p monitor.

 It's really been the case as far as I'm concerned, at least that since the 7th generation resolution matters less than art style.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Tan3l6 said:

I wonder if 4K is the basic limit for human eyue to make a difference @27" in at a reasonable distance.

Every display is "retina" if you're far enough away.

I sold my soul for ProSupport.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Ultraforce said:

Maybe I'm weird but I personally really don't understand the push for 4k or 8k gaming to begin with. It might be because I don't have perfect vision but even between 1080p and 1440 p while it's noticeable, the difference isn't at a level where I stopped using my 1080p monitor.

 It's really been the case as far as I'm concerned, at least that since the 7th generation resolution matters less than art style.

Absolutely agree. At least 4K seems absolute max @32"

 

 

  

1 minute ago, Needfuldoer said:

Every display is "retina" if you're far enough away.

Yeah, but for a monitor it's not more than a meter.

I edit my posts more often than not

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Ultraforce said:

Maybe I'm weird but I personally really don't understand the push for 4k or 8k gaming to begin with. It might be because I don't have perfect vision but even between 1080p and 1440 p while it's noticeable, the difference isn't at a level where I stopped using my 1080p monitor.

 It's really been the case as far as I'm concerned, at least that since the 7th generation resolution matters less than art style.

Pushing for 8K makes 4K more accessible in terms of market adoption and prices. 

It also allows other techs (e.g VR) to become mainstream because they cant work properly at low resolutions and low refreshrates 


Also it is not only about the detail as in the definition of texture of a surface its about colour as well.. 

The more pixels you have per inch the better color representation you can have at any distance. 

Eg. sky gradient even if the bits of the colordepth of the signal are low can dissapear if you have a high enough resolution 

Or to put it simply if something looks like this in 720p  

image.png.cb0809fb341602cba2e693a93bcf06f8.png

It will look like this on 4K 

image.png.3b8c14536d827bad0bf0de09ca0a165e.png

with the same exactly signal 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, papajo said:

The more pixels you have the better color representation you can have at any distance. 

Resolution has little to do with colors imo.

The color gamut has more do with it  (8-bit to 12bit  and beyond)

I edit my posts more often than not

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, papajo said:

...

.
Or to put it simply if something looks like this in 720p  

image.png.cb0809fb341602cba2e693a93bcf06f8.png

It will look like this on 4K 

image.png.3b8c14536d827bad0bf0de09ca0a165e.png

with the same exactly signal 

You've got something messed up. 

 

6-bit gradient (if even) vs 8-bit.

I edit my posts more often than not

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Tan3l6 said:

Resolution has little to do with colors imo.

 

The bigger pixels are the easier it is for our eyes to distinguish their tones in color the smaller the pixels are the more our vision blends the colors of that group of pixels 

So we gonna perceive the image on the right as having better colors than the image on the left

image.png.4cb6865a6914b8686633309d06d531e1.png


You can actually check this by yourself e.g open this grayscale gradient you can see easily how the tones change it looks to you that everything is boxie because of that, now close this image go to the directory you save it and select the view as large icons now check its thumbnail, you will see a uniform pattern instead that's because the "boxes" of different color tones are small enough that your vision blends them together. 

https://easyupload.io/8isamw


On top of that if actually the footage is made for higher resolution each individual pixel could represent a different color so more pixels = more colors

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, papajo said:

The bigger pixels are the easier it is for our eyes to distinguish their tones in color the smaller the pixels are the more our vision blends the colors of that group of pixels 

So we gonna perceive the image on the right as having better colors than the image on the left

image.png.4cb6865a6914b8686633309d06d531e1.png


You can actually check this by yourself e.g open this grayscale gradient you can see easily how the tones change it looks to you that everything is boxie because of that, now close this image go to the directory you save it and select the view as large icons now check its thumbnail, you will see a uniform pattern instead that's because the "boxes" of different color tones are small enough that your vision blends them together. 

https://easyupload.io/8isamw


On top of that if actually the footage is made for higher resolution each individual pixel could represent a different color so more pixels = more colors

 

How about MSAA?

But I guess you're somewhat right. Still imo AA does better job.

I edit my posts more often than not

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Tan3l6 said:

You've got something messed up. 

 

6-bit gradient (if even) vs 8-bit.

no just dont get how vision works (also the actual image is 8bit vs 24 if you must know but it is obviously not about that its just that there is not way to make a monitor 4K and 1080p in front of your eyes with a magic wand hence I used something that will convey the effect of colorblending like how monitor manufactures do by showing you a blured screenshot vs the not blurred version of the same screenshot to convey to you that e.g 4K looks better than 1080p despite the screenshots not being 4K and one of them obviously being blurred lol) 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Tan3l6 said:

How about MSAA?

But I guess you're somewhat right. Still imo AA does better job.

Nope AA is just a SUBSTITUTE for the lack of higher resolution 

Higher resolution does always a better job than low res + AA

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, papajo said:

Nope AA is just a SUBSTITUTE for the lack of higher resolution 

Higher resolution does always a better job than low res + AA

No, just consider DLSS (3.0) and you see that you're wrong.

DLSS (3.0) is free performance and quality.

I edit my posts more often than not

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Tan3l6 said:

No, just consider DLSS (3.0) and you see that you're wrong.

DLSS 3.0 (besides being stupid console peasant tech brought to PC master ace to introduce impurity 😛 ) has nothing to do with our conversation. 


 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, papajo said:

DLSS 3.0 (besides being stupid console peasant tech brought to PC master ace to introduce impurity 😛 ) has nothing to do with our conversation. 


 

You're wrong anyway, just waiting for backup  😄

You were presenting 6-bit image for 8-bit in original.

I edit my posts more often than not

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Tan3l6 said:

You're wrong anyway, just waiting for backup  😄

You were presenting 6-bit image for 8-bit in original.

Again what I did is analogous to this:

image.png.91c6c80550a73659d02d97b00001c69f.png

This image has one resolution and the right side is edited with a blur effect to showcase that 4K has more clarity compared to 1080p which is true and there is no way to show that to you unless you have a 4k and a 1080p monitor so they do an analogous trick. 

Which is the same thing I did as far as the 2 pictures with the gradient do. 

As far as the concept goes I explained why it works like that with our vision and also uploaded a file with some simple instructions for you to follow and realize the actual effect. 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, papajo said:

Again what I did is analogous to this:

image.png.91c6c80550a73659d02d97b00001c69f.png

This image has one resolution and the right side is edited with a blur effect to showcase that 4K has more clarity compared to 1080p which is true and there is no way to show that to you unless you have a 4k and a 1080p monitor so they do an analogous trick. 

Which is the same thing I did as far as the 2 pictures with the gradient do. 

As far as the concept goes I explained why it works like that with our vision and also uploaded a file with some simple instructions for you to follow and realize the actual effect. 

https://www.provideocoalition.com/color_depth/

 

Actually I give up...

I edit my posts more often than not

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, papajo said:

  

That's like totally wrong.. 

If you dont have a 4K screen you wont be able to tell the difference (or the difference will be smaller ) in a 1080p screen when playing a game with 4K textures. 

4K textures have more detail lol.

But there is no reason to have 4K textures if you are going to play at 1080p screen because 1080p textures will look almost as good (there is still a reason to have 4K textures on a 1080p screen but the benifit is not that great, its the same reason why rendering on 4K and then downscaling it to 1080p and play it on a 1080p monitor is slightly better in quality than rendering at 1080p while playing on a 1080p monitor) 

There is a huge difference though if you have a 4K (or especially an 8K) monitor and the textures are 720p or lower

2126-0-1480498375.jpg


Like I am currently on a 32" 1440p screen and still can see the difference even on the zoomed out part lol

That's like totally wrong..

 

A 4K texture means the texture file is 4096x4096 pixels. It has absolutely nothing to do with the screen resolution we call 4K. If you don't believe me run that same test at 1080p, you'll still notice a difference between the 2K (2048x2048, not the "2K"=1080p we're used to from screens) and 4K (4096x4096) textures. On the face it's probably going to be pretty subtle because it's on a small object so the texture will be denser, but it's often very obvious on large objects with stretched textures like big rocks or walls.

 

I absolutely agree with you here:

3 hours ago, papajo said:

4K textures have more detail lol.

 

But you're wrong here:

3 hours ago, papajo said:

there is no reason to have 4K textures if you are going to play at 1080p screen because 1080p textures will look almost as good (there is still a reason to have 4K textures on a 1080p screen but the benifit is not that great, its the same reason why rendering on 4K and then downscaling it to 1080p and play it on a 1080p monitor is slightly better in quality than rendering at 1080p while playing on a 1080p monitor) 

¯\_(ツ)_/¯

 

 

Desktop:

Intel Core i7-11700K | Noctua NH-D15S chromax.black | ASUS ROG Strix Z590-E Gaming WiFi  | 32 GB G.SKILL TridentZ 3200 MHz | ASUS TUF Gaming RTX 3080 | 1TB Samsung 980 Pro M.2 PCIe 4.0 SSD | 2TB WD Blue M.2 SATA SSD | Seasonic Focus GX-850 Fractal Design Meshify C Windows 10 Pro

 

Laptop:

HP Omen 15 | AMD Ryzen 7 5800H | 16 GB 3200 MHz | Nvidia RTX 3060 | 1 TB WD Black PCIe 3.0 SSD | 512 GB Micron PCIe 3.0 SSD | Windows 11

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, BobVonBob said:

That's like totally wrong..

 

A 4K texture means the texture file is 4096x4096 pixels. It has absolutely nothing to do with the screen resolution we call 4K. If you don't believe me run that same test at 1080p, you'll still notice a difference between the 2K (2048x2048, not the "2K"=1080p we're used to from screens) and 4K (4096x4096) textures. On the face it's probably going to be pretty subtle because it's on a small object so the texture will be denser, but it's often very obvious on large objects with stretched textures like big rocks or walls.

 

I absolutely agree with you here:

 

But you're wrong here:

i did not get a  lot of sleep on og reply,

total forgot to mention when ref certain stuff   i got to be more specific. Seeing general  people blanket term stuff all the time.

btw bob you nailed the ref.

(like og ref for ray tracing is audio related predate cgi)

 

vfx react corridor digital goes into detail on the topic of assets and such.

 

to answer the other question.  most modern game engine still use  a decent amount of legacy code. you also need to guess right in the future to on them.

framework failed,cry engine failed. seeing they guess wrong where the indrusty was going.( the cry engine that modern is the fork legacy code console version.)

MSI x399 sli plus  | AMD theardripper 2990wx all core 3ghz lock |Thermaltake flo ring 360 | EVGA 2080, Zotac 2080 |Gskill Ripjaws 128GB 3000 MHz | Corsair RM1200i |150tb | Asus tuff gaming mid tower| 10gb NIC

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Tan3l6 said:

I wonder if 4K is the basic limit for human eye to make a difference @27" in at a reasonable distance.

Well it depends on viewing distance.  And there are calculators out there for that. 

 

https://www.calculatorsoup.com/calculators/technology/ppi-calculator.php

 

Based off the ~163 PPI for 2160p @  27", I'd say that the limit is a bit further north--but most will have diminishing returns at this point.  Again, my home-tested conclusion is that the limit (for my viewing distance) is somewhere around 180 PPI.

 

So with that in mind, we very much need 8k to achieve that same 163 PPI on a 55" display like the Odyssey Ark.

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, BobVonBob said:

. It has absolutely nothing to do with the screen resolution we call 4K.

Nobody says anything about it being the same thing, what I say (And you dont understand as far as I understand from your posts) is that if you do not have a monitor with a definition/resolution equal or higher than the resolution of your textures then you can not appreciate the higher detail their give.


And hence if you increase the resolution to your monitor but the texture resolution remains the same and greatly lower than the resolution of your monitor then you wont see any added detail since what you see witll be the same crap just with more pixels. 

again just look at the zoomed in areas
2126-0-1480498375.jpg


If you have a 1080p monitor the difference wont be as pronounced because you dont have the needed pixels to show the added detail the 4K texture brings.

And vice versa if you have a 4K monitor you will notice that the 2K texture are shitier than then 4K ones... or if you dont have 4K textures then these 2K texture will look the same if you set the game to 4K or to 8K because its the same texture the monitor resolution wont generate detail that is not present in the texture!


So if a game has low res textures and you just increase the monitor resolution you wont see much difference that's exactly the initial point.

Whilst (E.g when they played CS GO on an 8k monitor) if CSGO had 8K textures then they would be able to notice the difference

But CSGO has not 8K textures so they couldnt tell the difference by changing the monitor's resolution 

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, Tan3l6 said:

That is irrelevant and monitors use RGB so what? I mean you link to a fact that has nothing to do with the effect we are talking about. 

just do this I already told you to do. 

 

16 hours ago, papajo said:

You can actually check this by yourself e.g open this grayscale gradient you can see easily how the tones change it looks to you that everything is boxie because of that, now close this image go to the directory you save it and select the view as large icons now check its thumbnail, you will see a uniform pattern instead that's because the "boxes" of different color tones are small enough that your vision blends them together. 

https://easyupload.io/8isamw



Why does the thubnail look much better than the actual full blown image? your monitor and display settings have the same color depth 

Its because colordepth in terms of bits in your HDMI signal  has nothing to do with it 

Its about how our vision works and how more pixels in the same are (e.g pixels per inch or per square inch) result in smother color gradient transition 

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/30/2022 at 2:37 AM, LeapFrogMasterRace said:

At this rate 1440p will be king for another 6 years if it takes a 4090 to run games at 4k without DLSS. 

 

Nope. It´s still 1080p. Steam survey shows that 67% still play on 1080p.

 

I mean if you watch tech channels you would think otherwise. But they all are too far away from reality.. Especially now, where the living costs everywhere are skyrocketing, 1080p will stay the most used for a longer time. A lot of people right now struggle to pay for gas and electricity. Upgrading your monitor and GPU is not something most people will think about this christmas.

 

It´s so far away from reality to call 8K gaming a bandwagon it isn´t even funny if told as a joke. It wouldn´t even be funny to call 4K gaming a bandwagon.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, papajo said:

Nobody says anything about it being the same thing, what I say (And you dont understand as far as I understand from your posts) is that if you do not have a monitor with a definition/resolution equal or higher than the resolution of your textures then you can not appreciate the higher detail their give.


And hence if you increase the resolution to your monitor but the texture resolution remains the same and greatly lower than the resolution of your monitor then you wont see any added detail since what you see witll be the same crap just with more pixels. 

again just look at the zoomed in areas
2126-0-1480498375.jpg


If you have a 1080p monitor the difference wont be as pronounced because you dont have the needed pixels to show the added detail the 4K texture brings.

And vice versa if you have a 4K monitor you will notice that the 2K texture are shitier than then 4K ones... or if you dont have 4K textures then these 2K texture will look the same if you set the game to 4K or to 8K because its the same texture the monitor resolution wont generate detail that is not present in the texture!


So if a game has low res textures and you just increase the monitor resolution you wont see much difference that's exactly the initial point.

Whilst (E.g when they played CS GO on an 8k monitor) if CSGO had 8K textures then they would be able to notice the difference

But CSGO has not 8K textures so they couldnt tell the difference by changing the monitor's resolution 

I don't know how to convince you of this, but you are wrong. You absolutely can see differences between 4K, 8K, and even higher texture resolutions with a 1080p or less screen.

 

I'll give you an example, and if you still don't get it like you're accusing me of then I give up.

 

 

You have a wall in front of you in game. It is a big wall, a really big wall, and it has one single 2K texture applied to it. You approach this wall. When you stop you are so close to the wall that you can only see 0.5% of it. Now only 192x108 pixels of the original texture are visible. At 1080p, 100 pixels on your screen are taken up by each pixel of the original texture.

 

Now you switch out the original texture for a 4K texture. There are now 384x216 pixels in your field of view, and wouldn't you know it, that's still lower than 1080p. You can tell the difference, and it's big.

 

You up the texture to 8K, 768x432 pixels visible, still lower than 1080p, you can still see a difference. In fact, you would still be seeing a difference at 480p.

 

16K, 1536x864, still lower than 1080p. You've still got just a tiny bit more room to cram in more pixels.

 

A 32K texture, a gigapixel behemoth. A monolith, gobbling GPU memory like it's going out of style. Finally, absurdly, with a visible resolution of 3072x1728, you won't be able to tell the difference between this and a higher resolution texture on a 1080p screen.

 

 

Is this an exaggeration? Yes, absolutely, but it illustrates that there is no prescribed screen resolution required for different texture resolutions. A 4K screen is not required to see a benefit from 4K textures, an 8K screen isn't needed for 8K textures. It's all relative to how much of the texture you're actually looking at. If instead you back up until the wall covers your screen edge to edge horizontally you wouldn't benefit from anything past 2K at 1080p, where before you could see a difference all the way up to 32K.

¯\_(ツ)_/¯

 

 

Desktop:

Intel Core i7-11700K | Noctua NH-D15S chromax.black | ASUS ROG Strix Z590-E Gaming WiFi  | 32 GB G.SKILL TridentZ 3200 MHz | ASUS TUF Gaming RTX 3080 | 1TB Samsung 980 Pro M.2 PCIe 4.0 SSD | 2TB WD Blue M.2 SATA SSD | Seasonic Focus GX-850 Fractal Design Meshify C Windows 10 Pro

 

Laptop:

HP Omen 15 | AMD Ryzen 7 5800H | 16 GB 3200 MHz | Nvidia RTX 3060 | 1 TB WD Black PCIe 3.0 SSD | 512 GB Micron PCIe 3.0 SSD | Windows 11

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, BobVonBob said:

You absolutely can see differences between 4K, 8K

Dude have you seen the video of the discussion?  there were like 5+ people including linus that said they did not see differences IN THE GAMES  they played switching between 4K and 8K. 

And the reason for that is that none of those games had highres assets period.  (and only one of them, the asian girl whos name I don't know, realized that this was the reason) 


We are not discussing if in general e.g one can tell the difference between a 4K video footage and an 8k one lol 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, papajo said:

Dude have you seen the video of the discussion?  there were like 5+ people including linus that said they did not see differences IN THE GAMES  they played switching between 4K and 8K. 

And the reason for that is that none of those games had highres assets period.  (and only one of them, the asian girl whos name I don't know, realized that this was the reason) 


We are not discussing if in general e.g one can tell the difference between a 4K video footage and an 8k one lol 

In the line you quoted I'm talking about the texture resolutions, not the screen resolutions.

6 hours ago, BobVonBob said:

You absolutely can see differences between 4K, 8K, and even higher texture resolutions

 

I absolutely agree that 8K screens are nonsense for basically anything, but it's not because the textures in games aren't 8K (the texture size). The textures can be whatever resolution you want and 8K (the screen size) still doesn't make sense. It's not because the textures aren't 8K (the texture size), it's because the angular size of physical pixels in both 4K and 8K (the screen sizes) are too small to be readily distinguishable at typical viewing distances. The point I've been apparently failing to make is that it is absolutely irrelevant that the textures aren't 8K (the texture size), because texture size is independent from screen sizes.

 

As a sidenote, this is why I hate all the conflicting acronyms in the computer space.

¯\_(ツ)_/¯

 

 

Desktop:

Intel Core i7-11700K | Noctua NH-D15S chromax.black | ASUS ROG Strix Z590-E Gaming WiFi  | 32 GB G.SKILL TridentZ 3200 MHz | ASUS TUF Gaming RTX 3080 | 1TB Samsung 980 Pro M.2 PCIe 4.0 SSD | 2TB WD Blue M.2 SATA SSD | Seasonic Focus GX-850 Fractal Design Meshify C Windows 10 Pro

 

Laptop:

HP Omen 15 | AMD Ryzen 7 5800H | 16 GB 3200 MHz | Nvidia RTX 3060 | 1 TB WD Black PCIe 3.0 SSD | 512 GB Micron PCIe 3.0 SSD | Windows 11

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×