Jump to content

Is it worth upgrading a 2080 Ti for gaming?

GamerBlake
5 hours ago, GamerBlake said:

Didn’t Linus do a video and disprove the “your eyes can only see 60 FPS” belief?

Yes. The whole "human eye can't see above 30 or 60 FPS" (choices differ) is a myth. I can definitely tell if my monitor is set to 60 Hz or 120 Hz. Of course if you were to say "guess the exact framerate" it's a pointless exercise, but the 120 Hz smoothness over 60 Hz is definitely noticeable to me.

 

If I were to guess on how it came to be it's because movies are 24 FPS (this is where we start to see motion instead of separate slides and at the time was thus the cheapest option to minimise the amount of film) and most displays are limited to 60 Hz with only now next gen consoles and such starting to offer high refresh rate to the masses.

 

Our eyes are also analog, not digital, and do not operate in FPS so the analogy can't really be made in the first place.

Crystal: CPU: i7 7700K | Motherboard: Asus ROG Strix Z270F | RAM: GSkill 16 GB@3200MHz | GPU: Nvidia GTX 1080 Ti FE | Case: Corsair Crystal 570X (black) | PSU: EVGA Supernova G2 1000W | Monitor: Asus VG248QE 24"

Laptop: Dell XPS 13 9370 | CPU: i5 10510U | RAM: 16 GB

Server: CPU: i5 4690k | RAM: 16 GB | Case: Corsair Graphite 760T White | Storage: 19 TB

Link to comment
Share on other sites

Link to post
Share on other sites

On 7/1/2021 at 6:47 AM, tikker said:

Yes. The whole "human eye can't see above 30 or 60 FPS" (choices differ) is a myth. I can definitely tell if my monitor is set to 60 Hz or 120 Hz. Of course if you were to say "guess the exact framerate" it's a pointless exercise, but the 120 Hz smoothness over 60 Hz is definitely noticeable to me.

 

If I were to guess on how it came to be it's because movies are 24 FPS (this is where we start to see motion instead of separate slides and at the time was thus the cheapest option to minimise the amount of film) and most displays are limited to 60 Hz with only now next gen consoles and such starting to offer high refresh rate to the masses.

 

Our eyes are also analog, not digital, and do not operate in FPS so the analogy can't really be made in the first place.

maybe not all brains do. same thing with 4k except now add the eyes.

CPU: Ryzen 2600 GPU: RX 6800 RAM: ddr4 3000Mhz 4x8GB  MOBO: MSI B450-A PRO Display: 4k120hz with freesync premium.

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, LOST TALE said:

maybe not all brains do. same thing with 4k except now add the eyes.

"now add the eyes" what? We're already talking about eyes. Regardless of framerate or resolution though, the same argument applies. Our eyes do not operate in pixels or "4k". They are more like analog optics.

 

We do have an innate angular resolution due to pupil size, however, which is about 0.5 arcminutes (one arcminute = 1/60 degree). You can translate to a phyiscal size using trigonometry. For example at 30 cm away this corresponds to a resolution of ~44 micrometre. If you grab a 43" screen at 3840x2160 resolution those pixels will be ~247 micrometres in size, which means we can see them. I just tried it and can indeed make out individual pixels at 30 cm away from my 43" 4k TV.

 

This is why distance matters so much in screen resolution and why film movies can stil l benefit from being scanned at something like 5k I believe (the exact resolution may be different, but the point is about high resolution).

Crystal: CPU: i7 7700K | Motherboard: Asus ROG Strix Z270F | RAM: GSkill 16 GB@3200MHz | GPU: Nvidia GTX 1080 Ti FE | Case: Corsair Crystal 570X (black) | PSU: EVGA Supernova G2 1000W | Monitor: Asus VG248QE 24"

Laptop: Dell XPS 13 9370 | CPU: i5 10510U | RAM: 16 GB

Server: CPU: i5 4690k | RAM: 16 GB | Case: Corsair Graphite 760T White | Storage: 19 TB

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, tikker said:

"now add the eyes" what? We're already talking about eyes. Regardless of framerate or resolution though, the same argument applies. Our eyes do not operate in pixels or "4k". They are more like analog optics.

 

We do have an innate angular resolution due to pupil size, however, which is about 0.5 arcminutes (one arcminute = 1/60 degree). You can translate to a phyiscal size using trigonometry. For example at 30 cm away this corresponds to a resolution of ~44 micrometre. If you grab a 43" screen at 3840x2160 resolution those pixels will be ~247 micrometres in size, which means we can see them. I just tried it and can indeed make out individual pixels at 30 cm away from my 43" 4k TV.

 

I could do it on 27" at 40cm away or however far I normally sit btw (probably not blue light lol) . I divided 3840 by an estimate of the horizontal FOV of 60 and the answer was close to 64 pixels per degree.

 

What I meant by the eye, is that I'm sure everyone's eyes can see higher framerate but their brains might not. Whereas with 4k, both the brain and eyes can be failure points. Altough to be fair the person deserves a shrot explanation fo what 4k is so they look at objects that are  farther from the character.

CPU: Ryzen 2600 GPU: RX 6800 RAM: ddr4 3000Mhz 4x8GB  MOBO: MSI B450-A PRO Display: 4k120hz with freesync premium.

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, LOST TALE said:

I could do it on 27" at 40cm away or however far I normally sit btw (probably not blue light lol) . I divided 3840 by an estimate of the horizontal FOV of 60 and the answer was close to 64 pixels per degree.

Horizontal field of view of 60 what? You distinguishing individual pixels depends on the resolution of your eyes and the "human signal processing chain". That has little to do with field of view.  The amount of pixels per degree will vary with pixel size and hence screen size. A 27" monitor will have smaller pixels than a 40" one of the same resolution.

13 minutes ago, LOST TALE said:

What I meant by the eye, is that I'm sure everyone's eyes can see higher framerate but their brains might not. Whereas with 4k, both the brain and eyes can be failure points. Altough to be fair the person deserves a shrot explanation fo what 4k is so they look at objects that are  farther from the character.

I don't understand what you are trying to say here with "failure points". Both your brain and eyes are always part of the equation; whether it's about "framerate" or "resolution" and in the end it's limited by your "signal processing chain". Yes to some people the difference is minor or hardly noticeable and yes you can train yourself to notice things quicker, that's because our brains are flexible and optimise themselves for what is useful to us.

Crystal: CPU: i7 7700K | Motherboard: Asus ROG Strix Z270F | RAM: GSkill 16 GB@3200MHz | GPU: Nvidia GTX 1080 Ti FE | Case: Corsair Crystal 570X (black) | PSU: EVGA Supernova G2 1000W | Monitor: Asus VG248QE 24"

Laptop: Dell XPS 13 9370 | CPU: i5 10510U | RAM: 16 GB

Server: CPU: i5 4690k | RAM: 16 GB | Case: Corsair Graphite 760T White | Storage: 19 TB

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, tikker said:

Horizontal field of view of 60 what? You distinguishing individual pixels depends on the resolution of your eyes and the "human signal processing chain". That has little to do with field of view.  The amount of pixels per degree will vary with pixel size and hence screen size. A 27" monitor will have smaller pixels than a 40" one of the same resolution.

The FOV that the screen covers takes into account pixel size and screen size. 

21 minutes ago, tikker said:

failure points.

Parts of them that are responsible for them not recognizing a difference between different refresh rates and resolutions.

 

Listen I'm done communicating with you about this. There's just too much extra writing required to clarify things for what it's worth. I could always clarify more but it's likely like coding in machine code because the interpreter is shit.

CPU: Ryzen 2600 GPU: RX 6800 RAM: ddr4 3000Mhz 4x8GB  MOBO: MSI B450-A PRO Display: 4k120hz with freesync premium.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, LOST TALE said:

The FOV that the screen covers takes into account pixel size and screen size. 

You're overcomplicating things. You have X times Y amount of pixels needing to cover an Z" diagonal screen. That is all you need to know to determine pixel size as it would be a different sized screen otherwise. The number of pixels in your FoV is also irrelevant, as you'll be able to distinguish 3 pixels just as wel as 3000 pixels if the pixels themselves are large enough. Finally it doesn't help that you don't give any units of what you're doing. For example:

1 hour ago, LOST TALE said:

I divided 3840 by an estimate of the horizontal FOV of 60 and the answer was close to 64 pixels per degree.

3840 what? a FOV of 60 what? I can say 3840/60=64, but you give no reasoning what you try to say with it, why that is a correct calculation or what you base it on.

9 minutes ago, LOST TALE said:

Parts of them that are responsible for them not recognizing a difference between different refresh rates and resolutions.

Those parts are your eyes and your brain... The way you worded it looks as if you imply that framerate is set by your eyes and that resolution is set by the brain as well, which is just not the case. Furthermore our brains can be heavlily influenced by prior knowledge or expectations. If you expect to see or hear a difference, you will or to a greater extent. If you don't, you won't or to a lesser extent.

2 minutes ago, LOST TALE said:

Listen I'm done communicating with you about this. There's just too much extra writing required to clarify things for what it's worth. I could always clarify more but it's likely like coding in machine code because the interpreter is shit.

You started it and have been exceptionally vauge and unclear in your wording, so that's on and up to you. It's not at all "like coding in machine code because the interpreter is shit" . Garbage in is garbage out just as much for machine code as for vague statements throwing terms and numbers around. How is someone supposed to get anything out of this post:

14 hours ago, LOST TALE said:

maybe not all brains do. same thing with 4k except now add the eyes.

With "same thing" you might refer to not all people noticing much difference between 60 and 120, but then you say "add the eyes" when going to 4k which makes zero sense. As if you aren't using your eyes when looking at high framerate content.

 

 

This confusion is also why people need to stop thinking about our eyes working at a certain resolution and framerate, because they simply don't. Our optical system isn't some pixel matrix refreshing or being read out X times per second. There are limits to how quickly we can respond to something, how short a signal we can register, but that still only tells us we can see events occuring up to a certain "FPS" equivalent and it doesn't mean we see or operate at that FPS.

Crystal: CPU: i7 7700K | Motherboard: Asus ROG Strix Z270F | RAM: GSkill 16 GB@3200MHz | GPU: Nvidia GTX 1080 Ti FE | Case: Corsair Crystal 570X (black) | PSU: EVGA Supernova G2 1000W | Monitor: Asus VG248QE 24"

Laptop: Dell XPS 13 9370 | CPU: i5 10510U | RAM: 16 GB

Server: CPU: i5 4690k | RAM: 16 GB | Case: Corsair Graphite 760T White | Storage: 19 TB

Link to comment
Share on other sites

Link to post
Share on other sites

On 6/29/2021 at 12:20 PM, GamerBlake said:

I end up paying for performance I’m not getting.

why not, does the monitor not have gsync? 

 

 

On 6/29/2021 at 12:20 PM, GamerBlake said:

But I will admit I do like upgrading for the fun of it too. I hate being in a situation where all of my computer parts are outdated and I need to replace everything and I’ve found one way to avoid that is by upgrading things one at a time over a period of time so that none of the parts are too old or slow.

That makes actually  kinda sense, except  its a really bad time for that… DDR5 is coming out *this* year (or it already has?) and GPUs are at an all time high…

 

So what you should do is wait until next year , buy the *then* current intel "horse creek" or whatever + DDR5 RAM + motherboard.

 

Then 2-3 years later the then newest and bestest GPU… that way you'll have a top system for like 4-5 years for sure.

 

And your current  GPU is just fine for a few more years.

 

 

OR…

 

you actually save up and buy a whole new PC in 2-3 years, no ifs no buts - that, economically makes the most sense and the system will be the most balanced too.

 

 

tldr: buy a cpu now, it will be completey "outdated" by end of the year

buy a gpu now, it will be overpriced af, your cpu will bottleneck it, and you dont need it cuz your current  gpu is def not "outdated" yet…

 

 

 

 

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I have a 2080 Ti as well and the 3080 isn't fast enough to be bothered with upgrading. The gap in performance is reduced (although not necessarily overcome) with overclocking.

Our Grace. The Feathered One. He shows us the way. His bob is majestic and shows us the path. Follow unto his guidance and His example. He knows the one true path. Our Saviour. Our Grace. Our Father Birb has taught us with His humble heart and gentle wing the way of the bob. Let us show Him our reverence and follow in His example. The True Path of the Feathered One. ~ Dimboble-dubabob III

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×