Jump to content

Screen tearing education please

Wookiee

Hey all,

 

I recently built, what I thought was a, decent mid tier gaming rig that could be capable of streaming.

 

Rig:

Nvidia RTX 2060 Super

2x8GB Corsair DDR4 3200MHz RAM

Ryzen 7 2700X CPU

Corsair RM650 PSU

Crucial P2 1TB NVMe Boot Drive (Only storage so far)

ASRock B450M Motherboard

Artic eSports DUO Freezer 34 CPU Cooler

 

Monitors:

MAIN - MSI 1440p 32" monitor (DP)

SECONDARY - HP 1080p monitor (HDMI)

 

So my question, when just gaming I can run Darktide at max settings with no ray tracing with buttery smooth results, yet when streaming using streamlabs OBS I need to go down to low graphics and turn off all AI frame rendering except for Nvidia RTX.

 

What is causing the screen tearing / freezing? Can my GPU not handle the same rendering? Is my CPU bottlenecking everything? Is my RAM getting flooded, or am I just being a complete inept boomer and missing something here.

 

Can you also tell me what all the different AI rendering systems do? eg. RTX, DLSS, I think Darktide has TRXX or something like that too, and an AMD Renderer too.

 

Do these fight each other, or do they work well together?

 

Thanks in advance for any help and/or education on these subjects

Link to comment
Share on other sites

Link to post
Share on other sites

Well usually RTX on means real time raytracing and it's very demanding for the GPU, a 2060 Super was in the first generation that had actual hardware for it on the cards, so the performance will tank a lot if you enable it. It has nothing to do with AI it is just a rendering technique that is more accurate than the approximation that older techniques are using. Before real time raytracing, developers used to "fool" you into thinking for example light rays were real, by showing you something that looks good but isn't physically accurate. That also means a lot more FPS since you don't have to compute how every light ray behaves. My understanding of it is quite superficial but I hope you get the main point RTX means lower FPS, but a prettier picture. (side note: some people can't even tell the difference) 

Now DLSS is AI that tries to mitigate the performance hit from ray tracing. It approximates what a picture should look like and only renders part of it. The AI will guess what is missing so it doesn't have to do all the computing work, as a trade off it might look a bit worse or may have visual artefacts from time to time. (side note: some people don't even notice those) There are certain versions of DLSS (1-3) and they vary in quality and in execution, they all have the same goal though: more FPS. In theory they are supposed to work in tandem, but you can turn each of them on individually if your hardware allows it (RTX) or if you just need more frames (DLSS).

DLSS is interesting since it uses AI in different ways, so again I'd say watch some videos dedicated to the topic or read articles that can explain much better than I do.  

In the end it's a matter of preference, but since your card is mid range, RTX isn't really made for it I think. You can obviously turn it on and play around with it, see if it looks better than normal, see if DLSS helps to get playable framerates with RTX on, but at the end of the day if you want a nice RTX experience you probably need to spend more money on a graphics card. There is nothing wrong with leaving RTX off and enjoying the games at normal framerates though.

 

Keep in mind that things like that need to be implemented by game devs, so most games don't even have those features. You can find a list of RTX games here for example: https://www.digitaltrends.com/computing/games-support-nvidia-ray-tracing/ 

 

Link to comment
Share on other sites

Link to post
Share on other sites

You seem to be confused on what RTX is, it's not performance-improving/scaling/frame-generation, that's DLSS. RTX is real-time raytracing i.e. nicer look but at high performance cost, especially on 20xx cards.

F@H
Desktop: i9-13900K, ASUS Z790-E, 64GB DDR5-6000 CL36, RTX3080, 2TB MP600 Pro XT, 2TB SX8200Pro, 2x16TB Ironwolf RAID0, Corsair HX1200, Antec Vortex 360 AIO, Thermaltake Versa H25 TG, Samsung 4K curved 49" TV, 23" secondary, Mountain Everest Max

Mobile SFF rig: i9-9900K, Noctua NH-L9i, Asrock Z390 Phantom ITX-AC, 32GB, GTX1070, 2x1TB SX8200Pro RAID0, 2x5TB 2.5" HDD RAID0, Athena 500W Flex (Noctua fan), Custom 4.7l 3D printed case

 

Asus Zenbook UM325UA, Ryzen 7 5700u, 16GB, 1TB, OLED

 

GPD Win 2

Link to comment
Share on other sites

Link to post
Share on other sites

So screen tearing happens when you have multiple frames show per refreshes screen.

 

So to give an example.

 

Say your monitor is 60hz and you have 120 fps per second. The screen loads your frame, but halfway the image, a second frame is offered, and thus the top half of your screen shows 1 image, and the bottom half another image.

If you were to have 240fps, you would be looking at 4 different images on your screen at once. This causes tearing, especially visible when a lot of movement happens, because each of those 4 images will have say a tree in a slightly different place, and thus each part of the tree doesn't connect to the next part on a different image.

 

Vsync fixes this, and always shows you 1 image per refresh, I do believe this causes some input lag, but nothing the average 'casual' gamer will notice.

Link to comment
Share on other sites

Link to post
Share on other sites

just to add from just skimming... yeah it sounds like you have to many "effects" going, like trixx + rtx and whatnot... that might look good in game (does it?) but i could see it easily overburden recording software or resources allocation... so turn down the effects use (or off) and see if that improves something.

 

 

@Neroonthat's a good explanation, but the elephant in the room is, if i have rocksolid 60fps on a 60hz screen i still get screen tearing without v-sync... 

 

basically it looks like even the slightest variation (which you cannot see in monitoring software typically)  causes screen tearing... its just weird? 

maybe depends on the game or rendering engine too, not sure. 

 

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Mark Kaine said:

just to add from just skimming... yeah it sounds like you have to many "effects" going, like trixx + rtx and whatnot... that might look good in game (does it?) but i could see it easily overburden recording software or resources allocation... so turn down the effects use (or off) and see if that improves something.

 

 

@Neroonthat's a good explanation, but the elephant in the room is, if i have rocksolid 60fps on a 60hz screen i still get screen tearing without v-sync... 

 

basically it looks like even the slightest variation (which you cannot see in monitoring software typically)  causes screen tearing... its just weird? 

maybe depends on the game or rendering engine too, not sure. 

 

 

That makes sense, because unless you get a perfect 60 fps that alligned perfectly with the refresh, it gets out of 'sync'.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Neroon said:

That makes sense, because unless you get a perfect 60 fps that alligned perfectly with the refresh, it gets out of 'sync'.

yeah i guess that makes sense, its still weird because you'd think screen and gpu (or os) would "auto-sync" ... i guess that's where gsync and other methods come in (nvcp has lots of settings that are supposed to reduce screen tearing for example,  like "fast-sync" etc)

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Mark Kaine said:

yeah i guess that makes sense, its still weird because you'd think screen and gpu (or os) would "auto-sync"

It's not weird, it's just that display clock is not synced to rendering clock, even if you had exactly constant 60fps generated (will never happen) it would likely be shifted with regard to display clock.

Unless you turn on v-sync, which does just that. G-Sync etc adds the ability for both frame genration and display rates to vary instead of being a predetermined fixed value.

F@H
Desktop: i9-13900K, ASUS Z790-E, 64GB DDR5-6000 CL36, RTX3080, 2TB MP600 Pro XT, 2TB SX8200Pro, 2x16TB Ironwolf RAID0, Corsair HX1200, Antec Vortex 360 AIO, Thermaltake Versa H25 TG, Samsung 4K curved 49" TV, 23" secondary, Mountain Everest Max

Mobile SFF rig: i9-9900K, Noctua NH-L9i, Asrock Z390 Phantom ITX-AC, 32GB, GTX1070, 2x1TB SX8200Pro RAID0, 2x5TB 2.5" HDD RAID0, Athena 500W Flex (Noctua fan), Custom 4.7l 3D printed case

 

Asus Zenbook UM325UA, Ryzen 7 5700u, 16GB, 1TB, OLED

 

GPD Win 2

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 weeks later...

So just to clarify, when I’m streaming via obs I should look at minimalising my extra effects and lock my framerate?

 

how come OBS uses more GPU anyway, I thought it was mainly cpu draws?

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Wookiee said:

how come OBS uses more GPU anyway, I thought it was mainly cpu draws?

Depends if you configured it to encode using CPU or GPU, and also if you add overlays, PIP sources and things something has to blend them and it'll be the GPU. 

F@H
Desktop: i9-13900K, ASUS Z790-E, 64GB DDR5-6000 CL36, RTX3080, 2TB MP600 Pro XT, 2TB SX8200Pro, 2x16TB Ironwolf RAID0, Corsair HX1200, Antec Vortex 360 AIO, Thermaltake Versa H25 TG, Samsung 4K curved 49" TV, 23" secondary, Mountain Everest Max

Mobile SFF rig: i9-9900K, Noctua NH-L9i, Asrock Z390 Phantom ITX-AC, 32GB, GTX1070, 2x1TB SX8200Pro RAID0, 2x5TB 2.5" HDD RAID0, Athena 500W Flex (Noctua fan), Custom 4.7l 3D printed case

 

Asus Zenbook UM325UA, Ryzen 7 5700u, 16GB, 1TB, OLED

 

GPD Win 2

Link to comment
Share on other sites

Link to post
Share on other sites

toggle on v-sync and disable ray tracing. Do you also require a service that writes essays? where your issue is discussed and where your errors are also corrected. Edusson Post is the finest option for you. This link https://essayreviewexpert.com/review/edusson/ can be useful for you if you want to learn more about the Edusson review on EssayReviewExpert. Visit the post to see the finest testimonials. If you compose an essay for yourself and discover a mistake in it, our editors will edit your essay and fix the problem. And by taking your essay test, you can improve your essay writing.

Link to comment
Share on other sites

Link to post
Share on other sites

Also OBS is unfortunately pretty demanding,  especially if compared to Shadowplay, i would say Shadowplay has ~1% performance impact and OBS has ~10% so in demanding titles it is noticable. 

 

 

only thing to really fix your issue is having enough overhead and using vsync or similar to lock the framerate/ avoid screen tearing. 

 

Lastly im not sure that the issue is actually screen tearing... if you run with unlocked framerate this should occur with or without OBS, unless the game is by chance somehow locked at 60 fps and OBS causes it to go under that.

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 11/19/2022 at 5:25 AM, merco said:

Well usually RTX on means real time raytracing and it's very demanding for the GPU, a 2060 Super was in the first generation that had actual hardware for it on the cards, so the performance will tank a lot if you enable it. It has nothing to do with AI it is just a rendering technique that is more accurate than the approximation that older techniques are using. Before real time raytracing, developers used to "fool" you into thinking for example light rays were real, by showing you something that looks good but isn't physically accurate. That also means a lot more FPS since you don't have to compute how every light ray behaves. My understanding of it is quite superficial but I hope you get the main point RTX means lower FPS, but a prettier picture. (side note: some people can't even tell the difference) 

Now DLSS is AI that tries to mitigate the performance hit from ray tracing. It approximates what a picture should look like and only renders part of it. The AI will guess what is missing so it doesn't have to do all the computing work, as a trade off it might look a bit worse or may have visual artefacts from time to time. (side note: some people don't even notice those) There are certain versions of DLSS (1-3) and they vary in quality and in execution, they all have the same goal though: more FPS. In theory they are supposed to work in tandem, but you can turn each of them on individually if your hardware allows it (RTX) or if you just need more frames (DLSS).

DLSS is interesting since it uses AI in different ways, so again I'd say watch some videos dedicated to the topic or read articles that can explain much better than I do.  

In the end it's a matter of preference, but since your card is mid range, RTX isn't really made for it I think. You can obviously turn it on and play around with it, see if it looks better than normal, see if DLSS helps to get playable framerates with RTX on, but at the end of the day if you want a nice RTX experience you probably need to spend more money on a graphics card. There is nothing wrong with leaving RTX off and enjoying the games at normal framerates though.

 

Keep in mind that things like that need to be implemented by game devs, so most games don't even have those features. You can find a list of RTX games here for example: https://www.digitaltrends.com/computing/games-support-nvidia-ray-tracing/ 

 

I'll add also that having RTX and DLSS can often tank FPS if you forget to also not add a targeted limit to the FPS. GPU's and CPU's are often hardware wise made with the intention of pushing their MAX critical load. Hence giving them a roof to hit that they can't go over can make them a bit more stable and also reduce the symptoms of a unpleasant performance. Kinda why an FPS limit set to unlimited can be visually distracting and unstable with a lot of lower end cards is cause the natural hardware intention then is to try and reach the max throttle of FPS and performance if your hardware can't actually handle reacing said unlimited number it can make cards stall out or in worst cases. Just stall out and crash. The way to get the most out of hardware is to find out what your hardware can handle and then tinker with the settings to find a perfect well balanced equilibrium. 

I love PC building and gaming. 
REMEMBER botttlenecks can happen at all points of a PC part. Make sure you are at equilibrium. For all parts unless you intend to upgrade at a later point. Also QA Tested AAA Games.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×