Jump to content

So I'm confused...issue with multiple gpus

commanderZiltoid

I recently put together a new build which is using an rtx 3070 as a primary gpu, and a gt 710 for additional monitors. I've noticed that when interacting with one of the two monitors that are connected to the gt 710 they are very stuttery/choppy (just dragging windows around on either monitor), and when trying to playback video (have tested youtube videos via chrome and local video files via media player), while the audio is fluid, the videos are extremely choppy. System is running the latest nvidia drivers (461.92). Not sure what could be going on with this? Thinking maybe it has something to do with the drivers???

 

Something else I noticed that is confusing is within task manager, if I have all windows closed on my primary monitors (running off of the 3070) and two chrome tabs open on the two monitors on the gt 710, task manager shows my 3070 utilization between 30-40% o_0. How could that be if the load is currently on the gt 710? Screen capture attached. Is there something going on behind the scenes where the 3070 is forwarding to the gt 710?

 

Capture.PNG

Link to comment
Share on other sites

Link to post
Share on other sites

did you check to see if they are running on their refresh rate monitors dont run on their native refresh rate and you need to change it in settings

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, FilmMoses said:

did you check to see if they are running on their refresh rate monitors dont run on their native refresh rate and you need to change it in settings

All monitors have been configured to run at their native refresh rate.This ain't that.

Link to comment
Share on other sites

Link to post
Share on other sites

Windows is very finicky about dragging windows across multiple monitors across different cards, ESPECIALLY if they are going across multiple graphics cards of different internal GPUs. IIRC, windows ends up putting most of the work on the 3070 and then just forwards all of the output frames across the PCIe bus to the GT 710 and uses it as a spare output. Most of the work the GT 710 does is re-encode and re-scale for the monitor's resolution and output connector. This means there is more work for the 3070 to also do because it can't just route the video out it's DP, instead it has to redirect it to the CPU, then to the chipset, then to the GT 710. Basically a lot of very unoptimized cooperation that would just be too hard to fix for your average Nvidia or MS engineer. 

 

The only programs that can effectively use the GT 710 are those that run in full screen exclusively on the monitor hooked up to the GT 710, or programs that use hardware acceleration and you select the GT 710 as the accelerator. But in those programs you can have the output of the program be displayed on the 710, but all of the actual program acceleration is happening on the 3070. That is because aside from the progress bar, the two cards wouldn't need to cooperate in such a program. 

 

Just a theory, but I know that you can't expect a smooth transition between monitors hooked up to different GPUs. 

Fuck you scalpers, fuck you scammers, fuck all of you jerks that charge way too much to tech-illiterate people. 

Unless I say I am speaking from experience or can confirm my expertise, assume it is an educated guess.

Current setup: Ryzen 5 3600, MSI MPG B550, 2x8GB DDR4-3200, RX 5600 XT (+120 core, +320 Mem), 1TB WD SN550, 1TB Team MP33, 2TB Seagate Barracuda Compute, 500GB Samsung 860 Evo, Corsair 4000D Airflow, 650W 80+ Gold. Razer peripherals. 

Also have a Alienware Alpha R1: i3-4170T, GTX 860M (≈ a 750 Ti). 2x4GB DDR3L-1600, Crucial MX500

My past and current projects: VR Flight Sim: https://pcpartpicker.com/user/nathanpete/saved/#view=dG38Jx (Done!)

A do it all server for educational use: https://pcpartpicker.com/user/nathanpete/saved/#view=vmmNcf (Cancelled)

Replacement of my friend's PC nicknamed Donkey, going from 2nd gen i5 to Zen+ R5: https://pcpartpicker.com/user/nathanpete/saved/#view=WmsW4D (Done!)

Link to comment
Share on other sites

Link to post
Share on other sites

your issue for the most part is that you are running a new card and one that is very old.

 

windows has no issues at all having multiple monitors if they are plugged into 1 GPU. this is a brand new card for one and a card that is 7 years old. still works but very unreliable. i had an AMD radeon 6770 from a prebuilt that was giving me issues in some games, i replaced it with a 2070 super and those issues went away. the issue here is just that. an old out of date card running 2GB vs a brand new card running 12GB

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, Nathanpete said:

Windows is very finicky about dragging windows across multiple monitors across different cards, ESPECIALLY if they are going across multiple graphics cards of different internal GPUs. IIRC, windows ends up putting most of the work on the 3070 and then just forwards all of the output frames across the PCIe bus to the GT 710 and uses it as a spare output. Most of the work the GT 710 does is re-encode and re-scale for the monitor's resolution and output connector. This means there is more work for the 3070 to also do because it can't just route the video out it's DP, instead it has to redirect it to the CPU, then to the chipset, then to the GT 710. Basically a lot of very unoptimized cooperation that would just be too hard to fix for your average Nvidia or MS engineer. 

 

The only programs that can effectively use the GT 710 are those that run in full screen exclusively on the monitor hooked up to the GT 710, or programs that use hardware acceleration and you select the GT 710 as the accelerator. But in those programs you can have the output of the program be displayed on the 710, but all of the actual program acceleration is happening on the 3070. That is because aside from the progress bar, the two cards wouldn't need to cooperate in such a program. 

 

Just a theory, but I know that you can't expect a smooth transition between monitors hooked up to different GPUs. 

I believe that makes sense! So the actual processing is being done on the 3070, then forwarded to the gt710 for output via some horribly unoptimized driver code? If that's the case that's probably where the issue is. Cause who else is going to think to run a 30 series card with a gt710? Or better yet optimize drivers for communication between the two. Probably why the 3070 utilization is 40 something percent for just streaming video.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, tdkid said:

your issue for the most part is that you are running a new card and one that is very old.

 

windows has no issues at all having multiple monitors if they are plugged into 1 GPU. this is a brand new card for one and a card that is 7 years old. still works but very unreliable. i had an AMD radeon 6770 from a prebuilt that was giving me issues in some games, i replaced it with a 2070 super and those issues went away. the issue here is just that. an old out of date card running 2GB vs a brand new card running 12GB

The thing about VRAM doesn't matter whatsoever in his problem. VRAM is only shared in any capacity when SLI is involved. They can request and transfer the contents in VRAM, but they can't freely access each other's VRAM without SLI. You can run two different cards simultaneously, they just won't have the ability to synchronize by themselves, and require the help of Windows to handle it. And Windows is doing a very bad job and synchronizing a 7 year old GPU and a <6 month old GPU. 

Fuck you scalpers, fuck you scammers, fuck all of you jerks that charge way too much to tech-illiterate people. 

Unless I say I am speaking from experience or can confirm my expertise, assume it is an educated guess.

Current setup: Ryzen 5 3600, MSI MPG B550, 2x8GB DDR4-3200, RX 5600 XT (+120 core, +320 Mem), 1TB WD SN550, 1TB Team MP33, 2TB Seagate Barracuda Compute, 500GB Samsung 860 Evo, Corsair 4000D Airflow, 650W 80+ Gold. Razer peripherals. 

Also have a Alienware Alpha R1: i3-4170T, GTX 860M (≈ a 750 Ti). 2x4GB DDR3L-1600, Crucial MX500

My past and current projects: VR Flight Sim: https://pcpartpicker.com/user/nathanpete/saved/#view=dG38Jx (Done!)

A do it all server for educational use: https://pcpartpicker.com/user/nathanpete/saved/#view=vmmNcf (Cancelled)

Replacement of my friend's PC nicknamed Donkey, going from 2nd gen i5 to Zen+ R5: https://pcpartpicker.com/user/nathanpete/saved/#view=WmsW4D (Done!)

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, commanderZiltoid said:

That believe that makes sense! So the actual processing is being done on the 3070, then forwarded to the gt710 for output via some horribly unoptimized driver code? If that's the case that's probably where the issue is. Cause who else is going to think to run a 30 series card with a gt710? Or better yet optimize drivers for communication between the two. Probably why the 3070 utilization is 40 something percent for just streaming video.

I don't even think it is driver code. Because I have seen people run both a Radeon and a Nvidia card in the same system, and it was just as bad. Also virtualization programs (but not emulation) can make good use of the GT 710. If you set the settings to have the virtualized OS run in full screen on a monitor hooked up to the GT 710, and have the GT 710 be the renderer, then any hitching should no longer be caused by the poorly optimized communication, but rather hardware allocation limits such as how much RAM and how many threads have been allocated to the virtualized OS. In which case you would probably also hit those same problems without the 710, unless the  issue is that the virtualized OS ran out of VRAM. 

Fuck you scalpers, fuck you scammers, fuck all of you jerks that charge way too much to tech-illiterate people. 

Unless I say I am speaking from experience or can confirm my expertise, assume it is an educated guess.

Current setup: Ryzen 5 3600, MSI MPG B550, 2x8GB DDR4-3200, RX 5600 XT (+120 core, +320 Mem), 1TB WD SN550, 1TB Team MP33, 2TB Seagate Barracuda Compute, 500GB Samsung 860 Evo, Corsair 4000D Airflow, 650W 80+ Gold. Razer peripherals. 

Also have a Alienware Alpha R1: i3-4170T, GTX 860M (≈ a 750 Ti). 2x4GB DDR3L-1600, Crucial MX500

My past and current projects: VR Flight Sim: https://pcpartpicker.com/user/nathanpete/saved/#view=dG38Jx (Done!)

A do it all server for educational use: https://pcpartpicker.com/user/nathanpete/saved/#view=vmmNcf (Cancelled)

Replacement of my friend's PC nicknamed Donkey, going from 2nd gen i5 to Zen+ R5: https://pcpartpicker.com/user/nathanpete/saved/#view=WmsW4D (Done!)

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, tdkid said:

your issue for the most part is that you are running a new card and one that is very old.

 

windows has no issues at all having multiple monitors if they are plugged into 1 GPU. this is a brand new card for one and a card that is 7 years old. still works but very unreliable. i had an AMD radeon 6770 from a prebuilt that was giving me issues in some games, i replaced it with a 2070 super and those issues went away. the issue here is just that. an old out of date card running 2GB vs a brand new card running 12GB

You're probably right on all accounts. The gt710 was suggested to me via another user on the board as a cheap card that would work with the 3070 to increase my video outputs. With that said, I'm totally down to try another more up to date card if anyone has any suggestions. Something that could possibly work better with this 3070 that's not another 30 series card that you can only get on ebay from scalpers for $1400 >_<

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Nathanpete said:

The thing about VRAM doesn't matter whatsoever in his problem. VRAM is only shared in any capacity when SLI is involved. They can request and transfer the contents in VRAM, but they can't freely access each other's VRAM without SLI. You can run two different cards simultaneously, they just won't have the ability to synchronize by themselves, and require the help of Windows to handle it. And Windows is doing a very bad job and synchronizing a 7 year old GPU and a <6 month old GPU. 

take a look at the cards mentioned. one is a 3070 from 2020 and the other is a 710 from 2014. i am pretty sure the issue is simply that one is way out of date.

 

try plugging in both monitors to the 3070.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, tdkid said:

take a look at the cards mentioned. one is a 3070 from 2020 and the other is a 710 from 2014. i am pretty sure the issue is simply that one is way out of date.

Yes, I was just correcting your statement about the differences in VRAM. I have already mentioned the date and tech difference in another post. 

Fuck you scalpers, fuck you scammers, fuck all of you jerks that charge way too much to tech-illiterate people. 

Unless I say I am speaking from experience or can confirm my expertise, assume it is an educated guess.

Current setup: Ryzen 5 3600, MSI MPG B550, 2x8GB DDR4-3200, RX 5600 XT (+120 core, +320 Mem), 1TB WD SN550, 1TB Team MP33, 2TB Seagate Barracuda Compute, 500GB Samsung 860 Evo, Corsair 4000D Airflow, 650W 80+ Gold. Razer peripherals. 

Also have a Alienware Alpha R1: i3-4170T, GTX 860M (≈ a 750 Ti). 2x4GB DDR3L-1600, Crucial MX500

My past and current projects: VR Flight Sim: https://pcpartpicker.com/user/nathanpete/saved/#view=dG38Jx (Done!)

A do it all server for educational use: https://pcpartpicker.com/user/nathanpete/saved/#view=vmmNcf (Cancelled)

Replacement of my friend's PC nicknamed Donkey, going from 2nd gen i5 to Zen+ R5: https://pcpartpicker.com/user/nathanpete/saved/#view=WmsW4D (Done!)

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, commanderZiltoid said:

You're probably right on all accounts. The gt710 was suggested to me via another user on the board as a cheap card that would work with the 3070 to increase my video outputs. With that said, I'm totally down to try another more up to date card if anyone has any suggestions. Something that could possibly work better with this 3070 that's not another 30 series card that you can only get on ebay from scalpers for $1400 >_<

Sorry, but I can't suggest anything. But if you are using an Intel CPU with a iGP the code to synchronize the two aren't that bad. Just plug into the mobo. I think most of the work is still done by the 3070, so you GPU usage may still be up, but I can almost guarantee there will be less stuttering and screen tear. 

Fuck you scalpers, fuck you scammers, fuck all of you jerks that charge way too much to tech-illiterate people. 

Unless I say I am speaking from experience or can confirm my expertise, assume it is an educated guess.

Current setup: Ryzen 5 3600, MSI MPG B550, 2x8GB DDR4-3200, RX 5600 XT (+120 core, +320 Mem), 1TB WD SN550, 1TB Team MP33, 2TB Seagate Barracuda Compute, 500GB Samsung 860 Evo, Corsair 4000D Airflow, 650W 80+ Gold. Razer peripherals. 

Also have a Alienware Alpha R1: i3-4170T, GTX 860M (≈ a 750 Ti). 2x4GB DDR3L-1600, Crucial MX500

My past and current projects: VR Flight Sim: https://pcpartpicker.com/user/nathanpete/saved/#view=dG38Jx (Done!)

A do it all server for educational use: https://pcpartpicker.com/user/nathanpete/saved/#view=vmmNcf (Cancelled)

Replacement of my friend's PC nicknamed Donkey, going from 2nd gen i5 to Zen+ R5: https://pcpartpicker.com/user/nathanpete/saved/#view=WmsW4D (Done!)

Link to comment
Share on other sites

Link to post
Share on other sites

Have you tried setting your 710 as default? I wonder if setting it as default would make it be the primary used GPU. And then if you can set games to run off the 3070. Or if that would cause more issues.

I'm not actually trying to be as grumpy as it seems.

I will find your mentions of Ikea or Gnome and I will /s post. 

Project Hot Box

CPU 13900k, Motherboard Gigabyte Aorus Elite AX, RAM CORSAIR Vengeance 4x16gb 5200 MHZ, GPU Zotac RTX 4090 Trinity OC, Case Fractal Pop Air XL, Storage Sabrent Rocket Q4 2tbCORSAIR Force Series MP510 1920GB NVMe, CORSAIR FORCE Series MP510 960GB NVMe, PSU CORSAIR HX1000i, Cooling Corsair XC8 CPU block, Bykski GPU block, 360mm and 280mm radiator, Displays Odyssey G9, LG 34UC98-W 34-Inch,Keyboard Mountain Everest Max, Mouse Mountain Makalu 67, Sound AT2035, Massdrop 6xx headphones, Go XLR 

Oppbevaring

CPU i9-9900k, Motherboard, ASUS Rog Maximus Code XI, RAM, 48GB Corsair Vengeance LPX 32GB 3200 mhz (2x16)+(2x8) GPUs Asus ROG Strix 2070 8gb, PNY 1080, Nvidia 1080, Case Mining Frame, 2x Storage Samsung 860 Evo 500 GB, PSU Corsair RM1000x and RM850x, Cooling Asus Rog Ryuo 240 with Noctua NF-12 fans

 

Why is the 5800x so hot?

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Nathanpete said:

Sorry, but I can't suggest anything. But if you are using an Intel CPU with a iGP the code to synchronize the two aren't that bad. Just plug into the mobo. I think most of the work is still done by the 3070, so you GPU usage may still be up, but I can almost guarantee there will be less stuttering and screen tear. 

I'm on a 5900x. 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Nathanpete said:

Yes, I was just correcting your statement about the differences in VRAM. I have already mentioned the date and tech difference in another post. 

 the 3070 should have multiple HDMI or DP connections, there is no issue at all plugging more than 1 monitor into that GPU. just go into your settings and detect the second monitor.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, commanderZiltoid said:

I'm on a 5900x. 

Damn

Fuck you scalpers, fuck you scammers, fuck all of you jerks that charge way too much to tech-illiterate people. 

Unless I say I am speaking from experience or can confirm my expertise, assume it is an educated guess.

Current setup: Ryzen 5 3600, MSI MPG B550, 2x8GB DDR4-3200, RX 5600 XT (+120 core, +320 Mem), 1TB WD SN550, 1TB Team MP33, 2TB Seagate Barracuda Compute, 500GB Samsung 860 Evo, Corsair 4000D Airflow, 650W 80+ Gold. Razer peripherals. 

Also have a Alienware Alpha R1: i3-4170T, GTX 860M (≈ a 750 Ti). 2x4GB DDR3L-1600, Crucial MX500

My past and current projects: VR Flight Sim: https://pcpartpicker.com/user/nathanpete/saved/#view=dG38Jx (Done!)

A do it all server for educational use: https://pcpartpicker.com/user/nathanpete/saved/#view=vmmNcf (Cancelled)

Replacement of my friend's PC nicknamed Donkey, going from 2nd gen i5 to Zen+ R5: https://pcpartpicker.com/user/nathanpete/saved/#view=WmsW4D (Done!)

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, tdkid said:

 the 3070 should have multiple HDMI or DP connections, there is no issue at all plugging more than 1 monitor into that GPU. just go into your settings and detect the second monitor.

Oh it does, and they're full up. The three display ports are connected to an ultragear 27gl83a and two acer sb230s. Then the hdmi port is connected to a 4k tv. I NEED the extra ports for the two other monitors. My original plan was to use an MST hub since these three ports are each dp1.4. You'd think something like this would work to split the signal from one port so I could get two 1080p outputs at 60hz, but I was informed it would not: https://www.amazon.com/StarTech-com-Port-DisplayPort-MST-Hub/dp/B0839LTNFK

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, commanderZiltoid said:

Oh it does, and they're full up. The three display ports are connected to an ultragear 27gl83a and two acer sb230s. Then the hdmi port is connected to a 4k tv. I NEED the extra ports for the two other monitors. My original plan was to use an MST hub since these three ports are each dp1.4. You'd think something like this would work to split the signal from one port so I could get two 1080p outputs at 60hz, but I was informed it would not: https://www.amazon.com/StarTech-com-Port-DisplayPort-MST-Hub/dp/B0839LTNFK

personally i dont have any idea as i just have the 1 TV i am using as my monitor. the issue i do see with that is that you will have to use a DP on the GPU and if the monitor doesnt have a DP on it, then i dont see it working. but again i dont know.

Link to comment
Share on other sites

Link to post
Share on other sites

Just purchased the mst hub linked above. Probably should have followed my gut and gone that route from the get go. One of the reviews was someone with a 3090 that confirmed it worked flawlessly for them. We'll see how it goes. 

Link to comment
Share on other sites

Link to post
Share on other sites

well i hope it works out for you. who makes the 3070 you have? gigabyte, evga, etc. etc..

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, tdkid said:

well i hope it works out for you. who makes the 3070 you have? gigabyte, evga, etc. etc..

Nvidia. It's a founders edition card.

Link to comment
Share on other sites

Link to post
Share on other sites

Update. So the rtx 3070 DOES NOT support more than 4 monitors even with an MST hub. Windows recognizes all monitors connected, but disables all but 4. Going into nvidia driver manager, monitors also show up but you can only select 4 and a message comes up stating that the gpu only supports 4 monitors. Oh well...I had to give it a go. Now I just have a spare MST hub if needed down the road for something. 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×