Jump to content

Two Different GPUs In A Build - Is There A Problem?

I recently built my video editing workstation from the video you posted in November - ULTIMATE VIDEO EDITING WORKSTATION. It includes a GeForce GTX780 AND a NVidia Quadro K000. It seems to work, but I am not sure how the two video cards work together, or what settings I should have in the NVidia Control Panel (for CUDA Acceleration, etc.). I contacted PNY because they built the Quadro card. They wrote back that they do not recommend nor support the two GPUs being installed into the same computer.

 

So, naturally I need to know why you called for both of those GPUs (and if you meant for BOTH or JUST ONE OR THE OTHER to be installed) In the video you demonstrated both of them installed. 

Link to comment
Share on other sites

Link to post
Share on other sites

They cannot work in SLI with each other. You have to switch between the cards for when you need them. Quadro for editing and 780 for gaming.

RIG: I7-4790k @ 4.5GHz | MSI Z97S SLI Plus | 12GB Geil Dragon RAM 1333MHz | Gigabyte G1 Gaming GTX 970 (1550MHz core/7800MHz memory) @ +18mV(Maxed out at 1650/7800 so far) | Corsair RM750 | Samsung 840 EVO 120GB, 1TB Seagate Barracuda | Fractal Design Arc Midi R2 (Closed) | Sound Blaster Z                                                                                                                        Getting: Noctua NH-D15 | Possible 250GB Samsung 850 Evo                                                                                        Need a console killer that actually shits on every console? Here you go (No MIR/Promo)

This is why you should not get an FX CPU for ANY scenario other than rendering on a budget http://linustechtips.com/main/topic/286142-fx-8350-r9-290-psu-requirements/?p=3892901 http://linustechtips.com/main/topic/266481-an-issue-with-people-bashing-the-fx-cpus/?p=3620861

Link to comment
Share on other sites

Link to post
Share on other sites

I recently built my video editing workstation from the video you posted in November - ULTIMATE VIDEO EDITING WORKSTATION. It includes a GeForce GTX780 AND a NVidia Quadro K000. It seems to work, but I am not sure how the two video cards work together, or what settings I should have in the NVidia Control Panel (for CUDA Acceleration, etc.). I contacted PNY because they built the Quadro card. They wrote back that they do not recommend nor support the two GPUs being installed into the same computer.

 

So, naturally I need to know why you called for both of those GPUs (and if you meant for BOTH or JUST ONE OR THE OTHER to be installed) In the video you demonstrated both of them installed. 

 

If things are working why do you wish to change anything?

80+ ratings certify electrical efficiency. Not quality.

 

Link to comment
Share on other sites

Link to post
Share on other sites

It works. But it hangs sometimes during editing and other heavy processes. And performance is not impressive - even disappointing for an $8k build. I'm also wondering if I could have made better use of the $800 that each card cost. I suppose I could have picked up a Titan.

I have read that Adobe makes use of multiple GPUs (not SLI) for the CUDA cores, but I don't understand how it works.

Link to comment
Share on other sites

Link to post
Share on other sites

It works. But it hangs sometimes during editing and other heavy processes. And performance is not impressive - even disappointing for an $8k build. I'm also wondering if I could have made better use of the $800 that each card cost. I suppose I could have picked up a Titan.

I have read that Adobe makes use of multiple GPUs (not SLI) for the CUDA cores, but I don't understand how it works.

 

Adobe does make use of multiple gpu for certain tasks. The monitor on which you game should be connected to the GTX gpu. Monitors used for editing should be connected to the Quadro gpu.

80+ ratings certify electrical efficiency. Not quality.

 

Link to comment
Share on other sites

Link to post
Share on other sites

The Quadro k4000 is connected to the ASUS PA279 monitor because they both handle 10-bit, and the other monitor is plugged into the GTX 780. My understanding was that the GTX is providing the performance, while the Quadro is supposed to provide stability and compatibility. But I don't see much of either happening. The computer locks up sometimes under load. And I see no real performance boost over having just one of the cards installed. 

Link to comment
Share on other sites

Link to post
Share on other sites

The Quadro k4000 is connected to the ASUS PA279 monitor because they both handle 10-bit, and the other monitor is plugged into the GTX 780. My understanding was that the GTX is providing the performance, while the Quadro is supposed to provide stability and compatibility. But I don't see much of either happening. The computer locks up sometimes under load. And I see no real performance boost over having just one of the cards installed. 

 

GTX provides gaming performance and may also assist in renders - depending on software. Quadro provides 10-bit color support and performance in some workstation activities - depending on software.

 

Much is dependent on the software being used and the type of projects worked on.

 

When the system locks up under load, what activities are generating the load?

80+ ratings certify electrical efficiency. Not quality.

 

Link to comment
Share on other sites

Link to post
Share on other sites

I use Adobe Premiere and After Effects and the whole CC 2014 version. I'd still like to know how that works having two GPUs. In the NVidia control panel, if I have the global default set, it looks like both should kick in for everything. I have asked in other forums and got different answers. Some think I should set CUDA cores to GTX780 only, but I just don't know.

 

It just doesn't seem like I'm getting the performance I expected. Video does not play back smoothly with effects applied. Even if it's just one layer of video.

Link to comment
Share on other sites

Link to post
Share on other sites

I use Adobe Premiere and After Effects and the whole CC 2014 version. I'd still like to know how that works having two GPUs. In the NVidia control panel, if I have the global default set, it looks like both should kick in for everything. I have asked in other forums and got different answers. Some think I should set CUDA cores to GTX780 only, but I just don't know.

 

It just doesn't seem like I'm getting the performance I expected. Video does not play back smoothly with effects applied. Even if it's just one layer of video.

 

Have you changed any setting in the Nvidia control panel?

 

Are you talking about previewing a video in an editor?

80+ ratings certify electrical efficiency. Not quality.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Yes, previewing in Adobe Premiere is just as choppy as my 3 year old Sandy bridge system. If this is supposed to be an 'ultimate' video workstation, I'm not feeling it. I'm pretty certain I will be going to a Mac next time. Or something with a dual Xeon.

Could it be a Windows thing? Is the OS introducing a lot of garbage for the system to fight through to do its work? I mean, with nothing running, this machine uses 3 or 4 GB of RAM. About 100 processes - with no programs running.

The RAID 0 disk array is reasonably fast, but in the various benchmark testers like HDTune, etc. the graph shows a very spiky display. Not straight across sustained rate.

Link to comment
Share on other sites

Link to post
Share on other sites

Yes, previewing in Adobe Premiere is just as choppy as my 3 year old Sandy bridge system. If this is supposed to be an 'ultimate' video workstation, I'm not feeling it. I'm pretty certain I will be going to a Mac next time. Or something with a dual Xeon.

Could it be a Windows thing? Is the OS introducing a lot of garbage for the system to fight through to do its work? I mean, with nothing running, this machine uses 3 or 4 GB of RAM. About 100 processes - with no programs running.

The RAID 0 disk array is reasonably fast, but in the various benchmark testers like HDTune, etc. the graph shows a very spiky display. Not straight across sustained rate.

 

It is quite possibly a driver or setting issue. Have you tried removing the GTX gpu to see if that changes how preview works?

 

Don't worry about the amount of memory reported in use. The o/s manages memory quite efficiently using as much as it needs to when it is available and using less when programs require more. With a swap file on ssd, even going beyond real memory should not involve a significant performance hit.

 

Just because something is in the process list does not mean that it consuming cpu or memory resources. Some of the entries are helpers that only become active when they are needed. Others wake up fairly infrequently to do particular, and usually brief tasks.There is also a System Idle Task that, roughly speaking, runs whenever the cpu has nothing else to do.

 

Without knowing the details of the graph you are referencing it is difficult to comment. But I will say that at most scales read/write benchmark graphs are rarely even curves or flat lines.

80+ ratings certify electrical efficiency. Not quality.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×