Jump to content

Hello there!

 

I have found myself in a bit of a weird spot, as there is not much specific information about something like this.

 

I have a GTX 1080, which normally I'd use for everything. However, I'm going to switch to a CRT Monitor pretty soon, and I'd not rather use digital to analog conversion, but analog straight away. You might know though, that starting from the 10 series GPUs from Nvidia, they dropped DVI-I support, meaning it no longer is capable to directly output analog (shame!). The CRT monitor obviously only works on VGA.

I also have a GTX 580 lying around, which does happen to have DVI-I. This means if I was to only use the GTX 580, it can power the CRT, no input-lag adding adapter needed.

 

Two main questions:

  • (1) I wanna know if I can somehow use my GTX 1080 ingame to actually perform the calculations, and then somehow direct that output to my second GPU, so it can, without additional delay, transmit in analog to my CRT.
  • (2) If not, can I at least keep the GTX 580 running alongside my GTX 1080, and plug in the CRT into my GTX 580, using it as the GPU for only certain games (like CS:GO), while being able to play any other game with my GTX 1080 on my main monitor? The idea is: Nvidia Controlpanel does allow you to "select a GPU" for certain applications, so I could set up a manual application profile for CS:GO, and select my GTX 580 there. Is that somehow possible?

 

I'm on Windows 10, and for completions sake, here is my rig data:

i7 6800k, MSI x99a SLI-Plus, 24GB 3000Mhz DDR4 Ram, MSI GTX 1080, Gainward GTX 580 (not plugged in currently)

 

I'll be interested for *any* insight you guys might have! Maybe this could spark an interesting discussion, or maybe even make it to an LTT vid? :)

 

Best regards,

 

Rankhole

Link to post
Share on other sites

Use the adapter on the igpu. 

 

Wont have any issues running it along side the main card but not sure if you’re gonna be able to use it to pass through. Not sure why you would want the 1080 to do the work and output on the other monitor attached to the other card. 

Main RIg Lian Li O11 MINI, I7 9900k, ASUS ROG Maximus XI Hero, G.Skill Ripjaws 3600 32GB, 3090FE, EVGA 1000G5, Acer Nitro XZ3 2560 x 1440@240hz 

 

Spare RIg Lian Li O11 AIR MINI, I7 4790K, Asus Maximus VI Extreme, G.Skill Ares 2400 32Gb, EVGA 1080ti, 1080sc 1070sc & 1060 SSC, EVGA 850GA, Acer KG251Q 1920x1080@240hz 

 

Link to post
Share on other sites

1 minute ago, Mick Naughty said:

Use the adapter on the igpu. 

 

Wont have any issues running it along side the main card but not sure if you’re gonna be able to use it to pass through. Not sure why you would want the 1080 to do the work and output on the other monitor attached to the other card. 

Unfortunately, the i7 6800k doesn't have an iGPU.

 

Also, the reason why I'd like to rather use the power of the 1080, is obviously for framerate reasons. While I'm not planning to play on too much of a high resolution, the 580 is still going to perform worse in terms of FPS than my GTX 1080.

Plus, this would allow me to play any game I want on my CRT, even moddern titles, without having to worry that the GTX 580 might explode :)

Link to post
Share on other sites

2 minutes ago, Rankhole said:

Unfortunately, the i7 6800k doesn't have an iGPU.

 

Also, the reason why I'd like to rather use the power of the 1080, is obviously for framerate reasons. While I'm not planning to play on too much of a high resolution, the 580 is still going to perform worse in terms of FPS than my GTX 1080.

Plus, this would allow me to play any game I want on my CRT, even moddern titles, without having to worry that the GTX 580 might explode :)

So the whole point of this is you actually wanting to use the crt for gaming? 

Main RIg Lian Li O11 MINI, I7 9900k, ASUS ROG Maximus XI Hero, G.Skill Ripjaws 3600 32GB, 3090FE, EVGA 1000G5, Acer Nitro XZ3 2560 x 1440@240hz 

 

Spare RIg Lian Li O11 AIR MINI, I7 4790K, Asus Maximus VI Extreme, G.Skill Ares 2400 32Gb, EVGA 1080ti, 1080sc 1070sc & 1060 SSC, EVGA 850GA, Acer KG251Q 1920x1080@240hz 

 

Link to post
Share on other sites

Just now, Mick Naughty said:

So the whole point of this is you actually wanting to use the crt for gaming? 

Yeah, well, that would be the ideal scenario.

If that won't happen, I want to know (2), if I can at least have both of them plugged in the system at all times and can just manually select which GPU to prefer in the Nvidia app profile settings. This is my "main" goal. At least being able to run CS with the GTX 580, while being able to run other games with my main GPU.

Link to post
Share on other sites

3 minutes ago, Rankhole said:

Unfortunately, the i7 6800k doesn't have an iGPU.

 

Also, the reason why I'd like to rather use the power of the 1080, is obviously for framerate reasons. While I'm not planning to play on too much of a high resolution, the 580 is still going to perform worse in terms of FPS than my GTX 1080.

Plus, this would allow me to play any game I want on my CRT, even moddern titles, without having to worry that the GTX 580 might explode :)

I believe (at least it worked like this on my eGPU laptop) windows will chose which GPU to use based off which display is primary, so if you left the display attached to the 1080 as the main, then switched it to the one on the 580, start whatever you wanted to and switch the main display back to the one on the 1080 it should stay as running off the 580 and whatever you start after you switch it back would use the 1080.

why no dark mode?
Current:

Asus ROG Flow Z13 (GZ301ZE):
CPU: i9-12900H @ Up to 5.0GHz all core
- dGPU: RTX 3050 Ti 4GB

- eGPU: Radeon 6850m XT XGm 16GB
RAM: 16GB (8x2GB) @ 5200MTs

Storage: 1TB NVMe SSD, 1TB MicroSD
Display: Internal 1200p@120Hz, Asus ROG XG-17 1080p@240Hz (G-Sync), Gigabyte M32U 4k@144Hz (G-Sync), External Laptop panel (LTN173HT02) 1080p@120Hz, Asus VG248QE 1080p@144hz

Watercooled Eluktronics THICC-17 (Clevo X170SM-G):
CPU: i9-10900k @ 4.9GHz all core
GPU: RTX 2080 Super (Max P 200W)
RAM: 32GB (4x8GB) @ 3200MTs

Storage: 512GB HP EX NVMe SSD, 2TB Silicon Power NVMe SSD
Displays: Internal 1080p@300Hz

Custom Game Server:

CPUs: Ryzen 9 9900X

RAM: 128GB (4x32GB) DDR5 @ whatever it'll boot at xD (I think it's 3600MTs)

Storage: 2x 1TB WD Blue NVMe SSD in RAID 1, 4x 10TB HGST Enterprise HDD in RAID Z1

Link to post
Share on other sites

3 minutes ago, Rankhole said:

Yeah, well, that would be the ideal scenario.

If that won't happen, I want to know (2), if I can at least have both of them plugged in the system at all times and can just manually select which GPU to prefer in the Nvidia app profile settings. This is my "main" goal. At least being able to run CS with the GTX 580, while being able to run other games with my main GPU.

Well I know in modern games you can pic the display adapter in the game. 

 

Or go in windowed mode and move it to that monitor and full screen it again. Worked for me as that application remember where it was. 

 

 

Main RIg Lian Li O11 MINI, I7 9900k, ASUS ROG Maximus XI Hero, G.Skill Ripjaws 3600 32GB, 3090FE, EVGA 1000G5, Acer Nitro XZ3 2560 x 1440@240hz 

 

Spare RIg Lian Li O11 AIR MINI, I7 4790K, Asus Maximus VI Extreme, G.Skill Ares 2400 32Gb, EVGA 1080ti, 1080sc 1070sc & 1060 SSC, EVGA 850GA, Acer KG251Q 1920x1080@240hz 

 

Link to post
Share on other sites

1 hour ago, Rankhole said:

Hello there!

 

I have found myself in a bit of a weird spot, as there is not much specific information about something like this.

 

I have a GTX 1080, which normally I'd use for everything. However, I'm going to switch to a CRT Monitor pretty soon, and I'd not rather use digital to analog conversion, but analog straight away. You might know though, that starting from the 10 series GPUs from Nvidia, they dropped DVI-I support, meaning it no longer is capable to directly output analog (shame!). The CRT monitor obviously only works on VGA.

I also have a GTX 580 lying around, which does happen to have DVI-I. This means if I was to only use the GTX 580, it can power the CRT, no input-lag adding adapter needed.

 

Two main questions:

  • (1) I wanna know if I can somehow use my GTX 1080 ingame to actually perform the calculations, and then somehow direct that output to my second GPU, so it can, without additional delay, transmit in analog to my CRT.
  • (2) If not, can I at least keep the GTX 580 running alongside my GTX 1080, and plug in the CRT into my GTX 580, using it as the GPU for only certain games (like CS:GO), while being able to play any other game with my GTX 1080 on my main monitor? The idea is: Nvidia Controlpanel does allow you to "select a GPU" for certain applications, so I could set up a manual application profile for CS:GO, and select my GTX 580 there. Is that somehow possible?

 

I'm on Windows 10, and for completions sake, here is my rig data:

i7 6800k, MSI x99a SLI-Plus, 24GB 3000Mhz DDR4 Ram, MSI GTX 1080, Gainward GTX 580 (not plugged in currently)

 

I'll be interested for *any* insight you guys might have! Maybe this could spark an interesting discussion, or maybe even make it to an LTT vid? :)

 

Best regards,

 

Rankhole

There is no "analog only" output.  Any GPU with VGA on board is using a RAMDAC digital to analog converter.  This is just an integrated DAC, which is a... digital to analog converter just like exists in a DP/VGA dongle.

 

Running any secondary GPU to get a VGA output is going to add latency.  The graphics are generated on the primary GPU and then copied to the secondary GPU and displayed on its output.

 

The last OS to actually render graphics on the GPU connected to the secondary display is Windows XP.

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×