Jump to content

G-Sync interoperability with SLI question

tribaljet

Hello there.

 

After having followed G-Sync closely, it does look to be an unquestionably welcome feature for any gamer in regards to overall picture quality (might be a broad term but it's pertinent).

 

Now, while there have been benefits discussed of running 60Hz monitors all the way to 144Hz monitors on single GPU systems, I wonder if there are any news regarding multi-GPU systems, namely SLI since G-Sync doesn't support Crossfire, if there has been any work done to address the whole microstuttering issue or if there is any source with info pertaining this matter.

Link to comment
Share on other sites

Link to post
Share on other sites

Pretty sure g-sync is going to work no problem with SLI, and whatever micro-stutter is present is a normal SLI setup would still be there, but nothing else should be introduced latency wise.

Rig: i7 13700k - - Asus Z790-P Wifi - - RTX 4080 - - 4x16GB 6000MHz - - Samsung 990 Pro 2TB NVMe Boot + Main Programs - - Assorted SATA SSD's for Photo Work - - Corsair RM850x - - Sound BlasterX EA-5 - - Corsair XC8 JTC Edition - - Corsair GPU Full Cover GPU Block - - XT45 X-Flow 420 + UT60 280 rads - - EK XRES RGB PWM - - Fractal Define S2 - - Acer Predator X34 -- Logitech G502 - - Logitech G710+ - - Logitech Z5500 - - LTT Deskpad

 

Headphones/amp/dac: Schiit Lyr 3 - - Fostex TR-X00 - - Sennheiser HD 6xx

 

Homelab/ Media Server: Proxmox VE host - - 512 NVMe Samsung 980 RAID Z1 for VM's/Proxmox boot - - Xeon e5 2660 V4- - Supermicro X10SRF-i - - 128 GB ECC 2133 - - 10x4 TB WD Red RAID Z2 - - Corsair 750D - - Corsair RM650i - - Dell H310 6Gbps SAS HBA - - Intel RES2SC240 SAS Expander - - TreuNAS + many other VM’s

 

iPhone 14 Pro - 2018 MacBook Air

Link to comment
Share on other sites

Link to post
Share on other sites

I imagined it would work with any form of SLI, was just wondering if it could somehow address the inherent microstuttering issue of multi-GPU systems the same way it addresses V-Sync stuttering.

Link to comment
Share on other sites

Link to post
Share on other sites

Well, you're running the monitor off the main card so I don't see where any problems would happen?

.

Link to comment
Share on other sites

Link to post
Share on other sites

Then bear with me on this, please. Microstuttering is, for the time being, always present on any multi-GPU implementation, not so dissimilar to what happens with V-Sync on its own. Now, if G-Sync could address the multi-GPU stuttering the same way it does for V-Sync's, by delaying frame presentation through adaptative refresh rates, it should in theory eliminate the stuttering, would it not?

Link to comment
Share on other sites

Link to post
Share on other sites

I imagined it would work with any form of SLI, was just wondering if it could somehow address the inherent microstuttering issue of multi-GPU systems the same way it addresses V-Sync stuttering.

Ah, gotcha. I get your question now. 

 

Um, I don't think it would address this issue because the micro-stutter I am pretty sure is the intercommunication between the two cards, basically I think micro-stutter would still happen even if no monitor at all was plugged in. Granted, no monitor means nothing because then how would you know, but I think you get my point? lol

Rig: i7 13700k - - Asus Z790-P Wifi - - RTX 4080 - - 4x16GB 6000MHz - - Samsung 990 Pro 2TB NVMe Boot + Main Programs - - Assorted SATA SSD's for Photo Work - - Corsair RM850x - - Sound BlasterX EA-5 - - Corsair XC8 JTC Edition - - Corsair GPU Full Cover GPU Block - - XT45 X-Flow 420 + UT60 280 rads - - EK XRES RGB PWM - - Fractal Define S2 - - Acer Predator X34 -- Logitech G502 - - Logitech G710+ - - Logitech Z5500 - - LTT Deskpad

 

Headphones/amp/dac: Schiit Lyr 3 - - Fostex TR-X00 - - Sennheiser HD 6xx

 

Homelab/ Media Server: Proxmox VE host - - 512 NVMe Samsung 980 RAID Z1 for VM's/Proxmox boot - - Xeon e5 2660 V4- - Supermicro X10SRF-i - - 128 GB ECC 2133 - - 10x4 TB WD Red RAID Z2 - - Corsair 750D - - Corsair RM650i - - Dell H310 6Gbps SAS HBA - - Intel RES2SC240 SAS Expander - - TreuNAS + many other VM’s

 

iPhone 14 Pro - 2018 MacBook Air

Link to comment
Share on other sites

Link to post
Share on other sites

nVidia has pretty good hardware and software things to reduce micro-stutter, but I am pretty sure you can think of the way they do that is almost like an internal g-sync type of thing. The two cards try and work really well together, and not just throw frames as fast as possible like AMD was doing. 

 

That being said, I don't think g-sync will help the issue, because I don't think it has anything to do with refresh rate of the monitor or the sync between refresh rate of GPU and monitor.

Rig: i7 13700k - - Asus Z790-P Wifi - - RTX 4080 - - 4x16GB 6000MHz - - Samsung 990 Pro 2TB NVMe Boot + Main Programs - - Assorted SATA SSD's for Photo Work - - Corsair RM850x - - Sound BlasterX EA-5 - - Corsair XC8 JTC Edition - - Corsair GPU Full Cover GPU Block - - XT45 X-Flow 420 + UT60 280 rads - - EK XRES RGB PWM - - Fractal Define S2 - - Acer Predator X34 -- Logitech G502 - - Logitech G710+ - - Logitech Z5500 - - LTT Deskpad

 

Headphones/amp/dac: Schiit Lyr 3 - - Fostex TR-X00 - - Sennheiser HD 6xx

 

Homelab/ Media Server: Proxmox VE host - - 512 NVMe Samsung 980 RAID Z1 for VM's/Proxmox boot - - Xeon e5 2660 V4- - Supermicro X10SRF-i - - 128 GB ECC 2133 - - 10x4 TB WD Red RAID Z2 - - Corsair 750D - - Corsair RM650i - - Dell H310 6Gbps SAS HBA - - Intel RES2SC240 SAS Expander - - TreuNAS + many other VM’s

 

iPhone 14 Pro - 2018 MacBook Air

Link to comment
Share on other sites

Link to post
Share on other sites

Ah, gotcha. I get your question now. 

 

Um, I don't think it would address this issue because the micro-stutter I am pretty sure is the intercommunication between the two cards, basically I think micro-stutter would still happen even if no monitor at all was plugged in. Granted, no monitor means nothing because then how would you know, but I think you get my point? lol

 

Well, the issue does lie between cards, but all data is still being pushed through a single monitor (let's assume this scenario for the purpose of the discussion), so it could prove to be a higher workload for the G-Sync module, yet in theory being capable. And agreed, of both GPU manufacturers, Nvidia does have better frame presentation.

 

I'm definitely puzzled about this because while I do get what you mean, I'm not entirely sure G-Sync is unable to act as a buffer of sorts as well, in addition to what it's known to do, to reduce or even eliminate perceived stuttering.

Link to comment
Share on other sites

Link to post
Share on other sites

It will work, you just have to plug in the G-Sync monitor to the master card.

My rig: CPU: Intel core i5 4670K MoBo: MSI Z87-G45 Gaming RAM: Kingston HyperX Beast 2x4GB 1600mhz CL9 GPU: EVGA GTX780 SC ACX SSD: ADATA Premier Pro SP900 256GBHDD: Western Digital RED 2TB PSU: FSP Aurum CM 750W Case: Cooler Master HAF XM OS: Windows 8 Pro

My Build log, the Snowbird (heavy WIP): http://linustechtips.com/main/topic/188011-snowbird-by-lachy/?hl=snowbird

Link to comment
Share on other sites

Link to post
Share on other sites

Hmm, it might be a bit too soon to get definitive answers for now, given that there are only prototypes floating around. When are production models expected to be made available? January or February, right?

 

In any case, it will be interesting to see news updates with further details. Also, some monitor manufacturers are slowly releasing monitors with their own proprietary features meant to mimic G-Sync, which will make for interesting comparisons.

Link to comment
Share on other sites

Link to post
Share on other sites

Well, the issue does lie between cards, but all data is still being pushed through a single monitor (let's assume this scenario for the purpose of the discussion), so it could prove to be a higher workload for the G-Sync module, yet in theory being capable. And agreed, of both GPU manufacturers, Nvidia does have better frame presentation.

 

I'm definitely puzzled about this because while I do get what you mean, I'm not entirely sure G-Sync is unable to act as a buffer of sorts as well, in addition to what it's known to do, to reduce or even eliminate perceived stuttering.

Hmm, the "buffer" idea would bring up an interesting idea. I don't think it works like this though, pretty sure all it does is receive a refresh command from the GPU then control the monitors Hz. Even if it had a buffer of sorts, I don't think that would help micro-stutter.

 

Maybe... it would be possible to help micro-stutter if they TOTALLY redo SLI. 

For instance, I could see micro stutter not being an issue at all if they made it so every other frame was drawn by a different card, or every other third in a tri SLI setup. With this, they might be able to have each GPU push out a frame, and send a refresh command. This would make it come across as if there was actually only a single card... maybe? If each card draws its own frame and sends its own refresh command, I could see it fixing the issue. Buuuuuuut, good luck with that lol

Rig: i7 13700k - - Asus Z790-P Wifi - - RTX 4080 - - 4x16GB 6000MHz - - Samsung 990 Pro 2TB NVMe Boot + Main Programs - - Assorted SATA SSD's for Photo Work - - Corsair RM850x - - Sound BlasterX EA-5 - - Corsair XC8 JTC Edition - - Corsair GPU Full Cover GPU Block - - XT45 X-Flow 420 + UT60 280 rads - - EK XRES RGB PWM - - Fractal Define S2 - - Acer Predator X34 -- Logitech G502 - - Logitech G710+ - - Logitech Z5500 - - LTT Deskpad

 

Headphones/amp/dac: Schiit Lyr 3 - - Fostex TR-X00 - - Sennheiser HD 6xx

 

Homelab/ Media Server: Proxmox VE host - - 512 NVMe Samsung 980 RAID Z1 for VM's/Proxmox boot - - Xeon e5 2660 V4- - Supermicro X10SRF-i - - 128 GB ECC 2133 - - 10x4 TB WD Red RAID Z2 - - Corsair 750D - - Corsair RM650i - - Dell H310 6Gbps SAS HBA - - Intel RES2SC240 SAS Expander - - TreuNAS + many other VM’s

 

iPhone 14 Pro - 2018 MacBook Air

Link to comment
Share on other sites

Link to post
Share on other sites

The idea of redoing SLI would be nice for sure. It's already on its second iteration as it is (thinking of 3dfx still). With such a SLI redesign, having each card render a single full frame could make rendering less messy in a matter of speak and since G-Sync would dismiss any need for V-Sync, so would triple buffering be no longer needed, at least for its original purpose. But we still have the issue of frames jumping around between cards. What about an active y-splitter cable that could be plugged to both cards (looking at a dual card setup) and receiving each card's frame simultaneously, with G-Sync ordering them through a command from the splitter? That would dismiss the whole master/slave card notion, and potentially improve frame rendering.

 

...am I going too far with this? lol

Link to comment
Share on other sites

Link to post
Share on other sites

The idea of redoing SLI would be nice for sure. It's already on its second iteration as it is (thinking of 3dfx still). With such a SLI redesign, having each card render a single full frame could make rendering less messy in a matter of speak and since G-Sync would dismiss any need for V-Sync, so would triple buffering be no longer needed, at least for its original purpose. But we still have the issue of frames jumping around between cards. What about an active y-splitter cable that could be plugged to both cards (looking at a dual card setup) and receiving each card's frame simultaneously, with G-Sync ordering them through a command from the splitter? That would dismiss the whole master/slave card notion, and potentially improve frame rendering.

 

...am I going too far with this? lol

LOL. Yea, maybe a little to far. But I thought about the y slitter idea in my previous statement, but I don't think if that would work either. Because I think you still want to run the output through a single card, over a single cable, although I have no good reason why. 

 

Who knows, guess we will wait and see how it goes, although I feel like the micro-stutter issues, at least initially will remain the exact same. Hopefully not, because that would help convince me to go SLI, which I have still yet to do  :unsure:

Rig: i7 13700k - - Asus Z790-P Wifi - - RTX 4080 - - 4x16GB 6000MHz - - Samsung 990 Pro 2TB NVMe Boot + Main Programs - - Assorted SATA SSD's for Photo Work - - Corsair RM850x - - Sound BlasterX EA-5 - - Corsair XC8 JTC Edition - - Corsair GPU Full Cover GPU Block - - XT45 X-Flow 420 + UT60 280 rads - - EK XRES RGB PWM - - Fractal Define S2 - - Acer Predator X34 -- Logitech G502 - - Logitech G710+ - - Logitech Z5500 - - LTT Deskpad

 

Headphones/amp/dac: Schiit Lyr 3 - - Fostex TR-X00 - - Sennheiser HD 6xx

 

Homelab/ Media Server: Proxmox VE host - - 512 NVMe Samsung 980 RAID Z1 for VM's/Proxmox boot - - Xeon e5 2660 V4- - Supermicro X10SRF-i - - 128 GB ECC 2133 - - 10x4 TB WD Red RAID Z2 - - Corsair 750D - - Corsair RM650i - - Dell H310 6Gbps SAS HBA - - Intel RES2SC240 SAS Expander - - TreuNAS + many other VM’s

 

iPhone 14 Pro - 2018 MacBook Air

Link to comment
Share on other sites

Link to post
Share on other sites

Would be instadeath for G-sync if it didn't support SLI since everyone in that price range is mostly dual GPU users.

Link to comment
Share on other sites

Link to post
Share on other sites

Ok, thinking at the current expected prices of the G-Sync module and respective price markup on G-Sync enabled monitors 6 months after production models are released, imagine said splitter got released, and having passed half a year (or even a whole year), hypothetically being enough time for the modules to drop in price but Nvidia having then bundles of module and splitter at the same price of the original module release prices, which would be a nice offset for end users.

 

Now, following your idea of running the output through a single card alone, I just thought of a reason why it would be more desirable over two cards, namely using one additional output port that could be used to power a separate monitor. The question is, will the G-Sync module have any resource limitations that might stop it being being capable to output G-Sync on monitors connected to every port, optionally including a monitor using a port on the second card? In theory it shouldn't since it's not on the card itself but rather on the monitor, therefore receiving far less data than if it was on the card itself sending data to multiple monitors. You know, the more I talk about this, the more I think it might be too out there, and yet I still have the nagging feeling running the main output through a port on each card might provide "full" G-Sync performance on SLI, assuming SLI will fully embrace and work with it or that there might be a hypothetical SLI redesign in the near future.

 

Curiously, I'm exactly on the same boat as you, I'm a strong proponent of single GPU systems. And I'm still waiting for some new revision of Adaptative V-Sync, I'm not exactly convinced by it yet.

 

EDIT: It should be noted that this discussion isn't about whether G-Sync works with SLI or not but rather if the module can go beyond what's currently known and help to somewhat/entirely mitigate multi-GPU stuttering.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×