Jump to content

Is it possible to use a separate systems GPU to render frames via sending the raw data over an Ethernet cable?

RainOfPain125

Title. When I used to host 8 or so servers before on consumer hardware, I thought why would I want to build a whole nother machine and buy a whole nother GPU when I don't need the processing power of an entirely separate GPU.

 

I wondered if it were possible to have a system setup with no GPU, and the raw video output is sent to another computer that DOES have a GPU via Ethernet, and the processed/finished frames comes out of one of the GPU video outputs (or back down the ethernet cable and output through the mobo VGA, however I'd imagine that would be too much data).

 

Even if there was a bit of latency, and even if you could only run a low resolution like 720p, it would be way cheaper than outright buying a new GTX 1030 or whatever.

 

If this isn't possible at all, what software or hardware limitations are preventing this?

Link to comment
Share on other sites

Link to post
Share on other sites

28 minutes ago, RainOfPain125 said:

and the raw video output is sent to another computer that DOES have a GPU via Ethernet,

 

I don't think you understand how "rendering" within normal apps work, CPU and GPU do work closely together and you can't just insert something in between.

Sure there are SW packages that do allow offloading over a network, but thats not for displaying anything live but for rendering CGI which is later combined to a movie.

 

You could run the whole SW on the remote PC and than stream it back (somewhere between screensharing and STADIA) but that ain't easy to setup and you might find that just running the SW on the iGPU to be the less c### solution.

Link to comment
Share on other sites

Link to post
Share on other sites

56 minutes ago, RainOfPain125 said:

Title. When I used to host 8 or so servers before on consumer hardware, I thought why would I want to build a whole nother machine and buy a whole nother GPU when I don't need the processing power of an entirely separate GPU.

 

I wondered if it were possible to have a system setup with no GPU, and the raw video output is sent to another computer that DOES have a GPU via Ethernet, and the processed/finished frames comes out of one of the GPU video outputs (or back down the ethernet cable and output through the mobo VGA, however I'd imagine that would be too much data).

 

Even if there was a bit of latency, and even if you could only run a low resolution like 720p, it would be way cheaper than outright buying a new GTX 1030 or whatever.

 

If this isn't possible at all, what software or hardware limitations are preventing this?

You could mix virtualization with a streaming app like parsec.

On the system with a powerful GPU you could make a VM and pass the GPU through to it, this would allow you to have a completely separate windows (or linux) install that doesn't affect anything else on the machine. Then you can stream that machine's screen using parsec to whatever other device you want to use SW from. It's not a perfect solution as you'd be on a completely different windows install not just rendering SW through it but it's definitely possible.

why no dark mode?
Current:

Watercooled Eluktronics THICC-17 (Clevo X170SM-G):
CPU: i9-10900k @ 4.9GHz all core
GPU: RTX 2080 Super (Max P 200W)
RAM: 32GB (4x8GB) @ 3200MTs

Storage: 512GB HP EX NVMe SSD, 2TB Silicon Power NVMe SSD
Displays: Asus ROG XG-17 1080p@240Hz (G-Sync), IPS 1080p@240Hz (G-Sync), Gigabyte M32U 4k@144Hz (G-Sync), External Laptop panel (LTN173HT02) 1080p@120Hz

Asus ROG Flow Z13 (GZ301ZE) W/ Increased Power Limit:
CPU: i9-12900H @ Up to 5.0GHz all core
- dGPU: RTX 3050 Ti 4GB

- eGPU: RTX 3080 (mobile) XGm 16GB
RAM: 16GB (8x2GB) @ 5200MTs

Storage: 1TB NVMe SSD, 1TB MicroSD
Display: 1200p@120Hz

Asus Zenbook Duo (UX481FLY):

CPU: i7-10510U @ Up to 4.3 GHz all core
- GPU: MX 250
RAM: 16GB (8x2GB) @ 2133MTs

Storage: 128GB SATA M.2 (NVMe no worky)
Display: Main 1080p@60Hz + Screnpad Plus 1920x515@60Hz

Custom Game Server:

CPUs: Ryzen 7 7700X @ 5.1GHz all core

RAM: 128GB (4x32GB) DDR5 @ whatever it'll boot at xD (I think it's 3600MTs)

Storage: 2x 1TB WD Blue NVMe SSD in RAID 1, 4x 10TB HGST Enterprise HDD in RAID Z1

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, RainOfPain125 said:

Title. When I used to host 8 or so servers before on consumer hardware, I thought why would I want to build a whole nother machine and buy a whole nother GPU when I don't need the processing power of an entirely separate GPU.

 

I wondered if it were possible to have a system setup with no GPU, and the raw video output is sent to another computer that DOES have a GPU via Ethernet, and the processed/finished frames comes out of one of the GPU video outputs (or back down the ethernet cable and output through the mobo VGA, however I'd imagine that would be too much data).

 

Even if there was a bit of latency, and even if you could only run a low resolution like 720p, it would be way cheaper than outright buying a new GTX 1030 or whatever.

 

If this isn't possible at all, what software or hardware limitations are preventing this?

The bandwidth between a graphics card and system memory is 32GB/sec on x16 PCI-E 4.0. The bandwidth on 1GbE is 0.125GB/sec. That is a factor of 256 difference. Even if you had 10GbE, it would still be a factor of 25.6 difference. People probably don't do this because of that.

 

However, there are graphics cards that have slower connectivity. There were once PCI graphics cards that were limited to 133MB/sec, which is just slightly above what 1GbE does. For 2D rendering, this is fine. For 3D rendering, this is probably painful, but once things load, it should be okay (although anything 3D that dynamically loads from system memory is going to give you a bad time).

 

You could theoretically write software to make this work over the network (by intercepting library calls, to redirect them over the network while rendering off screen and sending back the frame buffer updates), but there is nothing off the shelf for it. Keep in mind that getting back 1080p60 would use 498Mbps while 1440p60 would use ~884Mbps. 4k would exceed the capacity of 1GbE. You would then need to resort to doing some sort of compression at the server and decompression at the host, which would produce latency.

 

There are alternatives to this approach that are workable with off the shelf things. One is to use HDBase-T hardware (possibly with a KVM switch) to have access to a machine that is located elsewhere. Another is to use game streaming software to play games that are on a machine located elsewhere.

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, RainOfPain125 said:

Title. When I used to host 8 or so servers before on consumer hardware, I thought why would I want to build a whole nother machine and buy a whole nother GPU when I don't need the processing power of an entirely separate GPU.

 

I wondered if it were possible to have a system setup with no GPU, and the raw video output is sent to another computer that DOES have a GPU via Ethernet, and the processed/finished frames comes out of one of the GPU video outputs (or back down the ethernet cable and output through the mobo VGA, however I'd imagine that would be too much data).

 

Even if there was a bit of latency, and even if you could only run a low resolution like 720p, it would be way cheaper than outright buying a new GTX 1030 or whatever.

 

If this isn't possible at all, what software or hardware limitations are preventing this?

 

... okay.

 

Firstly, why are you exploring gaming grade GPUs for low grade hardware? Most CPU's come with built in graphics processing (iGPU). A most basic PC build doesn't require a dedicated GPU.

 

Secondly, non-GPU CPU's probably cost more than CPU's with iGPUs ...

 

Thirdly, there is literally zero hardware you can buy released in the last 5-10 years that 'can only run low resolution like 720p'. Resolution limits outside of gaming is next to non-existant.

 

What on earth are you actually trying to do?

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Amias said:

Most CPU's come with built in graphics processing (iGPU).

Every CPU I've used to host servers (R5-1600) (R7-5800x) didn't come with integrated graphics. Any APU or iGPU would have terrible performance and not be up to hosting a lot of servers lol

Link to comment
Share on other sites

Link to post
Share on other sites

On 6/28/2021 at 12:44 AM, RainOfPain125 said:

Every CPU I've used to host servers (R5-1600) (R7-5800x) didn't come with integrated graphics. Any APU or iGPU would have terrible performance and not be up to hosting a lot of servers lol

Okay. I think I understand your use case better now.

 

There's loads on Reddit discussing this issue. 

 

Maybe that'll help.

Link to comment
Share on other sites

Link to post
Share on other sites

On 6/27/2021 at 7:44 PM, RainOfPain125 said:

Every CPU I've used to host servers (R5-1600) (R7-5800x) didn't come with integrated graphics. Any APU or iGPU would have terrible performance and not be up to hosting a lot of servers lol

You're aware that most servers don't have GPUs, not sure what you mean by "Any APU or iGPU would have terrible performance and not be up to hosting a lot of servers" ... literally none of the servers we have at work have GPUs, they all have iGPUs and there's no performance issues.

If you need help with your forum account, please use the Forum Support form !

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×