Jump to content

VM newbie needs some hand holding-GPU Passthrough

Ok, I've gotten some of the steps done but I'm running into a wall finding information that's current.

Made a VM with Hyper-V for Windows 10 on Windows 10, looking to get GPU (Vega56) installed in the VM so I can run some stuff in VM with GPU. Problem is I'm finding a lot of information referencing features which aren't current anymore, older versions of Hyper-V and older versions of Windows/Windows 10.

https://forum.cfx.re/t/running-fivem-in-a-hyper-v-vm-with-full-gpu-performance-for-testing-gpu-partitioning/1281205

I found this tutorial and followed it as well as I could, copied the AMD drivers to the right locations I think, and now have 2 RX Vega show up in device manager with a Code 43, rebooted and there's still 2 on Code 43. Honestly I've gotten a lot farther along than I ever thought I would so I'm kind of proud of myself for being able to follow directions that weren't exact to my situation. I tried to run the Radeon driver installer in hopes it would do something, it errors with a message there's no Radeon hardware to install a driver for. Trying to update the driver manually also doesn't pan out, Windows says I'm using the best driver already.

I'm ready for some ELI5 now I think, thanks to anyone who wants to help.

Link to comment
Share on other sites

Link to post
Share on other sites

I read, at a time Hyper-V as you found out yourself supported GPU pass-through. All of my own research into this pointed at a feature known as RemoteFX. Weather or not you still have access to RemoteFX I don't know but it's your best bet.

 

Outside of this I would seriously consider replacing the host OS with something else. VMware vSphere, PROXMOX, almost any Linux distro + QEMU/KVM, UnRAID even.

 

These are built with the intention of passing hardware (such as PCIe devices) through to VMs.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Windows7ge said:

I read, at a time Hyper-V as you found out yourself supported GPU pass-through. All of my own research into this pointed at a feature known as RemoteFX. Weather or not you still have access to RemoteFX I don't know but it's your best bet.

 

Outside of this I would seriously consider replacing the host OS with something else. VMware vSphere, PROXMOX, almost any Linux distro + QEMU/KVM, UnRAID even.

 

These are built with the intention of passing hardware (such as PCIe devices) through to VMs.

RemoteFX is dead so that's out the Windows /dadjoke

This is my 'daily gamer' system so Windows is kind of a must on it, it's not a dedicated VM machine so Windows is what I'm working with. Others have gotten this working with Windows 10 guests in Hyper-V on Windows 10 hosts and I feel like I'm close...I'm just missing something and don't know enough about this to figure out what I'm missing. I'll come back to it tomorrow or later in the week and see if I can completely break everything in the process of making it work.

Here's what clued me into the rabbit hole that got me this far. I've got like 30 tabs open each with a different part of the puzzle trying to string it all together.

spacer.png

Link to comment
Share on other sites

Link to post
Share on other sites

Alright. So the process looks like it's suppose to be simple enough for NVIDIA GPU's (I see that's not what you're working with).

 

One thing you have to consider is that not all GPU's play friendly with GPU pass-through. The RX5000 series is a prime example. Don't know about the VEGA cards.

 

I found another article discussing this but I don't think it adds any worthwhile information.

 

Article

Link to comment
Share on other sites

Link to post
Share on other sites

I'll check that out tonight. I think I need to give up my computer for a few days to ride this ether high and mine with the Vega 56. I'll revisit this this weekend and let you know what I figure out.

Figured it was a cake walk for Nvidia, most things are. I'm hoping that Vega works out too. My next stab in the dark was going to be to delete the Vegas from device manager and reboot to see what happens. Might be missing some driver files, I'm seriously pretty stoked anything even showed up but confused about having two and wondering if that's the source of code 43. I'll stumble into a solution eventually I'm sure, I'll be sure to post back my convoluted path to success.

 

I wonder if OpenCL works...

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Bitter said:

I'll check that out tonight. I think I need to give up my computer for a few days to ride this ether high and mine with the Vega 56. I'll revisit this this weekend and let you know what I figure out.

Figured it was a cake walk for Nvidia, most things are. I'm hoping that Vega works out too. My next stab in the dark was going to be to delete the Vegas from device manager and reboot to see what happens. Might be missing some driver files, I'm seriously pretty stoked anything even showed up but confused about having two and wondering if that's the source of code 43. I'll stumble into a solution eventually I'm sure, I'll be sure to post back my convoluted path to success.

 

I wonder if OpenCL works...

You know with a GPU passed to a VM you can game in a VM right?

 

I would still seriously consider moving to a hypervisor OS or QEMU/KVM. I even have a tutorial on the process and how you can game in the VM from the host using Looking Glass.

 

 

It needs to be updated for Ubuntu 20.04.2 LTS as a few things have changed but 95% of the process is still the same.

 

Just something to think about if you hit a brick wall... :3

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Windows7ge said:

You know with a GPU passed to a VM you can game in a VM right?

 

I would still seriously consider moving to a hypervisor OS or QEMU/KVM. I even have a tutorial on the process and how you can game in the VM from the host using Looking Glass.

 

 

It needs to be updated for Ubuntu 20.04.2 LTS as a few things have changed but 95% of the process is still the same.

 

Just something to think about if you hit a brick wall... :3

Yeah, I'll consider that. I have other machines running Ubuntus and I'm a little familiar with it. Been breaking it since 6.06 lol!

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Bitter said:

Yeah, I'll consider that. I have other machines running Ubuntus and I'm a little familiar with it. Been breaking it since 6.06 lol!

I lost count how many times I broke FreeNAS(FreeBSD) over the years.

 

There are some steps where if you screw it up you might brick the install. Something to be wary of. 😛

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Windows7ge said:

I lost count how many times I broke FreeNAS(FreeBSD) over the years.

 

There are some steps where if you screw it up you might brick the install. Something to be wary of. 😛

 

The curative for everything is a fresh install, I've yet to brick hardware but there's always new experiences!

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Bitter said:

The curative for everything is a fresh install, I've yet to brick hardware but there's always new experiences!

Yeah it's just the time sink that's the inconvenience.

 

For shits'n'giggles I don't think you mentioned what the other hardware is you're working with. You did enable VT-d (Intel) or IOMMU Groups (AMD) yes?

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, Windows7ge said:

Yeah it's just the time sink that's the inconvenience.

 

For shits'n'giggles I don't think you mentioned what the other hardware is you're working with. You did enable VT-d (Intel) or IOMMU Groups (AMD) yes?

2700X and Vega56 are the important parts. Also have 32GB RAM because it was cheap back then, complete overkill for what I do with my PC but I don't build often so I overbuild when I do build. I'm fairly sure that's enabled but I'll have to double check next reboot.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Bitter said:

2700X and Vega56 are the important parts. Also have 32GB RAM because it was cheap back then, complete overkill for what I do with my PC but I don't build often so I overbuild when I do build. I'm fairly sure that's enabled but I'll have to double check next reboot.

A lot of people who want to do GPU Pass-through do it because the want to run Windows in a VE. Since you already game on the host do you have any particular workloads that you need in a VM that you don't want on the host?

Link to comment
Share on other sites

Link to post
Share on other sites

38 minutes ago, Windows7ge said:

A lot of people who want to do GPU Pass-through do it because the want to run Windows in a VE. Since you already game on the host do you have any particular workloads that you need in a VM that you don't want on the host?

Not at the moment but it would be nice to have working for my own understanding and if I did need it in the future. Always looking to throw myself off a cliff to learn new things you wouldn't think I know.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Bitter said:

Not at the moment but it would be nice to have working for my own understanding and if I did need it in the future. Always looking to throw myself off a cliff to learn new things you wouldn't think I know.

Relatable. I'm working on a project that's teaching me how to braise aluminum and I plan to learn how to weld steel.

 

Also working on how to setup a iPXE server so I can have network clients boot to virtual iSCSI drives on the network instead of relying on local storage.

 

GPU Pass-through among passing other hardware devices like USB controllers and HBA cards is something I've investigated thoroughly and have had a lot of fun with. I did investigate GPU Pass-through with Hyper-V but that led me down the RemoteFX rabbit-hole. I didn't put much more time into it before just opting to use my PROXMOX server.

 

That's another option you have. You could build a dedicated hypervisor server for hardware pass-through.

Link to comment
Share on other sites

Link to post
Share on other sites

39 minutes ago, Windows7ge said:

Relatable. I'm working on a project that's teaching me how to braise aluminum and I plan to learn how to weld steel.

 

Also working on how to setup a iPXE server so I can have network clients boot to virtual iSCSI drives on the network instead of relying on local storage.

 

GPU Pass-through among passing other hardware devices like USB controllers and HBA cards is something I've investigated thoroughly and have had a lot of fun with. I did investigate GPU Pass-through with Hyper-V but that led me down the RemoteFX rabbit-hole. I didn't put much more time into it before just opting to use my PROXMOX server.

 

That's another option you have. You could build a dedicated hypervisor server for hardware pass-through.

I'm a bit shy on hardware at the moment, I think the fastest CPU not in use is an i5 4590. My 2700X with it's 32GB RAM is the oddball in the house, everything else is Intel 4th gen and 4 cores or less.

 

Side note, I know how to braze steel and weld steel, aluminum, and stainless steel. It's been a long time but I could (probably still can) TIG, MIG, Stick, and Gas weld with some competency. I haven't brazed aluminum yet but it looks pretty straight forward. Also used to have a 'solar forge' of sorts when I was a kid in the form of a big fresnel lens and used to play with alloying different metals together. Aluminum and bronze was a neat one!

Link to comment
Share on other sites

Link to post
Share on other sites

Coincidentally I've been down the rabbit hole on this and just had some success last night. In my case I've been trying to get passthrough working with Windows Sandbox (which is basically a Hyper-V virtual machine)

 

I'm on Intel/Nvidia but maybe this can hint toward a fix for you. What worked for me was 1) Enabling vGPU with a config file, and 2) Enabling Intel virtualization in my BIOS (a.k.a Intel VT-D Tech). Maybe there's a similar setting on your AMD board?

 

Now I can see my GPU in Sandbox and I'm getting about 2/3 of the benchmarking performance compared to native. On one hand it feels like a success, but I'm researching to see if I can get the performance up with Sandbox or another solution. Ultimately I'm trying to set up an isolated environment for mining; looks like you might be trying to do the same. If you make any breakthroughs feel free to let me know :)

 

I made a Reddit post with more information it here

Link to comment
Share on other sites

Link to post
Share on other sites

IIRC wasn't Vega kinda troublesome with VM passthough? I have been playing with this for the last 6 or 7 years myself. VMware, HyperV, and more recently ProxMox. I'd suggest using a Linux based hypervisor for what you are looking to do honestly. ProxMox is fairly easy to use (even at the CLi leve) since it's Debian. I am fixing to take my newish dual 2690v3 and try this with 1070's and see what the results are. If GPU's weren't so hard to get right now I'd pick one just for your sake and see what issues you might be running into. Alas that isn't an option here. 

Main: Intel Core i7 10700K 4.8Ghz | MSI Z490 | MSI GTX 1080Ti | 32GB OLY 2666Mhz | 1 TB NVME, 3TB HDD

Toothless: Intel Xeon E5 1660v3 @4.4Ghz | MSI X99 Godlike | MSI GTX 1070 | Gigabyte 2070 | 64 GB 2666 Corsair LPX | 1TB NVME, 4TB HDD, 3TB HDD

Project-Gotham: Intel Xeon E5 1660v3 @4.3Ghz | MSI X99 Xpower AC | MSI GTX 760 | 48 GB 3000 Corsair LPX | 500GB NVME, 1 TB HDD

Casino-Royale: 2X Xeon 2690v3| SuperMicro X10DRG-Q | 2X MSI GTX 1070 | 80GB DDR4 RAM | ProxMox | 2 250GB SSD 1 60GB SSD 2X 1TB HDD 1 2TB HDD 6x 3TB HDD

Aux: Dual Xeon X5675's | Tyan s7205 | ProxMox | 32GB DDR3 ECC RAM |

Aux:  Core i7 4790K @ 4.6Ghz | MSI Z87 GD65 | MSI GTX 1070 | 16 GB 1866 Kingston HyperX | Dead

Aux: Core i7 4790K @ 4.8Ghz | MSI Z87 GD65 | EVGA 970 2GB | 16 GB 1600 Gskill

Aux: Core i5 4690K @ 3.6Ghz | MSI Z87 Gaming 5 | MSI GTX 960 | 8 GB 1600 Kingston HyperX

Mobile: Core i7 7700HQ | GTX 1070 | 16GB RAM | Ailenware 17r4

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×