Jump to content

Windows server in a VM?

TH Gamer

Hi, I am building a freenas box and i wonded if i can install windows server on the built in freenas VM software and create a domain that my PC'S can connect to ,thanks

 

Link to comment
Share on other sites

Link to post
Share on other sites

Honestly windows 10 has been given enough features that it should cover your needs, no need to pay the extra for server software. 

Link to comment
Share on other sites

Link to post
Share on other sites

My friend gave me a licence for it and i thought of using it 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, TH Gamer said:

Hi, I am building a freenas box and i wonded if i can install windows server on the built in freenas VM software and create a domain that my PC'S can connect to ,thanks

 

you do realize that windows file/printer sharing allows whole drives to be shared

 

how many empty sata ports you have in pc?

how many drives in NAS?

is it cheaper to just add drives to windows pc as shared drives?

Link to comment
Share on other sites

Link to post
Share on other sites

i looked up pcpartpicker and found lga1151 mobo with 5 pciex1 slots and 6 onboard sata

get 5 pciex to sata cards get 10 more sata ports

16 harddrives

16*12=192tbytes hdd if 12tbyte drives used

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, TH Gamer said:

Hi, I am building a freenas box and i wonded if i can install windows server on the built in freenas VM software and create a domain that my PC'S can connect to ,thanks

 

In the current version of FreeNas (v11). You can create virtual machines from the web user interface. FreeNAS is based on FreeBSD which features the Bhyve hyper-visor.

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, bcguru9384 said:

i looked up pcpartpicker and found lga1151 mobo with 5 pciex1 slots and 6 onboard sata

get 5 pciex to sata cards get 10 more sata ports

16 harddrives

16*12=192tbytes hdd if 12tbyte drives used

Who buys $6500 worth of drives, and can't spend $100 on a decent controller? :x

Spoiler

Desktop: Ryzen9 5950X | ASUS ROG Crosshair VIII Hero (Wifi) | EVGA RTX 3080Ti FTW3 | 32GB (2x16GB) Corsair Dominator Platinum RGB Pro 3600Mhz | EKWB EK-AIO 360D-RGB | EKWB EK-Vardar RGB Fans | 1TB Samsung 980 Pro, 4TB Samsung 980 Pro | Corsair 5000D Airflow | Corsair HX850 Platinum PSU | Asus ROG 42" OLED PG42UQ + LG 32" 32GK850G Monitor | Roccat Vulcan TKL Pro Keyboard | Logitech G Pro X Superlight  | MicroLab Solo 7C Speakers | Audio-Technica ATH-M50xBT2 LE Headphones | TC-Helicon GoXLR | Audio-Technica AT2035 | LTT Desk Mat | XBOX-X Controller | Windows 11 Pro

 

Spoiler

Server: Fractal Design Define R6 | Ryzen 3950x | ASRock X570 Taichi | EVGA GTX1070 FTW | 64GB (4x16GB) Corsair Vengeance LPX 3000Mhz | Corsair RM850v2 PSU | Fractal S36 Triple AIO | 12 x 8TB HGST Ultrastar He10 (WD Whitelabel) | 500GB Aorus Gen4 NVMe | 2 x 2TB Samsung 970 Evo Plus NVMe | LSI 9211-8i HBA

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/10/2017 at 10:43 AM, TH Gamer said:

Hi, I am building a freenas box and i wonded if i can install windows server on the built in freenas VM software and create a domain that my PC'S can connect to ,thanks

 

Yes, FreeNAS 11 can create and host VM's for any number of operating systems, including Windows 10/Server, etc.

 

I personally use ESXi as my Hypervisor (VM host), and run FreeNAS and Windows both as VM's - I just find ESXi much more user friendly on the VM front.

 

But FreeNAS will totally work if you want to try it out.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

Windows VMs on FreeNAS are more hassle than it's worth imo. First of all I greatly dislike that it forces UEFI down your throat for Windows VMs, which isn't that common yet on actual server hardware so getting that to boot properly is already more of a chore than it needs to be. It's more Unix centric, and should be used that way. 

You're better off running a true hypervisor like ESXi, Xen, Proxmox or Hyper-V and running Freenas and Windows Server as separate VMs, from a manageability standpoint.

PC Specs - AMD Ryzen 7 5800X3D MSI B550M Mortar - 32GB Corsair Vengeance RGB DDR4-3600 @ CL16 - ASRock RX7800XT 660p 1TBGB & Crucial P5 1TB Fractal Define Mini C CM V750v2 - Windows 11 Pro

 

Link to comment
Share on other sites

Link to post
Share on other sites

On ‎12‎/‎11‎/‎2017 at 1:17 AM, Jarsky said:

Who buys $6500 worth of drives, and can't spend $100 on a decent controller? :x

your saying windows file system sucks????

or are you saying an onboard pch controller sucks???

please specify

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, bcguru9384 said:

your saying windows file system sucks????

or are you saying an onboard pch controller sucks???

please specify

I think he's implying that if you are spending $6500 on buying a bunch of 12TB HDD's, you likely have needs that won't be satisfied by using the drives individually in Windows.

 

Furthermore, most motherboard wouldn't even have enough SATA ports for 16 drives.

 

NTFS certainly has room for improvement.

 

It all depends on usage and needs though. If you actually needed 16 drives, what operating system you used would be important. For Windows, using ReFS and storage spaces would be an acceptable scenario. But you'd likely still need a controller card of some description (an HBA, in this case, rather than a RAID card).

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

the best way for shares would be to run Ubuntu Server with ZFS partitioning, simplifies management a lot.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, dalekphalm said:

I think he's implying that if you are spending $6500 on buying a bunch of 12TB HDD's, you likely have needs that won't be satisfied by using the drives individually in Windows.

 

Furthermore, most motherboard wouldn't even have enough SATA ports for 16 drives.

 

NTFS certainly has room for improvement.

 

It all depends on usage and needs though. If you actually needed 16 drives, what operating system you used would be important. For Windows, using ReFS and storage spaces would be an acceptable scenario. But you'd likely still need a controller card of some description (an HBA, in this case, rather than a RAID card).

dalek read how i said i would get 5 pciex1 to sata adapter cards give 10 extra sata ports

windows handles absolutely fine

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, bcguru9384 said:

dalek read how i said i would get 5 pciex1 to sata adapter cards give 10 extra sata ports

windows handles absolutely fine

Yeah... 5x PCIe SATA adapters. I mean, they work, but I would definitely not trust running a bunch of 12TB HDD's off of them, especially in any sort of pool.

 

What he's saying is that if you're gonna spend $6500 on HDD's, at least buy a decent LSI (or similar quality) HBA - and then you only need one PCIe slot instead of 5, since pretty much any HBA will give you at least 4 channels (1 channel = 1 drive), with most typically having 8 channels.

 

Windows will handle an individual HDD fine. It will handle multiple individual HDD's fine. It will handle a pool pretty good if you use Storage Spaces and ReFS (though that has it's own quirks and limitations).

 

Storage Spaces parity pools tend to be pretty slow compared to a comparable pool from ZFS, let alone compared to an actual hardware RAID card.

 

You can of course augment the performance hit using an SSD cache. But that's added complexity.

 

So Windows is fine. But that doesn't mean it's ideal.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, dalekphalm said:

Yeah... 5x PCIe SATA adapters. I mean, they work, but I would definitely not trust running a bunch of 12TB HDD's off of them, especially in any sort of pool.

 

What he's saying is that if you're gonna spend $6500 on HDD's, at least buy a decent LSI (or similar quality) HBA - and then you only need one PCIe slot instead of 5, since pretty much any HBA will give you at least 4 channels (1 channel = 1 drive), with most typically having 8 channels.

 

Windows will handle an individual HDD fine. It will handle multiple individual HDD's fine. It will handle a pool pretty good if you use Storage Spaces and ReFS (though that has it's own quirks and limitations).

 

Storage Spaces parity pools tend to be pretty slow compared to a comparable pool from ZFS, let alone compared to an actual hardware RAID card.

 

You can of course augment the performance hit using an SSD cache. But that's added complexity.

 

So Windows is fine. But that doesn't mean it's ideal.

but each pci card allows 1ide with2 sata

sata get raid0 with ide raid1 to sata raid

excellent recovery abilities

note i only said 12tB because they are largest i could think of so exit the whole 6500 dollars of drives as it was only an example

your stuck on money side when the whole point was to demonstrate what can be done with "low level outdated parts"

sorry for getting you to think outside the normal box

(you could have used the fact the mobo only has 1 NIC so it would bottleneck as a NAS type system unless 2 pci cards dropped for 2 2 port NIC cards)even then you need a damn good router/switch

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, bcguru9384 said:

but each pci card allows 1ide with2 sata

sata get raid0 with ide raid1 to sata raid

excellent recovery abilities

note i only said 12tB because they are largest i could think of so exit the whole 6500 dollars of drives as it was only an example

your stuck on money side when the whole point was to demonstrate what can be done with "low level outdated parts"

sorry for getting you to think outside the normal box

(you could have used the fact the mobo only has 1 NIC so it would bottleneck as a NAS type system unless 2 pci cards dropped for 2 2 port NIC cards)even then you need a damn good router/switch

Look, there's a difference between "making due with low end cheap parts" and "thinking outside the box". I don't think outside the box, because the box doesn't even exist in my world. I've used various crazy setups in the past, including a FlexRAID build that utilized a dozen mismatched drives, with multiple ones combined together virtually to create virtual larger drives so that the virtual drive could be "raided" together with other matching larger drives, etc.

 

And how much do each of those cards cost? Lets say $20 each (optimistic), you're looking at what... 5 of them? For that price, you can buy a kickass used HBA or RAID Card off of eBay, that will smoke them in performance.

 

Yes, lots can be done with low end or outdated tech. But that doesn't mean you should.

 

If you have to use cheapo PCIe SATA cards because that's all you can get or afford? Sure whatever that's cool. No one is saying you can't do that. But if you have the ability or money to get a better PCIe HBA or RAID card, in a NAS scenario you should.

 

Also, while in some situations, yes a 1Gbps NIC can bottleneck a NAS, it highly depends on other variables, including overall network design, clients, etc. But for a home scenario, in most cases, even for a pretty beefy server, a Gigabit NIC is totally fine and won't hinder client usage (even if it technically will bottleneck total drive speeds in a RAID array).

 

As long as my Plex Clients don't start buffering, then I'm good to go.

 

Anyway I'm done arguing about this. I'm not saying you're wrong to use PCIe 1x SATA cards. I'm just saying that's not ideal for a home NAS/home server.

 

But we're techs and enthusiasts. Sometimes we have to deal with situations that aren't ideal. Sometimes that's part of the fun too.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/14/2017 at 6:50 PM, dalekphalm said:

Look, there's a difference between "making due with low end cheap parts" and "thinking outside the box".

 

This is exactly it. 

 

Those PCIe X1 boards are BTC mining boards, and theyre made for attaching graphics cards because theres virtually next to no data transfer between the CPU/Memory and the GPU. It transfers the payload, the card does its crunching, and it sends back a result.  If you're creating pools in FreeNAS though, then you're restricting your speed by only having a 6Gbps interface for 2 drives, rather than an HBA (SAS2) which is 6Gbps per channel for the same price if you pick one up on Ebay or Craigslist. Theres a reason that HBA's have X4 interfaces on them. Also, if you're looking at those cheap cards off eBay, not all chipsets are supported under FreeBSD/FreeNAS, and most HBA's you can find a validation list to ensure compatibility. 

 

 

Spoiler

Desktop: Ryzen9 5950X | ASUS ROG Crosshair VIII Hero (Wifi) | EVGA RTX 3080Ti FTW3 | 32GB (2x16GB) Corsair Dominator Platinum RGB Pro 3600Mhz | EKWB EK-AIO 360D-RGB | EKWB EK-Vardar RGB Fans | 1TB Samsung 980 Pro, 4TB Samsung 980 Pro | Corsair 5000D Airflow | Corsair HX850 Platinum PSU | Asus ROG 42" OLED PG42UQ + LG 32" 32GK850G Monitor | Roccat Vulcan TKL Pro Keyboard | Logitech G Pro X Superlight  | MicroLab Solo 7C Speakers | Audio-Technica ATH-M50xBT2 LE Headphones | TC-Helicon GoXLR | Audio-Technica AT2035 | LTT Desk Mat | XBOX-X Controller | Windows 11 Pro

 

Spoiler

Server: Fractal Design Define R6 | Ryzen 3950x | ASRock X570 Taichi | EVGA GTX1070 FTW | 64GB (4x16GB) Corsair Vengeance LPX 3000Mhz | Corsair RM850v2 PSU | Fractal S36 Triple AIO | 12 x 8TB HGST Ultrastar He10 (WD Whitelabel) | 500GB Aorus Gen4 NVMe | 2 x 2TB Samsung 970 Evo Plus NVMe | LSI 9211-8i HBA

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×