Jump to content

The sickest build EVER. You thought Linus was crazy? Think again.

32 minutes ago, mariushm said:

You wouldn't be able to have 16 pci-e slots at least in a very simple and cheap way.

 

I don't know off the top of my head the maximum trace distance for the frequencies pci-e runs at, but my guess would be the maximum trace length would be something like 20-30 cm - 16 pci-e slots one below the other would exceed that length. 

I suppose you could make some kind of custom motherboard where you would get an Intel cpu with more than 32 pci-e v3.0 lanes, and you can arrange them in 4 x x8 pci-e v3.0 lanes and then you could use four of those pci-e switcher chips which take in a x8 or x16 pci-e bus and create 4 separate x4 or x8 slots... each of those chips would probably cost you around 60-100$ 

You'd also have to build a custom case with 2 or more of those pci-e connectors  in a single line or to have the cards arranged back to back, basically to reduce the vertical height of the stack of cards.

 

Linus has a server where they had 10 pci-e slots and that system used those kind of switcher chips which took a number of pci-e lanes and created several slots and pushed the data packets from all those slots into the smaller number of pci-e lanes coming into the switch IC.

 

You have cards like GTX1070 which use only around 150 watts. You could tweak the voltage and power limits to make sure they don't go over 140-150 watts. 16 such cards would use up to 2400 watts which is just about 10 amps at 230v. In Europe, there's 16A circuits on 230v, so it shouldn't be a problem to power those cards from one or two of those industrial Delta power supplies for example, which only output 12v... or any server power supplies that output only 12v.

The rest of the system (cpu, motherboard, pci-e switches etc) could use less than 4-500 watts and could run on a second power circuit easily (or you could have some electrical engineers build a more powerful 230v circuit for your room, with thicker cables and bigger fuses etc)  

 

It's doable but waste of money. Few applications would actually be able to split their workloads on 16 cards at a time and have them all do work concurrently and then collect the job results as their completed and merge everything and finish jobs..  you wouldn't use all those 16 cards efficiently. 

 

What would make sense for example would be a system with 16 gtx1050 or rx 460, each using about 50-70 watts and each would be satisfied with a x4 slot ...   and have 16 real time h264 / hevc encoders receive raw HD or 4K streams from capture cards and load them in video cards and then do hardware h264 / hevc encoding in each video card... at least the RX cards support HEVC hardware encoding of 4K content, not sure about GTX1050 cards.

 

Thank you very much. How about 16 gtx960s? Performance practically identical to 680, and supports hevc. 

Link to comment
Share on other sites

Link to post
Share on other sites

I really never seen 16 cards on one board.  Most I think I have seen are eight cards using split PCIE cables.  Of course, those setups where not for gaming, but F@H, BOINC, Rendering, etc. as you can't run x16 in such a setup.

 

I remember one guy that use to be in a folding team with me was building a custom sever tower case to house eight 980s, hence his nickname Mr. 980, for folding.  You definitely will need two PSUs which can easily be done.  I seen quite a few large builds done that use two PSUs.

 

On the issues of Xeons, yes you can find the CPUs cheap, but darn, finding a good motherboard is a whole other story.

 

On crazy builds, I still consider Big Budget Boomer Box the top of just down right amazing build.  Plus, the amount of custom work the guy did on that build.

2023 BOINC Pentathlon Event

F@H & BOINC Installation on Linux Guide

My CPU Army: 5800X, E5-2670V3, 1950X, 5960X J Batch, 10750H *lappy

My GPU Army:3080Ti, 960 FTW @ 1551MHz, RTX 2070 Max-Q *lappy

My Console Brigade: Gamecube, Wii, Wii U, Switch, PS2 Fatty, Xbox One S, Xbox One X

My Tablet Squad: iPad Air 5th Gen, Samsung Tab S, Nexus 7 (1st gen)

3D Printer Unit: Prusa MK3S, Prusa Mini, EPAX E10

VR Headset: Quest 2

 

Hardware lost to Kevdog's Law of Folding

OG Titan, 5960X, ThermalTake BlackWidow 850 Watt PSU

Link to comment
Share on other sites

Link to post
Share on other sites

Okay, assuming this isn't some troll, here's some details on what you could do. 

 

As others have said, get something like 980s or 1080s, it might not be quite to spec that you are asking for but 4 of those will at least be feasible. 

 

The quad socket motherboard is a thing, you need to make sure the CPUs you are using will have the crosslink and support for 4 CPUs. This motherboard also supports 10G network speeds so it should be able to keep up with tech 10 years from now. 

https://www.supermicro.com/products/motherboard/Xeon/C600/X9QR7-TF-JBOD.cfm

 

With that motherboard, you would be able to fit 4 V3 or V4 E5 xeons as well as 4-5 video cards without significant work. One of the big parts of buying ES CPUs is that there are generally NDAs that go along with those when they are given out to people testing them, and you buying one is a breach of that NDA and can get you and the seller in hot water. 

 

As others have said, you won't get a hackintosh working on something like this. It's too far out of field for standard support. Best bet would probably be a Windows server OS or maybe Linux. It might be annoying, but there should be versions of whatever software you were planning to use. 

Link to comment
Share on other sites

Link to post
Share on other sites

Boeing, i did this build yesterday. Got 30 fps on morrowind 720p, Mission acomplished.

Ultra is stupid. ALWAYS.

Link to comment
Share on other sites

Link to post
Share on other sites

Just build yourself a mainframe or a supercomputer. There...done.

I currently play PUBG, help me get chicken dinner xD

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, qwertiuyop said:

I would like to know what exactly are the ''rules'' of scaling multiple GPUs. Why can some apps use 2 or 4 cards, but not 16? Why can the app simply not divide the entire load between all the cards, no matter how many there are. I mean, if one GTX680 is one quarter as powerful as the TitanXp, why would four 680s not be as powerful as one titan Xp? 

 

because they dont scale at 100%. just the way it works, no why

5 hours ago, mariushm said:

You wouldn't be able to have 16 pci-e slots at least in a very simple and cheap way.

 

I don't know off the top of my head the maximum trace distance for the frequencies pci-e runs at, but my guess would be the maximum trace length would be something like 20-30 cm - 16 pci-e slots one below the other would exceed that length. 

I suppose you could make some kind of custom motherboard where you would get an Intel cpu with more than 32 pci-e v3.0 lanes, and you can arrange them in 4 x x8 pci-e v3.0 lanes and then you could use four of those pci-e switcher chips which take in a x8 or x16 pci-e bus and create 4 separate x4 or x8 slots... each of those chips would probably cost you around 60-100$ 

You'd also have to build a custom case with 2 or more of those pci-e connectors  in a single line or to have the cards arranged back to back, basically to reduce the vertical height of the stack of cards.

 

Linus has a server where they had 10 pci-e slots and that system used those kind of switcher chips which took a number of pci-e lanes and created several slots and pushed the data packets from all those slots into the smaller number of pci-e lanes coming into the switch IC.

 

You have cards like GTX1070 which use only around 150 watts. You could tweak the voltage and power limits to make sure they don't go over 140-150 watts. 16 such cards would use up to 2400 watts which is just about 10 amps at 230v. In Europe, there's 16A circuits on 230v, so it shouldn't be a problem to power those cards from one or two of those industrial Delta power supplies for example, which only output 12v... or any server power supplies that output only 12v.

The rest of the system (cpu, motherboard, pci-e switches etc) could use less than 4-500 watts and could run on a second power circuit easily (or you could have some electrical engineers build a more powerful 230v circuit for your room, with thicker cables and bigger fuses etc)  

 

It's doable but waste of money. Few applications would actually be able to split their workloads on 16 cards at a time and have them all do work concurrently and then collect the job results as their completed and merge everything and finish jobs..  you wouldn't use all those 16 cards efficiently. 

 

What would make sense for example would be a system with 16 gtx1050 or rx 460, each using about 50-70 watts and each would be satisfied with a x4 slot ...   and have 16 real time h264 / hevc encoders receive raw HD or 4K streams from capture cards and load them in video cards and then do hardware h264 / hevc encoding in each video card... at least the RX cards support HEVC hardware encoding of 4K content, not sure about GTX1050 cards.

 

 

He could do the same thing for less, and with less effort and trouble by being a normal human being.

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, Xreldo said:

Arent Maxwell titans older than pascal ones?

Yes, yes they are....

 

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, qwertiuyop said:

Why would Itanium be better than a 12 core ES xeon v4 for 300 dollars? Also, what do so many folks here have against the ES CPUs? The reviews from people who have actually bought and used them were generally positive. 

 

Itanic.jpg

(hey atleast its watercooled)

RyzenAir : AMD R5 3600 | AsRock AB350M Pro4 | 32gb Aegis DDR4 3000 | GTX 1070 FE | Fractal Design Node 804
RyzenITX : Ryzen 7 1700 | GA-AB350N-Gaming WIFI | 16gb DDR4 2666 | GTX 1060 | Cougar QBX 

 

PSU Tier list

 

Link to comment
Share on other sites

Link to post
Share on other sites

I'd love to win the lottery....

 

That's never going to happen either though.

Main Rig:-

Ryzen 7 3800X | Asus ROG Strix X570-F Gaming | 16GB Team Group Dark Pro 3600Mhz | Corsair MP600 1TB PCIe Gen 4 | Sapphire 5700 XT Pulse | Corsair H115i Platinum | WD Black 1TB | WD Green 4TB | EVGA SuperNOVA G3 650W | Asus TUF GT501 | Samsung C27HG70 1440p 144hz HDR FreeSync 2 | Ubuntu 20.04.2 LTS |

 

Server:-

Intel NUC running Server 2019 + Synology DSM218+ with 2 x 4TB Toshiba NAS Ready HDDs (RAID0)

Link to comment
Share on other sites

Link to post
Share on other sites

If we want to talk about things we waste our money on, I have at least 48 Xeon cores I pay for that are barely used.

-KuJoe

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Nicholatian said:

You’re 99 thousand 999 percent sure? How is that even?

Its more possible than what OP suggested though...

Main Rig:-

Ryzen 7 3800X | Asus ROG Strix X570-F Gaming | 16GB Team Group Dark Pro 3600Mhz | Corsair MP600 1TB PCIe Gen 4 | Sapphire 5700 XT Pulse | Corsair H115i Platinum | WD Black 1TB | WD Green 4TB | EVGA SuperNOVA G3 650W | Asus TUF GT501 | Samsung C27HG70 1440p 144hz HDR FreeSync 2 | Ubuntu 20.04.2 LTS |

 

Server:-

Intel NUC running Server 2019 + Synology DSM218+ with 2 x 4TB Toshiba NAS Ready HDDs (RAID0)

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, qwertiuyop said:

Legal disclaimer: I am crazy. Like, really crazy. The project I am going to describe is sicker than all the HOLY $H!T stuff Linus has EVER showed off. You think dual xeon 36 core workstation with with two maxwell titans is way over the top? If you do, do not continue reading.

 

Okay, here is the thing. I want to build a beast. A workstation with four 16 or 12 core v4 intel xeons (they come incredibly cheap on Ebay. Few hundreds bucks per CPU (they are engineering samples of course, but that shouldn't be a problem), four to five pascal titan Xs worth of GPU power, four PCIe ssds (such as Samsung 950) in Raid 0 and 256gb of ddr4 ECC RAM. 

 

Now comes the interesting part: I am not indenting to buy 4 titan pascal Titan Xs. I want to get that amount of graphic performance from more, less powerfull cards. For instance, a nVidia GTX680 is around 4 times less powerfull than a Pascal Titan X. So I could use 16 GTX 680s to accomplish my goal. Of course, I realize one can not use SLI for such a monsterous quantity of GPUs, and I do not want to play games on this computer. But would it be possible to link 16 cards together and use them for photo or video editing? What I mean is that all the load would get evenly divided between all the GPUs and they would all work simultaneously. I will also welcome any recommendations on what cards and how many of them should I use to acomplish my goal of at least 4 Pascal Titan Xs worth of performance. Also, if you know where can I cheaply buy them in such large quantities, I am all ears (I don't want to hunt for 16 or even more individual GPUs on ebay, I would like to buy them all at once).

 

Now comes the even more interesting part: I would like to hackintosh my build. Would it be possible with all those GPUs and Xeons? Does OS X (yeah, yeah I know... Now macOS :-)) even support quad processor configurations? And how about 16 GPUs? :-)

 

Which motherboard and chasis should I use? How about cooling? I would like to overclock the CPUs and GPUs, maybe even the RAM, so good cooling is essential. 

 

but your work station is not a aquarium lol

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, Bananasplit_00 said:

because they are not complete, not made for retail and dont always supprt all features of proper CPUs. they can require verry specific boards to even work sometimes

His idea struck a iceberg before it even set off, thus Itanium is the right cpu for him. Itanium aka Itanic. 

Intel Xeon E5 1650 v3 @ 3.5GHz 6C:12T / CM212 Evo / Asus X99 Deluxe / 16GB (4x4GB) DDR4 3000 Trident-Z / Samsung 850 Pro 256GB / Intel 335 240GB / WD Red 2 & 3TB / Antec 850w / RTX 2070 / Win10 Pro x64

HP Envy X360 15: Intel Core i5 8250U @ 1.6GHz 4C:8T / 8GB DDR4 / Intel UHD620 + Nvidia GeForce MX150 4GB / Intel 120GB SSD / Win10 Pro x64

 

HP Envy x360 BP series Intel 8th gen

AMD ThreadRipper 2!

5820K & 6800K 3-way SLI mobo support list

 

Link to comment
Share on other sites

Link to post
Share on other sites

18 hours ago, qwertiuyop said:

Legal disclaimer: I am crazy. Like, really crazy. The project I am going to describe is sicker than all the HOLY $H!T stuff Linus has EVER showed off. You think dual xeon 36 core workstation with with two maxwell titans is way over the top? If you do, do not continue reading.

 

Okay, here is the thing. I want to build a beast. A workstation with four 16 or 12 core v4 intel xeons (they come incredibly cheap on Ebay. Few hundreds bucks per CPU (they are engineering samples of course, but that shouldn't be a problem), four to five pascal titan Xs worth of GPU power, four PCIe ssds (such as Samsung 950) in Raid 0 and 256gb of ddr4 ECC RAM. 

 

Now comes the interesting part: I am not indenting to buy 4 titan pascal Titan Xs. I want to get that amount of graphic performance from more, less powerfull cards. For instance, a nVidia GTX680 is around 4 times less powerfull than a Pascal Titan X. So I could use 16 GTX 680s to accomplish my goal. Of course, I realize one can not use SLI for such a monsterous quantity of GPUs, and I do not want to play games on this computer. But would it be possible to link 16 cards together and use them for photo or video editing? What I mean is that all the load would get evenly divided between all the GPUs and they would all work simultaneously. I will also welcome any recommendations on what cards and how many of them should I use to acomplish my goal of at least 4 Pascal Titan Xs worth of performance. Also, if you know where can I cheaply buy them in such large quantities, I am all ears (I don't want to hunt for 16 or even more individual GPUs on ebay, I would like to buy them all at once).

 

Now comes the even more interesting part: I would like to hackintosh my build. Would it be possible with all those GPUs and Xeons? Does OS X (yeah, yeah I know... Now macOS :-)) even support quad processor configurations? And how about 16 GPUs? :-)

 

Which motherboard and chasis should I use? How about cooling? I would like to overclock the CPUs and GPUs, maybe even the RAM, so good cooling is essential. 

 

 Dumb As Shit of an  idea......

 

Just go buy 2/3 1080's and a High End CPU  Way Easier Costs Less Money and will Work better without any Werid Problems/ Not even Working.

 

 

16 GPUS/ 4 CPUS the Power Supplies needed and power Draw Will Be WAY High.  Just the Cost of a 4 CPU 16 GPU Board with power supplies Will be expensive let alone the rest of the build  even if the 680s are 70 bucks  thats 1100 Dollars Plus the Many power supplies to run em  1080s are way more efficient and very fast cards 

 

 

More people will be impressed by a 3 Gtx 1080 Build vs 16 680s 

Link to comment
Share on other sites

Link to post
Share on other sites

25 minutes ago, michaelocarroll007 said:

 Dumb As Shit of an  idea......

 

Just go buy 2/3 1080's and a High End CPU  Way Easier Costs Less Money and will Work better without any Werid Problems/ Not even Working.

 

 

16 GPUS/ 4 CPUS the Power Supplies needed and power Draw Will Be WAY High.  Just the Cost of a 4 CPU 16 GPU Board with power supplies Will be expensive let alone the rest of the build  even if the 680s are 70 bucks  thats 1100 Dollars Plus the Many power supplies to run em  1080s are way more efficient and very fast cards 

 

 

More people will be impressed by a 3 Gtx 1080 Build vs 16 680s 

 

I personally would be more impressed by the OP's proposal, for the sheer fact that it's non existent(quad socket motherboard with 16 PCIe x16 slots).  Nobody produces a motherboard like this because it's just a bad idea with no demand.  No offense to the OP though, it's good to think outside of the box.

- ASUS X99 Deluxe - i7 5820k - Nvidia GTX 1080ti SLi - 4x4GB EVGA SSC 2800mhz DDR4 - Samsung SM951 500 - 2x Samsung 850 EVO 512 -

- EK Supremacy EVO CPU Block - EK FC 1080 GPU Blocks - EK XRES 100 DDC - EK Coolstream XE 360 - EK Coolstream XE 240 -

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, Berd said:

Would it be able to run crisis tho

Lowest settings at 20 fps and dips down to 10 at the intense areas

Link to comment
Share on other sites

Link to post
Share on other sites

19 hours ago, qwertiuyop said:

Legal disclaimer: I am crazy. Like, really crazy. The project I am going to describe is sicker than all the HOLY $H!T stuff Linus has EVER showed off. You think dual xeon 36 core workstation with with two maxwell titans is way over the top? If you do, do not continue reading.

 

Okay, here is the thing. I want to build a beast. A workstation with four 16 or 12 core v4 intel xeons (they come incredibly cheap on Ebay. Few hundreds bucks per CPU (they are engineering samples of course, but that shouldn't be a problem), four to five pascal titan Xs worth of GPU power, four PCIe ssds (such as Samsung 950) in Raid 0 and 256gb of ddr4 ECC RAM. 

 

Now comes the interesting part: I am not indenting to buy 4 titan pascal Titan Xs. I want to get that amount of graphic performance from more, less powerfull cards. For instance, a nVidia GTX680 is around 4 times less powerfull than a Pascal Titan X. So I could use 16 GTX 680s to accomplish my goal. Of course, I realize one can not use SLI for such a monsterous quantity of GPUs, and I do not want to play games on this computer. But would it be possible to link 16 cards together and use them for photo or video editing? What I mean is that all the load would get evenly divided between all the GPUs and they would all work simultaneously. I will also welcome any recommendations on what cards and how many of them should I use to acomplish my goal of at least 4 Pascal Titan Xs worth of performance. Also, if you know where can I cheaply buy them in such large quantities, I am all ears (I don't want to hunt for 16 or even more individual GPUs on ebay, I would like to buy them all at once).

 

Now comes the even more interesting part: I would like to hackintosh my build. Would it be possible with all those GPUs and Xeons? Does OS X (yeah, yeah I know... Now macOS :-)) even support quad processor configurations? And how about 16 GPUs? :-)

 

Which motherboard and chasis should I use? How about cooling? I would like to overclock the CPUs and GPUs, maybe even the RAM, so good cooling is essential. 

 

http://www.newegg.com/Product/Product.aspx?Item=N82E16817101033

 

thats the power supply. 2000 Watt of Supreme Power.

The geek himself.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, michaelocarroll007 said:

 Dumb As Shit of an  idea......

 

Just go buy 2/3 1080's and a High End CPU  Way Easier Costs Less Money and will Work better without any Werid Problems/ Not even Working.

 

 

16 GPUS/ 4 CPUS the Power Supplies needed and power Draw Will Be WAY High.  Just the Cost of a 4 CPU 16 GPU Board with power supplies Will be expensive let alone the rest of the build  even if the 680s are 70 bucks  thats 1100 Dollars Plus the Many power supplies to run em  1080s are way more efficient and very fast cards 

 

 

More people will be impressed by a 3 Gtx 1080 Build vs 16 680s 

Okay let's see... Very, very optimistically speaking, i could get two 1080s for the price of 16 680s. Does the scaling of multiple cards really work so bad that two 1080s would be a better choice?

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, michaelocarroll007 said:

 Dumb As Shit of an  idea......

 

Just go buy 2/3 1080's and a High End CPU  Way Easier Costs Less Money and will Work better without any Werid Problems/ Not even Working.

 

 

16 GPUS/ 4 CPUS the Power Supplies needed and power Draw Will Be WAY High.  Just the Cost of a 4 CPU 16 GPU Board with power supplies Will be expensive let alone the rest of the build  even if the 680s are 70 bucks  thats 1100 Dollars Plus the Many power supplies to run em  1080s are way more efficient and very fast cards 

For Gaming ? FUCK YES 16 680s is useless.  Theres many games that loose performance from going past 2-3 cards and many games that dont use more then 1-2. if its even possible to try to use them all. For Computing use they scale good so long the ports have enough bandwidth but i  dont know about video editing personally.

 

Quick Google Shows 680 3 TFlops   1080 is 9 Tflops  so 3 1080s is equal to 9 680s But the Power draw, Power Supply  and motherboard needed no way the 680s would be better performance per dollar  overall

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×