Jump to content

The sickest build EVER. You thought Linus was crazy? Think again.

Legal disclaimer: I am crazy. Like, really crazy. The project I am going to describe is sicker than all the HOLY $H!T stuff Linus has EVER showed off. You think dual xeon 36 core workstation with with two maxwell titans is way over the top? If you do, do not continue reading.

 

Okay, here is the thing. I want to build a beast. A workstation with four 16 or 12 core v4 intel xeons (they come incredibly cheap on Ebay. Few hundreds bucks per CPU (they are engineering samples of course, but that shouldn't be a problem), four to five pascal titan Xs worth of GPU power, four PCIe ssds (such as Samsung 950) in Raid 0 and 256gb of ddr4 ECC RAM. 

 

Now comes the interesting part: I am not indenting to buy 4 titan pascal Titan Xs. I want to get that amount of graphic performance from more, less powerfull cards. For instance, a nVidia GTX680 is around 4 times less powerfull than a Pascal Titan X. So I could use 16 GTX 680s to accomplish my goal. Of course, I realize one can not use SLI for such a monsterous quantity of GPUs, and I do not want to play games on this computer. But would it be possible to link 16 cards together and use them for photo or video editing? What I mean is that all the load would get evenly divided between all the GPUs and they would all work simultaneously. I will also welcome any recommendations on what cards and how many of them should I use to acomplish my goal of at least 4 Pascal Titan Xs worth of performance. Also, if you know where can I cheaply buy them in such large quantities, I am all ears (I don't want to hunt for 16 or even more individual GPUs on ebay, I would like to buy them all at once).

 

Now comes the even more interesting part: I would like to hackintosh my build. Would it be possible with all those GPUs and Xeons? Does OS X (yeah, yeah I know... Now macOS :-)) even support quad processor configurations? And how about 16 GPUs? :-)

 

Which motherboard and chasis should I use? How about cooling? I would like to overclock the CPUs and GPUs, maybe even the RAM, so good cooling is essential. 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Are you sure there is even a power supply that can power all of that? 16 high wattage GPU's? Your going to need something industrial grade for that.

i5-6600K @ 4.5 GHz |I| Hyper 212 EVO |I| ASUS STRIX Z270E |I| 8 GB DDR4 HyperX FURY
GTX 1060 Windforce OC 6GB @ 2088 MHz
DEEPCOOL TESSERACT WH |I| EVGA 500 W1 80+
1TB 7200rpm HDD |I| 120GB SSD
GN246HL 144Hz 1080p

Corsair K70 LUX RGB |I| Corsair M65 Pro RGB
PCPP: https://ca.pcpartpicker.com/b/DHzYcf

Link to comment
Share on other sites

Link to post
Share on other sites

Arent Maxwell titans older than pascal ones?

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, qwertiuyop said:

Legal disclaimer: I am crazy. Like, really crazy. The project I am going to describe is sicker than all the HOLY $H!T stuff Linus has EVER showed off. You think dual xeon 36 core workstation with with two maxwell titans is way over the top? If you do, do not continue reading.

 

Okay, here is the thing. I want to build a beast. A workstation with four 16 or 12 core v4 intel xeons (they come incredibly cheap on Ebay. Few hundreds bucks per CPU (they are engineering samples of course, but that shouldn't be a problem), four to five pascal titan Xs worth of GPU power, four PCIe ssds (such as Samsung 950) in Raid 0 and 256gb of ddr4 ECC RAM. 

 

Now comes the interesting part: I am not indenting to buy 4 titan pascal Titan Xs. I want to get that amount of graphic performance from more, less powerfull cards. For instance, a nVidia GTX680 is around 4 times less powerfull than a Pascal Titan X. So I could use 16 GTX 680s to accomplish my goal. Of course, I realize one can not use SLI for such a monsterous quantity of GPUs, and I do not want to play games on this computer. But would it be possible to link 16 cards together and use them for photo or video editing? What I mean is that all the load would get evenly divided between all the GPUs and they would all work simultaneously. I will also welcome any recommendations on what cards and how many of them should I use to acomplish my goal of at least 4 Pascal Titan Xs worth of performance. Also, if you know where can I cheaply buy them in such large quantities, I am all ears (I don't want to hunt for 16 or even more individual GPUs on ebay, I would like to buy them all at once).

 

Now comes the even more interesting part: I would like to hackintosh my build. Would it be possible with all those GPUs and Xeons? Does OS X (yeah, yeah I know... Now macOS :-)) even support quad processor configurations? And how about 16 GPUs? :-)

 

Which motherboard and chasis should I use? How about cooling? I would like to overclock the CPUs and GPUs, maybe even the RAM, so good cooling is essential. 

 

I'd say yeah but good luck finding a dual socked board with 16 full length pcie slots

i7-7700k @4.8GHz

Asus Maxmius IX hero

EVGA GTX 1080Ti FTW 3

850w EVGA PSU

32GB corsair LPX ddr4 ram 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Xreldo said:

Arent Maxwell titans older than pascal ones?

yeah

i7-7700k @4.8GHz

Asus Maxmius IX hero

EVGA GTX 1080Ti FTW 3

850w EVGA PSU

32GB corsair LPX ddr4 ram 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, 8-Bit Ninja said:

well this is biggest amount of bullshit i'v read today 

Me too..

Link to comment
Share on other sites

Link to post
Share on other sites

It does sound quite impractical, and very impossible. Is there even a way to connect 16 graphics cards, and if there was, wouldn't scaling be terrible?

i5-6600K @ 4.5 GHz |I| Hyper 212 EVO |I| ASUS STRIX Z270E |I| 8 GB DDR4 HyperX FURY
GTX 1060 Windforce OC 6GB @ 2088 MHz
DEEPCOOL TESSERACT WH |I| EVGA 500 W1 80+
1TB 7200rpm HDD |I| 120GB SSD
GN246HL 144Hz 1080p

Corsair K70 LUX RGB |I| Corsair M65 Pro RGB
PCPP: https://ca.pcpartpicker.com/b/DHzYcf

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, 8-Bit Ninja said:

well this is biggest amount of bullshit i'v read today 

Thank you for your insightful opinion. :-) You never know what might come tomorrow tho... 

Link to comment
Share on other sites

Link to post
Share on other sites

As well as the old titans are like 800 and 1080s are like 500-600 and they are way much powerful just get few titan pascals..

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, qwertiuyop said:

Legal disclaimer: I am crazy. Like, really crazy. The project I am going to describe is sicker than all the HOLY $H!T stuff Linus has EVER showed off. You think dual xeon 36 core workstation with with two maxwell titans is way over the top? If you do, do not continue reading.

 

Okay, here is the thing. I want to build a beast. A workstation with four 16 or 12 core v4 intel xeons (they come incredibly cheap on Ebay. Few hundreds bucks per CPU (they are engineering samples of course, but that shouldn't be a problem), four to five pascal titan Xs worth of GPU power, four PCIe ssds (such as Samsung 950) in Raid 0 and 256gb of ddr4 ECC RAM. 

 

Now comes the interesting part: I am not indenting to buy 4 titan pascal Titan Xs. I want to get that amount of graphic performance from more, less powerfull cards. For instance, a nVidia GTX680 is around 4 times less powerfull than a Pascal Titan X. So I could use 16 GTX 680s to accomplish my goal. Of course, I realize one can not use SLI for such a monsterous quantity of GPUs, and I do not want to play games on this computer. But would it be possible to link 16 cards together and use them for photo or video editing? What I mean is that all the load would get evenly divided between all the GPUs and they would all work simultaneously. I will also welcome any recommendations on what cards and how many of them should I use to acomplish my goal of at least 4 Pascal Titan Xs worth of performance. Also, if you know where can I cheaply buy them in such large quantities, I am all ears (I don't want to hunt for 16 or even more individual GPUs on ebay, I would like to buy them all at once).

 

Now comes the even more interesting part: I would like to hackintosh my build. Would it be possible with all those GPUs and Xeons? Does OS X (yeah, yeah I know... Now macOS :-)) even support quad processor configurations? And how about 16 GPUs? :-)

 

Which motherboard and chasis should I use? How about cooling? I would like to overclock the CPUs and GPUs, maybe even the RAM, so good cooling is essential. 

 

I am 99,999% sure you would never be able to hackintosh that shit even if you would find a way to build it which I believe is 20 times more unrealistic.

http://linustechtips.com/main/topic/334934-unofficial-ltt-beginners-guide/ (by Minibois) and a few things that will make our community interaction more pleasent:
1. FOLLOW your own topics                                                                                2.Try to QUOTE people so we can read through things easier
3.Use
PCPARTPICKER.COM - easy and most importantly approved here        4.Mark your topics SOLVED if they are                                
Don't change a running system

Link to comment
Share on other sites

Link to post
Share on other sites

And why the fuck would you actually need this your not running a gaming server for a hotel or something.

Link to comment
Share on other sites

Link to post
Share on other sites

Eh i dunno 

 

 

Gaming - Ryzen 5800X3D | 64GB 3200mhz  MSI 6900 XT Mini-ITX SFF Build

Home Server (Unraid OS) - Ryzen 2700x | 48GB 3200mhz |  EVGA 1060 6GB | 6TB SSD Cache [3x2TB] 66TB HDD [11x6TB]

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, NuclearKing said:

Are you sure there is even a power supply that can power all of that? 16 high wattage GPU's? Your going to need something industrial grade for that.

Such as? For instance, what did Linus use in the Compensator build? 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Xreldo said:

And why the fuck would you actually need this your not running a gaming server for a hotel or something.

I saw that one coming... The thing is I want to have a computer that I could keep for ideally 7 to 10 years. Today, I edit multistream 4k video. In 7 years, that will easily be something beyond 8k + much more fps. For instance, RED cameras already do 8K 75fps. In few years it will be normal just like 4k is today. 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, qwertiuyop said:

Such as? For instance, what did Linus use in the Compensator build? 

You can get 3000+ W PSU's(ive seen 5000w), and you can run multiple of them together.

 

The problem is most need 240V(some use 200v 3 phase). And you need 30A circuits, so there is anouther 1000 for runnelectrilal to that system.

 

The problem with this plan is that it will probably be slower than a normal system with a 1080 and a 6900k for video editing. Most programs don't like a ton of cpus. You also have the program of limited amount of vram on the gpu's, and they don't support the newest standards.

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Xreldo said:

As well as the old titans are like 800 and 1080s are like 500-600 and they are way much powerful just get few titan pascals..

Titan pascal is one of the most overpriced cards on the market. The goal in this project is to find components with the best performance/price ratio and use many of them. 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, qwertiuyop said:

I saw that one coming... The thing is I want to have a computer that I could keep for ideally 7 to 10 years. Today, I edit multistream 4k video. In 7 years, that will easily be something beyond 8k + much more fps. For instance, RED cameras already do 8K 75fps. In few years it will be normal just like 4k is today. 

Your much better off buying a few high end systems.  

 

Also software support becomes a issue, This system, probably won't run windows ltsb 2024 well(drivers), and software by then will require it.

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, qwertiuyop said:

I saw that one coming... The thing is I want to have a computer that I could keep for ideally 7 to 10 years. Today, I edit multistream 4k video. In 7 years, that will easily be something beyond 8k + much more fps. For instance, RED cameras already do 8K 75fps. In few years it will be normal just like 4k is today. 

Did you rob a bank? ;-;

Link to comment
Share on other sites

Link to post
Share on other sites

A) There's no case to fit 16 GTX 680s. You need at least 2 server rack cases to fit them into anything (like in this LTT build: https://www.youtube.com/watch?v=DhZJ66l82r8)

 

B) Even if one GTX 680 is around 25% of the performance of one Titan Xp, 4 GTX 680s ain't even close to the performance of one Titan Xp. Probably your 16 GTX 680 mich come around the performance of one Titan Xp. The scaling of even 2 GPUs is bad (~80%) so even thinking the scaling of 16 GPUs is horrendous. Maybe if you separeted it to 8 or 16 different PCs you could get better performance ratio out of it, but in one PC you are lucky to even get it running.

 

C) Hackintosh it? Really? You know that it's been 14 years since OS X has even touched server grade hardware? Also IIRC if you can't do it yourself you going to need motherboard that is supported by hackintosh community and I don't belive any even 2 CPU mobo is common enough not even talking about 4 CPUs. To even get something out of that build you really want some Linux distro build for servers with very good scaling. Windows probably could run there, but scaling would be really bad. But OS X, probably no way, hackintoshes are really unstable if you have slightly different HW than macs and your plan is as far from mac as it can be.

 

D) When you have got every piece of HW you need, your bill is probably going to be as much as getting a high-end standard PC.

Link to comment
Share on other sites

Link to post
Share on other sites

39 minutes ago, qwertiuyop said:

Few hundreds bucks per CPU (they are engineering samples of course, but that shouldn't be a problem)

ES cpus can have a lot of problems, especially with motherboard compatibility.

 

If this is for professional work, don't cheap out on ES cpus and inform yourself about what ES cpus entail, they are property of Intel, and illegal to sell.

 

 http://www.intel.com/content/www/us/en/support/processors/000005719.html

 

40 minutes ago, qwertiuyop said:

But would it be possible to link 16 cards together and use them for photo or video editing?

No.

 

41 minutes ago, qwertiuyop said:

Now comes the even more interesting part: I would like to hackintosh my build. Would it be possible with all those GPUs and Xeons? Does OS X (yeah, yeah I know... Now macOS :-)) even support quad processor configurations? And how about 16 GPUs? :-)

Hackentoshes and such, if you read the community standards are against these so you will have to go somewhere else to find compatibility, most likely though no.

 

44 minutes ago, qwertiuyop said:

our to five pascal titan Xs worth of GPU power, four PCIe ssds (such as Samsung 950) in Raid 0 and 256gb of ddr4 ECC RAM. 

With 16 gpus at x8 speed and 4 pcies at x4 speed that is a total of 144pcie lanes, you will need at least a quad config, which will require e5 4xxx or e7 4xxx/8xxx cpus, which are really hard to find ES cpus for and you are looking at at least $1000 on the board.

 

 •E5-2670 @2.7GHz • Intel DX79SI • EVGA 970 SSC• GSkill Sniper 8Gb ddr3 • Corsair Spec 02 • Corsair RM750 • HyperX 120Gb SSD • Hitachi 2Tb HDD •

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×