Jump to content

Build Log Data Server 'Baldur'

Ahnzh

I can't wait for the first Aluminum parts to arrive. I will start with building the HDD case though, since i'm not 100% sure which dimensions i want to have for the rest of the case ;) mATX, ATX, EEB, dunno have to see. I mean i am doing a lot of work now and want to expand it later possibly without spending an enormous amount of additional time

My builds:


'Baldur' - Data Server - Build Log


'Hlin' - UTM Gateway Server - Build Log

Link to comment
Share on other sites

Link to post
Share on other sites

ohh my gawd this is amazing the storage is sexy :wub:

btw can you talk german 

wenn ja dann hi :)

Link to comment
Share on other sites

Link to post
Share on other sites

ohh my gawd this is amazing the storage is sexy :wub:

btw can you talk german 

wenn ja dann hi :)

ich kann durchaus

 

but i refrain from it out of respect for all the people here that cannot.

 

I'm setting up my backup server at the moment. saving data on it, then I need to get it into a datacenter.

 

It will synchronize with my server later on.

My builds:


'Baldur' - Data Server - Build Log


'Hlin' - UTM Gateway Server - Build Log

Link to comment
Share on other sites

Link to post
Share on other sites

ich kann durchaus

 

but i refrain from it out of respect for all the people here that cannot.

we appreciate it thanks!

Link to comment
Share on other sites

Link to post
Share on other sites

same gotta practice my english some more ;)

Link to comment
Share on other sites

Link to post
Share on other sites

  • 3 weeks later...

So as a small update: I pretty much changed everything...

 

 

Now it's going to be :

3x LSI SAS 9300-8i Host Bus Controller Cards

2x Xeon E5 v2 2630

128GB total RAM (16x8GB  ECC RAM - Hynix)

X9DAX-iTF Dual Processor Mainboard by Supermicro with 2x 10GBase-T on it to finally get 10Gbit into play
 
The Hardware that i already received will be built into a calculation node for my private cloud. 
 
And i got a pic to share with you ;D
 

IMG 0152

 

And I've got a question: should I connect all HBAs to a single CPU or is it better to connect a single one to CPU1 and 2 to CPU2? The Intel X540 10Gbase-T chip is connected to CPU1. Still, when I think about it, the data from the HBAs will be sent from CPU2 via the QPI Link to CPU1 and then gets sent via LAN. Sounds fishy from my point of view. Does anybody know more about that? How does QPI compare to sending it through the FSB?

My builds:


'Baldur' - Data Server - Build Log


'Hlin' - UTM Gateway Server - Build Log

Link to comment
Share on other sites

Link to post
Share on other sites

I don't really understand what you just said, but that's a lot of storage sitting there and a lot of processing power too.  I can't wait to see the benchmarks.  Subbed.

 

 

Definitely subbed...

"I have not seen everything and anything is possible" - J. Guru

Link to comment
Share on other sites

Link to post
Share on other sites

I don't really understand what you just said, but that's a lot of storage sitting there and a lot of processing power too.  I can't wait to see the benchmarks.  Subbed.

 

 

Definitely subbed...

I'm a bit distracted so it's understandable if you didn't understand what I wrote, my GF is pregnant and the birth is just a question of 48 hours. She's making my live a living hell. Today we were in a movie. We said 'ok it will be the last movie for a long time'!

 

It ended with she first destroying my balls during an uterine contraction and then after I shouted out loud she started insulting me as an impotent fucker in some kind of screaming voice! In the middle of a movie! DEAD SILENCE! GOD please let this end!

 

 

I suffered so much already!

My builds:


'Baldur' - Data Server - Build Log


'Hlin' - UTM Gateway Server - Build Log

Link to comment
Share on other sites

Link to post
Share on other sites

I don't really understand what you just said, but that's a lot of storage sitting there and a lot of processing power too.  I can't wait to see the benchmarks.  Subbed.

 

 

Definitely subbed...

Oh and i didn't sleep for 48 hours already PRE birth! fckk

My builds:


'Baldur' - Data Server - Build Log


'Hlin' - UTM Gateway Server - Build Log

Link to comment
Share on other sites

Link to post
Share on other sites

this is the circuit diagram of the motherboard. maybe this helps understanding what my problem is:

 

Bildschirmfoto 2014 06 03 Um 22.51.28

 

It shows how the connections are routed.

 

CPU1 has got the following connections:

  • The chipset
  • 2x PCI-E x16
  • LAN x540 (2x 10Gigabit LAN adapters)

CPU2 is able to connect to:

  • 4x PCI-E x8

Both CPUs are connected via QPI (that's a data port to exchange data directly between CPUs instead using the Front Side Bus) with 2 8G QPI ports

 

In general all data that is transferred to the CPU is sent via the Front Side Bus. That's the interface where you transfer data to the different PCI Express devices, the RAM, the Chipset (which handles Hard Drives, USB and so on), or pretty much everything.

 

As you can see, the RAM isn't shared between CPUs. That again means that that when shared tasks are executed, the same data needs to be stored through the different RAM blocks. Not identically, like in the same blocks, but the same data needs to be accessible.

 

In the beginning data was shared via the Front Site Bus. That means, that the same Interface in which data was exchanged with all other devices was used to exchange data through CPUs. That again means that the data that you could transfer to additional devices was more limited. You accidentally weren't able to send data to your GPU if the data exchange between the CPUs was too high. To change this, Intel invented 'QPI'. It's a direct data connection between the CPUs. Then again it has got a higher latency than sending it through the Front Side Bus. The question now is, what's the performance of this connection. How does it handle the data (latency/throughput wise) if BOTH CPUs are connected to the same storage. Is it advantageous to connect everything to CPU2 or does the dual CPU layout profit from connecting extension cards to different CPUs. 

 

I hope that made it clear where the problem is

My builds:


'Baldur' - Data Server - Build Log


'Hlin' - UTM Gateway Server - Build Log

Link to comment
Share on other sites

Link to post
Share on other sites

I'm a bit distracted so it's understandable if you didn't understand what I wrote, my GF is pregnant and the birth is just a question of 48 hours.

Well then, congratulations? :)

As for your QPI issue, I have no idea, you'd probably need to try it out.

BUILD LOGS: HELIOS - Latest Update: 2015-SEP-06 ::: ZEUS - BOTW 2013-JUN-28 ::: APOLLO - Complete: 2014-MAY-10
OTHER STUFF: Cable Lacing Tutorial ::: What Is ZFS? ::: mincss Primer ::: LSI RAID Card Flashing Tutorial
FORUM INFO: Community Standards ::: The Moderating Team ::: 10TB+ Storage Showoff Topic

Link to comment
Share on other sites

Link to post
Share on other sites

isn't that backplate meant for ssd caching? i thought sharkoon has one

#killedmywife #howtomakebombs #vgamasterrace

Link to comment
Share on other sites

Link to post
Share on other sites

isn't that backplate meant for ssd caching? i thought sharkoon has one

 

hmmn? i don't understand. The Pactech backplane? It does nothing actually. It's grabbing the usage signal from the Sata power cable, amplifies it and sends it to 2 pins. except that it's pretty much as if you plug in a cable. 

 

Oh and my son was born today at 8:25 GMT+2

My builds:


'Baldur' - Data Server - Build Log


'Hlin' - UTM Gateway Server - Build Log

Link to comment
Share on other sites

Link to post
Share on other sites

-snip-

Oh and my son was born today at 8:25 GMT+2

congratulations ;)

build log: diagonalmod (RIP?)


i know i use many of these: ( ) and these: ... (i really do... (sry...) ) edit: and edits

Link to comment
Share on other sites

Link to post
Share on other sites

Are those SSDs going to be used for an SSD volume? Exciting :)

 

You have my attention, especially with all that memory.

I do not feel obliged to believe that the same God who has endowed us with sense, reason and intellect has intended us to forgo their use, and by some other means to give us knowledge which we can attain by them. - Galileo Galilei
Build Logs: Tophat (in progress), DNAF | Useful Links: How To: Choosing Your Storage Devices and Configuration, Case Study: RAID Tolerance to Failure, Reducing Single Points of Failure in Redundant Storage , Why Choose an SSD?, ZFS From A to Z (Eric1024), Advanced RAID: Survival Rates, Flashing LSI RAID Cards (alpenwasser), SAN and Storage Networking

Link to comment
Share on other sites

Link to post
Share on other sites

this is the circuit diagram of the motherboard. maybe this helps understanding what my problem is:

 

 

It shows how the connections are routed.

 

CPU1 has got the following connections:

  • The chipset
  • 2x PCI-E x16
  • LAN x540 (2x 10Gigabit LAN adapters)

CPU2 is able to connect to:

  • 4x PCI-E x8

Both CPUs are connected via QPI (that's a data port to exchange data directly between CPUs instead using the Front Side Bus) with 2 8G QPI ports

 

In general all data that is transferred to the CPU is sent via the Front Side Bus. That's the interface where you transfer data to the different PCI Express devices, the RAM, the Chipset (which handles Hard Drives, USB and so on), or pretty much everything.

 

As you can see, the RAM isn't shared between CPUs. That again means that that when shared tasks are executed, the same data needs to be stored through the different RAM blocks. Not identically, like in the same blocks, but the same data needs to be accessible.

 

In the beginning data was shared via the Front Site Bus. That means, that the same Interface in which data was exchanged with all other devices was used to exchange data through CPUs. That again means that the data that you could transfer to additional devices was more limited. You accidentally weren't able to send data to your GPU if the data exchange between the CPUs was too high. To change this, Intel invented 'QPI'. It's a direct data connection between the CPUs. Then again it has got a higher latency than sending it through the Front Side Bus. The question now is, what's the performance of this connection. How does it handle the data (latency/throughput wise) if BOTH CPUs are connected to the same storage. Is it advantageous to connect everything to CPU2 or does the dual CPU layout profit from connecting extension cards to different CPUs. 

 

I hope that made it clear where the problem is

 

Yes actually that does clear it up for me, thanks.  The little reading on QPI I've done seems to indicate it has much higher throughput for the interface itself vs. FSB.  However, since that connection is only between the CPUs in this case, I would imagine there would be less latency if all drives are connected to just one CPU.  If the drives are distributed between the 2 CPUs, QPI must be used as well as the FSB when accessing data, theoretically increasing latency. So then, if all drives are connected to CPU1, then this would limit the use of QPI, therefore reducing latency to your LAN connection.  I have no idea how this would work in practice, so, like alpenwasser said, you'll have to try it out. I guess the question is, how much does QPI increase latency in this config?

 

btw, congrats on the birth of your son!

"I have not seen everything and anything is possible" - J. Guru

Link to comment
Share on other sites

Link to post
Share on other sites

I got new pics for you ;)

 

IMG 0159

 

The RAM came today, so did the fans and the PSU.

 

So what am I doing right now? switching up the built in fan (PSU) to a new one. 

My builds:


'Baldur' - Data Server - Build Log


'Hlin' - UTM Gateway Server - Build Log

Link to comment
Share on other sites

Link to post
Share on other sites

And here some more pics, the RAM, and other stuff.

 

And the PSU with the new fan inside!

IMG 0160 2

IMG 0164

IMG 0163

IMG 0162

IMG 0161

My builds:


'Baldur' - Data Server - Build Log


'Hlin' - UTM Gateway Server - Build Log

Link to comment
Share on other sites

Link to post
Share on other sites

as a 'smaller' update i wanted to show the pcb boards that I'm using for the drive cages. I received them some time ago but never published the pictures.

 

IMG 0166

IMG 0165

My builds:


'Baldur' - Data Server - Build Log


'Hlin' - UTM Gateway Server - Build Log

Link to comment
Share on other sites

Link to post
Share on other sites

woah, all those bags of connector thingies. this is to make it hot swapable right?

build log: diagonalmod (RIP?)


i know i use many of these: ( ) and these: ... (i really do... (sry...) ) edit: and edits

Link to comment
Share on other sites

Link to post
Share on other sites

woah, all those bags of connector thingies. this is to make it hot swapable right?

 it's hot swappable 'in some sort of way' from the start. You can plug out the cable of drives from the beginning. I want to build drive cages that you can pull out and put in without having to screw in some screws n stuff. I would have to move my whole server for that and considering something around 40 kg that's something you don't want to do.

My builds:


'Baldur' - Data Server - Build Log


'Hlin' - UTM Gateway Server - Build Log

Link to comment
Share on other sites

Link to post
Share on other sites

oh and I figured that it's hard to understand what it will look like in the end. here's some further detail:

 

Bildschirmfoto 2014 06 11 Um 23.44.36

 
It's just a rough design focussing on dimensions. it's 1:1 for the parts that you see, i have to add a PSU and fans to the design sometime but it should show you the dimensions of everything

My builds:


'Baldur' - Data Server - Build Log


'Hlin' - UTM Gateway Server - Build Log

Link to comment
Share on other sites

Link to post
Share on other sites

This looks great. Do plan on using SAS or SATA harddrives?

Molex to SATA, lose all your data

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

This looks great. Do plan on using SAS or SATA harddrives?

it's going to be both. SATA for storage and SAS (I got 12G SAS Host Bus Adapters) for SSDs

My builds:


'Baldur' - Data Server - Build Log


'Hlin' - UTM Gateway Server - Build Log

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×