Jump to content

Asus Z170 motherboards (update: round up)

Odd than the mere "ranger" boards look much better than the "It-would-be-much-nicer-if-you-could-remove-all-this-shit" hero version

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

Previous generations looked better.

CPU-delided i5-4670k@4.6Ghz 1.42v R.I.P (2013-2015) MOBO-Asus Maximus VI Gene GPU-Gigabyte GTX 970 G1 Gaming@1582Mhz core/3744Mhz memory COOLING-Corsair H60 RAM-1x8Gb Crucial ballistix tactical tracer@2133Mhz 11-12-12-26  DRIVES-Kingston V300 60Gb, OCZ trion 100 120Gb, WD Red 1Tb
2nd  fastest i5 4670k in GPUPI for CPU - 100M
 
Link to comment
Share on other sites

Link to post
Share on other sites

Very pretty boards. Love the Gene and hope vendors would make more mATX, as most people don't need anymore anyways. Also the 110mm M.2 is very nice. Let's see some NVME M.2 110mm SSD's soon.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

No watercooling for the VRMs? Fail. MSI has buried them this time around in board designs.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

No watercooling for the VRMs? Fail. MSI has buried them this time around in board designs.

 

 

That would be the Maximus Formula. They don't get released till a month or two after.

[ Cruel Angel ]:     Exterior  -   BENQ XL2420T   |   SteelSeries MLG Sensei   |   Corsair K70 RED   |   Corsair 900D  |                                                                                                    CPU:    -   4.7Ghz @ 1.425v             |

                             Interior    -   i7 4770k   |    Maximus VI Formula    |   Corsair Vengeance Pro 16GB    |   ASUS GTX 980 Strix SLIx2  |  840 Pro 512Gb    |    WD Black 2TB  |           RAM:   -   2400Mhz OC @ 1.650v    |

                             Cooling   -   XSPC 120mm x7 Total Radiator Space   |   XSPC RayStorm    |    PrimoChill Tubing/Res  |                                                                                             GPU:   -   1000Mhz @ 1.158            |

Link to comment
Share on other sites

Link to post
Share on other sites

Sauce please?

Link to comment
Share on other sites

Link to post
Share on other sites

What the hell, no Pcie covers and only one M.2?

HTID

Link to comment
Share on other sites

Link to post
Share on other sites

What the hell, no Pcie covers and only one M.2?

You mean no gimmicks and only one port for a niche product that very few people will use, even among buyers of the board?

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Which is totally irrelevant on a platform that hasn't got the PCI-E lanes to actually do 4-way SLI or Crossfire. That's what X99 is for.

PLX chips.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Lol I am an engineer, and pretty much everyone uses consumer socket workstations. Then for computation heavy tasks, like CFD or FEA we offload that job to a server. In my companies case we have one server with a pair of Teslas. 

 

It is much cheaper for a company to buy many mid grade workstations and a very powerful server based machine or two, than it is to buy all of engineering high end enthusiast socket workstations. Yes, large companies use workstation boards on the consumer socket, small companies use workstation boards on the enthusiast socket because they don't want to invest in a powerful computational machine. 

The cheapest solution (since you're always hosting the server anyway) is actually to host everyone on a VM on that server machine and connect to it with a $200 Dell Optiplex or something similar.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Nice, I like the new colour scheme!

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

What the hell, no Pcie covers and only one M.2?

Your avatar, makes it look like the shark said that. lol

Intel Xeon E5 1650 v3 @ 3.5GHz 6C:12T / CM212 Evo / Asus X99 Deluxe / 16GB (4x4GB) DDR4 3000 Trident-Z / Samsung 850 Pro 256GB / Intel 335 240GB / WD Red 2 & 3TB / Antec 850w / RTX 2070 / Win10 Pro x64

HP Envy X360 15: Intel Core i5 8250U @ 1.6GHz 4C:8T / 8GB DDR4 / Intel UHD620 + Nvidia GeForce MX150 4GB / Intel 120GB SSD / Win10 Pro x64

 

HP Envy x360 BP series Intel 8th gen

AMD ThreadRipper 2!

5820K & 6800K 3-way SLI mobo support list

 

Link to comment
Share on other sites

Link to post
Share on other sites

PLX chips.

PLX chips are fucking expensive and introduce latency. By the time you've spent an extra 100-150 dollars on a motherboard with a PLX chip you might as well have gone X99 in the first place.

Intel i7 5820K (4.5 GHz) | MSI X99A MPower | 32 GB Kingston HyperX Fury 2666MHz | Asus RoG STRIX GTX 1080ti OC | Samsung 951 m.2 nVME 512GB | Crucial MX200 1000GB | Western Digital Caviar Black 2000GB | Noctua NH-D15 | Fractal Define R5 | Seasonic 860 Platinum | Logitech G910 | Sennheiser 599 | Blue Yeti | Logitech G502

 

Nikon D500 | Nikon 300mm f/4 PF  | Nikon 200-500 f/5.6 | Nikon 50mm f/1.8 | Tamron 70-210 f/4 VCII | Sigma 10-20 f/3.5 | Nikon 17-55 f/2.8 | Tamron 90mm F2.8 SP Di VC USD Macro | Neewer 750II

Link to comment
Share on other sites

Link to post
Share on other sites

The cheapest solution (since you're always hosting the server anyway) is actually to host everyone on a VM on that server machine and connect to it with a $200 Dell Optiplex or something similar.

 

I have never seen any company do that for their engineering department. 

CPU: i9-13900k MOBO: Asus Strix Z790-E RAM: 64GB GSkill  CPU Cooler: Corsair H170i

GPU: Asus Strix RTX-4090 Case: Fractal Torrent PSU: Corsair HX-1000i Storage: 2TB Samsung 990 Pro

 

Link to comment
Share on other sites

Link to post
Share on other sites

PLX chips are fucking expensive and introduce latency. By the time you've spent an extra 100-150 dollars on a motherboard with a PLX chip you might as well have gone X99 in the first place.

Just having 1 extra PLX chip gives you 3-way SLI even with 4 PCIe lanes to spare. And that's only $35 (50 if the mobo maker is shaking you down).

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

I have never seen any company do that for their engineering department. 

IBM and Nielsen Group both do this. That's how I'm currently working (for IBM as an intern).

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

IBM and Nielsen Group both do this. That's how I'm currently working (for IBM as an intern).

 

Interesting. Everywhere I have ever worked as an engineer or intern has used a medium powered workstation for engineers. 

CPU: i9-13900k MOBO: Asus Strix Z790-E RAM: 64GB GSkill  CPU Cooler: Corsair H170i

GPU: Asus Strix RTX-4090 Case: Fractal Torrent PSU: Corsair HX-1000i Storage: 2TB Samsung 990 Pro

 

Link to comment
Share on other sites

Link to post
Share on other sites

Interesting. Everywhere I have ever worked as an engineer or intern has used a medium powered workstation for engineers.

If the company didn't have the networking infrastructure or the software stack to let the client(engineer) side do the graphics while the big iron handles the calculations and file system, then you won't see it, but once you have that sort of infrastructure, it's uber cheap to run that way. I work off an I3 4130T for instance. Plenty of graphics power for what I need (basically run Emacs in Tramp Mode and render performance graphs from IBM's profilers).

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

If the company didn't have the networking infrastructure or the software stack to let the client(engineer) side do the graphics while the big iron handles the calculations and file system, then you won't see it, but once you have that sort of infrastructure, it's uber cheap to run that way. I work off an I3 4130T for instance. Plenty of graphics power for what I need (basically run Emacs in Tramp Mode and render performance graphs from IBM's profilers).

 

Everywhere I have been has run a midrange quadro and an i5, plus a compute server for larger simulations. I see how it would be cheaper, but I have worked for Case Newholland ($40 Billion co) and even they ran the same type of workstaions. *Shrugs*

CPU: i9-13900k MOBO: Asus Strix Z790-E RAM: 64GB GSkill  CPU Cooler: Corsair H170i

GPU: Asus Strix RTX-4090 Case: Fractal Torrent PSU: Corsair HX-1000i Storage: 2TB Samsung 990 Pro

 

Link to comment
Share on other sites

Link to post
Share on other sites

Hate the PCIe layout. Though, at least they got the top slot right, a full 16x should never be at the top. Other manufacturers need to learn to put the first 16x second from the top as well.

 

But the spacing between the two topmost 16x slots is one too far apart for me. I didn't install a brutal 180mm air penetrator fan in the bottom of my case blowing directly onto the GPU area so I could have a hideous gap in between GPUs.

 

Is it DDR4, DDR3 or both? If it's DDR3 I'll let my brother know, he might be interested.

You want the cards to be spaced closer, so there is less airflow between them, so you have to put a loud fan in to cool them of?

Link to comment
Share on other sites

Link to post
Share on other sites

z170 pro looks nice.

Sadly it's been cancelled

Intel Xeon E5 1650 v3 @ 3.5GHz 6C:12T / CM212 Evo / Asus X99 Deluxe / 16GB (4x4GB) DDR4 3000 Trident-Z / Samsung 850 Pro 256GB / Intel 335 240GB / WD Red 2 & 3TB / Antec 850w / RTX 2070 / Win10 Pro x64

HP Envy X360 15: Intel Core i5 8250U @ 1.6GHz 4C:8T / 8GB DDR4 / Intel UHD620 + Nvidia GeForce MX150 4GB / Intel 120GB SSD / Win10 Pro x64

 

HP Envy x360 BP series Intel 8th gen

AMD ThreadRipper 2!

5820K & 6800K 3-way SLI mobo support list

 

Link to comment
Share on other sites

Link to post
Share on other sites

Intel Xeon E5 1650 v3 @ 3.5GHz 6C:12T / CM212 Evo / Asus X99 Deluxe / 16GB (4x4GB) DDR4 3000 Trident-Z / Samsung 850 Pro 256GB / Intel 335 240GB / WD Red 2 & 3TB / Antec 850w / RTX 2070 / Win10 Pro x64

HP Envy X360 15: Intel Core i5 8250U @ 1.6GHz 4C:8T / 8GB DDR4 / Intel UHD620 + Nvidia GeForce MX150 4GB / Intel 120GB SSD / Win10 Pro x64

 

HP Envy x360 BP series Intel 8th gen

AMD ThreadRipper 2!

5820K & 6800K 3-way SLI mobo support list

 

Link to comment
Share on other sites

Link to post
Share on other sites

I gotta know on the Zeus board?

http://i59.tinypic.com/6nq9z6.jpg

 

Where the HELL is the PCI Express ?!!?!?!?!?!

There's two Radeon 7970s under that massive heatsink. 

Our Grace. The Feathered One. He shows us the way. His bob is majestic and shows us the path. Follow unto his guidance and His example. He knows the one true path. Our Saviour. Our Grace. Our Father Birb has taught us with His humble heart and gentle wing the way of the bob. Let us show Him our reverence and follow in His example. The True Path of the Feathered One. ~ Dimboble-dubabob III

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×