Jump to content

Don’t ask me the price - HP Z8 Fury G5 (SPONSORED)

Sveeno

Thanks to HP for sponsoring this video! Learn more about the HP Z8 Fury G5 at: https://bit.ly/3KMvOq0

 

HP’s new Z8 Fury G5 is a monster! With the new Intel W9-3495X workstation CPU, four (FOUR!?) Nvidia A6000 GPUs, and up to 2 TERABYTES of RAM, this will make your puny best-of-class PCPartPicker dream machine look like a pi by comparison

Link to comment
Share on other sites

Link to post
Share on other sites

It's HP. It's automatically overpriced.

 

NOTE: I no longer frequent this site. If you really need help, PM/DM me and my e.mail will alert me. 

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Sveeno said:

Thanks to HP for sponsoring this video! Learn more about the HP Z8 Fury G5 at: https://bit.ly/3KMvOq0

 

HP’s new Z8 Fury G5 is a monster! With the new Intel W9-3495X workstation CPU, four (FOUR!?) Nvidia A6000 GPUs, and up to 2 TERABYTES of RAM, this will make your puny best-of-class PCPartPicker dream machine look like a pi by comparison

Whats the price?

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, XsunaTera66 said:

Whats the price?

I like to live on the dangerous side of life.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm pretty sure that the HP z1 G9 which onlyt has a i9-12900K and a 3070Ti is like 70% more than a self built pc.

Specs: Motherboard: Asus X470-PLUS TUF gaming (Yes I know it's poor but I wasn't informed) RAM: Corsair VENGEANCE® LPX DDR4 3200Mhz CL16-18-18-36 2x8GB

            CPU: Ryzen 9 5900X          Case: Antec P8     PSU: Corsair RM850x                        Cooler: Antec K240 with two Noctura Industrial PPC 3000 PWM

            Drives: Samsung 970 EVO plus 250GB, Micron 1100 2TB, Seagate ST4000DM000/1F2168 GPU: EVGA RTX 2080 ti Black edition

Link to comment
Share on other sites

Link to post
Share on other sites

I absolutely adore the daughterboard design of those M.2s.

Or just even generally. I believe we should have more daughterboards just for the ease of access and usage. Like I wouldn't mind if we could have more stuff like the ASUS DIMM.2 or something like PCIe-daughterboards which would move the x1-x4 PCIe ports to somewhere else. It would be even refreshing to see the old 90's PCI/ISA/AGP daughterboard design come back to life where all PCI/ISA/AGP-ports were on their own daughterboard coming out of the MB and all your extra cards were vertical and sitting on top of PSU/HDD enclosure. Especially now that the GPUs start to seem like they need their own case or at least we need to return the eATX glory back to fit one in moving the "not-that-important" PCIe-ports somewhere else could be actually useful.

 

Of course it would require new cases and blablabla... Like the probably 6-slot RTX 6090 wouldn't require a new case in itself or the future 480/600mm radiators needed to cool the Opium Lake and Zen6 CPUs wouldn't need new cases to be designed just to fit them in. Same kind of garbage reasoning as the upcoming ATX3.0 standard which seems to be going fine with the connectors designed in the 1950's because "they are still good connectors and they can handle the power and they are asswards compatible". The connector is garbage compared to what we do manufacture today and could use. Yes the connector handles the power but just as long as no one inserts it wrong, what is possible because the 50's connector design is garbage. And no, the ATX3.0 standard most likely won't be compatible at all with the ATX2.X cables so why the compatibility of the connectors should matter? So, some Chinese sweatshop doesn't need to get new crimps that they should have renewed anyway decade ago?

Link to comment
Share on other sites

Link to post
Share on other sites

i wish i had just one of those GPUs....my a4000 is great, but wish it had more vram

🌲🌲🌲

 

 

 

◒ ◒ 

Link to comment
Share on other sites

Link to post
Share on other sites

That's alot of fucking graphic cards..

Useful threads: PSU Tier List | Motherboard Tier List | Graphics Card Cooling Tier List ❤️

Baby: MPG X570 GAMING PLUS | AMD Ryzen 9 5900x /w PBO | Corsair H150i Pro RGB | ASRock RX 7900 XTX Phantom Gaming OC (3020Mhz & 2650Memory) | Corsair Vengeance RGB PRO 32GB DDR4 (4x8GB) 3600 MHz | Corsair RM1000x |  WD_BLACK SN850 | WD_BLACK SN750 | Samsung EVO 850 | Kingston A400 |  PNY CS900 | Lian Li O11 Dynamic White | Display(s): Samsung Oddesy G7, ASUS TUF GAMING VG27AQZ 27" & MSI G274F

 

I also drive a volvo as one does being norwegian haha, a volvo v70 d3 from 2016.

Reliability was a key thing and its my second car, working pretty well for its 6 years age xD

Link to comment
Share on other sites

Link to post
Share on other sites

Just one of those cards are like 7k in norway.. Jesus.

Useful threads: PSU Tier List | Motherboard Tier List | Graphics Card Cooling Tier List ❤️

Baby: MPG X570 GAMING PLUS | AMD Ryzen 9 5900x /w PBO | Corsair H150i Pro RGB | ASRock RX 7900 XTX Phantom Gaming OC (3020Mhz & 2650Memory) | Corsair Vengeance RGB PRO 32GB DDR4 (4x8GB) 3600 MHz | Corsair RM1000x |  WD_BLACK SN850 | WD_BLACK SN750 | Samsung EVO 850 | Kingston A400 |  PNY CS900 | Lian Li O11 Dynamic White | Display(s): Samsung Oddesy G7, ASUS TUF GAMING VG27AQZ 27" & MSI G274F

 

I also drive a volvo as one does being norwegian haha, a volvo v70 d3 from 2016.

Reliability was a key thing and its my second car, working pretty well for its 6 years age xD

Link to comment
Share on other sites

Link to post
Share on other sites

Overengineered, overpriced & prob something is undervolted to its knees 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Gokul_P said:

Overengineered, overpriced & prob something is undervolted to its knees 

9 hours ago, Radium_Angel said:

It's HP. It's automatically overpriced.

 

9 hours ago, XsunaTera66 said:

Whats the price?

Considering this is enterprise stuff its expensive due to features like management, reliability and pro level CS support. Also the crazy engineering to cool, fit 4 ducking gpu, and power them.

 

As for underclocking I haven't seen a single enterprise stuff overclocked.

 

Not to mention this is a solution based product(like motorola radios, panasonic toughbooks, bassicly any enterprise level hardware) which is hard to stomach money wise, but when a company needs it they need it.

Everyone, Creator初音ミク Hatsune Miku Google commercial.

 

 

Cameras: Main: Canon 70D - Secondary: Panasonic GX85 - Spare: Samsung ST68. - Action cams: GoPro Hero+, Akaso EK7000pro

Dead cameras: Nikion s4000, Canon XTi

 

Pc's

Spoiler

Dell optiplex 5050 (main) - i5-6500- 20GB ram -500gb samsung 970 evo  500gb WD blue HDD - dvd r/w

 

HP compaq 8300 prebuilt - Intel i5-3470 - 8GB ram - 500GB HDD - bluray drive

 

old windows 7 gaming desktop - Intel i5 2400 - lenovo CIH61M V:1.0 - 4GB ram - 1TB HDD - dual DVD r/w

 

main laptop acer e5 15 - Intel i3 7th gen - 16GB ram - 1TB HDD - dvd drive                                                                     

 

school laptop lenovo 300e chromebook 2nd gen - Intel celeron - 4GB ram - 32GB SSD 

 

audio mac- 2017 apple macbook air A1466 EMC 3178

Any questions? pm me.

#Muricaparrotgang                                                                                   

 

Link to comment
Share on other sites

Link to post
Share on other sites

CaN ThIs GaMe

I try to be a human, but I cannot, because I have returned to monke.

Spoiler

Hehe boi

Spoiler

POV- when it can run crysis-

 ( ͝° ͜ʖ͡°)

 

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, Gokul_P said:

Overengineered, overpriced & prob something is undervolted to its knees 

it's pro gear from HP, so:

- overengineered: yes. because every minute this system is down, is costing the customer money. so it is built for fast servicing. (including the whole daughterboard thing for the SSD's.) their customers dont care about standard components, they care about uptime.

- overpriced: defenately more than the sum of it's parts, but that's not what you're buying. you're buying a validated platform that your IT guys can plug in and go fuss with something else. this tier of products is very far into "not for you" terretory. it's aimed at businesses where time costs exponentially more than hardware.

- undervolted to its knees: i have NO idea where this keeps coming from around HP's gear. most of their business gear is designed to run balls to the walls 24/7 for a decade. you may need "rack mode" on this one to get the absolute last inch of performance out of it, but you would need the same amount of fan screaming out of a system made from off the shelf parts either way.

 

---

 

on the topic of rack mode: that's actually a big selling point for businesses to buy HP workstations: HP includes a sort of "remote desktop on steroids" with their workstations that allows the screamy fan workstations to sit in a rack in the serveroom, and have the staff work on slim desktops or even laptops, and as long as you have 80-ish Mbps bandwidth between the client and the workstation, it's almost like working locally, even with tasks where image quality is a big issue.

they used to have an exploit-able endless 60-day trial (they dont anymore, RIP) so i got quite some good use out of it.

 

and it's nice to see HP in their transition to "Z branding" is still putting HUGE focus on the servicability of these things, a few years back i had a few Z800 series under my watch, and they are probably the most well designed airflow setups i've ever seen in a prebuilt device. keeping dual xeons and bucketloads of RAM cool in a chassis quiet enough to live under a desk i ssomething even a lot of boutique builders whould strruggle with.

oh.. and the fan trays come out so you can step out the lab to dust those off, without having to either take the entire device,  or take the entire thing apart. servicing these is faster than servicing my gaming desktop with all off the shelf parts, that's what these devices are about.

and ofcourse.. toolless is the norm in enterprise gear.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, manikyath said:

toolless is the norm in enterprise gear.

And should be more common in all areas of IT. 

NOTE: I no longer frequent this site. If you really need help, PM/DM me and my e.mail will alert me. 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Radium_Angel said:

And should be more common in all areas of IT. 

eh. bolts and screws have their place, especially in stuff that should stay in place either way, like monitor arms.

 

something that should happen more in the IT space that seems to be limited to the woodworking industry is "90° fasteners". it's like a screw, but it just slots into place and "locks" with a 90 degree turn. combine that with a thumb screw style grip and we have a product. the woodworking industry has these thing optimized down to a mass-produced product at part cost similar to wood screws, so i'd be amazed if the computer space couldnt come up with something similar that isnt a plastic latch for once.

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, manikyath said:

come up with something similar that isnt a plastic latch for once.

That would be nice, we should 3d something up and start making them.

I'll be the manager, you know, making the difficult decisions and examining the large picture, you be the worker, that's the easy part 😆

1Aw7LSKWMBmTNIIx9vo9cnM_ueVyFEyLyGLpG4fk

 

 

NOTE: I no longer frequent this site. If you really need help, PM/DM me and my e.mail will alert me. 

Link to comment
Share on other sites

Link to post
Share on other sites

21 hours ago, I_AM_ T3X said:

But how fast is it for folding at home? 

I'd love to see LTT try running F@h on all four of those GPUs at once. I've run F@h on a pair of RTX A5000 cards before, and they managed well over 5 million PPD each. 

Phobos: AMD Ryzen 7 2700, 16GB 3000MHz DDR4, ASRock B450 Steel Legend, 8GB Nvidia GeForce RTX 2070, 2GB Nvidia GeForce GT 1030, 1TB Samsung SSD 980, 450W Corsair CXM, Corsair Carbide 175R, Windows 10 Pro

 

Polaris: Intel Xeon E5-2697 v2, 32GB 1600MHz DDR3, ASRock X79 Extreme6, 12GB Nvidia GeForce RTX 3080, 6GB Nvidia GeForce GTX 1660 Ti, 1TB Crucial MX500, 750W Corsair RM750, Antec SX635, Windows 10 Pro

 

Pluto: Intel Core i7-2600, 32GB 1600MHz DDR3, ASUS P8Z68-V, 4GB XFX AMD Radeon RX 570, 8GB ASUS AMD Radeon RX 570, 1TB Samsung 860 EVO, 3TB Seagate BarraCuda, 750W EVGA BQ, Fractal Design Focus G, Windows 10 Pro for Workstations

 

York (NAS): Intel Core i5-2400, 16GB 1600MHz DDR3, HP Compaq OEM, 240GB Kingston V300 (boot), 3x2TB Seagate BarraCuda, 320W HP PSU, HP Compaq 6200 Pro, TrueNAS CORE (12.0)

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 weeks later...
On 3/8/2023 at 4:33 PM, manikyath said:

it's pro gear from HP, so:

- overengineered: yes. because every minute this system is down, is costing the customer money. so it is built for fast servicing. (including the whole daughterboard thing for the SSD's.) their customers dont care about standard components, they care about uptime.

 

Over-engineered? Maybe. Engineered to use all kinds of proprietary parts, rather than common standard parts? Definitely. You're going to need that extra reliability when you can't grab any part off the shelf.

 

The reality of the situation is that the value of OEM systems like this is in the support contracts, or more accurately, the ass covering this provides. Now you can show you did due diligence when something breaks down. The enterprise world revolves around ensuring you are not the one who will be blamed.

 

If you are actually serious about uptime you'd look towards cold or hot spares. It's hard to beat the reliability of two systems compared to almost any single one. But unless you're already buying something with a support contract, that doesn't get you a Get out of jail free card.

Link to comment
Share on other sites

Link to post
Share on other sites

It's amazing to me that a comparatively "affordable" consumer system gets about two thirds the performance of one of these in Cinebench, and that these will also handily beat it in single threaded workloads. The amount of power we can have on tap in recent years is insane. Competition is a wonderful thing.

 

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, XNOR said:

Over-engineered? Maybe. Engineered to use all kinds of proprietary parts, rather than common standard parts? Definitely. You're going to need that extra reliability when you can't grab any part off the shelf.

HP's enterprise customers dont care about standard parts, they care about quick maintenance => low downtime.

 

being able to pull an ATX power supply off any store shelf doesnt mean shit if you can get a spare from HP overnight, and it takes literally a minute to swap, as opposed to half an hour to sort ATX out.

6 hours ago, XNOR said:

 

The reality of the situation is that the value of OEM systems like this is in the support contracts, or more accurately, the ass covering this provides. Now you can show you did due diligence when something breaks down. The enterprise world revolves around ensuring you are not the one who will be blamed.

the enterprise computer market revolves around making a reliable product, with easy access to whatever wear parts might need replacing in the lifetime.

 

the enterprise market isnt about "blame", no one cares to "blame" someone for a workstation that is down. getting that workstation back up and running fast - even when the properly trained techs arent on site - is INCREDIBLY valuable however.

 

6 hours ago, XNOR said:

 

If you are actually serious about uptime you'd look towards cold or hot spares. It's hard to beat the reliability of two systems compared to almost any single one. But unless you're already buying something with a support contract, that doesn't get you a Get out of jail free card.

yes. every place that has serious reliability concerns has spares ready to go. now go back to the time required to swap an ATX power supply. also - HP's spare parts are *very* easy to get a hold of, and they come trough the same supply chain as their enterprise computers, so there's no weird billing stuff to deal with for accounting.

 

and again  - there's no "jail" - enterprise sector doesnt give a hoot about blame, they give a hoot about reliability and uptime, which includes speed of maintenance. if you're making choices that slow down this process however.. that's where blame comes in.

 

having all that said - it's clear you havent been in a position where you had to manage a large pool of workstations. you may be in a team of 5 or 6 if you're lucky to manage hundreds of workstations across multiple locations. of those 5 or 6 only 2 may have the appropriate skill to build their own computer properly (read: skill required to swap out an ATX power supply, or an M.2 drive on the motherboard).

the pool of devices is big enough that following can be assumed:

- for the 'average' administrative desktops, there's enough in rotation that you have one or two spares for the biggest locations. even if they have generational differences, they can be assumed to be an in-place swap for most systems of that category.

- all servers and network equipment have spares for every potential failiure point on-site. (spare hard drives, spare power supplies, spare fan units, maybe even an entire spare network switch.)

- for the high end workstations it might not be sensible to have spare systems, so you stock the expected wear components: power supplies, fans, storage drives. for example in the case of the SSD risers.. you could have a few risers sitting around with the "default loadout" for most workstations on-site, in the worst case scenario you can just call the "pair of hands" from facility, and tell *them* to shut down the system, press the release levers, and stick the other module back in. once the system is back up and running the people with propper qualification can come after the fact and recertify the part that came out. 

 

the absolute primary goal 1 is "getting things back up". every minute you need to get it back up is a minute that:

- you are busy (and cost money)

- the system is down (and not making money)

- the person at that system is waiting (and costing money)

 

and to address this:

6 hours ago, XNOR said:

It's amazing to me that a comparatively "affordable" consumer system gets about two thirds the performance of one of these in Cinebench, and that these will also handily beat it in single threaded workloads. The amount of power we can have on tap in recent years is insane. Competition is a wonderful thing.

 

this system is so far into the "diminishing returns" pit that the way to compare cost of these really comes down to "what does it do for the productivity of the person sitting behind it, and what do *they* cost?". (or potentially.. person that isnt sitting behind it, because Zcentral.)

 

as a sort of half related example, (lots of details cut out because NDA's):

- company with head office in the EU

- application that needs high bandwidth access to main data storage (EU), the two douzen very-high-cost employees (take 15K/month as a base level here) on the EU side can run this locally because they are in-office. the 6 they have overseas however.. cant. so they have a pair of desktops for them to remote into on-site EU.

 

in this situation, if those desktops were down, that's costing the corporation 560 bucks per hour of downtime, just in the wage of those employees. the cost of delivering a project late may potentially dwarf that cost. and due to timezones, when the systems go down for the people abroad, the people in EU were usually off the clock. (= VERY expensive engineer hours.)

 

so.. when these boxes were showing reliability concerns, it took management all of 5 minutes to approve new server infrastructure, because just the POTENTIAL of downtime, on a system that wasnt repair-friendly at all (ratty old desktops, screwdriver required.) was so much of a financial impact that it was "cheap" to buy shiny new servers to replace what was essentially just an office PC.

 

---

 

OR.. to put it very briefly:

you are not their customer, HP doesnt expect you to have any desire for what they have created.

 

it's sort of like criticizing the value of a humvee as a daily commute vehicle.. when it was clearly designed to sit on a battlefield in the desert. (and yet.. people commute with them, which is dumb.)

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×