Jump to content

is server hardware really needed for work station desktop? (CAD, Revit etc.)

I may get a new work PC and the choice is limited to Dell. I use autodesk Revit (uses mostly single thread, some multi-thread) and also lighting simulation (uses all 8 of my current W-2245 cores at 100%)

 

Dell has XPS series with intel 14900K and 4070 GPU. They also have the Precision 7960  with the W5-3425 and A2000 GPU. I would need 32GB and assume the XPS has the standard  ram, and the Xeon has ECC RAM. So I wouldn't take advantage of the 8 memory channels the Xeon offers.

 

Is there any stability issue going with the XPS with the consumer grade CPU, RAM and GPU? 

 

One advantage I see with the 14900K system is lower price and higher single-threaded performance. But it only has 8 P-cores. For the time being we still use W10 at work and I don't know if my lighting software likes the e-cores. I don't know yet if I can talk IT into giving me W11 yet (task scheduler for e cores). I also don't know if a Dell PC is cooled well enough for max clock speed. 

 

So I'm wondering if spending 50-70% more on the Xeon system is worth it? Is there really better stability? Before I suggest anything to IT, I like to get a better idea which actually is the best for work. 

 

AMD Ryzen is not an option for Dell. and using Dell (and which OS) is a non-negotiable decision by our IT department. 

AMD 9 7900 + Thermalright Peerless Assassin SE

Gigabyte B650m DS3H

2x16GB GSkill 60000 CL30

Samsung 980 Pro 2TB

Fractal Torrent Compact

Seasonic Focus Plus 550W Platinum

W11 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

26 minutes ago, Lurking said:

I also don't know if a Dell PC is cooled well enough for max clock speed. 

Sorry, I had to laugh at this. Of course the CPU is adequately cooled. Who do you think they are, Apple?

 

I don’t think there’s a huge stability difference, but the Xeon platform should have more PCIe expandability and memory channels. Either one’s going to be more stable than a whitebox made out of cheap gaming PC components. 

I sold my soul for ProSupport.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Needfuldoer said:

Sorry, I had to laugh at this. Of course the CPU is adequately cooled. Who do you think they are, Apple?

 

I don’t think there’s a huge stability difference, but the Xeon platform should have more PCIe expandability and memory channels. Either one’s going to be more stable than a whitebox made out of cheap gaming PC components. 

Since I only need 32 GB (2x16) and don't use expansion cards, that feature is wasted in my case.

 

Our office recently got upgraded to 10 Gbit Ethernet (but all PCs only have 1Gbit cards so far). I don't know if the XPS have the option to get a 10 Gbit network card (it wasn't listed as an option). But the Precision series offers it as an option. I'm not privy to the corporate deal we get from Dell and if they can add features that ic an't see on the website. but our IT most likely won't allow a separate 10Gbit network card unless it comes from Dell. So that may be a deciding factor. I have to save my files to the network drive (every 15 minutes) and takes a few very noticeable seconds. 

 

We keep PCs for 4+ years. So whatever i propose i will be stuck with for long. 

 

AMD 9 7900 + Thermalright Peerless Assassin SE

Gigabyte B650m DS3H

2x16GB GSkill 60000 CL30

Samsung 980 Pro 2TB

Fractal Torrent Compact

Seasonic Focus Plus 550W Platinum

W11 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Lurking said:

I may get a new work PC and the choice is limited to Dell. I use autodesk Revit (uses mostly single thread, some multi-thread) and also lighting simulation (uses all 8 of my current W-2245 cores at 100%)

 

Dell has XPS series with intel 14900K and 4070 GPU. They also have the Precision 7960  with the W5-3425 and A2000 GPU. I would need 32GB and assume the XPS has the standard  ram, and the Xeon has ECC RAM. So I wouldn't take advantage of the 8 memory channels the Xeon offers.

 

Is there any stability issue going with the XPS with the consumer grade CPU, RAM and GPU? 

 

One advantage I see with the 14900K system is lower price and higher single-threaded performance. But it only has 8 P-cores. For the time being we still use W10 at work and I don't know if my lighting software likes the e-cores. I don't know yet if I can talk IT into giving me W11 yet (task scheduler for e cores). I also don't know if a Dell PC is cooled well enough for max clock speed. 

 

So I'm wondering if spending 50-70% more on the Xeon system is worth it? Is there really better stability? Before I suggest anything to IT, I like to get a better idea which actually is the best for work. 

 

AMD Ryzen is not an option for Dell. and using Dell (and which OS) is a non-negotiable decision by our IT department. 

Revit's kinda chunky, but clients working on mega-projects have been using 64GB DDR4 RAM i7 laptop systems and haven't had much issue.

 

The reason you want ECC memory is to avoid errors being introduced. If you are designing a 50 story skyscraper, a subway station, etc, that is VITAL to avoid precision errors. But if you're designing smaller things for 3D printers, or 3D assets in a video game, then that doesn't matter. ECC is about data integrity, and it doesn't matter in the GPU unless you are actually doing physics calculations.

 

I would probably recommend just sticking with the Xeon if the software doesn't say it can be used on 12th gen intel. Because otherwise you might end up having to disable the e-cores and what you end up with is less than the equal Xeon.

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/17/2024 at 4:18 PM, Kisai said:

Revit's kinda chunky, but clients working on mega-projects have been using 64GB DDR4 RAM i7 laptop systems and haven't had much issue.

 

The reason you want ECC memory is to avoid errors being introduced. If you are designing a 50 story skyscraper, a subway station, etc, that is VITAL to avoid precision errors. But if you're designing smaller things for 3D printers, or 3D assets in a video game, then that doesn't matter. ECC is about data integrity, and it doesn't matter in the GPU unless you are actually doing physics calculations.

 

I would probably recommend just sticking with the Xeon if the software doesn't say it can be used on 12th gen intel. Because otherwise you might end up having to disable the e-cores and what you end up with is less than the equal Xeon.

 

How do the errors manifest themselves? Is a 50 story building then 1/8" shorter than it actually is? Or something that could be a serious problem? FWIW, my projects don't have more than 4 floors but have multiple disciplines (I do MEP and architecture in on model).

 

Current RAM use is under 20GB. So that is why 32GB should be fine. If that gives an idea on the size of projects.

AMD 9 7900 + Thermalright Peerless Assassin SE

Gigabyte B650m DS3H

2x16GB GSkill 60000 CL30

Samsung 980 Pro 2TB

Fractal Torrent Compact

Seasonic Focus Plus 550W Platinum

W11 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Lurking said:

How do the errors manifest themselves? Is a 50 story building then 1/8" shorter than it actually is? Or something that could be a serious problem? FWIW, my projects don't have more than 4 floors but have multiple disciplines (I do MEP and architecture in on model).

Theoretically "cosmic rays" or electrical noise could flip bits. In floating point (Which is what 3D and CAD use in underlying calculations) could flip any bit anywhere, so that could mean the difference between a number being positive or negative. In integers, depending where the bit flip is, it could double or halve the value of the lower bits of that number.

 

In the context of a building this could result in things being moved anywhere from fractions of a millimeter to entire meters. So you might not catch those "off by a few millimeters" errors because they will be introduced over multiple saves of the document.

 

In practice, RAM is hardly that unreliable, and there's no known examples of not using ECC memory that has resulted in a safety or an expensive cost overrun. Usually memory errors crash the computer and corrupt the document as a consequence, so you end up losing time re-doing work, and THAT is when errors get introduced by re-doing the data that was lost and missing things because you don't know exactly what was lost.

 

2 hours ago, Lurking said:

Current RAM use is under 20GB. So that is why 32GB should be fine. If that gives an idea on the size of projects.

You should always have more RAM than you think you need, but honestly the jump between 32 and 64 is cheap except when the device is a workstation/server on an Xeon.

 

Anyway if the software says it will not work on a specific CPU, then short of picking something like a 11th gen Intel CPU, I wouldn't risk buying a 14th gen and then finding out you have to disable the E-Cores to get things to work. The Xeon is the better idea, but you don't have to get the Xeon-W, you could get the Xeon-E which is basically the same as the i3/i5/i7 with only P cores.

 

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Kisai said:

Theoretically "cosmic rays" or electrical noise could flip bits. In floating point (Which is what 3D and CAD use in underlying calculations) could flip any bit anywhere, so that could mean the difference between a number being positive or negative. In integers, depending where the bit flip is, it could double or halve the value of the lower bits of that number.

 

In the context of a building this could result in things being moved anywhere from fractions of a millimeter to entire meters. So you might not catch those "off by a few millimeters" errors because they will be introduced over multiple saves of the document.

 

In practice, RAM is hardly that unreliable, and there's no known examples of not using ECC memory that has resulted in a safety or an expensive cost overrun. Usually memory errors crash the computer and corrupt the document as a consequence, so you end up losing time re-doing work, and THAT is when errors get introduced by re-doing the data that was lost and missing things because you don't know exactly what was lost.

 

You should always have more RAM than you think you need, but honestly the jump between 32 and 64 is cheap except when the device is a workstation/server on an Xeon.

 

Anyway if the software says it will not work on a specific CPU, then short of picking something like a 11th gen Intel CPU, I wouldn't risk buying a 14th gen and then finding out you have to disable the E-Cores to get things to work. The Xeon is the better idea, but you don't have to get the Xeon-W, you could get the Xeon-E which is basically the same as the i3/i5/i7 with only P cores.

 

Thanks for the explanation. Considering the cost of the software and the person using it, it seems cheaper to get the more expensive hardware than to worry about problems.

 

I talked to the manufacturer of the Revit lighting plugin. They said e-cores won't do much and recommend 10+ cores (P cores). That kind of excludes the 14900 or similar cheaper CPU. Since Dell doesn't offer AMD Ryzen, that leaves the Xeons with 10+ "real" cores. W5-3425 probably is a good choice. That also answers ECC memory question. (I know they offer AMD Ryzen in gaming PCs, but no one in IT would support that idea). 

AMD 9 7900 + Thermalright Peerless Assassin SE

Gigabyte B650m DS3H

2x16GB GSkill 60000 CL30

Samsung 980 Pro 2TB

Fractal Torrent Compact

Seasonic Focus Plus 550W Platinum

W11 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

I have been a Revit BIM tech for almost 20 years, and I have never had 'workstation' hardware even once. My current machine is a 11600K in a Z590 ASRock board, 2x32GB of regular Team Group gaming memory at 3200MT, and a GTX 1060 6GB that was my old gaming card. I run up to 140MB project files no problem, though I don't scene render much these days. Previously I had a 4790K and 2x8GB 1600MT ram, and that machine only supported SATA SSDs. Revit really doesn't need a huge amount of system resources until you get into very large files. Puget Systems does a lot of hardware testing for Revit and really outside of very limited circumstances, consumer hardware is totally fine, especially since Revit does not rely on GPU acceleration much, so any reasonably modern card with at least 4GB of VRAM will do fine.

My Current Setup:

AMD Ryzen 5900X

Kingston HyperX Fury 3200mhz 2x16GB

MSI B450 Gaming Plus

Cooler Master Hyper 212 Evo

EVGA RTX 3060 Ti XC

Samsung 970 EVO Plus 2TB

WD 5400RPM 2TB

EVGA G3 750W

Corsair Carbide 300R

Arctic Fans 140mm x4 120mm x 1

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×