Jump to content

3D Modeling & Design – Do you REALLY need a Xeon and Quadro??

11 hours ago, dalekphalm said:

Why didn't they ask AMD to send some professional workstation cards? Dunno (maybe they did and AMD ignored them, replied too late, or denied the request).

 

Because "They had such a low market share, we didn't consider adding it" - Floatplane

chrome_2018-02-15_14-08-17.png

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, JamieSinn said:

Which version of SW was this done with? Did SW Electrical get accounted for?

 

 

Considering the timing for this video, I’m guessing it was a 2018 copy of Solidworks and I doubt that Electrical would have been accounted for. Though I think it would be interesting to see if it changes anything. 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, B1gChief said:

I've Done 3D "hard surface" "high poly" work up to 5 mil. polys with a little lag but nothing that stop you from working at a productive rate on a Surface Pro. Its the rendering that will kill a laptop or low end pc.

 

But.......

With the help of cloud rendering like Red Rocket, Autodesk cloud rendering and Google even offers up a cloud render farm, that solves all your problems when it comes to rendering and you cant discredit these since a lot of movie VFX companies uses such render farms to do a fair bit of there render for big budget films.

 

So it really doesn't matter what pc you have unless it below the required specs for the 3D software you are using, then major problems will occur.

You are comparing a bit of apples and oranges. CAD is a completely different way of modeling. I can promise you (as someone who has killed many laptops while working in CAD during my time at University) a Surface Pro would not be up to the task of CAD unless you were only working on single parts, but then again you would still experience a lot of issues just running the program and the heat from the Surface Pro would also cause concern. 

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, bimmerman said:

I disagree-- neither 3d visual arts nor engineering owns those terms.

 

As an engineer, I design solid models, thus am doing both 3d Modeling and Design (and simulation and....). In my industry that is what those two terms refer to, and is how my services are described when billing clients. CAD is the field, modeling and design is the activity. The video content was exactly what I expected based on my usage of the terms in the video title.

 

In the 3d visual arts industry, I would expect modeling and design to refer to the programs you mention, so you're right in that sense; however, the terms 'Modeling' and 'Design' are not exclusive to either visual arts or engineering fields.

I work in the visual effects industry and I entirely agree with this.  We don't own those terms and CAD type stuff is for sure 3D modeling and design as well.

Link to comment
Share on other sites

Link to post
Share on other sites

I use cad software daily and was trying to figure this out when we were building cad systems 2 years ago year.

 

It really depends on the software which graphics card works best, mostly what kind of renderer they use.

OpenGL is used by NX, CATIA, Solidworks, Solid edge, Creo

DirectX is used by Autocad, Inventor, 3ds max, Showcase (mostly autodesk products)

 

For OpenGL software, quadro cards work better, as shown in this video.

For DirectX software, quadro's run just as well as their comparable Geforce alternative, Altought the Geforce card is a lot cheaper.

 

This is the reason we specced GTX970's and 6700k's for cad stations that are used for inventor and autocad.

Funny thing is that when you try to get support from autodesk for a geforce, they always say that they aren't supported and that that is probably causing the problem.

In my experience i haven't had any differences between quadro's and geforce cards, even though i have a lenovo P51 with a quadro M2200, al the bugs are still here.

 

CPU for usage is all single core and for rendering/simulating is multi core, but even while simulating the meshing doesn't tend to be multi core.

 

My P51 has a Xeon E3 1505 v6 laptop cpu, mostly because the xeon cpu's have a higher boostclock for single core loads, in this case 4ghz. which is not too bad compared to the 7700k in the desktop.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, B1gChief said:

I've Done 3D "hard surface" "high poly" work up to 5 mil. polys with a little lag but nothing that stop you from working at a productive rate on a Surface Pro. Its the rendering that will kill a laptop or low end pc.

 

But.......

With the help of cloud rendering like Red Rocket, Autodesk cloud rendering and Google even offers up a cloud render farm, that solves all your problems when it comes to rendering and you cant discredit these since a lot of movie VFX companies uses such render farms to do a fair bit of there render for big budget films.

 

So it really doesn't matter what pc you have unless it below the required specs for the 3D software you are using, then major problems will occur.

To be fair, that's entirely different from solid modeling. You're not rendering polygons in space, you're building solid models (i.e., not faceted infinitesimally thin surface geometry). Another way of looking at it: solid modeling is building stuff with blocks of clay, which have volume, mass, density, and thickness; polygon modeling is making detailed surfaces with a bunch of...polygons...that are essentially infinitely thin, which can enclose a volume but do not have material properties. It's a fundamentally different way of doing 3D modeling with its own hardware and graphical horsepower requirements.

 

Rendering for (engineering) CAD is also not usually offloaded to a compute cluster or farm in my experience....mostly because true rendering isn't a thing that's needed until the make-things-pretty-for-marketing stage. Until then you're working with garishly colored parts to distinguish from one another. Solidworks' rendering tool is called Photoworks, and it's very much an add-on once modeling and design phase is done.

 

You can run Solidworks on an integrated GPU and mobile CPU. I have done it, and it is very unpleasant once you get above 5-10 parts in an assembly. Because you aren't offloading to a cluster, your PC needs to have a fairly powerful (and supported) GPU and a mix between good single core and multicore CPU. Using the iGPU is absolutely an impediment to productivity when you're doing anything remotely complex. Doable, yes, but efficiently billable? Nope.

Link to comment
Share on other sites

Link to post
Share on other sites

So the one issue that I notice with this video is that around the 6:30 mark you discuss not using an i9 due to lack of ECC memory support, however you then recommend an i7-8700k which from what I can tell also does not support ECC memory...  I get that the IPC on the Intel pieces are a bit better than AMD, but seemed like a funny conclusion as to going with an i7 over an i9 or even looking for something that is an 8700k equivalent in the Xeon line-up.  I guess if anything, I wonder why the ECC memory support was even brought up, except for as a reason to rule out the i9.  Other than that, looked fairly good and shows that workstation cards do exist for a reason.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, WMGroomAK said:

So the one issue that I notice with this video is that around the 6:30 mark you discuss not using an i9 due to lack of ECC memory support, however you then recommend an i7-8700k which from what I can tell also does not support ECC memory...  I get that the IPC on the Intel pieces are a bit better than AMD, but seemed like a funny conclusion as to going with an i7 over an i9 or even looking for something that is an 8700k equivalent in the Xeon line-up.  I guess if anything, I wonder why the ECC memory support was even brought up, except for as a reason to rule out the i9.  Other than that, looked fairly good and shows that workstation cards do exist for a reason.

I think the point is that if you need or want ECC support, then the price jump from an i9 to a Xeon isn't very high. But if you don't need ECC support, dropping down to an 8700K saves a lot more money than dropping down to an i9.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, WMGroomAK said:

So the one issue that I notice with this video is that around the 6:30 mark you discuss not using an i9 due to lack of ECC memory support, however you then recommend an i7-8700k which from what I can tell also does not support ECC memory...  I get that the IPC on the Intel pieces are a bit better than AMD, but seemed like a funny conclusion as to going with an i7 over an i9 or even looking for something that is an 8700k equivalent in the Xeon line-up.  I guess if anything, I wonder why the ECC memory support was even brought up, except for as a reason to rule out the i9.  Other than that, looked fairly good and shows that workstation cards do exist for a reason.

The need for ECC was for critical multithreaded work such as FEA simulations, where memory errors aren't tolerable. Since a lot of companies use cluster computing for FEA (Abaqus, Ansys), an i9 or Xeon W would be of no benefit. Very few people use Solidworks' FEA solver.

 

The 8700 and 8700k have better single core performance than any of the i9s, which is why Solidworks would run better on those chips than the i9. Solidworks is horribly optimized for multithreaded work for the day to day use tasks, so why buy a fancy multithreaded chip that won't be used much at all?

 

That's why the 8700 over i9 / Xeon W recommendation was made. If you're doing FEA locally rather than on a node, then the Xeon W would be the better choice for an all-in-one workstation. If memory errors aren't dealbreakers for the workload, the i9 would be sufficient.

 

And realistically, the delta between the $2500 18c Xeon and the $2000 18c i9 is ~2-3 hrs of billable engineering time. That is easily amortized over the 3-5 year lifespan of the computer, if ECC memory is necessary to have.

 

Since no one in industry is overclocking these chips, I'd think an 8700 non-K is probably the sweet spot if you don't need crazy multithreaded performance or ECC-- in that case, buy a Xeon.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, bimmerman said:

The i9 + ECC combination was for multithreaded work such as FEA simulations, where memory errors aren't tolerable. Since a lot of companies use cluster computing for FEA (Abaqus, Ansys), an i9 or Xeon W would be of no benefit

Except that the only Intel processors that support ECC are Xeons and not i9 and that appeared to me to be the toss away remark that was made as to why you should not go with an i9 for this build... The remark that is made is that a "Core i9 is out of the question because Intels HEDT lineup lacks support for ECC memory..."  Not arguing the single threaded performance, just found the comment about ruling out an i9 due to ECC memory support and then recommending a processor without ECC memory support to be a little weird.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, WMGroomAK said:

Except that the only Intel processors that support ECC are Xeons and not i9 and that appeared to me to be the toss away remark that was made as to why you should not go with an i9 for this build... The remark that is made is that a "Core i9 is out of the question because Intels HEDT lineup lacks support for ECC memory..."  Not arguing the single threaded performance, just found the comment about ruling out an i9 due to ECC memory support and then recommending a processor without ECC memory support to be a little weird.

sorry-- edited post. What I meant was the comment in the video about ECC is that it is only necessary for critical simulation work (think medical devices, drug therapy, complex FEA, etc). If you are doing that work then you need ECC and cannot use an i9 or an i7-- you must use a Xeon (or Threadripper). If you are not doing that work, you don't need ECC, so you should get the processor that best suits your workflow, which in Solidworks' specific case, is an i7 not an i9 due to single core performance.

 

That leaves the i9 series in a weird place, suited only for multithreaded work that doesn't need ECC. Regular CAD design work is not that use case.

Link to comment
Share on other sites

Link to post
Share on other sites

38 minutes ago, bimmerman said:

sorry-- edited post. What I meant was the comment in the video about ECC is that it is only necessary for critical simulation work (think medical devices, drug therapy, complex FEA, etc). If you are doing that work then you need ECC and cannot use an i9 or an i7-- you must use a Xeon (or Threadripper). If you are not doing that work, you don't need ECC, so you should get the processor that best suits your workflow, which in Solidworks' specific case, is an i7 not an i9 due to single core performance.

 

That leaves the i9 series in a weird place, suited only for multithreaded work that doesn't need ECC. Regular CAD design work is not that use case.

My thinking is that it would have been best in this instance instead of ruling out the various processors that are listed as to why not to get those, just go forward with the fact that a good processor for a CAD workstation should be a good balance of high single threaded performance and multi-threaded workload and then just say that their testing finds the 8700 or 8700k to strike that balance.  Bringing up the lack of ECC support in the Intel HEDT product stack really seemed to lack much purpose in the aspect of the whole video.

Link to comment
Share on other sites

Link to post
Share on other sites

Solidworks Full features can work with GTX cards, I do use GTX cards on large assemblies with the fix below, it works greeat !

here is how

 

https://grabcad.com/questions/how-i-can-activate-real-view-graphics-in-my-solidworks-2015-x-64-bit-sp2-my-operating-system-is-windows-7-x-64-bit-sp1-and-my-graphics-card-is-amd-radeon-hd-7500m-7600m-series-i-have-already-tried-realhack-3-9-1-but-it-is-not-working

 

Windows 7 64bit /w GeForce GT525M , activated Realview in SW2014, following steps Leucetius wrote, with the mentions Plecostomus Prime wrote.

1.) Open up the registry editor (Start Button -> Run... -> "regedit")

2.) Navigate to "HKEY_CURRENT_USER\Software\SolidWorks\SOLIDWORKS 2015\Performance\Graphics\Hardware\Current"

3.) On the right side doubleclick on "Renderer"

4.) copy (Ctrl+C) the Value

5.) Navigate to "HKEY_CURRENT_USER\Software\SolidWorks\SOLIDWORKS 2015\Performance\Graphics\Hardware\Gl2Shaders\NV40"

6.) On the left side (the treeview) rightclick on "NV40" and choose "New / Key"

7.) Rename the new key with the copied value (Ctrl+V)

8.) Click another Key and back to your newly created to make sure you are editing the right values. The right side should be empty but an entry (Default) whose data is "value not set"

9.) rightclick in the right side and create "New / DWORD (32-bit) Value"

10.) Rename the DWORD to "Workarounds" and, after that, doublecklick it

11.) Leave the "Base" to "Hexadecimal" and enter 30008 into the field "Value data"

Opened SW and did nothing, Realview was on by itself.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, bassam64i said:

Solidworks Full features can work with GTX cards, I do use GTX cards on large assemblies with the fix below, it works greeat !

here is how

 

https://grabcad.com/questions/how-i-can-activate-real-view-graphics-in-my-solidworks-2015-x-64-bit-sp2-my-operating-system-is-windows-7-x-64-bit-sp1-and-my-graphics-card-is-amd-radeon-hd-7500m-7600m-series-i-have-already-tried-realhack-3-9-1-but-it-is-not-working

 

Windows 7 64bit /w GeForce GT525M , activated Realview in SW2014, following steps Leucetius wrote, with the mentions Plecostomus Prime wrote.

1.) Open up the registry editor (Start Button -> Run... -> "regedit")

2.) Navigate to "HKEY_CURRENT_USER\Software\SolidWorks\SOLIDWORKS 2015\Performance\Graphics\Hardware\Current"

3.) On the right side doubleclick on "Renderer"

4.) copy (Ctrl+C) the Value

5.) Navigate to "HKEY_CURRENT_USER\Software\SolidWorks\SOLIDWORKS 2015\Performance\Graphics\Hardware\Gl2Shaders\NV40"

6.) On the left side (the treeview) rightclick on "NV40" and choose "New / Key"

7.) Rename the new key with the copied value (Ctrl+V)

8.) Click another Key and back to your newly created to make sure you are editing the right values. The right side should be empty but an entry (Default) whose data is "value not set"

9.) rightclick in the right side and create "New / DWORD (32-bit) Value"

10.) Rename the DWORD to "Workarounds" and, after that, doublecklick it

11.) Leave the "Base" to "Hexadecimal" and enter 30008 into the field "Value data"

Opened SW and did nothing, Realview was on by itself.

As far as I know this doesn't work for 10 series cards, and can also cause stability issues

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, AlexTheGreatish said:

As far as I know this doesn't work for 10 series cards, and can also cause stability issues

I tested this procedure on my 1070, works and stable, no issues what so ever

Link to comment
Share on other sites

Link to post
Share on other sites

I'll also add that the hack above also enables the built in anti-aliasing function in solidworks. So the egdes look a lot smoother. again it works with large assemblies butter smooth.

Link to comment
Share on other sites

Link to post
Share on other sites

I am curious as I enjoy CADing on my multimedia laptop (it is an Asus N-550JK with a gtx 850m and i7 4700QH) It does well in Simmons NX which is the MAE cad software.  but the laptop after 3 years is about to disintegrate and I kinda want to upgrade to a small desktop.  I am curious about the in-game performance of the cad PC that was shown as the Quadros might be easyer to get a hold of than the more public gaming card right now.  

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Spirit of 76 said:

can someone tell me the difference between the GTX and p series cards

GTX are the gaming series. Example: GTX 1070.

 

the P series are a sub-series of the Quadro Workstation cards. Example:Quadro P1000 - in this case, the P means it's Pascal based architecture. K1000 would be Keplar, M1000 would be Maxwell, etc.
 

As for the actual difference? It varies, but usually different drivers that are optimized differently. Also, the Quadro's often (but not always) include more RAM, or ECC RAM, 10-Bit support, etc.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Spirit of 76 said:

ok it the optimisation hardware based or software based

Both.

 

Quadro cards tend to have much higher Floating Point computational power, but there are also software optimizations.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 2/14/2018 at 2:34 PM, bimmerman said:

I disagree-- neither 3d visual arts nor engineering owns those terms.

 

As an engineer, I design solid models, thus am doing both 3d Modeling and Design (and simulation and....). In my industry that is what those two terms refer to, and is how my services are described when billing clients. CAD is the field, modeling and design is the activity. The video content was exactly what I expected based on my usage of the terms in the video title.

 

In the 3d visual arts industry, I would expect modeling and design to refer to the programs you mention, so you're right in that sense; however, the terms 'Modeling' and 'Design' are not exclusive to either visual arts or engineering fields.

True! I've done 3D modeling of one character so far, and lots of gadgets as game assets, and I've done some 3D CAD work in Blender (though it's not optimal, it's doable) where I was 3D modeling a custom ATX full tower case.

 

That still needs some refinement though.

RIGZ

Spoiler

Starlight (Current): AMD Ryzen 9 3900X 12-core CPU | EVGA GeForce RTX 2080 Ti Black Edition | Gigabyte X570 Aorus Ultra | Full Custom Loop | 32GB (4x8GB) Dominator Platinum SE Blackout #338/500 | 1TB + 2TB M.2 NVMe PCIe 4.0 SSDs, 480GB SATA 2.5" SSD, 8TB 7200 RPM NAS HDD | EVGA NU Audio | Corsair 900D | Corsair AX1200i | Corsair ML120 2-pack 5x + ML140 2-pack

 

The Storm (Retired): Intel Core i7-5930K | Asus ROG STRIX GeForce GTX 1080 Ti | Asus ROG RAMPAGE V EDITION 10 | EKWB EK-KIT P360 with Hardware Labs Black Ice SR2 Multiport 480 | 32GB (4x8GB) Dominator Platinum SE Blackout #338/500 | 480GB SATA 2.5" SSD + 3TB 5400 RPM NAS HDD + 8TB 7200 RPM NAS HDD | Corsair 900D | Corsair AX1200i + Black/Blue CableMod cables | Corsair ML120 2-pack 2x + NB-BlackSilentPro PL-2 x3

STRONK COOLZ 9000

Spoiler

EK-Quantum Momentum X570 Aorus Master monoblock | EK-FC RTX 2080 + Ti Classic RGB Waterblock and Backplate | EK-XRES 140 D5 PWM Pump/Res Combo | 2x Hardware Labs Black Ice SR2 480 MP and 1x SR2 240 MP | 10X Corsair ML120 PWM fans | A mixture of EK-KIT fittings and EK-Torque STC fittings and adapters | Mayhems 10/13mm clear tubing | Mayhems X1 Eco UV Blue coolant | Bitspower G1/4 Temperature Probe Fitting

DESK TOIS

Spoiler

Glorious Modular Mechanical Keyboard | Glorious Model D Featherweight Mouse | 2x BenQ PD3200Q 32" 1440p IPS displays + BenQ BL3200PT 32" 1440p VA display | Mackie ProFX10v3 USB Mixer + Marantz MPM-1000 Mic | Sennheiser HD 598 SE Headphones | 2x ADAM Audio T5V 5" Powered Studio Monitors + ADAM Audio T10S Powered Studio Subwoofer | Logitech G920 Driving Force Steering Wheel and Pedal Kit + Driving Force Shifter | Logitech C922x 720p 60FPS Webcam | Xbox One Wireless Controller

QUOTES

Spoiler

"So because they didn't give you the results you want, they're biased? You realize that makes you biased, right?" - @App4that

"Brand loyalty/fanboyism is stupid." - Unknown person on these forums

"Assuming kills" - @Moondrelor

"That's not to say that Nvidia is always better, or that AMD isn't worth owning. But the fact remains that this forum is AMD biased." - @App4that

"I'd imagine there's exceptions to this trend - but just going on mine and my acquaintances' purchase history, we've found that budget cards often require you to turn off certain features to get slick performance, even though those technologies are previous gen and should be having a negligible impact" - ace42

"2K" is not 2560 x 1440 

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, dalekphalm said:

Both.

 

Quadro cards tend to have much higher Floating Point computational power, but there are also software optimizations.

I was pretty sure Quadros matched the GTX cards in raw floating-point (single-precision) capability. In tasks that leverage raw power (Cycles), the 1080 TI outpaces everything but the highest end Quadros.

 

Also, from what I recall, Pascal isn't particularly good at double precision, Quadros included.

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

25 minutes ago, Zodiark1593 said:

I was pretty sure Quadros matched the GTX cards in raw floating-point (single-precision) capability. In tasks that leverage raw power (Cycles), the 1080 TI outpaces everything but the highest end Quadros.

 

Also, from what I recall, Pascal isn't particularly good at double precision, Quadros included.

So, I just went hunting through specs and reviews to confirm numbers - and first, FP16 specs are not posted in very many places. But I was able to confirm:

 

GP102 seems to be consistent across GTX vs Quadro for FP32, FP64 (1/32 performance), and FP16 (1/64 performance)

 

It's worth noting that the GP100 has insane FP16 performance, which is 2x the FP32 perf. Though GP100 never made it into a consumer chip.

 

So, I'll concede that. In times past, Quadro's often had much better FP64 perf over their GTX counterpart, but that trend seems to have mostly died.

 

However, increased RAM is still a huge benefit. The P6000 has 24 GB, compared to the various smaller amounts for the rest of GP102.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×