Jump to content

Vega FE Hybrid Mod - Passing the 1080 but 400W+ Power Draw

Hunter259
5 hours ago, goodtofufriday said:

Not having enough cash flow is the explicit reason that they fell behind to being with. Starting with Intels sabatoge. 

Intel's sabotage had little to do with it. I keep seeing people hanging on that very point of something that happened 10 years ago. AMD's problems is AMD. Stupid decisions after stupid decisions did them in. From acquiring ATi to Piledriver/Bulldozer to appointing Hector Ruiz as CEO who believed that APU's would be the future thus AMD vastly undervaluing the GPU market putting them where they are today. Not to mention stupid decisions like requiring the use of HBM that provides no benefit to G5X and Vega being delayed due to HBM shortages. If they were smart, they'd already have a product out using G5X instead. Intel has very little to do with it.

CPU: Intel Core i7 7820X Cooling: Corsair Hydro Series H110i GTX Mobo: MSI X299 Gaming Pro Carbon AC RAM: Corsair Vengeance LPX DDR4 (3000MHz/16GB 2x8) SSD: 2x Samsung 850 Evo (250/250GB) + Samsung 850 Pro (512GB) GPU: NVidia GeForce GTX 1080 Ti FE (W/ EVGA Hybrid Kit) Case: Corsair Graphite Series 760T (Black) PSU: SeaSonic Platinum Series (860W) Monitor: Acer Predator XB241YU (165Hz / G-Sync) Fan Controller: NZXT Sentry Mix 2 Case Fans: Intake - 2x Noctua NF-A14 iPPC-3000 PWM / Radiator - 2x Noctua NF-A14 iPPC-3000 PWM / Rear Exhaust - 1x Noctua NF-F12 iPPC-3000 PWM

Link to comment
Share on other sites

Link to post
Share on other sites

400W power usage?

How does single 120mm rad take care of that lol

Intel i7 12700K | Gigabyte Z690 Gaming X DDR4 | Pure Loop 240mm | G.Skill 3200MHz 32GB CL14 | CM V850 G2 | RTX 3070 Phoenix | Lian Li O11 Air mini

Samsung EVO 960 M.2 250GB | Samsung EVO 860 PRO 512GB | 4x Be Quiet! Silent Wings 140mm fans

WD My Cloud 4TB

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, Simon771 said:

400W power usage?

How does single 120mm rad take care of that lol

Its only cooling the GPU itself, not the VRMs, VRAM or FETs.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, VagabondWraith said:

Intel's sabotage had little to do with it. I keep seeing people hanging on that very point of something that happened 10 years ago. AMD's problems is AMD. Stupid decisions after stupid decisions did them in. From acquiring ATi to Piledriver/Bulldozer to appointing Hector Ruiz as CEO who believed that APU's would be the future thus AMD vastly undervaluing the GPU market putting them where they are today. Not to mention stupid decisions like requiring the use of HBM that provides no benefit to G5X and Vega being delayed due to HBM shortages. If they were smart, they'd already have a product out using G5X instead. Intel has very little to do with it.

 

I wouldn't say little, but it wasn't the be all and end all of AMD's woes.  It happened just long enough before the shit started to hit the fan to be at the start of AMD's troubles. Adding to that there where two separate issues (one being anti-trust with OEM and the other being the compiler debacle) which meant financially it took AMD a while to recover from the lost sales and reputation damage.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

AMD has never been the most power efficient option (to my knowledge).  There was no reason to expect that to change with Vega.  Intel does CPU's, NVIDIA does GPU's, and AMD tries to juggle both.  They really surprised us with Ryzen, and expecting two miracles in the same year is silly IMO.

7 hours ago, Clanscorpia said:

And you know, their stuff sucked. 

AMD used to be great, and they're beginning to return to their former glory with Ryzen.

Make sure to quote or tag me (@JoostinOnline) or I won't see your response!

PSU Tier List  |  The Real Reason Delidding Improves Temperatures"2K" does not mean 2560×1440 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, JoostinOnline said:

We wouldn't even have 64-bit without them.

That's false. AMD didn't introduce 64bit to x86. They just made a good version of it. Prior to that, Intel was working on IA64 to extend the existing IA32, but it sucked and AMD pushed a desktop processor with 64bit capability to the common man before IA64 trickled down from the enterprise.

Come Bloody Angel

Break off your chains

And look what I've found in the dirt.

 

Pale battered body

Seems she was struggling

Something is wrong with this world.

 

Fierce Bloody Angel

The blood is on your hands

Why did you come to this world?

 

Everybody turns to dust.

 

Everybody turns to dust.

 

The blood is on your hands.

 

The blood is on your hands!

 

Pyo.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Drak3 said:

That's false. AMD didn't introduce 64bit to x86. They just made a good version of it. Prior to that, Intel was working on IA64 to extend the existing IA32, but it sucked and AMD pushed a desktop processor with 64bit capability to the common man before IA64 trickled down from the enterprise.

Interesting information.  Regardless, my point remains the same.  Bulldozer isn't a reflection of their entire history.

Make sure to quote or tag me (@JoostinOnline) or I won't see your response!

PSU Tier List  |  The Real Reason Delidding Improves Temperatures"2K" does not mean 2560×1440 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, JoostinOnline said:

Interesting information.  Regardless, my point remains the same.  Bulldozer isn't a reflection of their entire history.

It partially is. Bulldozer was a huge gamble, as was the dual core and AMD64, and it could have been an amazing option had IPC not been sacrificed (and it was heavily marketed for the use cases Bulldozer was relatively good in). It deviates from the pattern dual core and AMD64 set only in the fact that it failed.

Come Bloody Angel

Break off your chains

And look what I've found in the dirt.

 

Pale battered body

Seems she was struggling

Something is wrong with this world.

 

Fierce Bloody Angel

The blood is on your hands

Why did you come to this world?

 

Everybody turns to dust.

 

Everybody turns to dust.

 

The blood is on your hands.

 

The blood is on your hands!

 

Pyo.

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Clanscorpia said:

No, I just find it extremely stupid that you buy an inferior product just to support a company. Thats one word. Fanboyism. If these companies didnt hold their patents so stringently and made it so other companies could actually compete we wouldnt have issues like this. And you know, AMD could make a good higher end card for once too.

Not really fanboyism... Custom pc's are still a smaller percentage to prebuilts so if my mom went out to buy a laptop or desktop that had in an iferior product it doesn't make her a fan of anything, likewise in the past when nvidia had the inferior product to amd they still out sold AMD that's not fanboyism either people just bought nvidia because that's the brand they were familiar with before and they will continue to do so because they can't be bothered to check reviews and such they just know nvidia is good and they'll keep buying it. Some countries you can only find nvidia cards example my country out of all the pc stores here only 1 store carried any AMD cards from any generation.

CPU: Intel i7 7700K | GPU: ROG Strix GTX 1080Ti | PSU: Seasonic X-1250 (faulty) | Memory: Corsair Vengeance RGB 3200Mhz 16GB | OS Drive: Western Digital Black NVMe 250GB | Game Drive(s): Samsung 970 Evo 500GB, Hitachi 7K3000 3TB 3.5" | Motherboard: Gigabyte Z270x Gaming 7 | Case: Fractal Design Define S (No Window and modded front Panel) | Monitor(s): Dell S2716DG G-Sync 144Hz, Acer R240HY 60Hz (Dead) | Keyboard: G.SKILL RIPJAWS KM780R MX | Mouse: Steelseries Sensei 310 (Striked out parts are sold or dead, awaiting zen2 parts)

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, cj09beira said:

When I say Fiji drivers I mean the drivers aren't using some huge Vega features like rasterizer and the extra geometry output. According to AMD Vega has 2.75 the geometry output of Fiji (might be in relation to something else) and that has not been seem in any test. 

I only expect lower power for the same perf (no clue how much) and better perf on geometry heavy benchmarks. 

I don't know how or why it's not working I just know it's not. 

I think In part it might be new instructions because if it wasn't it would be on as soon as the card was working, unless it's disabled on silicon but I find that hard to believe

Then just say the drivers are immature still, and if you think that's the cause then chances are they will not suddenly mature before the gaming version is out.

Saying that the current Vega card runs on a Fiji driver makes absolutely no sense.

 

 

4 hours ago, leadeater said:

Um what? The software is the same in the test all the way through, Solidworks 2016 SP 0.1. As I said you are almost asking for the impossible beyond me personally doing it. Puget Systems are also experts in building professional workstations and you wanted evidence that GTX vs Quadro drivers had real impact on professional application performance well there it is, the software won't even let you run certain features on GTX drivers without hacking around it.

 

If you want to utterly ignore it and write it off for arbitrary and stupid reasons to fit your point go ahead, it shows exactly what you asked for.

I am not writing it off for arbitrary reasons. I am writing it off because:

1) They are not using cards with the same GPU cores. So the hardware (as in, the actual GPU) and the software (as in, the drivers) changed between tests. It is fairly hard to conclude that it is a driver thing when the hardware changed too.

2) They used a workaround to get Solidworks working on GeForce but they did not do it properly so the profile was not loaded.

3) A Quadro card performing better in Solidworks does not mean a GeForce card will perform better in games. The consumer cards are usually workstation cards with things disabled, so it's not really a surprise that there are scenarios where workstation cards performs better. The reverse is a lot harder to find examples of (if any even exists).

 

 

4 hours ago, leadeater said:

For cards running the same base architecture and for applications that are specifically using CUDA acceleration yes you can, this isn't gaming. Those figures come from the number of CUDA cores, clock rate, memory bandwidth etc. There is no other semi impartial way of comparing GPUs to each other other than GFLOPs, the proper way is to test the product which funnily enough has been my point.....

A lot of times, performance does not scale linearly like that. Even the Solidwork benchmarks you posted proves it. The difference in CUDA cores is not the same as the difference in performance between Quadro vs Quadro, nor GeForce vs GeForce cards.

So even the benchmarks you yourself picked disagrees that FLOPS is a meaningful measurement.

 

5 hours ago, leadeater said:

The Titan X with the GM200 die should stomp all over the K2200 with the GM107 die and it doesn't, it's not even close.

Because Puget did not inject the Solidworks profile for the GeForce tests.

 

5 hours ago, leadeater said:

Here's another review http://www.tomshardware.com/reviews/nvidia-geforce-gtx-1080-pascal,4572-9.html. Here it shows both that there can be little difference between GTX and Quadro and it also hows the difference can be huge.

Again, they are comparing different GPUs. I really should not have to explain to you why comparing a pascal GeForce card to a Maxwell Quadro card will not be usable to conclude that it's some driver magic going on.

Also, like I have said several times now, Quadro cards have things enabled while GeForce cards have things disabled. It is very hard to find any examples of the reverse being true (which would make the GeForce card with the exact same GPU core perform better than the Quadro card in the same performance with the same driver version).

 

21 hours ago, leadeater said:

There can be as much as 80% difference in performance between the Quadro drivers and GeForce drivers and some things outright will not work and cause errors.

5 hours ago, leadeater said:

So as I said and have shown this is a real thing, denying it is denying the truth.

OK I will give you that there can be a 80% difference in performance between a Quadro and GeForce card, but that's only if you change variables such as the GPU core used, or if you don't load the same profiles into the cards. If you do then things will be far more similar than if you don't.

By the way, if you look at the other tests done by the same person you will see that he wrote this:

Quote

Since you asked nicely, here is an article with results for GeForce cards ranging from a GTX 950 all the way up to a Titan X: https://www.pugetsystems.co...

 

The results were a huge surprise to me - usually there is not much difference between GeForce and Quadro cards. Turns out that GeForce cards have absolutely awful performance when using "Shaded w/ Edges" view mode. Not sure why, but I've found some reports online that "soft modding" a GeForce card to look like a Quadro card (basically lets you install and use Quadro drivers) doesn't result in any better performance so it must be a firmware difference rather than simply driver optimizations.

So even the test you yourself posted has concluded that no, it is not a simple driver optimization which results in that massive difference between the GeForce cards and the Quadro cards. The cards are different in other ways which contributes to the difference in performance. It's not the driver.

Please note the "usually there is not much difference between GeForce and Quadro cards". So even if that Solidwork benchmark is accurate, it's an exception exception rather than the norm. So once again the author of the article you linked agrees with me. Just because a statement is false in 1 out of let's say 10 programs doesn't mean it is incurrent in general, nor would it be correct to say the opposite is true in general just because of that 1 program.

 

5 hours ago, leadeater said:

There is a lot more to it than application profiles. Drivers have to work in tandem with the hardware to get the best possible performance, what you are in essence saying is that drivers do not effect performance is counter to reality and something that AMD demonstrates every time in part due to never having that good drivers at release of a product.

I did not even imply that drivers won't make a difference to performance. How you could extrapolate anything of that nature from my post is mind blowing to me.

I am fairly sure you can go through my posts in this thread and see my hint or flat out say the exact opposite several times.

 

What I have said is that:

1) There is no such thing as a Fiji driver that will work with Vega. If the driver works with a Vega card then it's a Vega driver. Period. Saying that the current Vega driver is based on the Fiji driver is also irrelevant because the current Fiji driver is based on Tahiti, and the Tahiti driver was based on the Cayman driver, and so on. They all share a lot of code.

2) Don't expect the drivers at launch to be that much different from the current Vega drivers. It's only been a month or two. Chances are they have been working on the drivers for close to a year already. One or two more months won't turn the card from a mediocre one to a great one.

3) There is no such thing as a "gaming driver" vs a "workstation driver" when it comes to performance. There is no such thing as trading gaming performance for better workstation performance, or vice versa. This is something a lot of people seem to think, hence the whole "it's a workstation card therefore it won't perform well in games" arguments flying around.

 

5 hours ago, leadeater said:

Radeon application profiles don't do a hell of a lot at all, only time they really do is when a game is known to have issues with crossfire and the profile disables it making the game playable.

[Citation Needed]

For GeForce/Quadro drivers, they make hell of a lot of difference.

 

5 hours ago, leadeater said:

Again, those are profiles which gets applied, and they are usually fairly generic in terms of architecture. If it is true that Vega is just another GCN card then the profiles should work just fine for it. No need to rewrite a bunch of it to create a "gaming driver".

 

5 hours ago, leadeater said:

So don't test it or talk about it because it's marketing? Faith has nothing to do with it.

Well it has been tested, and it indicated that there is next to no change in IPC. Since people did not like that conclusion they are now resorting to putting faith in future updates to live up to their expectations.

 

 

 

1 hour ago, JoostinOnline said:

AMD has never been the most power efficient option (to my knowledge).  There was no reason to expect that to change with Vega.  Intel does CPU's, NVIDIA does GPU's, and AMD tries to juggle both.  They really surprised us with Ryzen, and expecting two miracles in the same year is silly IMO.

GeForce 400 vs Radeon 5000.

That was not really AMD being efficient as much as it was Nvidia fucking up but the end result was that AMD cards used a lot less power on the high end. The GTX 460 was good though. The GTX 480... Not so much.

 

 

 

9 hours ago, goodtofufriday said:

If people bought 0% of a companys product, company goes under, then theres no competition. In this case leaving only Nvidia. I guess thats what you want then?

Gonna have to agree with everyone else here. Buying AMD even if they have an inferior product, just because you feel obligated to give them money is fanboy behavior.

AMD is not a charity, so stop treating it like one.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Hunter259 said:

"Fury X will be an overclockers dream" 

 

We all know how that turned out...

Quote was taken out of context in an interview. Guy was saying the Fury x's cooler was an overclockers dream. Not the card itself. 

CPU: Amd 7800X3D | GPU: AMD 7900XTX

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, LAwLz said:

1) They are not using cards with the same GPU cores. So the hardware (as in, the actual GPU) and the software (as in, the drivers) changed between tests. It is fairly hard to conclude that it is a driver thing when the hardware changed too.

You can't use GTX drivers on Quadro cards...... not without going unsupported.

 

20 minutes ago, LAwLz said:

3) A Quadro card performing better in Solidworks does not mean a GeForce card will perform better in games. The consumer cards are usually workstation cards with things disabled, so it's not really a surprise that there are scenarios where workstation cards performs better. The reverse is a lot harder to find examples of (if any even exists).

I was never talking about gaming, go back to my original post and read it again and the commentary I was making. See this is my point your are ignoring the narrative of the post and just arguing for the sake of arguing.

 

It exists and I've given you two difference sources of benchmarks showing it, on more than one application. Then in the case of toms hardware there is a 980 and a M5000 which is the exact same card, both are GM204 2048:128:64. I know there is firmware differences between Quadro and GTX but that is still exactly in line with my original point, see below.

 

On 7/15/2017 at 0:04 PM, leadeater said:

At the release of the GeForce 1080 how about you get that card and put it through all the tests of a professional workstation card then compare it to an AMD certified workstation card. Then as a result of that testing declare that the Pascal architecture is terrible for workstation/professional usage, which we know due to proper analytical testing on actual hardware is not the case.

 

I can say with pretty good confidence that a few things would have happened, many would say that it's not a Quadro and doesn't have optimized drivers, ;), and that many features are not available on that card which is applicable to those use cases.

Holy crap predicting the future or what....

 

Link to comment
Share on other sites

Link to post
Share on other sites

58 minutes ago, leadeater said:

You can't use GTX drivers on Quadro cards...... not without going unsupported.

 

I was never talking about gaming, go back to my original post and read it again and the commentary I was making. See this is my point your are ignoring the narrative of the post and just arguing for the sake of arguing.

 

It exists and I've given you two difference sources of benchmarks showing it, on more than one application. Then in the case of toms hardware there is a 980 and a M5000 which is the exact same card, both are GM204 2048:128:64. I know there is firmware differences between Quadro and GTX but that is still exactly in line with my original point, see below.

 

Holy crap predicting the future or what....

 

People tend to forget we have software assisted hardware and any software optimizations improves performance of said hardware  be it minuscule or significant

CPU: Intel i7 7700K | GPU: ROG Strix GTX 1080Ti | PSU: Seasonic X-1250 (faulty) | Memory: Corsair Vengeance RGB 3200Mhz 16GB | OS Drive: Western Digital Black NVMe 250GB | Game Drive(s): Samsung 970 Evo 500GB, Hitachi 7K3000 3TB 3.5" | Motherboard: Gigabyte Z270x Gaming 7 | Case: Fractal Design Define S (No Window and modded front Panel) | Monitor(s): Dell S2716DG G-Sync 144Hz, Acer R240HY 60Hz (Dead) | Keyboard: G.SKILL RIPJAWS KM780R MX | Mouse: Steelseries Sensei 310 (Striked out parts are sold or dead, awaiting zen2 parts)

Link to comment
Share on other sites

Link to post
Share on other sites

On 7/14/2017 at 0:46 PM, Hunter259 said:

And Quadro's aren't yet they are within 10% of Geforce cards. Come on now.

19 hours ago, bomerr said:

Go game on a Quadro and it'll provide roughly the same performance as a GeForce of similar specs (cores, hz, etc). 

People keep saying this, yet they compare a $5k pro card against a $1k "gaming" card (yes, I know the Titans aren't technically classified as gaming cards, but that's the market they're aimed at regardless).  For that price, I would hope it would perform well enough.  I guarantee that if you picked a $1k Quadro, it would suck at gaming.

 

4 hours ago, Drak3 said:

That's false. AMD didn't introduce 64bit to x86. They just made a good version of it. Prior to that, Intel was working on IA64 to extend the existing IA32, but it sucked and AMD pushed a desktop processor with 64bit capability to the common man before IA64 trickled down from the enterprise.

That's actually incorrect.  IA64 lost out because it had absolutely no native x86 compatibility.  There was emulation for x86, but it lagged horribly.  Intel was trying to abandon x86 completely.  In part, I imagine it was to distance themselves from the cross-licensing deals with AMD (that's my personal opinion, not a known fact).  If they had succeeded, AMD would have been left behind permanently.  Unfortunately for them, AMD released x86-64 extensions, which worked well and were popular.  The Itanic, on the other hand, just sank.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Jito463 said:

The Itanic, on the other hand, just sank.

I approve of your pun, excellent execution xD

 

I do wonder what would of happened if IA64 really did catch on. How Microsoft would of dealt with etc, any legal mess as well.

Link to comment
Share on other sites

Link to post
Share on other sites

38 minutes ago, Jito463 said:

People keep saying this, yet they compare a $5k pro card against a $1k "gaming" card (yes, I know the Titans aren't technically classified as gaming cards, but that's the market they're aimed at regardless).  For that price, I would hope it would perform well enough.  I guarantee that if you picked a $1k Quadro, it would suck at gaming.

 

That's actually incorrect.  IA64 lost out because it had absolutely no native x86 compatibility.  There was emulation for x86, but it lagged horribly.  Intel was trying to abandon x86 completely.  In part, I imagine it was to distance themselves from the cross-licensing deals with AMD (that's my personal opinion, not a known fact).  If they had succeeded, AMD would have been left behind permanently.  Unfortunately for them, AMD released x86-64 extensions, which worked well and were popular.  The Itanic, on the other hand, just sank.

They were compared due to the fact the chips are identical besides the extra compute power being unlocked in FP, the pro drivers, and the extra vram. Besides that they were the same card. Made it very easy. They could have done it with something else but hell, If I get offered a $5k to do a review on fuck yeah i'll take it over a cheaper one. Plus people LOVE seeing stuff they would never buy.

Main Gaming PC - i9 10850k @ 5GHz - EVGA XC Ultra 2080ti with Heatkiller 4 - Asrock Z490 Taichi - Corsair H115i - 32GB GSkill Ripjaws V 3600 CL16 OC'd to 3733 - HX850i - Samsung NVME 256GB SSD - Samsung 3.2TB PCIe 8x Enterprise NVMe - Toshiba 3TB 7200RPM HD - Lian Li Air

 

Proxmox Server - i7 8700k @ 4.5Ghz - 32GB EVGA 3000 CL15 OC'd to 3200 - Asus Strix Z370-E Gaming - Oracle F80 800GB Enterprise SSD, LSI SAS running 3 4TB and 2 6TB (Both Raid Z0), Samsung 840Pro 120GB - Phanteks Enthoo Pro

 

Super Server - i9 7980Xe @ 4.5GHz - 64GB 3200MHz Cl16 - Asrock X299 Professional - Nvidia Telsa K20 -Sandisk 512GB Enterprise SATA SSD, 128GB Seagate SATA SSD, 1.5TB WD Green (Over 9 years of power on time) - Phanteks Enthoo Pro 2

 

Laptop - 2019 Macbook Pro 16" - i7 - 16GB - 512GB - 5500M 8GB - Thermal Pads and Graphite Tape modded

 

Smart Phones - iPhone X - 64GB, AT&T, iOS 13.3 iPhone 6 : 16gb, AT&T, iOS 12 iPhone 4 : 16gb, AT&T Go Phone, iOS 7.1.1 Jailbroken. iPhone 3G : 8gb, AT&T Go Phone, iOS 4.2.1 Jailbroken.

 

Link to comment
Share on other sites

Link to post
Share on other sites

58 minutes ago, Jito463 said:

People keep saying this, yet they compare a $5k pro card against a $1k "gaming" card (yes, I know the Titans aren't technically classified as gaming cards, but that's the market they're aimed at regardless).  For that price, I would hope it would perform well enough.  I guarantee that if you picked a $1k Quadro, it would suck at gaming.

I think this is a misunderstanding of the market. Quadro/FirePro aren't expensive because they provide more performance than Geforce/Radeon cards but because Nvidia/AMD are able to segment the market and change professional users more money for nearly the same performance as consumers by withholding a few extra features that business really need (e.g. validated drivers). 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, bomerr said:

I think this is a misunderstanding of the market. Quadro aren't expensive because they provide more performance than Geforce cards but because Nvidia is able to segment the market and change professional users more money for nearly the same performance by withholding a few extra features that business really need. 

And they come with a guarantee that the card and it's drivers adheres more strictly to the set standards and has gone through the required testing to make sure of it. You're not paying for extra performance you are paying for peoples time to do that certification process and to maintain another driver stack. That is also why professional software developers also lock features out of GeForce which can effect performance, they are reducing the certification requirement of hardware and are choosing to use a more stable driver stack.

 

There are actual features that GeForce cards can lack like reduced DP or HP, or no ECC etc but those cost wise to a GPU maker is cheaper than the other factors which requires employees and their time. Parts are cheap people are not.

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, leadeater said:

And they come with a guarantee that the card and it's drivers adheres more strictly to the set standards and has gone through the required testing to make sure of it. You're not paying for extra performance you are paying for peoples time to do that certification process and to maintain another driver stack. That is also why professional software developers also lock features out of GeForce which can effect performance, they are reducing the certification requirement of hardware and are choosing to use a more stable driver stack.

 

There are actual features that GeForce cards can lack like reduced DP or HP, or no ECC etc but those cost wise to a GPU maker is cheaper than the other factors which requires employees and their time. Parts are cheap people are not.

Yep. For a residential consumer if a pixel is incorrectly rendered it might not matter. For a business consumer if a pixel isn't rendered correctly then it might mean a bridge will collapse. So the business customers are willing to pay an arm and a leg to make sure that doesn't happen. @Jito463 so the cost is based more on what people are willing to pay rather than raw performance figures. That's why I said if you benched a GeForce and Quadro of similar specifications then you would see similar gaming performance. 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, leadeater said:

And they come with a guarantee that the card and it's drivers adheres more strictly to the set standards and has gone through the required testing to make sure of it. You're not paying for extra performance you are paying for peoples time to do that certification process and to maintain another driver stack. That is also why professional software developers also lock features out of GeForce which can effect performance, they are reducing the certification requirement of hardware and are choosing to use a more stable driver stack.

 

There are actual features that GeForce cards can lack like reduced DP or HP, or no ECC etc but those cost wise to a GPU maker is cheaper than the other factors which requires employees and their time. Parts are cheap people are not.

 

Not too mention more ram and higher quality controls on/from board partners.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

30 minutes ago, mr moose said:

Not too mention more ram and higher quality controls on/from board partners.

The last point is a little funny since we just got hit by a bad production run of Nvidia Tesla M10's in our VDI cluster, one of them burned out and almost set off our fire suppression system then not long after that Nvidia issued a product recall for that production run of GPUs. We had 4 that ended up getting replaced.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, leadeater said:

The last one is a little funny since we just got hit by a bad production run of Nvidia Tesla M10's in your VDI cluster, one of them burned out and almost set off our fire suppression system then not long after that Nvidia issued a product recall for that production run of GPUs. We had 4 that ended up getting replaced.

More likely to get a bad production run than a single bad card when quality control is more stringent. I know it sounds counter intuitive, but that's essentially the nature of manufacturing.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, leadeater said:

You can't use GTX drivers on Quadro cards...... not without going unsupported.

Yes... and? You can't run Solidworks on GeForce drivers without doing workarounds either.

I really don't see your point here.

 

9 hours ago, leadeater said:

I was never talking about gaming, go back to my original post and read it again and the commentary I was making. See this is my point your are ignoring the narrative of the post and just arguing for the sake of arguing.

I am not. The problem is that you made a blanket statement that was not really meant for me in your reply to me and as I said, I was not sure if that was aimed at me or not but I decided to respond anyway, giving my thoughts on the things you brought up.

 

9 hours ago, leadeater said:

It exists and I've given you two difference sources of benchmarks showing it, on more than one application. Then in the case of toms hardware there is a 980 and a M5000 which is the exact same card, both are GM204 2048:128:64. I know there is firmware differences between Quadro and GTX but that is still exactly in line with my original point, see below.

Not sure what you want to prove with the Tomshardware link.

AutoCAD 3D - They perform about the same.

Maya 2013 - They perform about the same.

Showcase 2013 - They perform about the same when you factor in the higher clock speed and boost of the 980.

 

Creo 2, Catia V6 R2012 and SolidWorks 2013 can easily be explained by the profile missing, the extra memory on the Quadro gets used, or this rumored firmware difference (aka, artificially crippling the GeForce card in order to sell more Quadros).

Again, this does not prove anything about the difference between Vega Fe and Vega RX, other than that Vega RX might perform worse than Vega Fe in some programs, assuming that AMD decides to cripple it in firmware too. But as it has already been shown, it is not because of drivers. In fact, the Quadro drivers are based on the GeForce ones. It's just that they get verified more intensely and then ships with different application profiles (which can be loaded into other cards).

 

 

There is a mismatch between our arguments.

What I am saying - Don't expect Vega RX to perform better in games than Vega Fe, because Vega RX most likely will not have any special sauce that makes games better.

What you're saying - There is a difference between Quadro and GeForce cards, and that difference is that Quadro cards are better in some programs.

 

Both statements can be correct, and to some degree your statement actually fuels on my argument because the links you have posted have never shown a GeForce card performing better than a Quadro one, other than very small differences which can be explained by things like higher clocks.

 

 

5 hours ago, bomerr said:

I think this is a misunderstanding of the market. Quadro/FirePro aren't expensive because they provide more performance than Geforce/Radeon cards but because Nvidia/AMD are able to segment the market and change professional users more money for nearly the same performance as consumers by withholding a few extra features that business really need (e.g. validated drivers). 

This is exactly my point that I have been arguing for the entire thread.

People think that there is such a thing like reducing workstation performance in order to boost gaming performance, or vice versa. That idea is wrong and is giving people false expectations.

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, LAwLz said:

Then just say the drivers are immature still, and if you think that's the cause then chances are they will not suddenly mature before the gaming version is out.

Saying that the current Vega card runs on a Fiji driver makes absolutely no sense.

 

 

Well it has been tested, and it indicated that there is next to no change in IPC. Since people did not like that conclusion they are now resorting to putting faith in future updates to live up to their expectations.

According to GN tests, there IS a clock for clock advantage on Vega compared to Fiji on professional workloads, between 20% and 100% ( except a couple test which go as high as 7x, but that's proably for the larger amount of vram)

However, there is no gain in gaming, which seems strange. The whole "Vega uses Fiji drivers", derived from AMD own statement at CES, is partly wrong but it's strenghtened by gaming tests, which show no IPC advantage and bad clock scaling. Moreover, features like tile based rasterization seem to be off.

Since FE drivers are a different branch, they're probably using a more stable Fiji derived driver with optimizations for pro workloads but no "new gaming guidelines" for the new GPU, resulting in the same performance for the same number of cores.

Again, 2 weeks and we'll see.

On a mote of dust, suspended in a sunbeam

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, LAwLz said:

-snip-

Yes but do you have any idea why I made that statement? Or what I was actually saying? Seems to me you don't.

 

Go back through our entire conversation but now replace everything about Nvidia with AMD. Now what was it you were complaining about?

 

If you are getting annoyed at fanboyism and trying to call it out don't then be essentially guilty of the very thing you are complaining about in the very next posts you make and through an entire conversation chain. Go back and read your replies, the continued dismissal of evidence, the reasoning you are trying to give, all of it the very thing you are complaining about. Also the very thing I said would happen in my post. How is it you are unable to see that what you are saying sounds exactly like an AMD fan, just about a different topic.

 

My post may not have been directly aimed at you but you were quoted since I thought you should read it and understand what the point of it was. My post was a mirror, you looked in to it but couldn't see the reflection. This is the best way I can think of to describe it.

 

You even then started to try and use what I was pointing out in regards to Quadro and GeForce. Your stance on how to compare products is flip flopping all over the place.

 

The first part of my original post and all others after it had nothing to do with Vega FE at all, the whole point was how you can not use one product line to draw any conclusive statements about another different line of products i.e. GeForce and Quadro which half way through you started to take that stance. You can not also use the difference between GeForce and Quadro to mean the same difference will apply to AMD product lines, that shouldn't be done.

 

So do you agree or not, can you or can you not draw conclusive evidence from a product by testing another different product line?

 

Also it appears that you are bringing problems/issues from other sources on the internet e.g. reddit over to here and acting like it is an equivalent problem here. Are you also not aware that the comment you made is a common generator of what you are complaining about? Actions and words have consequences, ones you have to live with so it is within your own interest to not make comments like that.

 

2 hours ago, LAwLz said:

Both statements can be correct, and to some degree your statement actually fuels on my argument because the links you have posted have never shown a GeForce card performing better than a Quadro one, other than very small differences which can be explained by things like higher clocks.

No the links I gave shows it, more than once. Loading an unsupported profile for Solidworks doesn't prove a hell of a lot, there is no guarantee that works for every card/model and you may not be able to do that for every application.

 

You don't recommend to a professional studio to buy GeFroce cards because you might be able to hack around the limitations of that product line. If it doesn't work under stock settings it doesn't work, anything after that comes under disclaimer "Your results may vary" something you don't have to do with the Quadro product line.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×