Jump to content

Vice President and General Manager of AMD Radeon Gaming accuses NVIDIA GPP of monopolistic and anti competitive practices

Master Disaster
11 minutes ago, Razor01 said:

Like?  if you are talking about AMD's slides, sorry those were loaded ;)

 

 

Significantly faster...

Cinema 4D
Creo

 

Parity with P6000

3DS Max

Autocad 2017

 

Slightly behind P6000

SolidWorks 2015

 

These are results from Toms Hardware.

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, KarathKasun said:

Significantly faster...

Cinema 4D
Creo

 

Parity with P6000

3DS Max

Autocad 2017

 

Slightly behind P6000

SolidWorks 2015

 

These are results from Toms Hardware.

LOL what are we talking about HPC work or rendering/graphics work?

Link to comment
Share on other sites

Link to post
Share on other sites

32 minutes ago, KarathKasun said:

Titan XP2.

There's no such thing as the TitanXP2.

 

There's the Titan X Pascal and the Titan Xp. Two different GPUs. Don't be a noob.

Our Grace. The Feathered One. He shows us the way. His bob is majestic and shows us the path. Follow unto his guidance and His example. He knows the one true path. Our Saviour. Our Grace. Our Father Birb has taught us with His humble heart and gentle wing the way of the bob. Let us show Him our reverence and follow in His example. The True Path of the Feathered One. ~ Dimboble-dubabob III

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, DildorTheDecent said:

There's no such thing as the TitanXP2.

 

There's the Titan X Pascal and the Titan Xp. Two different GPUs. Don't be a noob.

Eh, whatever.  The NV naming convention for the past 3 Titans is flat out stupid.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, KarathKasun said:

Actually, during the time period you are talking about, no.

Even now, Vega has some datacenter and workstation workloads where it is ahead of the Titan XP2.

 

36 minutes ago, Razor01 said:

LOL what are we talking about HPC work or rendering/graphics work?

I have yet to find any HPC sources that can truly be considered non partisan.  Though you can extrapolate some type of compute workloads from datasets like this one...

https://www.anandtech.com/show/11717/the-amd-radeon-rx-vega-64-and-56-review/17

 

Fluid simulation seems to prefer Vega and the same can be said about ray-tracing.

 

Not that it much matters with Titan V out and about now.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Razor01 said:

Like?  if you are talking about AMD's slides, sorry those were loaded ;)

 

 

And so are Nvidia's

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

27 minutes ago, KarathKasun said:

 

I have yet to find any HPC sources that can truly be considered non partisan.  Though you can extrapolate some type of compute workloads from datasets like this one...

https://www.anandtech.com/show/11717/the-amd-radeon-rx-vega-64-and-56-review/17

 

Fluid simulation seems to prefer Vega and the same can be said about ray-tracing.

 

Not that it much matters with Titan V out and about now.

No you can't we were talking about CUDA advantages over Open Ll, so far all you have shown are OGL vs OGL, or Open CL vs Open CL ;)

 

Where you still see huge deficient where AMD hardware should have an outright advantage, not just the around the same.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, AluminiumTech said:

And so are Nvidia's

I didn't link any nV slides ;) I linked to 3rd party whitepapers comparing differences between the API's doing the same thing (results), by different implementations.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Razor01 said:

No you can't we were talking about CUDA advantages over Open Cl, so far all you have shown are OGL vs OGL, or Open CL vs Open CL ;)

 

Where you still see huge deficient where AMD hardware should have an outright advantage, not just the around the same.

Only because of CUDA having more time invested by coders.  OCL can achieve a similar throughput when optimization time is invested.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, KarathKasun said:

Only because of CUDA having more time invested by coders.  OCL can achieve a similar throughput when optimization time is invested.

Yeah but not from a user perspective its from what nV has done with CUDA and their hardware to take advantage of that, it has to do with features in the CUDA instruction set.  Which I pointed out, please read the papers if you want to find out what those are.

 

Need the hardware, the ISA of the ASIC must be capable of using those instructions or similar instructions in Open CL, and so far its been many years and we haven't see them incorporated into Open CL.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Razor01 said:

Yeah but not from a user perspective its from what nV has done with CUDA and their hardware to take advantage of that, it has to do with features in the CUDA instruction set.  Which I pointed out, please read the papers if you want to find out what those are.

That in itself is arguable.  Ive actually done the research on this prior to this thread, and that used to be true.  The tools for OCL have actually advanced a lot in the last two or three years.

 

The most hilarious thing is that NV tends to neglect the OCL tools for its own hardware to push CUDA, so if you maintain a platform agnostic code base you get punished with worse performance.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, KarathKasun said:

That in itself is arguable.  Ive actually done the research on this prior to this thread, and that used to be true.  The tools for OCL have actually advanced a lot in the last two or three years.

 

 

Please show me those then because in the past 2 years I have not seen much change in this regard.  HSA libraries are still not matching what the CUDA libraries can do across numerous applications.

Link to comment
Share on other sites

Link to post
Share on other sites

https://www.phoronix.com/scan.php?page=news_item&px=GTX-1080-OpenCL-vs-CUDA

 

Pascal OpenCL support was only at v1.2 for quite awhile while R9 290 supported OpenCL 2.0 since the time of Ubuntu 14.

 

This is why OpenCL performance on NV is crap.  The toolset and drivers are not developed for OpenCL because they would rather force you to use CUDA.

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, KarathKasun said:

https://www.phoronix.com/scan.php?page=news_item&px=GTX-1080-OpenCL-vs-CUDA

 

Pascal OpenCL support was only at v1.2 for quite awhile while R9 290 supported OpenCL 2.0 since the time of Ubuntu 14.

 

This is why OpenCL performance on NV is crap.  The toolset and drivers are not developed for OpenCL because they would rather force you to use CUDA.

Dude I'm not talking about Open CL vs CUDA on nV hardware we all know that nV doesn't care about Open CL.

 

Need to see Open CL AMD vs CUDA nV doing the same stuff.

 

Look most tests we have seen in recent years have been CUDA 8.0 vs a wide range of Open CL versions.  and it heavily favors nV hardware.  Now nV is now going to CUDA 9 and dropping support of CUDA 7.  Also deprecating quite a few features as wall.  When and if Open CL 2.0 comes out, in all likely hood they will catch up to CUDA 8, but we are talking what 4 years of nV dominance because of feature sets and CUDA?  And it will continue with CUDA 9.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Razor01 said:

Dude I'm not talking about Open CL vs CUDA on nV hardware we all know that nV doesn't care about Open CL.

 

Need to see Open CL AMD vs CUDA nV doing the same stuff.

Those are different use cases anyway.  OpenCL code can natively run on a whole host of hardware (with different compute focuses on each) where CUDA can not.  They are not 1 to 1 comparable because of the nature of the API's.

 

What you are asking is similar to asking if C++ is better than OpenGL, and its only like that because of NV's implementation of CUDA.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, KarathKasun said:

Those are different use cases anyway.  OpenCL code can natively run on a whole host of hardware (with different compute focuses on each) where CUDA can not.  They are not 1 to 1 comparable.

 

 

It doesn't matter if the code is portable, the reason why nV has cornered the market was because it just runs better.  Also arguable easier to code for, since they are use to coding in CUDA.

 

Scientists don't give a shit on the hardware they use, they care about the end results of what they need that hardware for.  And if CUDA and nV hardware get them to where they want to be faster and quicker (code wise) that is all they care about.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Razor01 said:

 

 

It doesn't matter if the code is portable, the reason why nV has cornered the market was because it just runs better.

 

Scientists don't give a shit on the hardware they use, they care about the end results of what they need that hardware for.  And if CUDA and nV hardware get them to where they want to be faster and quicker (code wise) that is all they care about.

Except when they have to move that code to a new cluster installation down the road.  Or when they have to do more kludgey things to get data back and forth from CUDA to a CPU for specific tasks.

 

CUDA does not "just run better", its faster at some specific things because NV have failed to maintain the codebase for OCL on their hardware.  That is the only reason its different at all in 2018.  It is the pinnacle of vendor lock-in and one of the primary pains in the ass of many HPC coders.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, KarathKasun said:

Except when they have to move that code to a new cluster installation down the road.  Or when they have to do more kludgey things to get data back and forth from CUDA to a CPU for specific tasks.

 

CUDA does not "just run better", its faster at some specific things because NV have failed to maintain the codebase for OCL on their hardware.  That is the only reason its different at all in 2018.  It is the pinnacle of vendor lock-in and one of the primary pains in the ass of many HPC coders.

 

 

Dude it doesn't matter, they gave something to these guys they want.  AMD wasn't able to do that with Open CL.  End financial results for these guys matter.

 

If you were a company that needed to get something done, wouldn't you want to spend less time coding (since you are already use to coding in what ever) and when running the tests you get your results faster? 

 

Then you have time in your day to do more work or do other things.

 

Its a win win situation.

 

Would your company care if the guy making the hardware only makes stuff for their hardware?

 

Come on man this is business.  As an end user you if save money and time for your business and get the same results, that is great for you!

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Razor01 said:

 

 

Dude it doesn't matter, they gave something to these guys they want.  AMD wasn't able to do that with Open CL.  End results for these guys matter.

 

If you were a company that needed to get something done, wouldn't you want to spend less time coding (since you are already use to coding in what ever) and when running the tests you get your results faster? 

 

Its a win win situation.

 

Would your company care if the guy making the hardware only makes stuff for their hardware?

 

Come on man this is business.  As an end user you need to save money and time for your business and get the same results, that is great for you!

Knowing some people who code in the field, they do not want it anymore in large part.  Its adding dev time at this point rather than reducing it.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, KarathKasun said:

Knowing some people who code in the field, they do not want it anymore in large part.  Its adding dev time at this point rather than reducing it.

Really I would like to talk to some of them because so far I haven't seen that, I can't even find things on the internet that even points to that.

 

Nor are companies even looking for that right now.  CUDA developers are in higher demand than Open CL devs and CUDA dev's get paid a lot more.

 

This late late year

 

https://www.phoronix.com/forums/forum/hardware/graphics-cards/978493-open-source-opencl-adoption-is-sadly-an-issue-in-2017

 

https://www.phoronix.com/scan.php?page=news_item&px=XDC2017-OpenCL-GPGPU

 

I can only find things that are contradictory to what you just stated.

Link to comment
Share on other sites

Link to post
Share on other sites

On 4/21/2018 at 1:45 AM, Master Disaster said:

I think the issue is you, I and almost everyone likely to be reading this post stand a much higher chance of being informed about what's going on than average John or Sue who just want a computer/laptop that works.

 

Nvidia are banking on there being more less informed people (that sounds so much like an oxymoron) than well informed people who will buy their products without knowing what's going on.

 

IMO anyway.

Late quote here, but I'd like to come at this as a secondhand used buyer.

 

One thing that really gets me going is that since there are less AMD cards on the market (the same can be said about Ryzen and such), the used market is god awful for these cards. Not only are there very few deals even in big cities like DFW, but the ones that you find are nowhere near as good a price as the Nvidia cards (and Intel processors, to go after the idea of Ryzen CPU's) plastered all over craigslist and other local classifieds. For instance, cards that are performing somewhere around the level of 1060's are priced around $50 more than the 1060 deals.

 

There's a certain market for used cards that is being hit by the AMD shortage that's making it harder to buy AMD on a budget and make you lean towards Intel and Nvidia due to sheer market share.

 

All of my craigslist searches have been for Nvidia lately, as there are very few deals on a budget in my area.

My profile picure is real. That's what I look like in real life. I'm actually a blue and white African Wild Dog.

Ryzen 9 5900X - MSI Ventus 2x OC 3060 Ti - 2x8GB Corsair Vengeance LPX 3200MHz CL16 - ASRock B550 Phantom Gaming ITX/ax

EVGA CLC 280 + 2x140mm NF-A14 - Samsung 850 EVO 500GB + WD Black SN750 1TB - Windows 11/10 - EVGA Supernova G3 1000W

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, DaJakerBoss said:

Late quote here, but I'd like to come at this as a secondhand used buyer.

 

One thing that really gets me going is that since there are less AMD cards on the market (the same can be said about Ryzen and such), the used market is god awful for these cards. Not only are there very few deals even in big cities like DFW, but the ones that you find are nowhere near as good a price as the Nvidia cards (and Intel processors, to go after the idea of Ryzen CPU's) plastered all over craigslist and other local classifieds. For instance, cards that are performing somewhere around the level of 1060's are priced around $50 more than the 1060 deals.

 

There's a certain market for used cards that is being hit by the AMD shortage that's making it harder to buy AMD on a budget and make you lean towards Intel and Nvidia due to sheer market share.

 

All of my craigslist searches have been for Nvidia lately, as there are very few deals on a budget in my area.

There are very few deals anywhere.  Right now the cryptomarket is shot to hell, it did get better the last 2 weeks though.  And still no MSRP for either vendor cards.

 

nV cards tend to be lower in price for gaming equivalent cards as of now, but even they aren't worth the cost.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Razor01 said:

Right now the cryptomarket is shot to hell, it did get better the last 2 weeks though.  And still no MSRP for either vendor cards.

Quick question, wasn't the Vega 56 and 64 an extremely powerful card for GPU mining? I thought that was a large contributor to the whole AMD GPU shortage?

My profile picure is real. That's what I look like in real life. I'm actually a blue and white African Wild Dog.

Ryzen 9 5900X - MSI Ventus 2x OC 3060 Ti - 2x8GB Corsair Vengeance LPX 3200MHz CL16 - ASRock B550 Phantom Gaming ITX/ax

EVGA CLC 280 + 2x140mm NF-A14 - Samsung 850 EVO 500GB + WD Black SN750 1TB - Windows 11/10 - EVGA Supernova G3 1000W

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, DaJakerBoss said:

Quick question, wasn't the Vega 56 and 64 an extremely powerful card for GPU mining? I thought that was a large contributor to the whole AMD GPU shortage?

 

Vega is great with XMR mining, second to none.  But XMR and its based coins got hit really hard profitability wise (much harder than other coins), its not even worth mining those coins for day to day profits right now.  6 Vega card will get you around 6 buck a day but power used will eat up all that profit and then some.  Only good for investment reasons, hold on to the coins and wait for them to appreciate.

 

Its actually more profitable to get a 1070 or 1070 ti to mine with than either Vega 56 and 64 currently as the cryptomarket stands when factoring in power.

So I'm 99% sure mining is not what is causing the shortage of Vega cards lol.

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, Humbug said:

The negative publicity and partial boycott can absolutely work. Unless by 'work' if you mean to make half of Nvidia's customers turn away and make Nvidia take a huge financial hit from lost sales in the short term... Nope that's not gonna happen.

 

But what it can very realistically do is swing 15% of the next purchasing decisions of traditional Nvidia customers. And that's a very big deal, it boosts AMD's profits, it gets a few more people into AMD's ecosystem which makes them likely to stay and repeat purchase etc.

Sounds like a very arbitrary number.   Unless you are spending the money on a free/g sync monitor there is no ecosystem.  And just to confuse the issue, 50% of all gaming monitors sold were freesync while something like only 20% of the market have a freesycn capable card.  Which indicates even if AMD do absolutely nothing,  another 30% of the current market is intending to buy an AMD card in the future or don't understand what they are buying.  

https://www.contextworld.com/display-research-update-09-02-2017

 

9 hours ago, Humbug said:

The negative PR against Nvidia  absolutely does benefit AMD. Anyway AMD has no delusions about getting 40-50% discrete GPU market share overnight. They know it's not gonna happen now. But what they do want to do is continue to grow their market share and increase their profits. So every percentage point of market share matters.

The boycotts we are talking about don't even have that kind of detailed planning involved.  Boycotts people are referring to here are not about AMD gaining market share but about punishing Nvidia for perceived injustice.   AMD gaining market share is just the natural consequence of a successful boycott (which I don't believe actually exists or at the very least would be barely measurable).

 

9 hours ago, Humbug said:

All this bad publicity for sure turns off a small Chunk of Nvidia buyers. And AMD will happily welcome them. It matters a lot as AMD continues to become a stronger company...

Of course, every business is the same. any market share is good market share.

9 hours ago, Humbug said:

That's the reason AMD tries to use this negative PR against Nvidia. They know it will reach a certain chunk of the buyers at least. And it costs no marketing money to send out a few tweets and expose the story to the press. Absolutely worth doing for them. 

Not too sure how this relates to my post, we are talking about boycotts not corporate propaganda.

 

9 hours ago, Humbug said:

Another example is all the negative publicity Nvidia has taken for gameworks and not being successful when they work with devs on graphical optimization, or associating their name with the wrong games...  None of this stuff is an absolute disaster for Nvidia. But it would be foolish to think it doesn't matter at all, it does matter and the PR from it does benefit AMD. It's just one more factor in a complicated market. 

Again not too sure what this has to do with boycotts or what I said, but I will say that gameworks has earnt nvidia more market share than bad PR. That makes it a success.  whether a small portion of people hate it or not.

 

 

For a boycott to have any meaningful impact it must meet 4 major criteria:

 

1 Customers must care passionately.  -and enough of them to stick with it

2.The cost of participation must be low. -for most consumers with any type off budget this will be very hard

3.The issues must be easy to understand.  -given the debate over the many Nvidia/AMD topics I'd say most people don't fully understand the issue

4.The mass media is still essential.  -never going to happen with such an isolated product like GPU's, if it were an everyday table item that every households buys then sure.

 

 

 

https://hbr.org/2012/08/when-do-company-boycotts-work

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×