Jump to content

Raja: AMD cannot compete

NumLock21
2 hours ago, schwellmo92 said:

To be fair there isn’t much of a Software and developer ecosystem around AMD GPU’s (compared to NVIDIA). But what’s that got to do with Intel?

Yep, the Nvidia CUDA developer ecosystem is bigger than what AMD has in that space.

 

But it's hard to tell what specifically Raja was referring to? Or was it just an off the cuff marketing statement aimed to push Intel without any substance?

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, leadeater said:

Because there so much around Intel GPUs right?

Well technically Intel are the largest GPU manufacturer by market share ?

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, schwellmo92 said:

Well technically Intel are the largest GPU manufacturer by market share ?

And even people who don't use Intel GPUs (like me) tend to have one. I.e. my i7 3770K has onboard graphics which I do not use cause I Have a discrete R9 290. So even in the PC gaming market most people have purchased Intel GPUs.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Drak3 said:

IRL 5% faster.

 

 

Due to 5% clockspeed increase.

And 25% increase in TDP

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, BuckGup said:

AMD has been ahead on developing awesome OPENSOURCE software.

No they haven't.

Intel is possibly the largest contributor to open source software in the world. AMD's contributions to the open source world is a drop in the bucket compared to Intel. For example in 2016, about 13% of the contributions to the Linux kernel were from Intel. AMD did 1.9% of the contributions. Then we have other open source projects such as OpenStack, their encoders including but not limited to STV-AV1, BlueZ, a ton machine learning contributions, and the list goes on.

 

 

4 hours ago, BuckGup said:

Remember Mantle? Well that's Vulkan now and is super impressive/promising. Freesync? Yeah even Nvidia is using it now. I'm sorry but what Raja? Intel scaaaared

Mantle was shit. AMD kept it closed source for as long as it was relevant. You could give AMD some credit for Vulkan, but at the same time, Intel played a key role in developing Vulkan too. So did Nvidia in fact. I don't get why AMD gets so much credit for Vulkan when it's all the members in the Khronos group developing it.

 

And I think it's a stretch to call FreeSync an AMD invention as well. They suggested repurposing an existing VESA standard, which was developed by VESA, which Nvidia and Intel are both members of. It might even be that the original standard FreeSync is based on was written by someone from Intel or Nvidia (although I don't have the specs to look that up).

 

 

AMD talks a lot more than other companies about how they contribute to open source. If we look at how large contributions they actually do though, they don't that much though.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, NumLock21 said:

He stated AMD has "no software ecosystem that's meaningful, without Intel".

...wich is why the planned 1,5 Exa Flop Oak Ridge National Laboratory Super Computer will ship with AMD Radeon Instincts and not nVidia, thoug there is a 1 Exaflop unit with Intel Chips as well...

 

But hey, what should they do? Make more debt to make Cards that nobody wants to buy and Market it to people who only want AMD to compete so that they can get cheaper GPU from a different company??

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, LAwLz said:

I don't get why AMD gets so much credit for Vulkan

Because:

a) Developer say the functions of Mantle are similar to Vulkan, just renamed

b) AMD gave the Khronos Group the plans for Mantle, wich they used for Vulkan

c) because they made Mantle wich then in turn forced the Development of Vulkan (and DX12 as well).

 

nVidia doesn't want "Lower Level" APU because they want to hav the Control and be able to be the king with their driver optimization. With Lower Level APIs you can optimize less.

 

11 minutes ago, LAwLz said:

when it's all the members in the Khronos group developing it.

Oh yeah, so the things AMD has done have to be trivialized...

 

No, without AMD Mantle, there would be no Vulkan. There would be no DX12 even in 2019.

We would still have the same old APIs that were made for GPUs more than 10 years ago or are a total patchwork that originated in a time when even texturing was uncommon.

 

11 minutes ago, LAwLz said:

And I think it's a stretch to call FreeSync an AMD invention as well. They suggested repurposing an existing VESA standard, which was developed by VESA, which Nvidia and Intel are both members of. It might even be that the original standard FreeSync is based on was written by someone from Intel or Nvidia (although I don't have the specs to look that up).

AMD brought it up to discussion and pushed for it. It might actually be possible that they used the eDP Spec for that."

But that doesn't matter.

 

And what you should have done is bash the one that went out of their way, with an FPGA to do some proprietary shit because they knew that AMD pushed for that Standard, so that they can steal all the fame from all the other Members of the comitee.


How about that for a change?

11 minutes ago, LAwLz said:

AMD talks a lot more than other companies about how they contribute to open source. If we look at how large contributions they actually do though, they don't that much though.

At least they contribute to Open Source and push for Open Standards...

Not like other Companys that sabotage Open Standards and try to proprietarize that open standard and steal the ideas of others with all their might and money, just to claim that they have "something better"...

 

That's something you really should be criticizing, not "a small contribution" from a Company that hardly has any R&D Budget for those endeavors at all...

 

Why not criticize the other bigger one that has more resources than AMD but doesn't contribute to Open Source?

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

Quote

the size of the developer ecosystem is tiny. In fact, without our invaluable software contributions they have no software ecosystem that’s meaningful

Last time I checked AMD has one of the biggest, if not biggest, Open Source and even FOSS repo library in the market and the company itself pushes for Open Source alternatives like crazy as well so that developers aren't cucked to only use proprietary software.

 

So maybe take his words with an incredibly T H I C C grain of salt, because it sounds like a desperate move by Intel to please shareholders with the "we good they bad" talking points?

Very similar to the "glued together die" remark, reeks with ignorance.

Link to comment
Share on other sites

Link to post
Share on other sites

34 minutes ago, LAwLz said:

AMD talks a lot more than other companies about how they contribute to open source. If we look at how large contributions they actually do though, they don't that much though.

Some of it is due to what's happening on Linux and because Nvidia is always the company that AMD is compared to.

 

On the Linux side the ecosystem is optimized for open source graphics drivers rolled into the kernel, and AMD relies on that as their official driver for gamers. In the last few years AMD has garnered a lot of good faith in that community- they have improved their Linux open source graphics drivers to be as good as Nvidia's proprietary drivers. Plus their devs post on forums answering questions, engage with the community, release adequate documentation on their products enabling 3rd party contributions from companies like Red Hat and Valve to contribute etc. This approach works well on Linux because it means that you don't need to download separate driver packages and run them, you don't have to worry about kernel updates breaking things etc.

 

On the other hand for Nvidia on Linux they are very closed, and even GPU boost clocks do not work with the open source driver due to lack to support from Nvidia. They want to keep all their secret sauce stuff wrapped up in their proprietary driver.

 

It's absolutely true that Intel has contributed more code, even if they engage less with the gaming side of the community. But the Linux people do not see Intel GPU hardware as powerful enough, so what they compare AMD to is Nvidia, and it makes AMD look very good in comparison.  

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, BuckGup said:

Intel has some pretty impressive CPUs coming but AMD has been ahead on developing awesome OPENSOURCE software. Remember Mantle? Well that's Vulkan now and is super impressive/promising. Freesync? Yeah even Nvidia is using it now. I'm sorry but what Raja? Intel scaaaared

AMD didn't create Freesync, it's literally just adaptive sync renamed by AMD to fit their marketing.

 

On topic: wtf is he waffling about? AMD Adrenaline is actually really nice and imo AMDs drivers are about the best they've ever been. I'm not sure Intel should be mentioning software in any context given how bad some of there's is, Intel onboard video drivers anyone?

Main Rig:-

Ryzen 7 3800X | Asus ROG Strix X570-F Gaming | 16GB Team Group Dark Pro 3600Mhz | Corsair MP600 1TB PCIe Gen 4 | Sapphire 5700 XT Pulse | Corsair H115i Platinum | WD Black 1TB | WD Green 4TB | EVGA SuperNOVA G3 650W | Asus TUF GT501 | Samsung C27HG70 1440p 144hz HDR FreeSync 2 | Ubuntu 20.04.2 LTS |

 

Server:-

Intel NUC running Server 2019 + Synology DSM218+ with 2 x 4TB Toshiba NAS Ready HDDs (RAID0)

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Humbug said:

I think some of it is due to what's happening on Linux and because Nvidia is always the company that AMD is compared to.

 

On the Linux side the ecosystem is optimized for open source graphics drivers rolled into the kernel, and AMD relies on that as their official driver for gamers. In the last few years AMD has garnered a lot of good faith in that community- they have improved their Linux open source graphics drivers to be as good as Nvidia's proprietary drivers. Plus their devs post on forums answering questions, engage with the community, release adequate documentation on their products enabling 3rd party contributions from companies like Red Hat and Valve to contribute etc.

 

On the other hand for Nvidia on Linux they are very closed, and even GPU boost clocks do not work with the open source driver due to lack to support from Nvidia. They want to keep all their secret sauce stuff wrapped up in their proprietary driver.

 

It's absolutely true that Intel has contributed more. But the Linux people do not see Intel GPU hardware as powerful enough, so what they compare AMD to is Nvidia, and it makes AMD look good. 

In my experience there's no need to touch AMDs drivers on Linux unless you want compute support. The open source AMD driver on Linux is more than good enough for gaming and actually installing the proprietary drivers on Linux tends to hurt gaming.

Main Rig:-

Ryzen 7 3800X | Asus ROG Strix X570-F Gaming | 16GB Team Group Dark Pro 3600Mhz | Corsair MP600 1TB PCIe Gen 4 | Sapphire 5700 XT Pulse | Corsair H115i Platinum | WD Black 1TB | WD Green 4TB | EVGA SuperNOVA G3 650W | Asus TUF GT501 | Samsung C27HG70 1440p 144hz HDR FreeSync 2 | Ubuntu 20.04.2 LTS |

 

Server:-

Intel NUC running Server 2019 + Synology DSM218+ with 2 x 4TB Toshiba NAS Ready HDDs (RAID0)

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Stefan Payne said:

a) Developer say the functions of Mantle are similar to Vulkan, just renamed

Source? Preferably multiple ones, and not just talking about a handful of APIs.

 

1 minute ago, Stefan Payne said:

b) AMD gave the Khronos Group the plans for Mantle, wich they used for Vulkan 

Yep, true. But how much of that had to be changed for Vulkan and who did that job?

By the way, it wasn't even AMD alone that developed Mantle. DICE played a major role too, and those were pretty much the only ones who used it.

Did you know that AMD did not release their Mantle programming guide until March 2015? Wanna know when AMD discontinued Mantle? March 2015...

 

4 minutes ago, Stefan Payne said:

c) because they made Mantle wich then in turn forced the Development of Vulkan (and DX12 as well).

According to Microsoft, work on DX12 was already underway.

Mantle was announced in late 2013.

Microsoft had DirectX 12 demos ready in early 2014.

 

It doesn't take a rocket scientist to understand that they were in development at the same time.

 

 

 

9 minutes ago, Stefan Payne said:

nVidia doesn't want "Lower Level" APU because they want to hav the Control and be able to be the king with their driver optimization. With Lower Level APIs you can optimize less.

Citation needed on Nvidia not wanting low-level APIs.

 

 

10 minutes ago, Stefan Payne said:

Oh yeah, so the things AMD has done have to be trivialized... 

No they don't, but people seem to love overhyping and overstating AMD's contributions to open source.

 

10 minutes ago, Stefan Payne said:

No, without AMD Mantle, there would be no Vulkan. There would be no DX12 even in 2019.

We would still have the same old APIs that were made for GPUs more than 10 years ago or are a total patchwork that originated in a time when even texturing was uncommon.

[Citation Needed]

 

11 minutes ago, Stefan Payne said:

AMD brought it up to discussion and pushed for it. It might actually be possible that they used the eDP Spec for that."

But that doesn't matter.

 

And what you should have done is bash the one that went out of their way, with an FPGA to do some proprietary shit because they knew that AMD pushed for that Standard, so that they can steal all the fame from all the other Members of the comitee.

You make it sound like we only got G-Sync because of AMD. Is that what you're implying?

We had G-Sync capable monitors on the market in 2013.

The variable refresh rate standard from VESA was created in 2014.

FreeSync products hit the market in 2015.

I think it is very clear in this case who pushed for what, and who reacted to another company's product.

 

17 minutes ago, Stefan Payne said:

At least they contribute to Open Source and push for Open Standards... 

Yep, and credit where credit is due. The thing is that AMD's contributions gets overblown so ridiculously much that someone in this thread actually believed that AMD contributes more than Intel, which is ridiculous.

 

18 minutes ago, Stefan Payne said:

Not like other Companys that sabotage Open Standards and try to proprietarize that open standard and steal the ideas of others with all their might and money, just to claim that they have "something better"... 

Such as? Got any company names and concrete examples?

 

19 minutes ago, Stefan Payne said:

That's something you really should be criticizing, not "a small contribution" from a Company that hardly has any R&D Budget for those endeavors at all... 

 

Why not criticize the other bigger one that has more resources than AMD but doesn't contribute to Open Source?

I am capable of criticizing more than one company. And do you really see this as me criticizing AMD? I haven't said anything bad about them here. I am merely pointing out facts that for example they do not contribute as much as Intel.

I haven't said that their contributions are bad. I think a lot of things they have done are great. What I am saying is that they get way too much credit, and other companies aren't getting enough.

 

Just because I am not fondling AMD's balls every chance I get doesn't mean I am criticizing them.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Master Disaster said:

In my experience there's no need to touch AMDs drivers on Linux unless you want compute support. The open source AMD driver on Linux is more than good enough for gaming and actually installing the proprietary drivers on Linux tends to hurt gaming.

Ya exactly. The open source driver is their official driver for gaming now, they contribute to it and recommend it. When I bought my R9 290 years ago basically it was terrible on Linux, the proprietary driver wasn't great, but the open source one was even worse. Since then they have invested a lot into it and really turned it around, today I can play AAA games smoothly without a problem and it keeps improving.

 

Based on what their devs posted on the phoronix forum they are even now expanding that driver team aggressively with new hires (probably because they have more money now LOL).

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, strajk- said:

Last time I checked AMD has one of the biggest, if not biggest, Open Source and even FOSS repo library in the market and the company itself pushes for Open Source alternatives like crazy as well so that developers aren't cucked to only use proprietary software. 

You need to check again.

AMD's open source library is minuscule compared to Intel's. AMD is nowhere near the biggest contributors to open source. I don't even think they break the top 10 list.

 

15 minutes ago, strajk- said:

Very similar to the "glued together die" remark, reeks with ignorance.

You're the ignorant one here. Glue in that context refers to something very specific, which actually called glue. Even Intel refers to their own products as glue sometimes. For example here is a simplified overview of their FSP connections and APIs. Notice that they reference "glue logic"?

image.jpg.88b3797858fbe6c82e570805c2d3a062.jpg

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, LAwLz said:

According to Microsoft, work on DX12 was already underway.

Mantle was announced in late 2013.

Microsoft had DirectX 12 demos ready in early 2014.

 

It doesn't take a rocket scientist to understand that they were in development at the same time.

I think what was being talked about was the change in DX12 development to include low level hardware access. I remember at the time news articles about that sort of thing and big change in DX12 development focus. Would have to do a fair amount to re-looking to figure out what went on back then.

Link to comment
Share on other sites

Link to post
Share on other sites

Let's be clear about what Intel is talking about here: it's developer resources (generally speaking), not a consumer software stack like video drivers. And I don't think it takes a genius to realize that AMD has been behind on that front for ages although they seem to be picking up speed again.

 

@LAwLz

In regards to the syncing stuff: VESA has references to vblank signals earlier than that. It's implied in 2010 and explicitly (and officially) talked about in January 2011. This is of course all in reference to eDP and the advantages it gives within that segment. I'm sure Nvidia and AMD both looked at that and used it as inspiration and a jumping off point.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, LAwLz said:

Source? Preferably multiple ones, and not just talking about a handful of APIs.

3DCenter, a ton of people actually working with both APIs.

 

But You can look at the Mantle and Vulkan guides yourself. The name of the instructions are very similar.

1 minute ago, LAwLz said:

Yep, true. But how much of that had to be changed for Vulkan and who did that job?

Irrelevant.

The relevant part is that AMD gave them Mantle!

That gave them an enormous boost in development and something they can base their stuff off of. Without that, it would have taken many years more or the way they worked recently:
it might never have been!

So give AMD Credit for what they deserve. And that's giving the 

 

1 minute ago, LAwLz said:

By the way, it wasn't even AMD alone that developed Mantle. DICE played a major role too, and those were pretty much the only ones who used it.

Dice also said they asked nVidia and Intel before they asked AMD and only AMD had some interest.

With it beeing a Graphics API, its safer to assume that AMD did most of the Work and not DICE.

1 minute ago, LAwLz said:

Did you know that AMD did not release their Mantle programming guide until March 2015? Wanna know when AMD discontinued Mantle? March 2015...

And? Do you know why that was the case? Do you know if there might have been contractual obligations with Dice and/or Stardock for example??

Or maybet hey just didn't have the Personell for that.

 

1 minute ago, LAwLz said:

According to Microsoft, work on DX12 was already underway.

Mantle was announced in late 2013.

Microsoft had DirectX 12 demos ready in early 2014.

Yes, it was. It was what Microsoft Planned for Longhorn or even earlier. They already worked on it but canned it.

That's what some developer in the German Forum 3DCenter claimed that there was already something like that in development but due to resistance from nVidia or whoever it was canned and they continued.

 

Mantle forced the Development of DX12!!

 

1 minute ago, LAwLz said:

It doesn't take a rocket scientist to understand that they were in development at the same time.

No, DX12 was earlier but development stopped.

AMD forced the continuation. M$ just needed to open their drawers and pull something out they already were working on but never finished.

You can look for old "WDDM Driver" Announcement from the mid 2000s. In some of them you might find something that looks like it could be DX12.

 

1 minute ago, LAwLz said:

Citation needed on Nvidia not wanting low-level APIs.

Anderson from Dice.

And also other People, who work in the Industry.

 

1 minute ago, LAwLz said:

No they don't, but people seem to love overhyping and overstating AMD's contributions to open source.

No, they don't.

But people seem to love overhyping and overstating nVidia's Contribution and Development.
Why not give AMD the Credit and allow them to have this at least?!


They don't get anything else these days. Why fight it?!

 

1 minute ago, LAwLz said:

You make it sound like we only got G-Sync because of AMD. Is that what you're implying?

We had G-Sync capable monitors on the market in 2013.

The variable refresh rate standard from VESA was created in 2014.

FreeSync products hit the market in 2015.

I think it is very clear in this case who pushed for what, and who reacted to another company's product.

Are you defending nVidia here right now?

While you try to trivialize the things AMD did??

 

If it was in development, and "ready to go" like you said, why does it look like that:

https://www.anandtech.com/show/7582/nvidia-gsync-review

 

That doesn't look like it was in development for too long. That looks more like it was forced development, where the Engineers were forced to make it happen in a short amount of time at all cost!

 

But hey, can't be that AMD did go to the VESA comitee and introduced Variable refresh rate, wich they found to be a good idea, and nVidia, since they are also member of VESA, knew about it and Jensen forced his Engineers to make something "better" at all cost to have a product out before AMD. That's totally unlikely, isn't it??

1 minute ago, LAwLz said:

Yep, and credit where credit is due. The thing is that AMD's contributions gets overblown so ridiculously much that someone in this thread actually believed that AMD contributes more than Intel, which is ridiculous.

No, the stuff nVidia does gets overblown, see above with the G-Sync stuff. YOU defended them, when all the facts don't support your claim and look more like its a rush job to steal the fame from the Competition.

NVidia was never seen doing that, weren't they? They didn't remove OpenCL Support once the Competition was as strong or better or did they??

 

Jensen is not known to be a choleric who throws a tantrum when he hears something good about Competition, isn't it?

 

1 minute ago, LAwLz said:

Such as? Got any company names and concrete examples?

There is no need to name the Company that doesn't support Open Source, that violetes Kernel Guidelines, isn't it?

 

1 minute ago, LAwLz said:

I am merely pointing out facts that for example they do not contribute as much as Intel.

...when Intel makes 16 Billion per Quarter vs 1.27 Billion per Quater and only pay less than 400 Million for R&D while Intel pays double what AMD makes in R&D (3,3Billion), it should be obvious that they could contribute a bit more because they can afford it.

 

That a company that makes 10 times as much money, pays almost 10 times the amount for R&D should be expected to contribute a bit more, shouldn't it?

 

1 minute ago, LAwLz said:

I haven't said that their contributions are bad. I think a lot of things they have done are great. What I am saying is that they get way too much credit, and other companies aren't getting enough.

No, you missed the context. That is that Intel has ~10 times the R&D Budget that AMD has. Obviously they can contribute more to any project, if they want to, wich they do.

 

You shouldn't criticize AMD for doing a decent job, with the Resources they have, you should criticize the ones that don't although they have a pretty big R&D Budget but act rather Egomaniac...

 

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, Trixanity said:

In regards to the syncing stuff: VESA has references to vblank signals earlier than that. It's implied in 2010 and explicitly (and officially) talked about in January 2011. This is of course all in reference to eDP and the advantages it gives within that segment. I'm sure Nvidia and AMD both looked at that and used it as inspiration and a jumping off point. 

Yeah both Nvidia and AMD probably used the vblank signal for the variable refresh rate stuff.

My point was that Nvidia had products on the market when the adaptive sync standard which became FreeSync even was made. I think it's absolutely ludicrous to even suggest that Nvidia somehow had to rush out a product because AMD started talking inside VESA about variable refresh rate for desktops.

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, leadeater said:

I think what was being talked about was the change in DX12 development to include low level hardware access. I remember at the time news articles about that sort of thing and big change in DX12 development focus. Would have to do a fair amount to re-looking to figure out what went on back then.

You might want to look at older (internal) Presentation about "WDDM 2.0", I've seen some (though I didn't save them *ARGH*), that talked about that. IIRC they even called it WDDM2.0 but not in the context of Windows 10. It might either be Windows 7 or even Vista in that context.

 

 

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Stefan Payne said:

You might want to look at older (internal) Presentation about "WDDM 2.0", I've seen some (though I didn't save them *ARGH*), that talked about that. IIRC they even called it WDDM2.0 but not in the context of Windows 10. It might either be Windows 7 or even Vista in that context.

Well I said it would require looking in to, not that I was going to ?.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, LAwLz said:

I think it's absolutely ludicrous to even suggest that Nvidia somehow had to rush out a product because AMD started talking inside VESA about variable refresh rate for desktops.

No, its not, as they worked on Hawaii for a couple of Months and we know that Hawaii was the first product - wich was released in 2013:

https://www.techpowerup.com/gpu-specs/radeon-r9-290.c2397

 

So no, its not. Its rather plausible when you think about it. If they didn't have to rush it, why not develop an ASIC, why use an FPGA? Why put 768MiB RAM onto that FPGA Module??

And also why put it on a Card that goes into a slot and not design a Motherboard with it??

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

39 minutes ago, Trixanity said:

Let's be clear about what Intel is talking about here: it's developer resources (generally speaking), not a consumer software stack like video drivers. And I don't think it takes a genius to realize that AMD has been behind on that front for ages although they seem to be picking up speed again.

Which developers? Developing what kind/s of software?
I found Raja's comments ambiguous.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Stefan Payne said:

3DCenter, a ton of people actually working with both APIs. 

 

But You can look at the Mantle and Vulkan guides yourself. The name of the instructions are very similar. 

Again, source? I want links.

And just because the names of functions are similar does not mean they are implemented the same way.

 

3 minutes ago, Stefan Payne said:

Irrelevant.

The relevant part is that AMD gave them Mantle!

That gave them an enormous boost in development and something they can base their stuff off of. Without that, it would have taken many years more or the way they worked recently:
it might never have been!

So give AMD Credit for what they deserve. And that's giving the 

I will give AMD credit for open sourcing Mantle. But I will also give the Khronos group for actually creating Vulkan. I do think AMD gets way too much credit for Mantle though. A lot of people were praising them for how open source friendly they were even before Mantle was open source (which didn't happen until they killed it).

 

5 minutes ago, Stefan Payne said:

Dice also said they asked nVidia and Intel before they asked AMD and only AMD had some interest.

With it beeing a Graphics API, its safer to assume that AMD did most of the Work and not DICE.

Source?

And I wouldn't be surprised if DICE did a lot of work on it. Maybe not the majority, but I think it's foolish to give AMD all the credit without knowing who did what.

 

7 minutes ago, Stefan Payne said:

And? Do you know why that was the case? Do you know if there might have been contractual obligations with Dice and/or Stardock for example??

Or maybet hey just didn't have the Personell for that.

What do you mean? Are you suggesting that AMD were under contractual obligations to not release documentation for Mantle until it was killed off?

If that's the case then I would say they should not have signed off on that contract if their purpose for Mantle was to drive low level API development forward. And I find such a contract very strange if AMD actually did most of the work like you suggest.

 

If the problem was a lack of personnel then I don't believe they should get credit for being so open source friendly. You only get credit for what you accomplish.

 

 

9 minutes ago, Stefan Payne said:

Yes, it was. It was what Microsoft Planned for Longhorn or even earlier. They already worked on it but canned it.

That's what some developer in the German Forum 3DCenter claimed that there was already something like that in development but due to resistance from nVidia or whoever it was canned and they continued.

 

Mantle forced the Development of DX12!!

[Citation Needed]

 

 

11 minutes ago, Stefan Payne said:

No, DX12 was earlier but development stopped.

AMD forced the continuation. M$ just needed to open their drawers and pull something out they already were working on but never finished.

You can look for old "WDDM Driver" Announcement from the mid 2000s. In some of them you might find something that looks like it could be DX12.

[Citation Needed]

 

 

12 minutes ago, Stefan Payne said:

Are you defending nVidia here right now? 

While you try to trivialize the things AMD did??

 

If it was in development, and "ready to go" like you said, why does it look like that:

https://www.anandtech.com/show/7582/nvidia-gsync-review

 

That doesn't look like it was in development for too long. That looks more like it was forced development, where the Engineers were forced to make it happen in a short amount of time at all cost! 

What are you referring to here exactly? Like I said, Nvidia had products on the market the market, which worked really well, the year before the adaptive sync standard was even created. It then took AMD until the year after to get a product on the market, and it was not anywhere near as good as the G-Sync products (very limited variable refresh rate window, no ability to compensate with frame doubling at lower refresh rates, no strobing backlight controller built in, etc).

 

14 minutes ago, Stefan Payne said:

But hey, can't be that AMD did go to the VESA comitee and introduced Variable refresh rate, wich they found to be a good idea, and nVidia, since they are also member of VESA, knew about it and Jensen forced his Engineers to make something "better" at all cost to have a product out before AMD. That's totally unlikely, isn't it??

No, that doesn't sound very likely to me.

 

 

15 minutes ago, Stefan Payne said:

No, the stuff nVidia does gets overblown, see above with the G-Sync stuff. YOU defended them, when all the facts don't support your claim and look more like its a rush job to steal the fame from the Competition.

In what way does it look like a "rush job"?

Also I think it's hilarious that you are for some reason giving AMD credit for variable refresh rate monitors despite them being several years later than Nvidia on that front.

But no, whenever Nvidia does something first it's clearly actually AMD that should get credit. And whenever AMD does something first it's AMD that deserve credit. It's always AMD that should get credit. Never anyone else.

 

 

17 minutes ago, Stefan Payne said:

NVidia was never seen doing that, weren't they? They didn't remove OpenCL Support once the Competition was as strong or better or did they?? 

What do you mean? Nvidia never removed OpenCL support.

 

20 minutes ago, Stefan Payne said:

Jensen is not known to be a choleric who throws a tantrum when he hears something good about Competition, isn't it?

Not that I know of. Got any source on that?

 

21 minutes ago, Stefan Payne said:

There is no need to name the Company that doesn't support Open Source, that violetes Kernel Guidelines, isn't it?

Are you implying Nvidia does? First of all, this thread is about AMD and Intel, not AMD vs Nvidia.

Can you please link to either Intel or Nvidia violating Kernel guidelines? And if you do, do you have evidence that AMD doesn't and are better than whichever company you refer to in that regard?

 

22 minutes ago, Stefan Payne said:

...when Intel makes 16 Billion per Quarter vs 1.27 Billion per Quater and only pay less than 400 Million for R&D while Intel pays double what AMD makes in R&D (3,3Billion), it should be obvious that they could contribute a bit more because they can afford it. 

 

That a company that makes 10 times as much money, pays almost 10 times the amount for R&D should be expected to contribute a bit more, shouldn't it?

Which I think is completely irrelevant.

AMD's contributions shouldn't be overhyped because they are smaller. If you contribute one line of code then you get credit for one line of code. That's it. The size of the companies are completely irrelevant to me when I give credit.

If Intel contributes more than 5 times as much to the Linux kernel source then I will give them 5 times as much credit as I give to AMD, when it comes to contributing to the Linux kernel. Simple as that. I don't give AMD 10 times as much credit for everything they do just because they might be 1/10 the size.

By that logic I should worship one of the individual contributors to the Linux kernel as a god. His wealthy might be 1/100,000,000 the size of Intel. Does that mean he deserves 100,000,000 as much praise for every single thing he does than Intel?

I don't think so.

 

 

26 minutes ago, Stefan Payne said:

You shouldn't criticize AMD for doing a decent job, with the Resources they have, you should criticize the ones that don't although they have a pretty big R&D Budget but act rather Egomaniac...

Again, I am not criticizing AMD. I have never said AMD does a poor job.

You seem to think that anyone who isn't sucking AMD's dick is against them. It's not critique to point out that statements such as "AMD contributes more to open source than Intel" is incorrect.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, LAwLz said:

If you contribute one line of code then you get credit for one line of code.

All lines of code are not created equal.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×