Jump to content

NVIDIA To Devs: Compute/Graphics Toggle Is A Heavyweight Switch

Mr_Troll

I saw someone wrongly post a few pages back that that AMDs current hardware wasn't fully dx12 compatible. Gcn1.1 and 1.2 are fully dx12_0 compatible, they couldn't be more compatible. They don't support 12_1 features however. Only gameworks titles on pc will ever support 12_1 features, that's almost a guarantee.

They only gpu company to fully support the main level of dx12 is AMD. Seems like people just can't get this fact through their skull.

DX12 doesn't stop at 12_0 feature level - and with that, no current hardware has complete DX12 support

and when I say complete, I include all feature levels

for example, AMD's hardware can't do Rasterized Order Views in the HW - so, what happens when a "Gameworks" title requires it? AMD said they will emulate it in the CPU

Link to comment
Share on other sites

Link to post
Share on other sites

DX12 doesn't stop at 12_0 feature level - and with that, no current hardware has complete DX12 support

and when I say complete, I include all feature levels

What you personally consider to be dx12 support isn't gospel. Dx12_1 is about as useful as dx11_1. Technically no Nvidia cards ever fully supported 11_1 or 11_2 except maybe maxwell. so by your definition, Nvidia were never fully dx11 compliant after years and years, until last year. Now Nvidia are not fully dx12_0 compliant, which is the most important and crucial level of compliance.

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

What you personally consider to be dx12 support isn't gospel. Dx12_1 is about as useful as dx11_1. Technically no Nvidia cards even support it except maybe maxwell. so by your definition, Nvidia were never fully dx11 compliant after years and years, until last year. Now Nvidia are not fully dx12_0 compliant, which is he most important level of compliance.

no, nVidia din not have DX11 complete support - and who's saying that is bending the truth, just as you do it right now

DX12FeatureLEvels-640x436.png

Link to comment
Share on other sites

Link to post
Share on other sites

yeah, I can throw up pictures too.

 

MbNhU9C.png

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

I saw someone wrongly post a few pages back that that AMDs current hardware wasn't fully dx12 compatible. Gcn1.1 and 1.2 are fully dx12_0 compatible, they couldn't be more compatible. They don't support 12_1 features however. Only gameworks titles on pc will ever support 12_1 features, that's almost a guarantee.

They only gpu company to fully support the main level of dx12 is AMD. Seems like people just can't get this fact through their skull.

No, I even Microsoft said no one is fully DX 12 compliant. Being tier 3 compliant doesn't mean every feature is included.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

yeah, I can throw up pictures too.

for what? to point out, yet again, that no current HW has complete DX12 support
Link to comment
Share on other sites

Link to post
Share on other sites

for what? to point out, yet again, that no current HW has complete DX12 support

 

so we agree that Nvidia is missing a lot of dx12_0 feature support on current hardware, while supporting niche 12_1 features that almost no games will use. 

 

you have been arguing in circles for weeks against AMD without proving anything useful. So far, you've only proven how far behind Nvidia is in their current hardware for gaming.

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

No, I even Microsoft said no one is fully DX 12 compliant. Being tier 3 compliant doesn't mean every feature is included.

So you're saying Robert hallock is a liar about gcn 1.2 being fully dx12_0 feature compliant?

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

"existed" yes, finished? never

 

as for nVidia - you confuse Maxwell2 ability for Async Compute with the scheduling; scheduling Maxwell2 has a problem with

 

from OP:

---

@MageTank

putting another dent in your erroneous assumption:

 

"DirectX® 12 is Microsoft’s own creation, though its development has been steered by input from many different technology partners including AMD."

 

 

So AMD DID influence the development of DirectX 12. Thanks for disproving your own point.

 

As for async compute, the entire point is to do concurrent graphics and compute in parallel. Maxwell is only able to do content switching, which is switching between graphics and compute in serial, thus only able to do one thing at a time, which is NOT async compute as we know it. Maxwell being able to do something like async compute, is emulating it via drivers and using the CPU for a lot of it.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

...

the feature level certification is done by MS, as I already stated numerous times - you should not look at what nVidia or AMD claim, only at what MS certifies

@Briggsy

Hallock claimed that GCN 1.2 is fully DX12 compatible, not 12_0 feature level ready - and that in itself is a bending of the truth

Link to comment
Share on other sites

Link to post
Share on other sites

the feature level certification is done by MS, as I already stated numerous times - you should not look at what nVidia or AMD claim, only at what MS certifies

@Briggsy

Hallock claimed that GCN 1.2 is fully DX12 compatible, not 12_0 feature level ready - and that in itself is a bending of the truth

 

No Robert Hallock said the only two DX12 features AMD are not fully compliant with are the 2 features found in DX12_1. 

 

If you are going to spend this much time arguing details, get them right.

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

So you're saying Robert hallock is a liar about gcn 1.2 being fully dx12_0 feature compliant?

I'm saying Microsoft says he's a liar.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

If you are going to spend this much time arguing details, get them right.

yes, please link to Hallock's comment

---

https://msdn.microsoft.com/en-us/library/mt186615.aspx

 

EDuvogy.png

  • Feature levels 12.0 and 12.1 require the Direct3D 11.3 or Direct3D 12 runtime.
  • Feature level 11.1 requires the Direct3D 11.1 runtime.
  • Feature level 11.0 requires the Direct3D 11.0 runtime.
  • ¹ : higher tiers optional.
Link to comment
Share on other sites

Link to post
Share on other sites

yes, please link to Hallock comment

 

Go look on reddit, you'll find it under his user name. 

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

Go look on reddit, you'll find it under his user name.

I don't follow that notorious liar, not even on reddit, I don't know his handle

 

---

 

 

 

“I think gamers are learning an important lesson: there’s no such thing as “full support” for DX12 on the market today.” said Robert and continued:

 

“There have been many attempts to distract people from this truth through campaigns that deliberately conflate feature levels, individual untiered features and the definition of “support.”

 

This has been confusing, and caused so much unnecessary heartache and rumor-mongering.

 

Here is the unvarnished truth: Every graphics architecture has unique features, and no one architecture has them all. Some of those unique features are more powerful than others.

 

http://wccftech.com/amd-full-support-dx12-today-fury-missing-dx12-features/

 

so ... how about them apples

Link to comment
Share on other sites

Link to post
Share on other sites

I don't follow that notorious liar, not even on reddit, I don't know his handle

 

---

 

 

 

 

http://wccftech.com/amd-full-support-dx12-today-fury-missing-dx12-features/

 

so ... how about them apples

 

and then he was asked which features GCN 1.2 does not fully support, to which Robert replied:

Robert Hallock: Raster Ordered Views and Conservative Raster. Thankfully, the techniques that these enable (like global illumination) can already be done in other ways at high framerates (see: DiRT Showdown).

 

Again, If you are going to argue details, please get them right.

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

"existed" yes, finished? never

 

as for nVidia - you confuse Maxwell2 ability for Async Compute with the scheduling; scheduling Maxwell2 has a problem with

 

from OP:

---

@MageTank

putting another dent in your erroneous assumption:

http://support.amd.com/en-us/search/faq/207

http://support.amd.com/en-us/search/faq/208

Thanks for beating yourself for me.

 

 

It's development has been steered by input from many different technology partners, including AMD. We have welcomed the same input on Mantle by sharing the full specification with Microsoft since the early days of our API. As the industry moves to embrace the principles of "closer-to-the-metal" API design, it is evident that our pioneering work with this concept has been highly influential.

 

Good work zMeul. Couldn't have done it better myself. Microsoft was shown mantle's specifications since its early days, meaning they were fully briefed on every aspect of it, and what it was designed to do. They were helped by the very same company that designed Mantle, in their DX12 API. Thus, it is safe to draw the conclusion that DX12 has some tricks that were originally put in Mantle. All forms of technology is influential on each other. Nothing wrong with being inspired by a competitor to do something. Nothing wrong with taking help from a mutual tech company to achieve something great for all parties involved either. Mantle was a proof of concept. It showed us all what could have been, and influenced a performance boost for not only AMD, but even Nvidia and Intel. Say what you want about AMD and your disdain for them, but they deserve credit here for spearheading this thing.

 

It was a pleasure arguing with you @zMeul. While i doubt you will ever admit you were wrong, at the very least, you can take satisfaction in knowing i wouldn't have beaten you without your help.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Good work zMeul. Couldn't have done it better myself.

do the job for you? you're delusional; you're inventing stuff; you're creating fan fiction, and the worst part .. you actually believe in it; you still can't accept that AMD was and still is lying to it's customers

just like nVidia, just like every other company out there - their business is making money, not telling the truth

AMD's web page can't be more clear when they say DX12 and MANTLE could not be anything more different

Q: What is the relationship between Mantle and DirectX® 12?

A: There is no direct relationship

Q: What are the similarities between Mantle and DirectX® 12?

A: DirectX® 12 is Microsoft’s own creation

as for AMD's involvement? no shit!? how about nVidia's? Qualcomm's ...

1st you say Async Compute isn't part of DX12, when proven wrong you use that to create the illusion MANTLE is DX12

you still haven't explained how nVidia (company that had no involvement in XB1 nor PS4 development) knew to put Async Compute in their hardware

Link to comment
Share on other sites

Link to post
Share on other sites

Thanks for beating yourself for me.

 

 

Good work zMeul. Couldn't have done it better myself. Microsoft was shown mantle's specifications since its early days, meaning they were fully briefed on every aspect of it, and what it was designed to do. They were helped by the very same company that designed Mantle, in their DX12 API. Thus, it is safe to draw the conclusion that DX12 has some tricks that were originally put in Mantle. All forms of technology is influential on each other. Nothing wrong with being inspired by a competitor to do something. Nothing wrong with taking help from a mutual tech company to achieve something great for all parties involved either. Mantle was a proof of concept. It showed us all what could have been, and influenced a performance boost for not only AMD, but even Nvidia and Intel. Say what you want about AMD and your disdain for them, but they deserve credit here for spearheading this thing.

 

It was a pleasure arguing with you @zMeul. While i doubt you will ever admit you were wrong, at the very least, you can take satisfaction in knowing i wouldn't have beaten you without your help.

 

one thing you forget is that Mantle is inspired by the work done on Consoles (the old one xbox360/ps3), meaning amd didn't actually came up with async compute for dx12

fx-8350 @4,4Ghz/sapphire r9 fury/2x 8gb Kingstone ddr3 @2030Mhz

Link to comment
Share on other sites

Link to post
Share on other sites

do the job for you? you're delusional; you're inventing stuff; you're creating fan fiction, and the worst part .. you actually believe in it; you still can't accept that AMD was and still is lying to it's customers

just like nVidia, just like every other company out there - their business is making money, not telling the truth

AMD's web page can't be more clear when they say DX12 and MANTLE could not be anything more different

as for AMD's involvement? no shit!? how about nVidia's? Qualcomm's ...

1st you say Async Compute isn't part of DX12, when proven wrong you use that to create the illusion MANTLE is DX12

you still haven't explained how nVidia (company that had no involvement in XB1 nor PS4 development) knew to put Async Compute in their hardware

You really don't know how to read, do you? I said Async Compute is not a mandatory feature of DX 12, meaning if you cant do Async Compute, you can still say your card is DX12 ready.

 

 

How many times do i have to repeat myself when i say Async Compute is NOT a DX12 feature? 

 

http://www.extremetech.com/extreme/213519-asynchronous-shading-amd-nvidia-and-dx12-what-we-know-so-far/2

 

 

https://en.wikipedia.org/wiki/Direct3D#Direct3D_12_levels

 

Notice any "ACE or Async Compute" under the mandatory features? No, because it is not a mandatory feature. Directx 12 just so happened to unlock a useful feature that was not available on DX11, and it was something AMD used with Mantle (DX12 is said to have dipped into AMD's Mantle bag of tricks)

 
I made the parts bold. Hopefully you will read this time, instead of blatantly ignoring things, while cherry-picking other things to twist into your false argument. Also, i didn't invent anything. You are the one that linked the picture that confirmed everything i said, remember? Never have i said DX12 is Mantle. You keep changing my words, but you fail to understand anyone can go back and read what i said, and clearly see that you are the delusional one. You switch my words in a way that gives you some sort of advantage for you to attack them, yet even then, it still makes no sense. You've backed yourself into a corner. According to the very two images you linked, we can sum up the following:
 
Is there a relationship between Mantle and DX12?: There is no direct relationship. Google the definition of that word. If there was no relationship at all, they would never use the word direct. They would simply say "there is no relationship at all". Clearly there is some indirect relationship between Mantle and DX12.
 
What are the similarities between Mantle and DX12?: DX12 is Microsofts own creation, but multiple companies (INCLUDING AMD) chipped in on the development. Why can't you put two and two together, and see that with AMD's help, MS could implement Mantle features into DX12? Also, your Nvidia question makes absolutely no sense. Nvidia DIDN'T put Async Compute in their hardware. It's why this drama is going on in the first place. They are doing it via software. Seriously, go look at @Mahigan's OCN post and educate yourself: http://www.overclock.net/t/1572716/directx-12-asynchronous-compute-an-exercise-in-crowd-sourcing

 

one thing you forget is that Mantle is inspired by the work done on Consoles (the old one xbox360/ps3), meaning amd didn't actually came up with async compute for dx12

You are only partially correct. Mantle was inspired by consoles, that much is true. This is because consoles do not have to deal with the driver overhead that PC's have to deal with. Mantle wanted to alleviate that. It also made porting exceptionally easier. However, Async Compute did not exist in the xbox360/ PS3. The hardware simply was not capable of doing it. Hence the whole reason AMD was tasked by MS and Sony to use GCN 1.0 hardware in these new consoles. 

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia DIDN'T put Async Compute in their hardware. It's why this drama is going on in the first place. They are doing it via software. Seriously, go look at @Mahigan's OCN post and educate yourself: http://www.overclock.net/t/1572716/directx-12-asynchronous-compute-an-exercise-in-crowd-sourcing

 

Well, TECHNICALLY they have async compute, but it's not as efficient as the ACEs in AMD's architectures, and they do not have a hardware context switch.

Link to comment
Share on other sites

Link to post
Share on other sites

@Zah

Look at that quote that you quoted from me. I did not say that Nvidia was incapable of Asynchronous Compute, I said they do it via software.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

@Zah

Look at that quote that you quoted from me. I did not say that Nvidia was incapable of Asynchronous Compute, I said they do it via software.

Well I mean they have hardware in there to do it, but it does have software components as well, unlike AMD, which has the more efficient method.  I should have been more thorough in my post.

Link to comment
Share on other sites

Link to post
Share on other sites

Well I mean they have hardware in there to do it, but it does have software components as well, unlike AMD, which has the more efficient method. I should have been more thorough in my post.

Sorry, I misunderstood your post. Either way, whether Maxwell can do it or not is irrelevant to the argument i am having with @zMeul. GCN was out long before Maxwell, and did it through hardware. Even if Nvidia was aware of Asynchronous Compute, they clearly did not think it would be used any time soon, or else they would have probably included it. He is just grasping for random straws to throw at me. Everyone can see the effect Mantle had on DX12, and his own evidence against me is a testament to that fact.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Sorry, I misunderstood your post. Either way, whether Maxwell can do it or not is irrelevant to the argument i am having with @zMeul. GCN was out long before Maxwell, and did it through hardware. Even if Nvidia was aware of Asynchronous Compute, they clearly did not think it would be used any time soon, or else they would have probably included it. He is just grasping for random straws to throw at me. Everyone can see the effect Mantle had on DX12, and his own evidence against me is a testament to that fact.

I don't think arguing with him with any amount of information will make him change his mind.  He has already stated he has an incredible dislike for AMD, so there is already a bias inherently there.  It's going to take a lot to sway something like that.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×