Jump to content

NVIDIA To Devs: Compute/Graphics Toggle Is A Heavyweight Switch

Mr_Troll

There was no forcing at all. It was just a better implementation of the same library functions. It's the same thing in different clothing.

Maxwell is the dominant market architecture, not niche at all. Regardless of AMD's future ability to support it, the fact they can't now combined with Nvidia's capitalization on it would cause undue harm to AMD and its ability to compete by the letter of U.S. law. Nvidia can't publicly encourage that.

 

Dominant still means single digits to maybe 15% max. Hardly anything useful, at least not in a way, that any dev would make any game NVidia exclusive. By that definition GameWorks would be antitrust 15 times over. DirectX 12.1 is still a standard, not an NVidia technology, not proprietary or exclusive for NVidia either.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

this is a lie - I would like you to source this from AMD's or MS' "mouths" directly and not from rumor mills

something you need to understand: both MS and SONY had specific HW requirements for the APUs AMD would produce - at that point in time, MANTLE wasn't even a thing

MANTLE became a thing when AMD, SONY and EA (DICE) started development of BF4 - the low level API for the PS4, GNM is even lower than MANTLE or even DX12 (!!!!!!)

http://www.eurogamer.net/articles/digitalfoundry-how-the-crew-was-ported-to-playstation-4

I thought that by admitting when i was wrong, you would find satisfaction in knowing you proved a point. Sadly, your hatred for AMD has once again, led us to me calling you wrong. You are well known on this forum for posting news relating to the downfall of AMD in any way you can, and it's really starting to be your greatest downfall. 

 

https://community.amd.com/community/gaming/blog/2015/05/12/on-apis-and-the-future-of-mantle

 

If you read through that link, AMD is claiming that Mantle lives on through Vulkan and DX12. It's also not hard to agree with that, given the features i showed you earlier from Mantle, and how every single one of them are present in DX 12. 

 

http://www.extremetech.com/gaming/177407-microsoft-hints-that-directx-12-will-imitate-and-destroy-amds-mantle

 

 

 

We’ve spoken to several sources with additional information on the topic who have told us that Microsoft’s interest in developing a new API is a recent phenomenon, and that the new DirectX (likely DirectX 12) will substantially duplicate the capabilities of AMD’s Mantle. The two APIs won’t be identical — Microsoft is doing its own implementation — but the end result, for consumers, should be the same: lower CPU overhead and better scaling in modern titles.

 

Not too long before you became a member here, back when i was nothing more than a lurker, there was a news post regarding this very same subject. http://linustechtips.com/main/topic/129139-dx12-may-actually-be-mantle-renamed/

 

Many people back then came to the same conclusion. Directx 12/Vulkan is the spiritual successor to Mantle. You can claim that MS came to the exact same solution as Mantle without even looking at Mantle, it just seems highly unlikely, given AMD's history of trying to push open standards, and trying to make everything a "free and fair battleground". 

 

Seeing as you are being hostile, calling @bogus an "AMD fanboy", i think it would be fair to call you an Anti-AMD fanboy. Huddy is from AMD, so the link bogus gave you would be the evidence you asked for, no? Or are you just going to dismiss it because it does not fit your narrative? Everything in the computer world is relative to each other. Mantle is released, shows insane improvement in CPU overhead scenarios. Directx 12 comes, does exactly the same thing. Even lists the same features as Mantle, and then some. Mantle becomes "open source". Mantle dies. Khronos picks it up. Vulkan is born. Microsoft is supporting Khronos team. 

 

Now that we've come full circle, i'd like to take the time to teach you about "Context". In post that you originally corrected, i said "DX12 is said to have dipped into Mantles bag of tricks". I proved that already by giving you a direct comparison between Mantle and DX12. I also showed sources where others came to the same conclusion, and even AMD themselves taking credit for how well DX12 has become. I also never said DX 12 was Mantle. I said MS saw what Mantle could do, and emulated it. They absorbed those listed features of Mantle. To say they did not, when every single one of them are in DX 12 would be pretty stupid, don't you think? 

 

If you need a direct comparison between the two, let me know. I will gladly oblige and do the work for you. You should have just taken my admittance of defeat, and moved on. Now you've just asked me to come back for another round.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Still waiting for DX12 games.

Microsoft said end of the year, you'll start seeing new games (assuming no delays from game devs.)

So, 2016 will be where you have larger collection.

Link to comment
Share on other sites

Link to post
Share on other sites

IIf you need a direct comparison between the two, let me know. I will gladly oblige and do the work for you. You should have just taken my admittance of defeat, and moved on. Now you've just asked me to come back for another round.

 

Well this image shows your point quite well:

 

Mantle%20vs%20DX12%20pgm%20guide.jpg ​

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Well this image shows your point quite well:

 

 

Thank you for that. I like to collect information, and this will come in handy.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Dominant still means single digits to maybe 15% max. Hardly anything useful, at least not in a way, that any dev would make any game NVidia exclusive. By that definition GameWorks would be antitrust 15 times over. DirectX 12.1 is still a standard, not an NVidia technology, not proprietary or exclusive for NVidia either.

For game studios to use Gameworks, they must enter a contract with Nvidia that is made record to the FTC just like every other major corporate contract made legally is. And Nvidia can't be the one to initiate it. Game companies have to go to them to ask to use it. It's not about Nvidia exerting force over the market. You really don't understand what antitrust entails.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

For game studios to use Gameworks, they must enter a contract with Nvidia that is made record to the FTC just like every other major corporate contract made legally is. And Nvidia can't be the one to initiate it. Game companies have to go to them to ask to use it. It's not about Nvidia exerting force over the market. You really don't understand what antitrust entails.

THat does not mean Nvidia may or may not be aggressively marketing it behind the scenes...

Something that neither you nor i would know. Because neither of us work with or in the higher up positions that would make those decisions in a major game studio.

Link to comment
Share on other sites

Link to post
Share on other sites

This kinda reminds me of the whole Xbox One thing that happened when they said that they can't just "flip a switch" in regards to the Xbone's past-DRM, then a week later they flipped the switch hehehe.

 

Obviously this is a much different scenario and I trust Nvidia more since this is a lot more technical than some DRM system. But still funny nonetheless. 

Link to comment
Share on other sites

Link to post
Share on other sites

That is your gratest flaw

 

As opposed to the 1000s of other more important things in the world, trusting Nvidia is literally the worst flaw someone can have. Yup.

Link to comment
Share on other sites

Link to post
Share on other sites

We have some fine know-it-all specimens in this thread.

Keep going guys. Learn me some more stuff about development that you know nothing about.

Hey man! It seems you know about that topic and I have some questions.

Isn't these stuff just general advices that would work on all DX12 GPU's? I guess that if you follow this using AMD stuff would still work, maybe better thanks to their compute performance?

I don't understand the hate for this guide, it seems quite handy for developers.

I guess that it's also useful for GPU's that use Dx12 but as Nvidia, aren't capable of Asynchronous computing, like Intel.

In that case, it would be more useful and smart to optimize code so it works in most of the devices, than use a feature that it's avaliable in only 1 GPU vendor (in this case AMD with their superb compute performance) and basically screw the rest.

Sorry if these questions are super stupid, but i'm not a coding expert :P

Link to comment
Share on other sites

Link to post
Share on other sites

Hey man! It seems you know about that topic and I have some questions.

Isn't these stuff just general advices that would work on all DX12 GPU's? I guess that if you follow this using AMD stuff would still work, maybe better thanks to their compute performance?

I don't understand the hate for this guide, it seems quite handy for developers.

I guess that it's also useful for GPU's that use Dx12 but as Nvidia, aren't capable of Asynchronous computing, like Intel.

In that case, it would be more useful and smart to optimize code so it works in most of the devices, than use a feature that it's avaliable in only 1 GPU vendor (in this case AMD with their superb compute performance) and basically screw the rest.

Sorry if these questions are super stupid, but i'm not a coding expert :P

People have conflicting opinions on this subject, because of other issues that came before this. NVidia was recently shamed for their GPU's not being able to handle ACE (Asynchronous Compute) on the same level that AMD can. NVidia states within this guide that relying too heavily on Async Compute is a bad thing. Thus, given the previous drama, people think that Nvidia wants to cover their own skin, by suggesting to people to use something they are weak at, less often. I won't go into deep detail about this, as @Mahigan has already done so on this lovely thread over on OCN: http://www.overclock.net/t/1572716/directx-12-asynchronous-compute-an-exercise-in-crowd-sourcing

 

I assume you already know some about the subject, given that you've already said Nvidia is not so great at Async Compute, so i apologize in advanced if you think i was treating you like you are not informed on the subject, just thought i would throw that in, in case others were unaware of the backstory and significance it brings to this thread in general. That being said, you are correct, in that this could be helpful not only to Nvidia, but for all graphics vendors. Swapping back and forth might be too taxing on the hardware itself, regardless of if its able to perfectly leverage Async Compute or not. That being said, i fully understand peoples skepticism towards this "guide" given the recent circumstances.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

For game studios to use Gameworks, they must enter a contract with Nvidia that is made record to the FTC just like every other major corporate contract made legally is. And Nvidia can't be the one to initiate it. Game companies have to go to them to ask to use it. It's not about Nvidia exerting force over the market. You really don't understand what antitrust entails.

The point isn't that gameworks is antitrust, it isn't, as you can turn it off.

Antitrust is a general term, but the specifics in the law differs from place to place. America is not the world. In fact most gameworks games are made in Canada by Ubisoft, or Poland like cd project red. American law is worth as much there as a condom in a lesbian orgie.

There is no chance that nvidia won't contact devs about the use of gameworks. In fact It's probably mandatory in most nvidia sponsored games. I doubt ftc is going to do anything about that. Especially in games made outside of the US.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

The point isn't that gameworks is antitrust, it isn't, as you can turn it off.

Antitrust is a general term, but the specifics in the law differs from place to place. America is not the world. In fact most gameworks games are made in Canada by Ubisoft, or Poland like cd project red. American law is worth as much there as a condom in a lesbian orgie.

There is no chance that nvidia won't contact devs about the use of gameworks. In fact It's probably mandatory in most nvidia sponsored games. I doubt ftc is going to do anything about that. Especially in games made outside of the US.

The mountain of political and beurocratical paperwork to push other organs similar to FTC to follow up a FTC investigation is massive.

It takes resources, both monetary and time

Link to comment
Share on other sites

Link to post
Share on other sites

The point isn't that gameworks is antitrust, it isn't, as you can turn it off.

Antitrust is a general term, but the specifics in the law differs from place to place. America is not the world. In fact most gameworks games are made in Canada by Ubisoft, or Poland like cd project red. American law is worth as much there as a condom in a lesbian orgie.

There is no chance that nvidia won't contact devs about the use of gameworks. In fact It's probably mandatory in most nvidia sponsored games. I doubt ftc is going to do anything about that. Especially in games made outside of the US.

If doesn't matter where the games are made of the suit is against Nvidia (in the U.S.) based on cornering sales (in the U.S.)

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

If doesn't matter where the games are made of the suit is against Nvidia (in the U.S.) based on cornering sales (in the U.S.)

 

But there cannot be a suit, if GameWorks isn't mandatory per se. There is no law against writing poorly optimized shitty code, that generally runs bad on everything. The APEX stuff is still not mandatory, but an option, so I cannot see where the anitrust should be in this case.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Why don't the developers run a check on startup, that enables and disables features that the customers gpu supports? That way AMD gets ACE, and NVidia gets raster, and everyone is happy that their cards are performing to the best of their abilities. ACE might be more advantageous than raster, but that'a just how competition works. NVidia can support ACE next generation. ..

Intel i7 6700k @ 4.8ghz, NZXT Kraken X61, ASUS Z170 Maximus VIII Hero, (2x8GB) Kingston DDR4 2400, 2x Sapphire Nitro Fury OC+, Thermaltake 1050W

All in a Semi Truck!:

http://linustechtips.com/main/topic/519811-semi-truck-gaming-pc/#entry6905347

Link to comment
Share on other sites

Link to post
Share on other sites

Why don't the developers run a check on startup, that enables and disables features that the customers gpu supports? That way AMD gets ACE, and NVidia gets raster, and everyone is happy that their cards are performing to the best of their abilities. ACE might be more advantageous than raster, but that'a just how competition works. NVidia can support ACE next generation. ..

Mostly because it just means more work for the developers, which are already under intense stress by the publishers, or are lazy by nature (I live with lazy programmers, trust me, they can be very lazy). Though, this would be a pretty ideal solution. Most games already run a pre-check to determine which graphic settings to default to (Think of Skyrim). 

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

...

and yet .. in MicroSoft own words, DX12 has nothing to do with MANTLE - at all !!!!! ; look at the link I just posted in reply to bogus' post

the only relation between it and DX12 is that AMD forced MS' hand in getting their asses in gear

ps: I already said that the only API MANTLE is based on is GLNext (Vulkan) - I never denied that

---

you are ignoring the words in your won quote:

We’ve spoken to several sources with additional information on the topic who have told us that Microsoft’s interest in developing a new API is a recent phenomenon, and that the new DirectX (likely DirectX 12) will substantially duplicate the capabilities of AMD’s Mantle. The two APIs won’t be identical — Microsoft is doing its own implementation — but the end result, for consumers, should be the same: lower CPU overhead and better scaling in modern titles.

why MS suddenly got interested in developing DX12? because AMD published MANTLE on desktop - a low level API that was similar to DX11.X, API that was exclusive to XB1

Link to comment
Share on other sites

Link to post
Share on other sites

Mostly because it just means more work for the developers, which are already under intense stress by the publishers, or are lazy by nature (I live with lazy programmers, trust me, they can be very lazy). Though, this would be a pretty ideal solution. Most games already run a pre-check to determine which graphic settings to default to (Think of Skyrim). 

This, it requires writing essentially two executables.

Link to comment
Share on other sites

Link to post
Share on other sites

and yet .. in MicroSoft own words, DX12 has nothing to do with MANTLE - at all !!!!! ; the only relation between it and DX12 is that AMD forced MS' hand in getting their asses in gear

look at the link I just posted in reply to bogus' post

ps: I already said that the only API MANTLE is based on is GLNext (Vulkan) - I never denied that

---

you are ignoring the words in your won quote:

why MS suddenly gets interested in developing DX12? because AMD published MANTLE on desktop - a low level API that was similar to DX11.X, API that was exclusive to XB1

You mean to tell me, a company did not admit that their work is based on the work of others? This is shocking news! Not... It is like saying "This guy didn't lie, he said so!".

 

Way to ignore the mountain of evidence given to you by @bogus and @Notional. Also, let me fix this common flaw with your wording. Mantle is not based on Vulkan. Vulkan is based on Mantle. The way you worded it (repeatedly) is completely backwards, and is impossible. MS only got interested in DX12 because Mantle showed how extremely superior it was to DX11, and MS decided "Hey, they did this to get X amount of performance, lets do the exact same thing". 

 

Face it zMeul, you made the same mistake as the fanboys you loathe. You let a brand blind you from seeing the truth objectively, and now you are caught in a corner. You really should have taken the out i offered you.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

You mean to tell me, a company did not admit that their work is based on the work of others? This is shocking news! Not... It is like saying "This guy didn't lie, he said so!".

and AMD tells the truth, because what? reasons .. right?!?!

no, I don't blindly trust MS' words and neither AMD's - that's the difference between us

there are couple of facts that you bluntly ignore:

these are the facts

 

 

here's a question for you: if DX12 is MANTLE, for all intended purposes, why current AMD hardware isn't fully DX12 capable?

Link to comment
Share on other sites

Link to post
Share on other sites

and AMD tells the truth, because what? reasons .. right?!?!

no, I don't blindly trust MS' words and neither AMD's - that's the difference between us

there are couple of facts that you bluntly ignore:

  • MS and SONY requested AMD to build specific hardware for their consoles
  • MS had it's own low level API for XBox One (XB1) - DirectX 11.X
  • MS rejected AMD's proposal that it should run MANTLE
  • at this point, AMD starts pushing MANTLE on Windows
  • MS, forced by AMD's decision of a low level API, starts evolving DX11.X into DX12 for Windows desktop
  • DX11.X specific features are now part of DX12

these are the facts

 

If those were facts we wouldn't be asking for sources now wouldn't we?

The most ridiculous part is that you are so far from what actually happened and you have no fucking clue...

I'll just quote yourself, to teach yourself some maners:

 

this is a lie - I would like you to source this from AMD's or MS' "mouths" directly 

 

I'm still waiting for you to deliver SOURCES, other then your imagination, that proove that AMD and Frostbite were lying, that MS "rejected" Mantle, and all the other delusional hate filled bullshit you are trying to spread in this forum like a STD.

I see you are trying to pull a patrick move, but that shit wont stick on me.

Link to comment
Share on other sites

Link to post
Share on other sites

Mostly because it just means more work for the developers, which are already under intense stress by the publishers, or are lazy by nature (I live with lazy programmers, trust me, they can be very lazy). Though, this would be a pretty ideal solution. Most games already run a pre-check to determine which graphic settings to default to (Think of Skyrim). 

 

On a scale of Kotaku Journalism to Fast Food Workers, how lazy are they

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×