Jump to content

AMD Releasing Public Mantle SDK This Year, Encourages Nvidia and Intel to Use it... For Free.

TERAFLOP

Sadly Nvidia has too much pride and is too stuck up to EVER use mantle... I could see Intel implementing it with their Iris Pro graphics though.

CPU: i7 6700k @ 4.6ghz | CASE: Corsair 780T White Edition | MB: Asus Z170 Deluxe | CPU Cooling: EK Predator 360 | GPU: NVIDIA Titan X Pascal w/ EKWB nickel waterblock | PSU: EVGA 850w P2 | RAM: 16GB DDR4 Corsair Domintator Platinum 2800mhz | Storage: Samsung 850 EVO 500GB | OS: Win 10 Pro x64 | Monitor: Acer Predator X34/HTC VIVE Keyboard: CM Storm Trigger-Z | Mouse: Razer Taipan | Sound: Audio Technica ATH-M50x / Klipsch Promedia 2.1 Sound System 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Awesome! Hopefully this will allow Mantle to gain some traction (although it still seems unlikely). I'm really hoping that it will put some pressure on DX12 though for it to be decent.

Link to comment
Share on other sites

Link to post
Share on other sites

AMD needed 1 year to drop 290X price from 500$ to 370$ but how many bought while it was overpriced?

 

Show me where is free FreeSync? You won't update firmware to enable FreeSync you will have to buy new monitor. I love how AMD calling it free to industry and every fanboy imagining he's industry and will get everything for free.

 

Nvidia has plenty of free SDK

 

And why everyone is saying that DX12 and new OpenGL is Mantle? It's like say that VP9 is H.265 (x265).

Okay first of all, the price. Cryptocurrency hype increases demand, decreases supply. Basic business idea.

 

Next, FreeSync. They just accepted it as a standard, so I'm assuming newer models will be supporting it (by a standard of course). Plus GSync is Nvidia only. The lack of FreeSync monitors compared to Nvidia's 5 expensive monitors with GSync. Not really a huge amount of difference in variety....

Nvidia's SDK. Free, yet only useful if you use a Nvidia product... PhysX the more appealing of the SDK's will just be used on a CPU, which will kill any way to play the game. This is the first move to actually share Mantle, which has been successful so far, to other companies (Intel and Nvidia). The other SDKs Nvidia has are all Nvidia specific products, which isn't bad, but I don't think AMD/Intel makes anything that Tegra or Nvidia Shield can actually use.

No one is saying DX12 and OpenGL is Mantle, however we can all say they were inspired by Mantle. Similarly, Android is NOT iOS, but was inspired by it (If I'm correct that Android was after iOS). Also, There is an article saying that the Mantle team has assisted in the development of DX12

I don't know about you, but open is better than proprietary. This can be shown through other products, such as Linux and Android. I personally do not use Linux, due to its lack of support for programs that I use.  :(

Link to comment
Share on other sites

Link to post
Share on other sites

If it works great, if not, don't care. I'd just like my 660 to stop being so shitty, had to restart my pc 3 times this morning to get the sound bug to stop happening.

 

Bout to switch teams out of pure fucking spite. Had 2 660's now with the same exact problem, they claim it's my power supply, tried switching it out, nothin doing.

Ketchup is better than mustard.

GUI is better than Command Line Interface.

Dubs are better than subs

Link to comment
Share on other sites

Link to post
Share on other sites

This would be great... in a perfect world.In ours only intel might do something with it,i think there was already some old talk between intel-amd for mantle use,but screw nvidia i cant stand their business atitude,they make great overpriced products nothing to argue there but they just want it all cash,give us gamers something more,make physx open once and for all the games today still suck so much barely any decent physics in them and it uses guess what cpu which could make use of mantle too.

Good news i guess hopefully something happens its great work on mantle stuff to just let it waste when its there for free?

They want to make money!!! If nvidia and intel use mantle then every single game will have to support it, but now only 1 out of 100 games support mantle :D

Computer users fall into two groups:
those that do backups
those that have never had a hard drive fail.

Link to comment
Share on other sites

Link to post
Share on other sites

didnt know that there were 100s of aaa games being released yearly,

There are many many games and you can count on your fingers games that support mantle.

Computer users fall into two groups:
those that do backups
those that have never had a hard drive fail.

Link to comment
Share on other sites

Link to post
Share on other sites

I don't think Intel would have a grudge using it but Nvidia...

Also if Intel does use it them MOAR performance on HD graphics which is a good thing, especially for NUCs.

My PC specs; Processor: Intel i5 2500K @4.6GHz, Graphics card: Sapphire AMD R9 Nano 4GB DD Overclocked @1050MHz Core and 550 MHz Memory. Hard Drives: 500GB Seagate Barracuda 7200 RPM, 2TB Western Digital Green Drive, Motherboard: Asus P8Z77-V , Power Supply: OCZ ZS series 750W 80+ Bronze certified, Case: NZXT S340, Memory: Corsair Vengance series Ram, Dual Channel kit @ 1866 Mhz, 10-11-10-30 Timings, 4x4 GB DIMMs. Cooler: CoolerMaster Seidon 240V

Link to comment
Share on other sites

Link to post
Share on other sites

I havent used any AMD Products yet, but i love them why? because they do things for everyone and they want the industry to be better not just themselves earning more money

Link to comment
Share on other sites

Link to post
Share on other sites

It's funny how people think AMD is doing this for the good of the industry. AMD only cares for the good of the industry when it increases their bottom line. They aren't saints, they are sharks. And as a company they are pretty shitty sharks given how much they earn.

Didn't they already give up Mantle to Kronos? Why would Nvidia, or anyone, need AMD now? They don't. This is PR posturing more than anything.

AMD is doing this to get good will from both consumers but also from the industry. It gets developers to get into GCN, at least those who want to tinker around the, truth be told, most advanced API currently in the market... and it was just in beta.

Look at what Google did with Android. Google developed a Open Source OS yet it's still Google who dictates the direction it takes - it's Open Source but still is Google proprietary OS.

No one is saying AMD are saints, what they are saying is that what AMD is doing is good, and they are doing it in a good way. Without crippling or hurting anyone in the process, free to be used by anyone who wants to, free of charge. Both from a business perspective and from a consumer perspective it's a good thing that is going on.

AMD gave Mantle access to Microsoft and Khronos. They will implemented it the way they want to but one thing is still there - there will always be a connection between those APIs and Mantle, and if that means less effort for developers to implement any of those APIs then it's a good thing.

If you have a limited vision you can say that: now that AMD gave Mantle to Microsoft and Khronos, Mantle is dead.

If you have a wider vision you will say: now that Mantle is implemented in the two main APIs, AMD can keep pushing Mantle forward, with both software and hardware features. AMD doesn't have to wait for Microsoft and Khronos slow pace. AMD dictates their own pace. If developers want to try such new features, it will be almost effortless to them when it comes to implementation because Mantle is part of the foundation of the standard APIs. If such features have a good adoption it will later leak to DX and OpenGL, the same that happened with Mantle 1.0.

AMD just conquered their freedom when it comes to APIs. They have their own "DNA" in the standard APIs - now it's up to them to keep pushing Mantle forward, to keep being the cutting edge API that will inspire other APIs.

Will AMD be successfull with this? Will they be able to keep Mantle relevant? Only time will tell, it's all up to them, for they are free to move in the direction they want to without being dependent on DX or OpenGL. The beauty of it is that if other hardware vendors will be able to make drivers for this, and AMD still supports both standard APIs. It's called a win-win situation.

Link to comment
Share on other sites

Link to post
Share on other sites

Let's remind ourselves here for moment of who has actually been sabotaging the performance of their competitor's product.... hint it's not AMD.

http://www.extremetech.com/extreme/173511-nvidias-gameworks-program-usurps-power-from-developers-end-users-and-amd

Sorry but you're dumb if you believe that.

All of those titles have mainly come from Ubisoft, and all of those games tanked on both teams.

You know, AMD has had their "Gaming Evolved" titles that are optimized for AMD video cards and no one goes on every AMD related post to point that out. They even put Mantle in certain titles. God forbid Nvidia gives developers their own secret sauce for games - but again these recent GameWorks titles have been from Ubisoft and they've sucked for everyone.

Like I told someone else on the forum, if Witcher 3 runs fine on both vendor's video cards, all of you should be forced to play the shitty Ass Creed: Unity exclusively for a whole year.

Link to comment
Share on other sites

Link to post
Share on other sites

Sorry but you're dumb if you believe that.

All of those titles have mainly come from Ubisoft, and all of those games tanked on both teams.

You know, AMD has had their "Gaming Evolved" titles that are optimized for AMD video cards and no one goes on every AMD related post to point that out. They even put Mantle in certain titles. God forbid Nvidia gives developers their own secret sauce for games - but again these recent GameWorks titles have been from Ubisoft and they've sucked for everyone.

Like I told someone else on the forum, if Witcher 3 runs fine on both vendor's video cards, all of you should be forced to play the shitty Ass Creed: Unity exclusively for a whole year.

First thing those tittles came from Ubisoft and the Rocksteady studios (working for Warner Bros), and there were talks about the early months of Crysis. So it's not just a Ubisoft symptom.

You are missing the point: one thing is to have games optimized for a IHV hardware - wich is completly fine. Other thing is to have middleware in games that cripple the performance of other IHV hardware, wich was the case pointed out.

Link to comment
Share on other sites

Link to post
Share on other sites

First thing those tittles came from Ubisoft and the Rocksteady studios (working for Warner Bros), and there were talks about the early months of Crysis. So it's not just a Ubisoft symptom.

You are missing the point: one thing is to have games optimized for a IHV hardware - wich is completly fine. Other thing is to have middleware in games that cripple the performance of other IHV hardware, wich was the case pointed out.

Aaaand you missed my point. It's not Nvidia's fault from what I'm seeing because both teams have had bad performance in these titles. The way Nvidia's effects are implemented is what's causing the issue for the red team, and the way (or lack of, rather) these developers are optimizing their games is showing that they just don't give a shit about framerate and that's bad for everyone.
Link to comment
Share on other sites

Link to post
Share on other sites

Aaaand you missed my point. It's not Nvidia's fault from what I'm seeing because both teams have had bad performance in these titles. The way Nvidia's effects are implemented is what's causing the issue for the red team, and the way (or lack of, rather) these developers are optimizing their games is showing that they just don't give a shit about framerate and that's bad for everyone.

No, I didn't miss your point.

If you were informed in the subject you should know that part of the problem is that for a long time not even developers had the access to the GameWorks source code, they just had a .dll. The same thing AMD had access to. Why do you think developers called GameWorks "The Black Box"? Now NVIDIA had throw some sand into the media eyes saying that "Now developers have access to the source code if they pay a license" - yet they still have a NDA contract that can't disclose the source code to AMD.

So NVIDIA can optimize themselfs the games. When it comes to developers, now they can optimize the game for AMD hardware - but if they somehow need help or support doing it, AMD can't do anything because they can only see a .dll to solve their issues... this means, a AMD can only look into a Black Box wich can be changed by a simple update making all of their efforts worthless. 

Can you see the catch in here? I sure can. NVIDIA doesn't let AMD optimize the games for their hardware.

About NVIDIA having issues - it's known the some of the GameWorks code itself is unoptimized (for example over tesselation) that takes a hit on any hardware, also NVIDIA hardware. But the hit is way harder on AMD hardware.

Edit: At least this is my perception of the issue, if I'm incorrect, please correct me.

 

Link to comment
Share on other sites

Link to post
Share on other sites

To anyone and everyone who's saying "Bu they'll make money off it even if it's open source and free":

WOULD YOU'VE LIKED IT MORE IF IT WAS PROPRIETARY AND COSTED MILLIONS TO LICENSE??? HOW "BETTER" WOULD MANTLE BE IF IT WAS PROPRIETARY LIKE GAMEWORKS AND PHYSX???

 

Sorry but you're dumb if you believe that.

All of those titles have mainly come from Ubisoft, and all of those games tanked on both teams.

You know, AMD has had their "Gaming Evolved" titles that are optimized for AMD video cards and no one goes on every AMD related post to point that out. They even put Mantle in certain titles. God forbid Nvidia gives developers their own secret sauce for games - but again these recent GameWorks titles have been from Ubisoft and they've sucked for everyone.

Like I told someone else on the forum, if Witcher 3 runs fine on both vendor's video cards, all of you should be forced to play the shitty Ass Creed: Unity exclusively for a whole year.

 

http://techreport.com/review/21404/crysis-2-tessellation-too-much-of-a-good-thing

 

 

And no one cares about AMD optimization harming Nvidia cards because AMD doesn't either do that or does waaaaay less than the stupid green team.

 

Aaaand you missed my point. It's not Nvidia's fault from what I'm seeing because both teams have had bad performance in these titles. The way Nvidia's effects are implemented is what's causing the issue for the red team, and the way (or lack of, rather) these developers are optimizing their games is showing that they just don't give a shit about framerate and that's bad for everyone.

 

I'm pretty sure it's not the developer's best intention to add tessellation to the point it hurts performance; and NVIDIA had better tessellators than AMD, hurt AMD cards more. How can you justify that as a game developer's intent?

Quote

The problem is that this is an nVidia product and scoring any nVidia product a "zero" is also highly predictive of the number of nVidia products the reviewer will receive for review in the future.

On 2015-01-28 at 5:24 PM, Victorious Secret said:

Only yours, you don't shitpost on the same level that we can, mainly because this thread is finally dead and should be locked.

On 2016-06-07 at 11:25 PM, patrickjp93 said:

I wasn't wrong. It's extremely rare that I am. I provided sources as well. Different devs can disagree. Further, we now have confirmed discrepancy from Twitter about he use of the pre-release 1080 driver in AMD's demo despite the release 1080 driver having been out a week prior.

On 2016-09-10 at 4:32 PM, Hikaru12 said:

You apparently haven't seen his responses to questions on YouTube. He is very condescending and aggressive in his comments with which there is little justification. He acts totally different in his videos. I don't necessarily care for this content style and there is nothing really unique about him or his channel. His endless dick jokes and toilet humor are annoying as well.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Should just wait for dx12 hopefully it will be just as good if dx12 is as good as promised mantle will be pointless.

Link to comment
Share on other sites

Link to post
Share on other sites

Actions speak louder than words. AMD has always done things that benefited the entire industry instead of just themselves. Mantle, FreeSync and OpenCL are great examples of this. Nvidia on the other hand likes to keep things very much locked down and proprietary even if it's bad for their users or the industry as a whole.

G-Sync went into development before Free-Sync. Nvidia pushed AMD.

Mantle is genuine good now that anyone can use it.

OpenCL was started by Intel and was instrumental to them taking out IBM and Broadcomm in the old server and supercomputer era.

Also, Nvidia contributes far more to driver development for Unix and BSD systems. AMD's graphics drivers suck for all things *nix/nux and BSD.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

i hope mantle gets used by everyone, it will make the performance in games a lot better

Potentially.

"It pays to keep an open mind, but not so open your brain falls out." - Carl Sagan.

"I can explain it to you, but I can't understand it for you" - Edward I. Koch

Link to comment
Share on other sites

Link to post
Share on other sites

Actions speak louder than words. AMD has always done things that benefited the entire industry instead of just themselves. Mantle, FreeSync and OpenCL are great examples of this. Nvidia on the other hand likes to keep things very much locked down and proprietary even if it's bad for their users or the industry as a whole.

G-Sync was started by NVIDIA before AMD even mentioned anything about FreeSync. NVIDIA pushed AMD to release FreeSync - I bet NVIDIA used G-Sync to push the industry. OpenCL was not started or designed by AMD. Apple designed it. If you're talking about what AMD and NVIDIA has contributed to OpenGL, the answer is they both have done quite a bit with OpenCL. Here's the timeline of the vendor implementations. However, they did push Microsoft to release DX12, I'll give them that.

"It pays to keep an open mind, but not so open your brain falls out." - Carl Sagan.

"I can explain it to you, but I can't understand it for you" - Edward I. Koch

Link to comment
Share on other sites

Link to post
Share on other sites

Good. I'll be curious to see if Intel does something to help leverage idling threads to help too. Lets hope other companies experiment with it! :)

 

Spoiler

Senor Shiny: Main- CPU Intel i7 6700k 4.7GHz @1.42v | RAM G.Skill TridentZ CL16 3200 | GPU Asus Strix GTX 1070 (2100/2152) | Motherboard ASRock Z170 OC Formula | HDD Seagate 1TB x2 | SSD 850 EVO 120GB | CASE NZXT S340 (Black) | PSU Supernova G2 750W  | Cooling NZXT Kraken X62 w/Vardars
Secondary (Plex): CPU Intel Xeon E3-1230 v3 @1.099v | RAM Samsun Wonder 16GB CL9 1600 (sadly no oc) | GPU Asus GTX 680 4GB DCII | Motherboard ASRock H97M-Pro4 | HDDs Seagate 1TB, WD Blue 1TB, WD Blue 3TB | Case Corsair Air 240 (Black) | PSU EVGA 600B | Cooling GeminII S524

Spoiler

(Deceased) DangerousNotDell- CPU AMD AMD FX 8120 @4.8GHz 1.42v | GPU Asus GTX 680 4GB DCII | RAM Samsung Wonder 8GB (CL9 2133MHz 1.6v) | Motherboard Asus Crosshair V Formula-Z | Cooling EVO 212 | Case Rosewill Redbone | PSU EVGA 600B | HDD Seagate 1TB

DangerousNotDell New Parts For Main Rig Build Log, Señor Shiny  I am a beautiful person. The comments for your help. I have to be a good book. I have to be a good book. I have to be a good book.

 

Link to comment
Share on other sites

Link to post
Share on other sites

It makes pretty good business sense for AMD to do this, so they can get more developers to support mantle. Then they port mantle to linux and they have a huge leg up in selling their video cards for SteamOS systems. I really hope it catches on so we can finally be rid of paying $100 for Windows just to play AAA games on better than console hardware.

Link to comment
Share on other sites

Link to post
Share on other sites

I bet NVIDIA used G-Sync to push the industry.

 

If G-SYNC was really meant to push the industry then why is it proprietary? Do you really think industry adoption will be faster if the technology is proprietary and costly instead of free and open source?

 

And if they REALLY ACTUALLY made G-SYNC to push the industry, then I'll say AMD has done a waaaaay better job than NVIDIA by making FreeSync a VESA standard.

Quote

The problem is that this is an nVidia product and scoring any nVidia product a "zero" is also highly predictive of the number of nVidia products the reviewer will receive for review in the future.

On 2015-01-28 at 5:24 PM, Victorious Secret said:

Only yours, you don't shitpost on the same level that we can, mainly because this thread is finally dead and should be locked.

On 2016-06-07 at 11:25 PM, patrickjp93 said:

I wasn't wrong. It's extremely rare that I am. I provided sources as well. Different devs can disagree. Further, we now have confirmed discrepancy from Twitter about he use of the pre-release 1080 driver in AMD's demo despite the release 1080 driver having been out a week prior.

On 2016-09-10 at 4:32 PM, Hikaru12 said:

You apparently haven't seen his responses to questions on YouTube. He is very condescending and aggressive in his comments with which there is little justification. He acts totally different in his videos. I don't necessarily care for this content style and there is nothing really unique about him or his channel. His endless dick jokes and toilet humor are annoying as well.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

If G-SYNC was really meant to push the industry then why is it proprietary?

It's the same thing as Google Fiber. When Google released Fiber, other companies started releasing their own fiber connections. Before that, no company cared. G-Sync could've been a way to motivate AMD to create an open form of the same technology.

"It pays to keep an open mind, but not so open your brain falls out." - Carl Sagan.

"I can explain it to you, but I can't understand it for you" - Edward I. Koch

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×