Jump to content

AMD Releasing Public Mantle SDK This Year, Encourages Nvidia and Intel to Use it... For Free.

TERAFLOP

Impressive, while 2 of them don't even need a PC to run it. Explain me why BF4 runs quite shit without Mantle? I'm seeing lots of people reporting CPU bottlenecks in DX mode while not having it with Mantle at all, even Intel CPU's like 4670K/4770K's are bottlenecking. Rather than fixing their massive overhead their dx drivers are suffering from, they are abusing it to showoff how great Mantle is. Lets not claim Mantle is AWESOME when their dx drivers are broken. 

That's funny cause when I played BF4, i had the same fps in mantle as I did in DX11 mode. So please tell me how they are using "broken" DX11 drivers to make mantle look better? Mind you I was playing in multiplayer with 64 players.

i7 4770K @ 4.5GHZ, NH-D14, Kingston HyperX Black 8GB, Asus Z87-A, Fractal Design XL R2, MSI TF IV R9 280x, BTFNX 550G

Link to comment
Share on other sites

Link to post
Share on other sites

Everyone here is already hoping for Nvidia to support Mantle and I'm not following them, am I? Even if Nvidia announces this a few days later, they'll be needing to find a way to get more performance out of it or that choice wouldn't make any sense. When they do no PR about this that should pretty much sum up Mantle has no use for them. Nvidia will happily accept something that would increase their GPU's performance by a huge amount, sadly Mantle isnt offering this.

 

Nvidia management is all about proprietary solutions for quite some time now. And is using gameworks to sabotage performance on AMD GPUs for the price of also sabotage there own customers performance, although less.

In that kind of environment there won't be much reflections about using mantle. But not for technical reasons.

 

Begging brands to use Mantle, which isn't even open source so everyone could improve the API, performs pretty much the same as Directx, giving a damn about improving your DX overhead so that you could showoff how much of a difference Mantle makes for some nice PR, isn't pushing tech forward at all. Intel asked AMD to release their source code but they denied, that's what you call holding tech back.

If AMD now decides to release their source code so Nvidia could tweak it and do some massive improvements for their OWN GPU's only then AMD would be fooled which is why they aren't offering the source code. AMD is the only one here holding tech back for her own glory, not Nvidia or Intel. Why isn't AMD improving their DX drivers to a point that they outperform Nvidia in cpu limited scenaro's so we can see Nvidia responding to it and trying to release a driver with better performance?

 

You are intermixing different concepts here. API standard <> open source. There can be open source implementation of standards like in the case of OpenGL there is mesa.

But a standard would be just documentation about how the interface looks like and how it should act.

 

That AMD didn't release the Mantle documentation so far is a different discussion.

But when they ever want broad usage of it they would put it out there. And then everybody could just create there own implementation of it. Assuming it is not to much designed around AMD hardware specific quirks.

The last point and the question who would decide future changes in the API would be the real discussion if there where any real expectation that nvidia supports mantle.

 

Link to comment
Share on other sites

Link to post
Share on other sites

That's funny cause when I played BF4, i had the same fps in mantle as I did in DX11 mode. So please tell me how they are using "broken" DX11 drivers to make mantle look better? Mind you I was playing in multiplayer with 64 players.

 

Don't feed the trolls/fanboys/flamebait generators ;).

Quote

The problem is that this is an nVidia product and scoring any nVidia product a "zero" is also highly predictive of the number of nVidia products the reviewer will receive for review in the future.

On 2015-01-28 at 5:24 PM, Victorious Secret said:

Only yours, you don't shitpost on the same level that we can, mainly because this thread is finally dead and should be locked.

On 2016-06-07 at 11:25 PM, patrickjp93 said:

I wasn't wrong. It's extremely rare that I am. I provided sources as well. Different devs can disagree. Further, we now have confirmed discrepancy from Twitter about he use of the pre-release 1080 driver in AMD's demo despite the release 1080 driver having been out a week prior.

On 2016-09-10 at 4:32 PM, Hikaru12 said:

You apparently haven't seen his responses to questions on YouTube. He is very condescending and aggressive in his comments with which there is little justification. He acts totally different in his videos. I don't necessarily care for this content style and there is nothing really unique about him or his channel. His endless dick jokes and toilet humor are annoying as well.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Until the source code is provided, not just the development kit, they really shouldn't be celebrated as proponents of open-source.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Good guy AMD

CPU: AMD Ryzen 7 3800X GPU: ASUS Strix Radeon R7 5700 XT OC Edition Motherboard: ASUS Crosshair VIII Hero RAM: G.Skill Trident Z DDR4-3200 CL14 SSD: 2x Samsung 970 EVO Plus 1TB PSU: Corsair AX860 Cooling: NZXT Kraken X72 Case: Corsair Crystal Series 570X Mirror Black Display: BenQ XL2420Z  Keyboard: Corsair K95 RGB Cherry MX Blue Mouse: Logitech G502 Proteus Spectrum Audio: Logitech G633 Artemis Spectrum

Link to comment
Share on other sites

Link to post
Share on other sites

The only way NV will admit Mantle is great and adopt Mantle (indirectly/secretly) is by adopting DirectX 12. Cause DirectX 12 is basically DirectX 11.x + Mantle.

 

It's more like DX12 = DX11.3 + "big Mantle inspiration"... I won't say just Mantle because people will call me out for it, in a mad way.

 

Ooops. Someone already said it.

Quote

The problem is that this is an nVidia product and scoring any nVidia product a "zero" is also highly predictive of the number of nVidia products the reviewer will receive for review in the future.

On 2015-01-28 at 5:24 PM, Victorious Secret said:

Only yours, you don't shitpost on the same level that we can, mainly because this thread is finally dead and should be locked.

On 2016-06-07 at 11:25 PM, patrickjp93 said:

I wasn't wrong. It's extremely rare that I am. I provided sources as well. Different devs can disagree. Further, we now have confirmed discrepancy from Twitter about he use of the pre-release 1080 driver in AMD's demo despite the release 1080 driver having been out a week prior.

On 2016-09-10 at 4:32 PM, Hikaru12 said:

You apparently haven't seen his responses to questions on YouTube. He is very condescending and aggressive in his comments with which there is little justification. He acts totally different in his videos. I don't necessarily care for this content style and there is nothing really unique about him or his channel. His endless dick jokes and toilet humor are annoying as well.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Like I said, NVIDIA is being successful doing what they are doing. Now it doesn't mean AMD would do the same, neither it means AMD would be successful with such model.

If one takes one or two steps back they can see the whole picture, where companys were, what they did, what are doing, and where they are heading to. In all of this major 3 players: NVIDIA, Intel, and AMD the paths are clear. To caim one would take the others path it just makes not much sense because they are too different from eachother. All of them had stepbacks.

Yet all with the same goal: to make money.

What AMD Radeon is doing is to spread GCN as far and wide as they can (lower margins), so they can profit from it later due to high volume. Both in in-house APUs and GPUs, and semi custom designs. NVIDIA seems to be focused on higher margins, and they create more value in their ecosystem so their clients can spend more on them (G-Sync, Shields, even their remote control is only working on NVIDIA hardware, or at least was). Both valid and prooved business models. Now of course this is not as simple as this, I'm just making it simple.

 

 

I agree with all that, that isn't the point I was trying to make.

 

 

We were discussing open source drivers. Get a life and stop trolling me. Furthermore, they actually are relevant to Nvidia's bottom line. If Nvidia's cards work better, their compute power is higher, and they're more appealing to makers of supercomputers.

 

oh, someones having trouble keeping up again I see.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

The only way NV will admit Mantle is great and adopt Mantle (indirectly/secretly) is by adopting DirectX 12. Cause DirectX 12 is basically DirectX 11.x + Mantle.

 

 

Ooops. Someone already said it.

 

Oh please not this again, it has been done to death in so many other threads already.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Until the source code is provided, not just the development kit, they really shouldn't be celebrated as proponents of open-source.

 

It's not even out yet, why are you complaining already?

Quote

The problem is that this is an nVidia product and scoring any nVidia product a "zero" is also highly predictive of the number of nVidia products the reviewer will receive for review in the future.

On 2015-01-28 at 5:24 PM, Victorious Secret said:

Only yours, you don't shitpost on the same level that we can, mainly because this thread is finally dead and should be locked.

On 2016-06-07 at 11:25 PM, patrickjp93 said:

I wasn't wrong. It's extremely rare that I am. I provided sources as well. Different devs can disagree. Further, we now have confirmed discrepancy from Twitter about he use of the pre-release 1080 driver in AMD's demo despite the release 1080 driver having been out a week prior.

On 2016-09-10 at 4:32 PM, Hikaru12 said:

You apparently haven't seen his responses to questions on YouTube. He is very condescending and aggressive in his comments with which there is little justification. He acts totally different in his videos. I don't necessarily care for this content style and there is nothing really unique about him or his channel. His endless dick jokes and toilet humor are annoying as well.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

It's not even out yet, why are you complaining already?

SDKs are generally binary files you can call functions from. They're basically black boxes.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Oh please not this again, it has been done to death in so many other threads already.

 

I understand. But I'm just saying it for the record. :rolleyes:

Now if anyone has a hundred dollars to spare, I want to read this article: http://semiaccurate.com/2014/03/18/microsoft-adopts-mantle-calls-dx12/

Again, just for the record. Please ignore my post if you don't have a hundred dollars to spare! :lol:

Quote

The problem is that this is an nVidia product and scoring any nVidia product a "zero" is also highly predictive of the number of nVidia products the reviewer will receive for review in the future.

On 2015-01-28 at 5:24 PM, Victorious Secret said:

Only yours, you don't shitpost on the same level that we can, mainly because this thread is finally dead and should be locked.

On 2016-06-07 at 11:25 PM, patrickjp93 said:

I wasn't wrong. It's extremely rare that I am. I provided sources as well. Different devs can disagree. Further, we now have confirmed discrepancy from Twitter about he use of the pre-release 1080 driver in AMD's demo despite the release 1080 driver having been out a week prior.

On 2016-09-10 at 4:32 PM, Hikaru12 said:

You apparently haven't seen his responses to questions on YouTube. He is very condescending and aggressive in his comments with which there is little justification. He acts totally different in his videos. I don't necessarily care for this content style and there is nothing really unique about him or his channel. His endless dick jokes and toilet humor are annoying as well.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

The only way NV will admit Mantle is great and adopt Mantle (indirectly/secretly) is by adopting DirectX 12. Cause DirectX 12 is basically DirectX 11 + Mantle.

 

Bingo. Mantle uses DX library effects. It is the same damn thing as DX 12 will be, except Nvidia will be supported.

 

http://www.eteknix.com/amd-demonstrates-mantle-can-be-easily-ported-to-directx-12/

 

amd_mantle-dx12_port.png

 

I use Mantle on Sniper Elite 3 on my R9 290 Tri-x. Runs flawlessly, saw FPS increases on my OC 4770k, run the game at 4xSSAA (same thing as native 4k). Runs like a champ. My FPS is a little higher (they have an included benchmark in the game) than these guys got on their 290x with a mild overclock on my r9 290.

 

http://www.pcper.com/reviews/Graphics-Cards/Sniper-Elite-3-Performance-Maxwell-vs-Hawaii-DX11-vs-Mantle

 

But yeah "Mantle Sucks". I am only running a game at 4k resolution of the consoles at damn near 60 FPS on a GPU that cost 290 bucks...Oh look. 4xMSAA on Dragon's Age Inquisition, instead of FXAA like AC Unity. I don't think anyone here plays MMO's like Guild Wars 2 and understands just how badly we need a low level API on certain types of PC exclusives. Mantle is being used in Star Citizen for a reason and Roberts said the game was CPU bound for a reason. My 4770k at 4.5ghz plays Guild Wars 2 like crap at times. We need a low level API for computer exclusives even on GOOD cpu's. Console ports? That is the fault of the developer. We see gains on GOOD CPU's but they are not giant. They are still gains though... On a very CPU bound game like DA I? My 4770k doesn't see huge swings. An I7-920 OVERCLOCKED sees 40 percent gains. So yeah...I will gladly take 5 FPS more lows on a 4.5ghz 4770k. 

 

http://www.pcgameshardware.de/Dragon-Age-Inquisition-PC-236767/Specials/Technik-Test-1142136/

 

Benchmarks_Dragon_Age_1080p_-_Mantle_-_f

CPU:24/7-4770k @ 4.5ghz/4.0 cache @ 1.22V override, 1.776 VCCIN. MB: Z87-G41 PC Mate. Cooling: Hyper 212 evo push/pull. Ram: Gskill Ares 1600 CL9 @ 2133 1.56v 10-12-10-31-T1 150 TRFC. Case: HAF 912 stock fans (no LED crap). HD: Seagate Barracuda 1 TB. Display: Dell S2340M IPS. GPU: Sapphire Tri-x R9 290. PSU:CX600M OS: Win 7 64 bit/Mac OS X Mavericks, dual boot Hackintosh.

Link to comment
Share on other sites

Link to post
Share on other sites

SDKs are generally binary files you can call functions from. They're basically black boxes.

Hmm. I'm curious. Isn't Gameworks released to devs as an SDK? If so, then devs couldn't do anything if a Gameworks effect used tessellation like crazy.

I'm still trying to pinpoint the root cause of Batman Arkham Origins and Crysis 2 over-tessellation reports. Was it Gameworks effects itself? Or were the devs paid to do that kind of shit?

Quote

The problem is that this is an nVidia product and scoring any nVidia product a "zero" is also highly predictive of the number of nVidia products the reviewer will receive for review in the future.

On 2015-01-28 at 5:24 PM, Victorious Secret said:

Only yours, you don't shitpost on the same level that we can, mainly because this thread is finally dead and should be locked.

On 2016-06-07 at 11:25 PM, patrickjp93 said:

I wasn't wrong. It's extremely rare that I am. I provided sources as well. Different devs can disagree. Further, we now have confirmed discrepancy from Twitter about he use of the pre-release 1080 driver in AMD's demo despite the release 1080 driver having been out a week prior.

On 2016-09-10 at 4:32 PM, Hikaru12 said:

You apparently haven't seen his responses to questions on YouTube. He is very condescending and aggressive in his comments with which there is little justification. He acts totally different in his videos. I don't necessarily care for this content style and there is nothing really unique about him or his channel. His endless dick jokes and toilet humor are annoying as well.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I understand. But I'm just saying it for the record. :rolleyes:

Now if anyone has a hundred dollars to spare, I want to read this article: http://semiaccurate.com/2014/03/18/microsoft-adopts-mantle-calls-dx12/

Again, just for the record. Please ignore my post if you don't have a hundred dollars to spare! :lol:

 

Sorry, I don't have a hundred dollars, even if I did I would seriously think twice about wasting it on semiaccurate. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Hmm. I'm curious. Isn't Gameworks released to devs as an SDK? If so, then devs couldn't do anything if a Gameworks effect used tessellation like crazy.

I'm still trying to figure out the Batman Arkham Origins and Crysis 2 over-tessellation reports. Was it Gameworks effects itself? Or were the devs paid to do that kind of shit?

 

To put it simply? Both cards are different hardware. They both are better at things. Nvidia has the Games Works library do everything that the AMD is worse at in a stupidly high amount, with damn near no visual pay off for either piece of hardware. AMD can do the same thing. AMD also does things much better. Higher AA/downsampling/supersampling due to bandwidth (and that has nothing to do with screwing over Nvidia), certain compute (which they could screw over Nvidia with).

 

Nvidia is also screwing over their own customers though, which makes it even funnier. You are doing things different than the console version for no reason at all, with higher overhead. This game (ACU)doesn't even have freaking reflections in a mirror when you look at it lol, just like Watch Dogs.

 

AC Unity performance? Should look like DA I performance. A single powerful card should be doing what DAI does in AC Unity at 4X SMAA and 2x SMAA at minimum for 60 FPS. DA I is just as hard on the CPU. 

 

It literally takes something like Games Works to not get near 4x either the resolution, FPS, other settings on either the top AMD or the top Nvidia GPU.

CPU:24/7-4770k @ 4.5ghz/4.0 cache @ 1.22V override, 1.776 VCCIN. MB: Z87-G41 PC Mate. Cooling: Hyper 212 evo push/pull. Ram: Gskill Ares 1600 CL9 @ 2133 1.56v 10-12-10-31-T1 150 TRFC. Case: HAF 912 stock fans (no LED crap). HD: Seagate Barracuda 1 TB. Display: Dell S2340M IPS. GPU: Sapphire Tri-x R9 290. PSU:CX600M OS: Win 7 64 bit/Mac OS X Mavericks, dual boot Hackintosh.

Link to comment
Share on other sites

Link to post
Share on other sites

Hmm. I'm curious. Isn't Gameworks released to devs as an SDK? If so, then devs couldn't do anything if a Gameworks effect used tessellation like crazy.

I'm still trying to pinpoint the root cause of Batman Arkham Origins and Crysis 2 over-tessellation reports. Was it Gameworks effects itself? Or were the devs paid to do that kind of shit?

Gameworks is generally optimized, but it doesn't have the fine granularity a really good programmer would need.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

To put it simply? Both cards are different hardware. They both are better at things. Nvidia has the Games Works library do everything that the AMD is worse at in a stupidly high amount, with damn near no visual pay off for either piece of hardware. AMD can do the same thing. AMD also does things much better. Higher AA/downsampling/supersampling due to bandwidth (and that has nothing to do with screwing over Nvidia), certain compute (which they could screw over Nvidia with).

 

Nvidia is also screwing over their own customers though, which makes it even funnier. You are doing things different than the console version for no reason at all, with higher overhead. This game (ACU)doesn't even have freaking reflections in a mirror when you look at it lol, just like Watch Dogs.

 

AC Unity performance? Should look like DA I performance. A single powerful card should be doing what DAI does in AC Unity at 4X SMAA and 2x SMAA at minimum for 60 FPS. DA I is just as hard on the CPU. 

 

It literally takes something like Games Works to not get near 4x either the resolution, FPS, other settings on either the top AMD or the top Nvidia GPU.

AMD's theoretical compute is better, but the true compute is in Nvidia's favor. The only exception to this is coin mining where there happens to be dedicated hardware for some things which hashing algorithms used a lot. If AMD's compute was better than nvidia's, you'd see more FirePros used in supercomputers.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

AMD's theoretical compute is better, but the true compute is in Nvidia's favor. The only exception to this is coin mining where there happens to be dedicated hardware for some things which hashing algorithms used a lot. If AMD's compute was better than nvidia's, you'd see more FirePros used in supercomputers.

 

Depends on the compute. Like Nvidia Hairworks (which is straight up Directcompute)? IF the library was open (which it should be), performance should look like this. The old Titan is actually MUCH better than the 980. The cards both have strengths. My R9 290 craps on GTX 970's in ubersampling/downsampling and I really love to do that with old games. 970 would be better for directcompute. AMD killed it with litecoin mining. Titan is still freakin awesome for Direct Compute, which is why those things cost so damn much, and people used them in place of enterprise cards on Hackintosh and stuff.

 

http://www.videocardbenchmark.net/directCompute.html

 

On a honest port and where Nvidia Games works is only used for the additional stuff. That should be the only difference along with Physx and only if those options are chosen over a higher res/AA. Just like in Tomb Raider, the performance discrepancy only happens when TressFX is on. When off? Whatever is the better DX GPU wins, and that can change on what AA you are using/supersampling/downsampling.

 

I often skipped over Physx for higher AA/res (on my GTX 770). People do the same with Tress FX and Tomb Raider.

CPU:24/7-4770k @ 4.5ghz/4.0 cache @ 1.22V override, 1.776 VCCIN. MB: Z87-G41 PC Mate. Cooling: Hyper 212 evo push/pull. Ram: Gskill Ares 1600 CL9 @ 2133 1.56v 10-12-10-31-T1 150 TRFC. Case: HAF 912 stock fans (no LED crap). HD: Seagate Barracuda 1 TB. Display: Dell S2340M IPS. GPU: Sapphire Tri-x R9 290. PSU:CX600M OS: Win 7 64 bit/Mac OS X Mavericks, dual boot Hackintosh.

Link to comment
Share on other sites

Link to post
Share on other sites

Sorry, I don't have a hundred dollars, even if I did I would seriously think twice about wasting it on semiaccurate. 

 

Of course. Keep thinking. Eventually I hope you'll come to a conclusion. -_-

Don't worry, I've went through the same process too. It's not like you're the first.

Quote

The problem is that this is an nVidia product and scoring any nVidia product a "zero" is also highly predictive of the number of nVidia products the reviewer will receive for review in the future.

On 2015-01-28 at 5:24 PM, Victorious Secret said:

Only yours, you don't shitpost on the same level that we can, mainly because this thread is finally dead and should be locked.

On 2016-06-07 at 11:25 PM, patrickjp93 said:

I wasn't wrong. It's extremely rare that I am. I provided sources as well. Different devs can disagree. Further, we now have confirmed discrepancy from Twitter about he use of the pre-release 1080 driver in AMD's demo despite the release 1080 driver having been out a week prior.

On 2016-09-10 at 4:32 PM, Hikaru12 said:

You apparently haven't seen his responses to questions on YouTube. He is very condescending and aggressive in his comments with which there is little justification. He acts totally different in his videos. I don't necessarily care for this content style and there is nothing really unique about him or his channel. His endless dick jokes and toilet humor are annoying as well.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

That's funny cause when I played BF4, i had the same fps in mantle as I did in DX11 mode. So please tell me how they are using "broken" DX11 drivers to make mantle look better? Mind you I was playing in multiplayer with 64 players.

When it didnt kick in yeah. Go google driver overhead tests, there are plenty available.

 

 

Nvidia management is all about proprietary solutions for quite some time now. And is using gameworks to sabotage performance on AMD GPUs for the price of also sabotage there own customers performance, although less.

In that kind of environment there won't be much reflections about using mantle. But not for technical reasons.

 

 

You are intermixing different concepts here. API standard <> open source. There can be open source implementation of standards like in the case of OpenGL there is mesa.

But a standard would be just documentation about how the interface looks like and how it should act.

 

That AMD didn't release the Mantle documentation so far is a different discussion.

But when they ever want broad usage of it they would put it out there. And then everybody could just create there own implementation of it. Assuming it is not to much designed around AMD hardware specific quirks.

The last point and the question who would decide future changes in the API would be the real discussion if there where any real expectation that nvidia supports mantle.

Not sure what you're trying to say and Nvidia wasn't sabotaging AMD's performance with gameworks. It's AMD's own fault for having a huge driver overhead.

Link to comment
Share on other sites

Link to post
Share on other sites

Of course. Keep thinking. Eventually I hope you'll come to a conclusion. -_-

Don't worry, I've went through the same process too. It's not like you're the first.

 

And what process do you think I am going through?

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

I'm guessing with DirectX 12 on the horizon AMD wants to incorporate Mantle into whatever they can before it becomes irrelevant.

 

DirectX 12 is a low level API, the same and developed in corporation with AMD.

Link to comment
Share on other sites

Link to post
Share on other sites

But yeah "Mantle Sucks". I am only running a game at 4k resolution of the consoles at damn near 60 FPS on a GPU that cost 290 bucks...Oh look. 4xMSAA on Dragon's Age Inquisition, instead of FXAA like AC Unity. I don't think anyone here plays MMO's like Guild Wars 2 and understands just how badly we need a low level API on certain types of PC exclusives. Mantle is being used in Star Citizen for a reason and Roberts said the game was CPU bound for a reason. My 4770k at 4.5ghz plays Guild Wars 2 like crap at times. We need a low level API for computer exclusives even on GOOD cpu's. Console ports? That is the fault of the developer. We see gains on GOOD CPU's but they are not giant. They are still gains though... On a very CPU bound game like DA I? My 4770k doesn't see huge swings. An I7-920 OVERCLOCKED sees 40 percent gains. So yeah...I will gladly take 5 FPS more lows on a 4.5ghz 4770k.

 

Yup, and with the right implementations and the right optimizations, we get something like this: http://semiaccurate.com/2014/10/28/look-civilization-beyond-earth/

Mantle-Civ.png

Quote

The problem is that this is an nVidia product and scoring any nVidia product a "zero" is also highly predictive of the number of nVidia products the reviewer will receive for review in the future.

On 2015-01-28 at 5:24 PM, Victorious Secret said:

Only yours, you don't shitpost on the same level that we can, mainly because this thread is finally dead and should be locked.

On 2016-06-07 at 11:25 PM, patrickjp93 said:

I wasn't wrong. It's extremely rare that I am. I provided sources as well. Different devs can disagree. Further, we now have confirmed discrepancy from Twitter about he use of the pre-release 1080 driver in AMD's demo despite the release 1080 driver having been out a week prior.

On 2016-09-10 at 4:32 PM, Hikaru12 said:

You apparently haven't seen his responses to questions on YouTube. He is very condescending and aggressive in his comments with which there is little justification. He acts totally different in his videos. I don't necessarily care for this content style and there is nothing really unique about him or his channel. His endless dick jokes and toilet humor are annoying as well.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

 

When it didnt kick in yeah. Go google driver overhead tests, there are plenty available.

 

 

Not sure what you're trying to say and Nvidia wasn't sabotaging AMD's performance with gameworks. It's AMD's own fault for having a huge driver overhead.

 

 

That is literally the stupidest thing I have ever heard Faa. AMD can't access the library and any optimization is guess work. Do not pass go. It is literally that simple. This only happens in Games Works games which use the library at the core and this did not start with Ubisoft.

 

http://www.extremetech.com/extreme/173511-nvidias-gameworks-program-usurps-power-from-developers-end-users-and-amd

 

You are saying that a GTX 770 should perform like a R9 290. That is straight up asinine. I just upgraded from one to another and the GPU's are not close...at all. Not even a little bit. One can run 4k 30 FPS plus on Shadows of Mordor which uses a customized ASSASSIN'S CREED ENGINE and the other can't even come close even at much lower settings where the VRAM is less of a problem.

 

If is like comparing a chevette to a camaro in a race and then saying the camaro sucks because it has to run the race with flat tires, pulling a boat. I guess Monolith has...

 

its-magic-shia-labeouf-gif.gif

 

Funny thing? That game has a Nvidia splash logo on the screen and the GTX 980 gets beat at 4k...

CPU:24/7-4770k @ 4.5ghz/4.0 cache @ 1.22V override, 1.776 VCCIN. MB: Z87-G41 PC Mate. Cooling: Hyper 212 evo push/pull. Ram: Gskill Ares 1600 CL9 @ 2133 1.56v 10-12-10-31-T1 150 TRFC. Case: HAF 912 stock fans (no LED crap). HD: Seagate Barracuda 1 TB. Display: Dell S2340M IPS. GPU: Sapphire Tri-x R9 290. PSU:CX600M OS: Win 7 64 bit/Mac OS X Mavericks, dual boot Hackintosh.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×