Jump to content

Mantle Will Boost Performance By 20-50%, Nvidia Can Add Support.

makes it harder but it does make it better if it is used correctly i have yet to see a game which fully utilizes physx and makes it a better game experience with it

The problem with that is then they reduce non-physx related effects, anyone with a low end nvidia card or a amd card will be less likely to buy it, mantle improves performance, physx alienates many gamers and makes it a lot harder to port to console which is where all the money is, in theory physx is awesome but the reality is that its not used enough and not used well enough.

cpu: intel i5 4670k @ 4.5ghz Ram: G skill ares 2x4gb 2166mhz cl10 Gpu: GTX 680 liquid cooled cpu cooler: Raijintek ereboss Mobo: gigabyte z87x ud5h psu: cm gx650 bronze Case: Zalman Z9 plus


Listen if you care.

Cpu: intel i7 4770k @ 4.2ghz Ram: G skill  ripjaws 2x4gb Gpu: nvidia gtx 970 cpu cooler: akasa venom voodoo Mobo: G1.Sniper Z6 Psu: XFX proseries 650w Case: Zalman H1

Link to comment
Share on other sites

Link to post
Share on other sites

The problem with that is then they reduce non-physx related effects, anyone with a low end nvidia card or a amd card will be less likely to buy it, mantle improves performance, physx alienates many gamers and makes it a lot harder to port to console which is where all the money is, in theory physx is awesome but the reality is that its not used enough and not used well enough.

exactly why i said i needs to be on a good game but that means being backed by nvidia and throwing away the amd and intel consumers

even if the next gen consoles can handle the physx

Link to comment
Share on other sites

Link to post
Share on other sites

exactly why i said i needs to be on a good game but that means being backed by nvidia and throwing away the amd and intel consumers

even if the next gen consoles can handle the physx

But sadly physx has never been at a point where we want it to be, in tech demos its amazing but it destroys performance, bottem line is that many nvidia users dont give two shits about it anymore

cpu: intel i5 4670k @ 4.5ghz Ram: G skill ares 2x4gb 2166mhz cl10 Gpu: GTX 680 liquid cooled cpu cooler: Raijintek ereboss Mobo: gigabyte z87x ud5h psu: cm gx650 bronze Case: Zalman Z9 plus


Listen if you care.

Cpu: intel i7 4770k @ 4.2ghz Ram: G skill  ripjaws 2x4gb Gpu: nvidia gtx 970 cpu cooler: akasa venom voodoo Mobo: G1.Sniper Z6 Psu: XFX proseries 650w Case: Zalman H1

Link to comment
Share on other sites

Link to post
Share on other sites

Meh....amd needs gsync more than nvidia needs mantle

Link to comment
Share on other sites

Link to post
Share on other sites

Dice's Johan Anderson discussing Mantle. Apparently future support for both newer AMD architectures as well as Nvidia would be possible. He also mentions that it's easier to use than opengl and could make a potential good combination with Linux and SteamOS.

http://techreport.com/news/25651/mantle-to-power-15-frostbite-games-dice-calls-for-multi-vendor-support

Link to comment
Share on other sites

Link to post
Share on other sites

Guys comon ...

 

Mantle isn't something that couldn't be done on a NVIDIA card. Hell, Dice already said, that Frostbite includeds some low level NVIDIA API stuff ...

 

You thinke all the major developers can just ignore this (2/3 of the gpus aren't AMD ones) ?

5foSJKg.png

 

AMD might have an advantage for a half a year or maybe a whole year ...

But up till now we haven't even seen any improvements, but rather only big marketing words ...

They already explained how it works in detail.

You can't say how much peformance you will get from Mantle, because it depends on the Dev just like it does on Consoles.

Some Devs get 20% other Devs get 2x peformance.  it all depends on the effort of the Dev.

And it will get even faster overtime, because when you have it once in your engine you can extend it from that point more and more.

It also can use your onboard GPU for stuff like post proccesing wich is huge, because we have all Onboard GPUs on our CPU but we don't even use it.

And there is no way that Nvidia will not take it if they allow them to use it.

Because this would mess up the complete Nvidia Line Up where they would have to drop the prices of a GTX780 to 300$ or less.

I have a feeling that they will have Mantle support with Maxwell.

 

RTX2070OC 

Link to comment
Share on other sites

Link to post
Share on other sites

If its dev'd in Mantle, its hard to compare to DX, because if they use mantle, and get more performance to use CPU or GPU, they can keep adding more objects in development under mantle VS DX and keep the performance target the same.

Like the draw calls comparison to consoles... They have more information on screen delivered from a lower end GPU equivalent.

Maximums - Asus Z97-K /w i5 4690 Bclk @106.9Mhz * x39 = 4.17Ghz, 8GB of 2600Mhz DDR3,.. Gigabyte GTX970 G1-Gaming @ 1550Mhz

 

Link to comment
Share on other sites

Link to post
Share on other sites

There is this great news from the tech world and how do the forum reply?

With a bunch of hardcore Nvidia "fanboys" (a.k.a. people who cannot think clearly) hoard the topics with posts like "duhh nv can use dis so amd sux and yay dey got a fanboy"

Really now, really..

By the way you're spelling these, I don't even think those "fanboys" are buying the GPUs themselves..

It's a great f***ing piece of technology to push the industry forward once more. And leave it at that unless NV's willing to license out Gsync to the red team too.

Please no fanboism.

Link to comment
Share on other sites

Link to post
Share on other sites

Meh....amd needs gsync more than nvidia needs mantle

no it s the opposite, monst people look at bench marks to buy gpus, when they ll see a r9 280x(300$) getting close to a titan/780(700+$), they ll get the amd card.

and that is only assuming a 30% increase.

gsync isnt someyhing people relates to compared to raw fps number. you cannot give it a value unless you try it.

and who needs gsync if you can only get 60fps all the time...

idk if you realise but a 50%(optimist) fps boost (at launch) means any gpu that was running at 40 fps will cap most screens refresh rate.

Anything I write is just a comment, take is as such, there is no guarantees associated with anything I say.

ATX Portable rig (smaller than prodigy(LOL)) :  Nmedia 2800 | Gigabyte Z77x-ud3h  | Corsair HX1000 | Scythe Big Shuriken | i5 3570K  |  XFX R9 290 DoubleD | Corsair Vengeance 32GB

Link to comment
Share on other sites

Link to post
Share on other sites

idk if you realise but a 50%(optimist) fps boost (at launch) means any gpu that was running at 40 fps will cap most screens refresh rate.

And don't forget if it's 50% on Launch the next frostbite game will start from there and could get even higher peformance if they add more to it O_O

RTX2070OC 

Link to comment
Share on other sites

Link to post
Share on other sites

my 7970s and 7870 are happy! (different systems)

Daily Driver:

Case: Red Prodigy CPU: i5 3570K @ 4.3 GHZ GPU: Powercolor PCS+ 290x @1100 mhz MOBO: Asus P8Z77-I CPU Cooler: NZXT x40 RAM: 8GB 2133mhz AMD Gamer series Storage: A 1TB WD Blue, a 500GB WD Blue, a Samsung 840 EVO 250GB

Link to comment
Share on other sites

Link to post
Share on other sites

no it s the opposite, monst people look at bench marks to buy gpus, when they ll see a r9 280x(300$) getting close to a titan/780(700+$), they ll get the amd card.

and that is only assuming a 30% increase.

gsync isnt someyhing people relates to compared to raw fps number. you cannot give it a value unless you try it.

and who needs gsync if you can only get 60fps all the time...

idk if you realise but a 50%(optimist) fps boost (at launch) means any gpu that was running at 40 fps will cap most screens refresh rate.

60fps with gsync will be flawless...hell even 30-50fps would look pretty good.....while amd mantle will be screen tearing and stuttering like mad... even if they get their untested and unproven boost in performance.

gsync is proven...mantle is not.

When we have 4k monitors and cards barely pushing 60fps...gsync will be necessary.

Link to comment
Share on other sites

Link to post
Share on other sites

60fps with gsync will be flawless...hell even 30-50fps would look pretty good.....while amd mantle will be screen tearing and stuttering like mad... even if they get their untested and unproven boost in performance. gsync is proven...mantle is not. When we have 4k monitors and cards barely pushing 60fps...gsync will be necessary.

But then again there's just 16 days until we see proof. So let's not get ahead with calling it bad.

 

Even so what you said.. if G-Sync wont be on AMD's too then it will still suck.. but the lower end graphics cards that make 21-30fps(in some games) even without g-sync if these will get a 5-10fps improvement that will be awesome for a lot of people.

CPU: AMD Ryzen 9 5900X 12; GPU: GeForce RTX 3080 Gigabyte Vision OC V2 10GB; PSU: EVGA 750W 80+ Gold Certified; RAM: 4x32GB (w/RGB xd); SSD: 1xM.2 Samsung 980 Pro 1TB, 1xM.2 Samsung 970 Pro 1TB, 1xWD 6TB HDD; OS: 10; Monitor: 2xAorus IPS 27" (2560x1400)Keyboard: Corsair K95; Mouse: Mionix Naos 7000 w/ Steelseries QcK mousepad.

Laptop - HP Omen 15" w/5800U, GPU 3070, 1TB M.2 WD Black, 16GB RAM.

Link to comment
Share on other sites

Link to post
Share on other sites

60fps with gsync will be flawless...hell even 30-50fps would look pretty good.....while amd mantle will be screen tearing and stuttering like mad... even if they get their untested and unproven boost in performance. gsync is proven...mantle is not. When we have 4k monitors and cards barely pushing 60fps...gsync will be necessary.

 

So  you mean... in a few years when 4K monitors become reasonable to most people to buy... and graphics cards by then will not have improved to have 60+ consistent FPS on 4K resolution? <sarcastic tone>Yea, you're right - Gsync will be necessary.</sarcastic tone> Seriously, if you buy your card now and do no upgrades - yea, Gsync will probably be required in 2-4 years when most people start buying 4K monitors. Otherwise, I imagine that the cards of that day will be able to run a 4K monitor at 60fps. Now, if they can manage to get 4K 120/144hz in that time (and affordable), than Gsync will probably be needed.

Link to comment
Share on other sites

Link to post
Share on other sites

Amazing news. Now mantle is real competitor to G-Sync. Now all we need is G-Sync for AMD! :)

Honestly I don't see why everything one comparing a new API to monitor scaling refresh rates

Link to comment
Share on other sites

Link to post
Share on other sites

60fps with gsync will be flawless...hell even 30-50fps would look pretty good.....while amd mantle will be screen tearing and stuttering like mad... even if they get their untested and unproven boost in performance.gsync is proven...mantle is not.When we have 4k monitors and cards barely pushing 60fps...gsync will be necessary.

except for the part that when you get 60fps (with vsync) gsync does nothing, and the image will not tear noticably. gsync only slow your panel to match your card.

mantle is free, gsync isnt.

idk why people whine about free stuff.

Also another thing you said is that gsync is proven while mantle is not, breaking news for you, gsync isn t out until next year, mantle will be out in 1 month.

idk if you realise but no logical person makes buying decisions based on these features until they re out so calm way down and wait.

Anything I write is just a comment, take is as such, there is no guarantees associated with anything I say.

ATX Portable rig (smaller than prodigy(LOL)) :  Nmedia 2800 | Gigabyte Z77x-ud3h  | Corsair HX1000 | Scythe Big Shuriken | i5 3570K  |  XFX R9 290 DoubleD | Corsair Vengeance 32GB

Link to comment
Share on other sites

Link to post
Share on other sites

Honestly I don't see why everything one comparing a new API to monitor scaling refresh rates

Well, both are exciting technologies, bringing us more fluid gaming. Ofcourse in a different way. But if we had both, on both platforms, that would be amazing. For now, you can only choose one :/

Link to comment
Share on other sites

Link to post
Share on other sites

except for the part that when you get 60fps (with vsync) gsync does nothing, and the image will not tear noticably. gsync only slow your panel to match your card.

mantle is free, gsync isnt.

idk why people whine about free stuff.

Also another thing you said is that gsync is proven while mantle is not, breaking news for you, gsync isn t out either.

idk if you realise but no logical person makes buying decisions based on these features until they re out so calm way down and wait.

 Even with 60Fps and V-sync (wich is impossible to get) it wouldn't be in sync with the monitor.

There would still be timing diffrences wich cause input lag.(G-sync is the only solution for a real stutter and lag free picture)

And G-sync is proven, because they already showcased it.

And Mantle is also proven because it's just a low level API nothing more wich is in consoles for years now.

It does the same thing as a console API does.

All gains come from the developer not from the API Mantle has no fixed peformance gain!!!

 

RTX2070OC 

Link to comment
Share on other sites

Link to post
Share on other sites

Comparing some of these technologies is pretty silly..

 

Do I want to look good wearing this shirt....or a single shoe.

 

Some things just Shouldnt be compared.

Maximums - Asus Z97-K /w i5 4690 Bclk @106.9Mhz * x39 = 4.17Ghz, 8GB of 2600Mhz DDR3,.. Gigabyte GTX970 G1-Gaming @ 1550Mhz

 

Link to comment
Share on other sites

Link to post
Share on other sites

 

This hurts my heat enough to lay down....

You need better sentence structure.

FICTION AT THIS POINT, with a tiny bit of Fact...being it exists...and only a few people have seen it.. what if they decide to not release it due to a flaw... but its been announced........Still fiction...

 

So until the people who have used it,...post here & quote on their experience, or YOU experience it yourself and post your thoughts on the benefits for gamers...

Everyone should have a nice big cup of.........

Maximums - Asus Z97-K /w i5 4690 Bclk @106.9Mhz * x39 = 4.17Ghz, 8GB of 2600Mhz DDR3,.. Gigabyte GTX970 G1-Gaming @ 1550Mhz

 

Link to comment
Share on other sites

Link to post
Share on other sites

This hurts my heat enough to lay down....

You need better sentence structure.

FICTION AT THIS POINT, with a tiny bit of Fact...being it exists...and only a few people have seen it.. what if they decide to not release it due to a flaw... but its been announced........Still fiction...

 

So until the people who have used it,...post here & quote on their experience, or YOU experience it yourself and post your thoughts on the benefits for gamers...

Everyone should have a nice big cup of.........

G-Sync got already released for 1 ASUS Monitor and upgrade kits.

And Mantle has no fixed peformance gains so it doesn't matter how much it is in Battlefield 4, because you could get the double of that in another game or only the halph of it.

RTX2070OC 

Link to comment
Share on other sites

Link to post
Share on other sites

AMD is such a bro.

                                                                                                                                                                | 5820k+EK supremacy nickel+acetal white 4.5Ghz | X99 Deluxe | Enthoo Luxe | 2x gtx780+komod NV full cover block | Corsair AX1200i | WD blue 500gb |

                                                                                                                                                                                 Kingston V300 120gb | Samsung 840 Evo 500gb| Bitspower D5 vario+Res combo | primochill advanced LRT tubing (Solid White) |

                                                                                                                                                       | Alphacool Nexxos MONSTA dual 120mm Black Ice nemesis GTX360 triple 120mm | Noctua NF-F12 X4 | Bitspower true silver 1/2ID 3/4 OD compressions (various angles) |

Link to comment
Share on other sites

Link to post
Share on other sites

20% boost is very good if that's a worse case scenario, that's more than enough to allow a card like the 7950 running mantle to surpass a 7970 running opengl / direct x.

 

The biggest revelation for me was the below quote

"Mantle can basically “see” multiple GCN-based GPUs as a single GPU"

This will work wonders for crossfire users.

*drool comes from mouth* If thats true then man ill never have to replace my quadfire 7970's.

Link to comment
Share on other sites

Link to post
Share on other sites

As i posted in another post

Its not a matter of trusting linus or not. Sure it can make frame tearing and the visual look of jumping up and down fps feel less extreme but until I play with it i'll argue that it wont feel smooth.

My logic being that anyone whos ever picked up an fps on pc will know the differance between 30fps and 60fps. The best way i can explain my personal feel with it is sure 30 fps LOOKS smooth while playing a shooter but it in no way FEELS smooth. At lower fps the game feels sluggish and less responsive even if visually it looks relativly smooth. So great G-sync will make my random drops below 60 LOOK smooth but will it somehow make the game FEEL smooth still? Until i can play with it I'm straight up guessing no.

I'll take my chances with just geting pure higher fps thank you very much

CPU: Intel i5 4690k W/Noctua nh-d15 GPU: Gigabyte G1 980 TI MOBO: MSI Z97 Gaming 5 RAM: 16Gig Corsair Vengance Boot-Drive: 500gb Samsung Evo Storage: 2x 500g WD Blue, 1x 2tb WD Black 1x4tb WD Red

 

 

 

 

"Whatever AMD is losing in suddenly becomes the most important thing ever." - Glenwing, 1/13/2015

 

Link to comment
Share on other sites

Link to post
Share on other sites

Don't forget Nvidia as cherry picked situations to present G-Sync in the best light possible. (Kinda like if AMD said, 'See? Mantle gives a 60% fps increase*.') Honestly, if a game couldn't hold a stable frame rate, it's the developer's problem; I shouldn't have to purchase a $120 module just so I can play their game.

 

For a pro gamer, 90 fps on a 144Hz monitor (Mantle, free) is better than 64 fps on a 144Hz, under-clocked to a 64Hz, monitor (G-Sync, $120). Besides when the frame is refreshing that fast, you shouldn't notice stutter or lag.

 

In theory G-Sync sounds great. In theory Mantle sounds great. That is the current situation.

 

*in one game

Intel i5 6600k~Asus Maximus VIII Hero~G.Skill Ripjaws 4 Series 8GB DDR4-3200 CL-16~Sapphire Radeon R9 Fury Tri-X~Phanteks Enthoo Pro M~Sandisk Extreme Pro 480GB~SeaSonic Snow Silent 750~BenQ XL2730Z QHD 144Hz FreeSync~Cooler Master Seidon 240M~Varmilo VA87M (Cherry MX Brown)~Corsair Vengeance M95~Oppo PM-3~Windows 10 Pro~http://pcpartpicker.com/p/ynmBnQ

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×