Jump to content

Mantle Will Boost Performance By 20-50%, Nvidia Can Add Support.

I hope that Mantle supports Nvidia as well. Maybe Nvidia can stop being tools and let AMD license G-Sync.

Link to comment
Share on other sites

Link to post
Share on other sites

I am afraid this might not benefit higher end cards :/ It seems a lot like Mantle helps the CPU side in recognizing the GPU's... But not actually helping the GPU's to better be utilized by themselves... Hence the reason I think we might see these improvements on the low - middle end hardware, but now so much on the high ens... I wonna see some benching!!!! :3

Everyone have a cool signature. I don't, so I thought I would write something.

- Cool right?

Link to comment
Share on other sites

Link to post
Share on other sites

Don't forget Nvidia as cherry picked situations to present G-Sync in the best light possible. (Kinda like if AMD said, 'See? Mantle gives a 60% fps increase*.') Honestly, if a game couldn't hold a stable frame rate, it's the developer's problem; I shouldn't have to purchase a $120 module just so I can play their game.

 

For a pro gamer, 90 fps on a 144Hz monitor (Mantle, free) is better than 64 fps on a 144Hz, under-clocked to a 64Hz, monitor (G-Sync, $120). Besides when the frame is refreshing that fast, you shouldn't notice stutter or lag.

 

In theory G-Sync sounds great. In theory Mantle sounds great. That is the current situation.

 

*in one game

 

The thing I think people keep forgetting including yourself is that your example deals with constant framerates. Show me one game, just one, where you get constant FPS? Even Mantle isn't supposed to give constant FPS. FPS jumps around which causes tearing and microstuttering which effects how we view the game(s). G-Sync solves this by stabilizing the FPS at a constant FPS, as much as it can. Now you might ask, why is that important. How about this, you get 90FPS in X game as its top end FPS but it dips every thirty seconds to 50 FPS? How does the effect playing experience? For me, it effects it a lot, and I can tell when my FPS drops. There is not one thing for every single person, not everyone can tell the difference between 90 FPS and 50 FPS, some are more sensitive then others. You're being disingenuous with your example, because like I said, you aren't telling the whole story.

Link to comment
Share on other sites

Link to post
Share on other sites

What an ignorant thing to say, you thank AMD for letting Nvidia use its superior development language by not buying AMD cards ?

If anything AMD should be commended for their efforts to push the industry forwards and give every single gamer a better experience.

 

Don't worry, AMD will get X amount from each GPU sold by Nvidia.

 

Licensing is allowing AMD to have their cake, and eat it too!

Link to comment
Share on other sites

Link to post
Share on other sites

As i posted in another post

Its not a matter of trusting linus or not. Sure it can make frame tearing and the visual look of jumping up and down fps feel less extreme but until I play with it i'll argue that it wont feel smooth.

My logic being that anyone whos ever picked up an fps on pc will know the differance between 30fps and 60fps. The best way i can explain my personal feel with it is sure 30 fps LOOKS smooth while playing a shooter but it in no way FEELS smooth. At lower fps the game feels sluggish and less responsive even if visually it looks relativly smooth. So great G-sync will make my random drops below 60 LOOK smooth but will it somehow make the game FEEL smooth still? Until i can play with it I'm straight up guessing no.

I'll take my chances with just geting pure higher fps thank you very much

There is no doubt that g-sync is a necessary technology in order to once and for all solve the timing issues between displays and graphics chips, and eliminate the artifacts that come with those timing differences.

 

But yeah there seem to be some people who think that framerate won't matter anymore with g-sync. But in reality if you are getting 20fps then that means there is a 50ms lag between consecutive frames. So there is going to be lag between your inputs and the on-screen response. The monitor being synced with the VGA is not going to solve that because the VGA is still taking too long to get the next frame out.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

The thing I think people keep forgetting including yourself is that your example deals with constant framerates. Show me one game, just one, where you get constant FPS? Even Mantle isn't supposed to give constant FPS. FPS jumps around which causes tearing and microstuttering which effects how we view the game(s). G-Sync solves this by stabilizing the FPS at a constant FPS, as much as it can. Now you might ask, why is that important. How about this, you get 90FPS in X game as its top end FPS but it dips every thirty seconds to 50 FPS? How does the effect playing experience? For me, it effects it a lot, and I can tell when my FPS drops. There is not one thing for every single person, not everyone can tell the difference between 90 FPS and 50 FPS, some are more sensitive then others. You're being disingenuous with your example, because like I said, you aren't telling the whole story.

Frame jumps don't really cause tearing or microstutter. Microstutter is an issue with frame time variance that can also cause frame lags. The lag would be at worse 6.9 ms on a 144Hz monitor, which is below what NASA says the human eye can see.Tearing happens when two frames mash together in one scan. Again, with 144Hz fewer frames tear.

 

You will notice a drop in framerate even with G-Sync (50 fps doesn't look like 90 fps regardless of the equipment). G-Sync reduces lag and tearing; microstutter will be there, but the lags will not.

 

Also, the specific example I gave was for pro gamers. They want to see enemies as fast as possible. 90 with drops to 50 fps is better than 65 with drops to 35 fps. Planetside 2 devs have implemented an option to have smooth framerates or the highest frame rate possible, demonstrating my example where some people rather faster fps over a "smooth" experience.

 

I would argue that the pendulum demo is "disingenuous," as it is a demo designed to make G-Sync look good. We also don't know all the facts behind the Tomb Raider demo (i.e. monitor and computer specs).

Intel i5 6600k~Asus Maximus VIII Hero~G.Skill Ripjaws 4 Series 8GB DDR4-3200 CL-16~Sapphire Radeon R9 Fury Tri-X~Phanteks Enthoo Pro M~Sandisk Extreme Pro 480GB~SeaSonic Snow Silent 750~BenQ XL2730Z QHD 144Hz FreeSync~Cooler Master Seidon 240M~Varmilo VA87M (Cherry MX Brown)~Corsair Vengeance M95~Oppo PM-3~Windows 10 Pro~http://pcpartpicker.com/p/ynmBnQ

Link to comment
Share on other sites

Link to post
Share on other sites

hopefully nvida can have mantle so that amd can have gsync that would be cool  

Link to comment
Share on other sites

Link to post
Share on other sites

 

Mantle Q&A with Chris Roberts of Star Citizen, Daniel Baker and Tim Kip of Oxide, Jurjen Katsman of Nixxes, Guennadi Riguer of AMD and Johan Andersson of DICE.

Intel i7 5820K (4.5 GHz) | MSI X99A MPower | 32 GB Kingston HyperX Fury 2666MHz | Asus RoG STRIX GTX 1080ti OC | Samsung 951 m.2 nVME 512GB | Crucial MX200 1000GB | Western Digital Caviar Black 2000GB | Noctua NH-D15 | Fractal Define R5 | Seasonic 860 Platinum | Logitech G910 | Sennheiser 599 | Blue Yeti | Logitech G502

 

Nikon D500 | Nikon 300mm f/4 PF  | Nikon 200-500 f/5.6 | Nikon 50mm f/1.8 | Tamron 70-210 f/4 VCII | Sigma 10-20 f/3.5 | Nikon 17-55 f/2.8 | Tamron 90mm F2.8 SP Di VC USD Macro | Neewer 750II

Link to comment
Share on other sites

Link to post
Share on other sites

20% boost is very good if that's a worse case scenario, that's more than enough to allow a card like the 7950 running mantle to surpass a 7970 running opengl / direct x.

 

The biggest revelation for me was the below quote

"Mantle can basically “see” multiple GCN-based GPUs as a single GPU"

This will work wonders for crossfire users.

Yeah, especially for those select games where it doesn't support SLI/Crossfire. 

15" MBP TB

AMD 5800X | Gigabyte Aorus Master | EVGA 2060 KO Ultra | Define 7 || Blade Server: Intel 3570k | GD65 | Corsair C70 | 13TB

Link to comment
Share on other sites

Link to post
Share on other sites

Yo nvidia! How about trading gsync with mantle huh?

FX8320 4.2Ghz@1.280v& 4.5 Ghz Turbo@1.312v Thermalright HR-02/w TY-147 140MM+Arctic Cooling 120MMVRM cooled by AMD Stock Cooler Fan 70MM 0-7200 RPM PWM controlled via SpeedfanGigabyte GA990XA-UD3Gigabyte HD 7970 SOC@R9 280X120GiBee Kingston HyperX 3K2TB Toshiba DT01ACA2001TB WD GreenZalman Z11+Enermax 140MM TB Apollish RED+2X Deepcool 120MM and stock fans running @5VSingle Channel Patriot 8GB (1333MHZ)+Dual Channel 4GB&2GB Kingston NANO Gaming(1600MHZ CL9)=14GB 1,600 Jigahurtz 10-10-9-29 CR1@1.28VSirtec High Power 500WASUS Xonar DG, Logitech F510Sony MDR-XD200Edifier X220 + Edifier 3200A4Tech XL-747H 3600dpiA4Tech X7-200MPdecent membrane keyboardPhilips 236V3LSB 23" 1080p@71Hz .

               
Sorry for my English....

Link to comment
Share on other sites

Link to post
Share on other sites

I have a GTX 660 so i'm more interested in G-Sync than Mantle (for obvious reasons), but we just have to wait and see how big is the improvement from DX. Maybe both could license these technologies to each other so everyone gets the best experience possible :)

Link to comment
Share on other sites

Link to post
Share on other sites

Damn, if the Mantle hype doesn't pay out well, there will be a huge avalanche of rage on AMD.

Link to comment
Share on other sites

Link to post
Share on other sites

The thing I think people keep forgetting including yourself is that your example deals with constant framerates. Show me one game, just one, where you get constant FPS? Even Mantle isn't supposed to give constant FPS. FPS jumps around which causes tearing and microstuttering which effects how we view the game(s). G-Sync solves this by stabilizing the FPS at a constant FPS, as much as it can. Now you might ask, why is that important. How about this, you get 90FPS in X game as its top end FPS but it dips every thirty seconds to 50 FPS? How does the effect playing experience? For me, it effects it a lot, and I can tell when my FPS drops. There is not one thing for every single person, not everyone can tell the difference between 90 FPS and 50 FPS, some are more sensitive then others. You're being disingenuous with your example, because like I said, you aren't telling the whole story.

Tetris

Link to comment
Share on other sites

Link to post
Share on other sites

Tetris

 

Minesweeper.

Intel i7 5820K (4.5 GHz) | MSI X99A MPower | 32 GB Kingston HyperX Fury 2666MHz | Asus RoG STRIX GTX 1080ti OC | Samsung 951 m.2 nVME 512GB | Crucial MX200 1000GB | Western Digital Caviar Black 2000GB | Noctua NH-D15 | Fractal Define R5 | Seasonic 860 Platinum | Logitech G910 | Sennheiser 599 | Blue Yeti | Logitech G502

 

Nikon D500 | Nikon 300mm f/4 PF  | Nikon 200-500 f/5.6 | Nikon 50mm f/1.8 | Tamron 70-210 f/4 VCII | Sigma 10-20 f/3.5 | Nikon 17-55 f/2.8 | Tamron 90mm F2.8 SP Di VC USD Macro | Neewer 750II

Link to comment
Share on other sites

Link to post
Share on other sites

I hope you all realize that in order for there to be any gain at all the Game needs to have been designed and optmized for Mantle.  Since PC game companies don't like to spend time/money optimizing an engine for something that only a portion of gamers have (physx anyone?) you won't see any significant number of people using it.  The fact of the matter is that average users are what matter to game studios and a lot of "average" users don't even pick their own GPU, they run laptops, pre-built systems and integrated graphics (which have slowly been getting better).

 

Mantle is a pipe-dream just like everything else that has come out with the hopes of revolutionizing gaming.  It isn't going to magically make your 270 into a 780 killer and even thinking of it that way is silly.  Best case scenario it might give you a boost on what settings you can play at on a card that otherwise might have to lower them a bit to keep framerates steady, and that's on the handful of "AMD" oriented titles that get developed using this.  NVIDIA will just ignore it, they aren't going to use it.

Link to comment
Share on other sites

Link to post
Share on other sites

20%-50% would allow an R9 280X to match a GTX 780 & an R9 270X to match a GTX 770.

Which is basically a 200 dollars' worth of performance.

But, if Mantle is supported on Nvidia's cards as well, then the 780 and 770 will get a similar boost. Having mantle will just all around be a good thing as long as it's not locked down to AMD cards.

Link to comment
Share on other sites

Link to post
Share on other sites

Since PC game companies don't like to spend time/money optimizing an engine for something that only a portion of gamers have (physx anyone?) you won't see any significant number of people using it.  The fact of the matter is that average users are what matter to game studios and a lot of "average" users don't even pick their own GPU, they run laptops, pre-built systems and integrated graphics (which have slowly been getting better).

The hope is that Nvidia and Intel will add support, now that we know it's technically possible. Remains to be seen what the obstacles to that are in terms of licensing costs etc...

 

Mantle is a pipe-dream just like everything else that has come out with the hopes of revolutionizing gaming.

Way too cynical. It's an API like opengl or direct 3d, and the hope is that it will allow developers to make better use of hardware resources.

Also nobody said it's going to revolutionize gaming, Mantle is more important to developers than it is to gamers.

Link to comment
Share on other sites

Link to post
Share on other sites

i hoped it would be more like a 100% increase, but this is for sure better than nothing, I'm notgonna turn down free performance ^^

AMD FX8320 @3.5ghz |  Gigabyte 990FXA-UD3  |  Corsair Vengeance 8gb 1600mhz  |  Hyper 412s  |  Gigabyte windforceR9 290  |  BeQuiet! 630w  |  Asus Xonar DGX  |  CoolerMast HAF 912+  |  Samsung 840 120gb


2 WD red 1tb RAID0  |  WD green 2tb(external, backup)  |  Asus VG278He  |  LG Flatron E2240  |  CMstorm Quickfire TK MXbrown  |  Sharkoon Fireglider  |  Audio Technica ATH700X


#KILLEDMYWIFE

Link to comment
Share on other sites

Link to post
Share on other sites

But, if Mantle is supported on Nvidia's cards as well, then the 780 and 770 will get a similar boost. Having mantle will just all around be a good thing as long as it's not locked down to AMD cards.

No actually, Mantle has been specifically optimized for AMD's GCN architecture so in its current form Nvidia cards would only benefit from the more efficient rendering path i.e. more draw calls which is the CPU performance side of Mantle.

All other optimizations would remain AMD specific, this includes all the GPU performance benefits for example hardware MSAA, Compute & DMA specific pipe-lining, multi-GPU scaling, Global Illumination, APU specific physics, AI & path-finding and a whole bunch of other techniques.

Nvidia would have to commit to Mantle first, then develop its own hardware-specific optimizations and implement them within the Mantle rendering path.

Link to comment
Share on other sites

Link to post
Share on other sites

20%-50% would allow an R9 280X to match a GTX 780 & an R9 270X to match a GTX 770.

Which is basically a 200 dollars' worth of performance.

 

Yes, but only time will tell will every major game developer use it, it would be nice to see Nvidia chasing AMD for a change, they have been slacking for some time...

System

CPU: i7 4770kMotherboard: Asus Maximus VI HeroRAM: HyperX KHX318C9SRK4/32 - 32GB DDR3-1866 CL9 / GPU: Gainward Geforce GTX 670 Phantom Case: Cooler Master HAF XBStorage: 1 TB WD BluePSU: Cooler Master V-650sDisplay(s): Dell U2312HM, LG194WT, LG E1941

Cooling: Noctua NH-D15Keyboard: Logitech G710+Mouse: Logitech G502 Proteus SpectrumSound: Focusrite 2i4 - USB DAC / OS: Windows 7 (still holding on XD)

 
 
Link to comment
Share on other sites

Link to post
Share on other sites

Yes, but only time will tell will every major game developer use it, it would be nice to see Nvidia chasing AMD for a change, they have been slacking for some time...

All new EA non Sport games will use it and all new Cryengine games.

 

RTX2070OC 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×