Jump to content

AMD Mantle vs NVidia G-SYNC

I'm Batman

Mantle is a long term project from which the entire industry benefits. Even users that do not have cards which benefit from Mantle, it will push Microsoft to improve DirectX and possibly inspire OpenGL developers. 

G-Sync is a marketing concept that allows nVidia to make more money off of less demanding users by telling them "you can still use your midrange card (or old Kepler card) and use G-sync for smooth gameplay" but within 1 year that card will have to be replaced by a better one assuming they still want to play demanding games at high resolutions/settings. 

5.1GHz 4770k

My Specs

Intel i7-4770K @ 4.7GHz | Corsair H105 w/ SP120 | Asus Gene VI | 32GB Corsair Vengeance LP | 2x GTX 780Ti| Corsair 750D | OCZ Agility 3 | Samsung 840/850 | Sandisk SSD | 3TB WD RED | Seagate Barracuda 2TB | Corsair RM850 | ASUS PB278Q | SyncMaster 2370HD | SyncMaster P2450
Link to comment
Share on other sites

Link to post
Share on other sites

G - Sync is out of the market since I do not game on 120 - 144 hz monitors, plus if someone had a monitor i would doubt they would cough money up again to get it. Also for people not having a kepler GPU it makes it even more expensive

 

so at the moment I will go with mantle, I am not fussed if AAA's dont take it on board - i am hoping it makes indie games alot better

 

 

Mantle is a long term project from which the entire industry benefits. Even users that do not have cards which benefit from Mantle, it will push Microsoft to improve DirectX and possibly inspire OpenGL developers. 

G-Sync is a marketing concept that allows nVidia to make more money off of less demanding users by telling them "you can still use your midrange card (or old Kepler card) and use G-sync for smooth gameplay" but within 1 year that card will have to be replaced by a better one assuming they still want to play demanding games at high resolutions/settings.

 

also on this ..... I could see nvidia using this to get into the monitor market, forcing people to buy a new monitor so its more compatible with future GPU's

 

i can see this having compatibility issues meaning you need to buy a monitor to match the GPU

Its all about those volumetric clouds

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I'll be completely honest. PhysX at the moment is not something I would consider a factor when I'd make the decision on what card to get. It's just way to proprietary and there's definitely not enough games that utilise, let alone utilise it well to even impact my decision to the slightest. PhysX is not a necessity and I'm sure a lot of people would agree.

Disagree with you. Lots of great & popular video games use Physx and it's a great feature.

| CPU: i7 3770k | MOTHERBOARD: MSI Z77A-G45 Gaming | GPU: GTX 770 | RAM: 16GB G.Skill Trident X | PSU: XFX PRO 1050w | STORAGE: SSD 120GB PQI +  6TB HDD | COOLER: Thermaltake: Water 2.0 | CASE: Cooler Master: HAF 912 Plus |

Link to comment
Share on other sites

Link to post
Share on other sites

I'm really excited about both of these technologies. I feel like they're areas that have been massively neglected (hardware on nvidia side and software on amd).

MANTLE

-it's exciting to think what a truly optimized engine might actually gain over directx

-I don't think it will catch unless steam box really sky rockets

-Encourages a new iteration of directx to finally come out. Were due for some update.

-Can cause further differentiation between the two hardware vendors and more favoritism between different games if each company adopts their own standard.

-MANTLE is ultimately furthering software implementation for games and when paired with true audio and other things amd has it makes a nice little bundle

G-Sync

-I question why it hasn't been done in quite some time

- I want to see the performance increase over panels with very high refresh rates like plasma.

-I'm more excited for this to be implemented with movies because they only run at 24hz

-I want to see how expensive it will be after it becomes a standard

-if it's what it's cracked up to be it should make most games much more immersive.

-G-Sync to me is probably something I won't be able to afford so I won't allow myself to become too excited by it.

I view both things as standards that will be adopted widespread eventually. Right now they're just starting though and will take a while to get built up. I think ultimately Mantle is going to make the bigger difference. When BF4 comes out and (if) it gets a 50% improvement everyone is going to turn their noses at Microsoft and say hey your holding out on us. This will increase support for the steam box and cause Microsoft to do some thinking and try and make the PC market better instead of worse.

Link to comment
Share on other sites

Link to post
Share on other sites

I may be in the minority, but I think G-Sync is HUGE.  MASSIVE.

 

Gamers have always had to make a compromise.  If we want the more consistent visual experience offered by V-Sync, we have to accept a capped 60fps (for most of us), higher input delay, and jarring drops in framerate if the GPU drags the tiniest bit.  If we want the highest framerate possible and a nice continuum of possible framerates, we have to accept horrible tearing that destroys the visual fidelity of the game.

 

Adaptive V-Sync is kind of a hack solution that doesn't fix the problem; I get terrible tearing when using it and so I haven't bothered with it.

 

With G-Sync, we finally can have a no-compromises gaming experience.  None of the delay and stuttering that comes with V-Sync, and none of the tearing that normally comes with an unrestricted framerate.  If my framerate hovers between 50 and 80 in a particular game, I will be able to have a silky smooth experience, which has not been possible until now.

 

Even for those with dual Titans pushing everything at 120/144 frames per second, I presume G-Sync will offer framerate capping without delays or tearing since the monitor and GPU are working in tandem.  Rather than the GPU drawing a frame then waiting for the monitor to refresh, they will be able to sync with each other.  And any drop below 120 will be unnoticeable and silky smooth with no tearing.

 

Mantle is something that's cool for developers to experiment with, but may not mean much to the end user when it's all said and done.  G-Sync will make a massive difference for anyone who invests in the technology, for every game, and users will no longer have to buy an overpowered GPU just to guarantee a smooth gaming experience.

Intel Core i7-7700K | EVGA GeForce GTX 1080 FTW | ASUS ROG Strix Z270G Gaming | 32GB G-Skill TridentZ RGB DDR4-3200 | Corsair AX860i

Cooler Master MasterCase Pro 3 Samsung 950 Pro 256GB | Samsung 850 Evo 1TB | EKWB Custom Loop | Noctua NF-F12(x4)/NF-A14 LTT Special Edition

Dell S2716DGR | Corsair K95 RGB Platinum (Cherry MX Brown) | Logitech G502 Proteus Spectrum | FiiO E17 DAC/Amp | Beyerdynamic DT990 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Well, my eyes aren't that sensitive to tearing. So, I'll vote Mantle.

\[T]/ Praise the Sun!
Super Budget Gaming Build: Intel Pentium G1610, Gigabye GA-H61M-DS2 rev. 3, Kingston Value RAM 4GB CL9 1333MHz, Fractal Design Core 1000, Corsair VS 450, WD 1TB, Powercolor Radeon HD 7750 1GB/GDDR5, (Optional: Asus DRW-24B1ST).
 (Total: $340 USD)

Link to comment
Share on other sites

Link to post
Share on other sites

These are completely different things. Why are we comparing them?

3700X, GTX1060, 64GB HyperX 3000, B450 DS3H, 500GB SN750, TT Core G3, PreSonus Audiobox USB96

Link to comment
Share on other sites

Link to post
Share on other sites

What Mantle seems like to me is AMD trying to stick there foot in the door of a technology thats perfectly fine.

 

DirectX doesn't need to be replaced, its doing the job quite all right, it manages all GPU's- across PC's and consoles- perfectly fine.

 

However AMD have touted some cool features, all of which sounds mostly like a marketing stunt to me but I'll guess we see next week when BF4 rolls out.

 

Is it better than Gsync? 

Well thats a pathetic question. They're completely different.

I'm the one who overclocks.

 

CPU: i5 2500k w/CM Cooler Master Hyper 212 EVO OC'd@4.2GHz RAM: Corsair Vengeance 8GB  MOBO: Intel Burrage DP67BG GPU: MSI GTX770 Twin Frozr 2GB HDD: 2x Seagate 2TB Barracuda PSU: Gigabyte ODIN 550W

Link to comment
Share on other sites

Link to post
Share on other sites

These are completely different things. Why are we comparing them?

I believe the whole point of this is not to say "Which is the better technology?" but rather "Which is going to make the greater overall impact on the game industry?"

 

Obviously there's no way to directly compare the two since they're totally different things, but we can compare their relative importance.

Intel Core i7-7700K | EVGA GeForce GTX 1080 FTW | ASUS ROG Strix Z270G Gaming | 32GB G-Skill TridentZ RGB DDR4-3200 | Corsair AX860i

Cooler Master MasterCase Pro 3 Samsung 950 Pro 256GB | Samsung 850 Evo 1TB | EKWB Custom Loop | Noctua NF-F12(x4)/NF-A14 LTT Special Edition

Dell S2716DGR | Corsair K95 RGB Platinum (Cherry MX Brown) | Logitech G502 Proteus Spectrum | FiiO E17 DAC/Amp | Beyerdynamic DT990 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

I hope Mantle turns out to be a useful tool for developers to more accurately scale the performance from console to PC, but I was assuming that with similar architecture, that would be a given anyways.

With G-Sync....

I have a high refresh monitor. Screen tearing is no longer an issue over 60 fps. It would be great if the software could be configured so that fps under 45 or so would use double frame interpolation like current HDTVs to smooth out the choppyness.

Air 540, MSI Z97 Gaming 7, 4770K, SLI EVGA 980Ti, 16GB Vengeance Pro 2133, HX1050, H105840 EVO 500, 850 Pro 512, WD Black 1TB, HyperX 3K 120, SMSNG u28e590d, K70 Blues, M65 RGB.          Son's PC: A10 7850k, MSI A88X gaming, MSI gaming R9 270X, Air 240, H55, 8GB Vengeance pro 2400, CX430, Asus VG278HE, K60 Reds, M65 RGB                                                                                       Daughter's PC: i5-4430, MSI z87 gaming AC, GTX970 gaming 4G, pink air 240, fury 1866 8gb, CX600, SMSNG un55HU8550, CMstorm greens, Deathadder 2013

 

Link to comment
Share on other sites

Link to post
Share on other sites

I expect both to fail magnificently.

 

Mantle has a chance if it isn't kept proprietary.

 

nVidia has already failed. If I buy a video card with tec on it, I shouldn't then have to buy a special monitor to get full functionality out of it. It's going to be physX all over again.

 

There's going to be a few monitors, they'll be too expensive, 99% of the people won't care and it will be a silly little hole. This is my prediction.

- Silverstone TJ08B-E - Gigabyte Z87M-D3H - i7 4770k @ 4.0GHZ 1.2v - 16gb Kingston HyperX Black 1600 - Gigabyte GTX 770 OC 4GB -


- Silverstone Fortress FT02 - MSI Z77 Mpower - i5 3570k @ 4.0GHZ 1.09v - 8gb Mushkin Blackline 1600 - MSI GTX 670 PE -


- Lenovo T430 (1600x900) - i5 3210m - 8GB DDR3 1333 - nVidia NVS5400M - 256GB mSATA OS - 320GB HDD-

Link to comment
Share on other sites

Link to post
Share on other sites

Anything like this that I have seen implemented in the past increases response time, and if this does, even by half a millisecond, I want nothing to do with it. I need things to be instant, anything less renders my PC useless

Main Machines~ Prometheus - i7 3930k @ 4.5GHz, X79-UD3, HyperX Blu 1600Mhz 24Gb (2x8Gb, 2x4Gb), XFX Double-D 7970, H100i, Military green C70, TX850M, All Corsair SP/AF-120 fans, Samsung 840 Pro 128Gb, 750Gb Seagate Barracuda 7200 RPM x2, NZXT Hue accent lights (wouldn't reccommend, they don't stick well). Laptop~ Asus X202E touch notebook- 500Gb 5400RPM 7mm HDD, 4Gb RAM-not upgradeable, 1.8GHz i3, HD4000

Link to comment
Share on other sites

Link to post
Share on other sites

What I would like to know is the following:

 

Would G-Sync be able to get rid of the stuttering/hitching that we see today in some game engines that were poorly optimized? (Far Cry 3 for example).

 

If that's a definite Yes, then It would make G-Sync a bit more relevant to me (I play on a 144Hz BenQ XL2411T).

 

But I still stand firm that these two technologies shouldn't me named in the same sentence, they're completely different things.

 

What I find interesting is that repi (Johan Andersson) mentioned several times over the weekend that Mantle would bring "the robust stable performance that we have on consoles today", so if Mantle is able to close the gap significantly between Min FPS and Max FPS, then it would make something like G-Sync less relevant in Mantle enabled games.

 

I for one I'm excited for BOTH technologies, but given that I just spent ~300 EUR on my BenQ XL2411T 144Hz monitor, and ~$370 on a Sapphire HD 7970 GHz Vapor X edition, I don't see myself going out and spending ~$800 more for a problem that might not even exist on my current situation (Mantle enabled GPU and a 144Hz monitor).

Link to comment
Share on other sites

Link to post
Share on other sites

First time I've been actually a bit frustrated looking at opinions around here, mostly because so many posts are just dead wrong about Mantle and G-Sync, especially G-Sync :( Really wish people would do their own research rather than regurgitate everything they read. :angry:

Beware the irrational, however seductive. Shun the ‘transcendent’ and all who invite you to subordinate or annihilate yourself. Distrust compassion; prefer dignity for yourself and others. Don’t be afraid to be thought arrogant or selfish. Picture all experts as if they were mammals. Never be a spectator of unfairness or stupidity. Seek out argument and disputation for their own sake; the grave will supply plenty of time for silence. Suspect your own motives, and all excuses. Do not live for others any more than you would expect others to live for you.

Link to comment
Share on other sites

Link to post
Share on other sites

Clearly G-Sync, because Mantle will already die when the 20nm GPU's come out wich should be next year or the Support of the 7000/9000 will go away.

(Because thats how a Low Level API works)

With G-Sync 40FPS on an Nvidia will look better then 60FPS on AMD wich could make a mid range Nvidia GPU a better option then a High End AMD GPU.

Stable 30FPS are better then jumping higher Framerates.

why will it die exactly? MANTLE is an API that's being developed for amd archetextures. to think that they wouldn't support their own stuff is stupid.

TekSyndicate Forum Moderator: https://forum.teksyndicate.com/users/njm1112

5930K@4.3Ghz | 16GB 4x4 2133Mhz | r9-290 | 6TB Raid5 | 250GB 850Evo | 8.1pro | RM750

Link to comment
Share on other sites

Link to post
Share on other sites

AMD won't be going away from GCN anytime soon, new architectures can easily be deployed and vary greatly while still supporting what needs to be supported to keep Mantle alive. I don't care much, at this time, but I don't mind AMD trying something new, especially with a huge title that will reach many many people. Anyway, there's no reason for AMD to drop support and I'd be amazed if they did. Their new GCN is pretty epic, although at this time I drive Keplers at home.

Beware the irrational, however seductive. Shun the ‘transcendent’ and all who invite you to subordinate or annihilate yourself. Distrust compassion; prefer dignity for yourself and others. Don’t be afraid to be thought arrogant or selfish. Picture all experts as if they were mammals. Never be a spectator of unfairness or stupidity. Seek out argument and disputation for their own sake; the grave will supply plenty of time for silence. Suspect your own motives, and all excuses. Do not live for others any more than you would expect others to live for you.

Link to comment
Share on other sites

Link to post
Share on other sites

G-sync

 

Just like what Linus said mantle takes time let game makers to adopt. In the meanwhile, NVIDIA can come up with something too, and it would be modified to address issues that Mantle will have.

 

G-sync, 2014 new monitor, and that is it. As long as you have 660+.

The problem with the two is which will be major in the future.

Who takes the most of the cake wins.

Link to comment
Share on other sites

Link to post
Share on other sites

why will it die exactly? MANTLE is an API that's being developed for amd archetextures. to think that they wouldn't support their own stuff is stupid.

Because a Low Level API can only support 1 architecture thats how it works.

And they will go away from GCN sooner or later and so will the support.

Also it doesn't matter if you get 20FPS more when the game looks better with 20FPS less with G-sync.

 

RTX2070OC 

Link to comment
Share on other sites

Link to post
Share on other sites

Because a Low Level API can only support 1 architecture thats how it works.

And they will go away from GCN sooner or later and so will the support.

Also it doesn't matter if you get 20FPS more when the game looks better with 20FPS less with G-sync.

 

Mantle 2.0

 

we've seen this with otherAPIs a little more different apis but still apis

 

EDIT: that and we know mantel is NOT going away because it's being dev'd for consoles. The fact that consoles have a long life by nature means that they will keep support for it, and when the PC architecture moves forward why would they not dev mantel for more comparability on their own stuff?

TekSyndicate Forum Moderator: https://forum.teksyndicate.com/users/njm1112

5930K@4.3Ghz | 16GB 4x4 2133Mhz | r9-290 | 6TB Raid5 | 250GB 850Evo | 8.1pro | RM750

Link to comment
Share on other sites

Link to post
Share on other sites

I just got a 2560x1600 dell u3011. Although not designed for gaming, I do like gaming a high resolutions. The chances of me adopting either either technology will be I about 4-5 years at the earliest. So buying another expensive monitor or an expensive GPU just isn't going to happen. Though I am interested to see how the technology will be implemented.

Link to comment
Share on other sites

Link to post
Share on other sites

As many said, not a valid question, refuse to vote.

 

But that type of synchronization is really technology that display manufacturers needed to develop, independent on GPU manufacturers. It is very likely that Intel or AMD will develop open platform similar yo g-sync, and take over the market. That type of technology should not be premium nor closed to begin with. 

 

Just my opinion.

Link to comment
Share on other sites

Link to post
Share on other sites

These are completely different things. Why are we comparing them?

 

Maybe read the thread and you might understand. Both are technologies to reduce tearing and stutter.

FX 8150 @ 4.0 GHz, MSI Radeon 7850 OC Edition, 2 x 4 gigs of GSkill Sniper ddr3 at 2133 MHz, two 120 gig Kingston HyperX 3K ssds, Crosshair V Formula, Antec High Current Gamer 620 watt psu, all wrapped up inside a Cooler Master CM Storm Stryker.

 

Link to comment
Share on other sites

Link to post
Share on other sites

As many said, not a valid question, refuse to vote.

 

But that type of synchronization is really technology that display manufacturers needed to develop, independent on GPU manufacturers. It is very likely that Intel or AMD will develop open platform similar yo g-sync, and take over the market. That type of technology should not be premium nor closed to begin with. 

 

Just my opinion.

I have no doubt that Nvidia has a patent on this (and they should, they spent however many millions on R&D to develop it) so if Intel or AMD want to develop something it MUST be in a form that doesn't violate the patent. Further, if we get competing technologies all that does is force consumers to purchase their monitor/monitor upgrade with their GPU, it splits the market into obligate Nvidia and obligate AMD due to cost to the consumer, this limits competition instead of encouraging it. In my mind the best case scenario would be if Nvidia were to license it to AMD (would be good for both parties, Nvida increases the demand for their product and AMD gets to solve a major issue, that being artifacts). Honestly if G-sync works as advertised (all signs point to yes) we should hope it becomes a standard not a point of competition.

Intel 3570k @ 4.4 GHz |Asus Sabertooth Z77 |EVGA GTX 660 Ti FTW |Kingston HyperX Beast 16 Gb DDR3 1866 (2x8Gb)


|Samsung 840 250 GB |Western Digital Green 2TB 2x |Cooler Master 850w 80+ Gold |Custom Water Cooling Loop |Noctua NF-F12 4x
|Noctua NF-A14 3x |Corsair Carbide 500R (White) |Corsair K95 |Razer Mamba |Razer Megalodon |Samsung SyncMaster T220 2x Computer Bucket List   Greatest Thread Ever   WAN Show Drinking Game  GPU Buyers Guide
Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×