Jump to content

AMD Mantle vs NVidia G-SYNC

I'm Batman

Apparently Frostbite 3 engine used in Battlefield 4 is based on Mantle. Not sure how that works for Nvidia, whether the engine uses mantle for AMD and then uses directx for Nvidia GPU's on windows?

 

Apparently Frostbite 3 is an engine for various platforms (PC, XBox 360, XBox One, PS3, PS4) which CLEARLY RULES OUT, that Frostbite 3 is BASED on Mantle. It might very well include some Manlte code, as well as it does include some Nvidia specific code. It certainly is NOT BASED on Mantle ...

Mini-Desktop: NCASE M1 Build Log
Mini-Server: M350 Build Log

Link to comment
Share on other sites

Link to post
Share on other sites

the thing about AMD's Mantel is that it is OPEN and Nvidia or a third party could come in and add some code to the Mantel API to allow for the same/similar "metal" programming that would happen on AMD GCN cards win-win.  on the other hand Nvidia's GSync will only work on Nvidia 6xx and higher; and will probably be Nvidia exclusive. 

 

Are your sure? I am pretty sure it isn't.

Mini-Desktop: NCASE M1 Build Log
Mini-Server: M350 Build Log

Link to comment
Share on other sites

Link to post
Share on other sites

Are your sure? I am pretty sure it isn't.

If you're referring to the Nvidia 6XX thing, he's right, it only works on Kepler cards and above

THE BEAST Motherboard: MSI B350 Tomahawk   CPU: AMD Ryzen 7 1700   GPU: Sapphire R9 290 Tri-X OC  RAM: 16GB G.Skill FlareX DDR4   

 

PSU: Corsair CX650M     Case: Corsair 200R    SSD: Kingston 240GB SSD Plus   HDD: 1TB WD Green Drive and Seagate Barracuda 2TB Media Drive

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I honestly am more excited for Mantle.

I'm not planing on buying a 175$ update kit for g.synch, and at least from what Slick said on the WAN show, it's TN only [at least for now]. So, no care.

 

While Nvidia is screwing up one thing after another at the moment especially on Linux [which will soon be my main boot-to-OS] I won't buy Nvidia cards anyways.

 

And tbh, a full HW_access API, and the use of upto 8 cores, makes Mantle a big deal [yet ofc in theory as noone has seen what it actually does]. While g.synch seems to be the next "3D vision / 144hz monitor"... Using a 120hz benq xl2420t and I couldnt care less for it next to my pb278q ^^

Yeah the kit is only for certain TN monitors, but that doesn't rule out ever buying a monitor in the future. The first monitors with this tech in them are being released in Q1 2014, the brands & number of models haven't been specified but if there's more down the track it may become a standard.

Link to comment
Share on other sites

Link to post
Share on other sites

Well im not a fan boy i always said i buy best price/performance option Linus is right that atm nothing is clear and in a same time very wrong g-sync its not a new bike as in interview they said it could be done 5 years ago and im confused why that wasint done in a first pleace as a basic procedure since its logical and simple thing refresh every time its done  and why linus is wrong well i gues this kind a technology cant be patented only the base code or that architechture can be patented in that case amd could easy adopt this technology(what i mean they could use build in hardware in monitors for theyr own purpose with just a modifide architechture for there own gpus) where at the other hand mattle is based on gpu architechture witch i doubt that nvidia will take and what i have in mind if u think deep its mind blowing since if amd gets bether utialization of its raw power even mid range gpu might perform as high end nvidia in games with matle (console/pc) style where console uses raw power more effectively and if amd uses g-sync it sudenly gets even betther expierince for end user and as open as mantle should be i gues they could implement phisix colculations without nvidia phisix involved how much of the true will be here time will show and for content developing i gues nvidia still be wining until mantle is used in developing sofware since it should have infinite options cant wait for future. Do you guys imagine how amazing this technology might become? simple tip mantle with g-sync on something like oculus rift or similar hardware with 3d visions anabled it could be so powerfull that it would look as real reality well in other words i dont care who wins what i only hope in the end, end user will get the best it can in that case us to get ultimate expierince. sry for my spelling ideas are prety clear for anyone to understand

http://www.speedtest.net/my-result/2823229239

Made on !7:00 so its worst time =D ussually have 97+mbs/90+mbps -=Fact Lithuania has one of the best internet in the world and best internet infrastrukture in Europe.=-

Link to comment
Share on other sites

Link to post
Share on other sites

If you're referring to the Nvidia 6XX thing, he's right, it only works on Kepler cards and above

I thought it was pretty clear, that I meant mantle

 

Also it has been mentioned several times now, that G-Sync might see a licensing scheme ...

Mini-Desktop: NCASE M1 Build Log
Mini-Server: M350 Build Log

Link to comment
Share on other sites

Link to post
Share on other sites

the thing about AMD's Mantel is that it is OPEN and Nvidia or a third party could come in and add some code to the Mantel API to allow for the same/similar "metal" programming that would happen on AMD GCN cards win-win.  on the other hand Nvidia's GSync will only work on Nvidia 6xx and higher; and will probably be Nvidia exclusive. 

Mantle isn't open right now.

As per Robert Hallock PR gaming @ AMD

https://twitter.com/Thracks/status/383872285351739393

Link to comment
Share on other sites

Link to post
Share on other sites

Yeah the kit is only for certain TN monitors, but that doesn't rule out ever buying a monitor in the future. The first monitors with this tech in them are being released in Q1 2014, the brands & number of models haven't been specified but if there's more down the track it may become a standard.

 

http://uk.hardware.info/news/37490/asus-announces-vg248qe-monitor-with-g-sync

 

280$ Monitor +g.synch = 400$ monitor. lol.

 

for 400$ you can get a PA24... ^^

 

Seriously, I'm not saying "If it has g.synch I won't buy it" I'm saying I don't care for it. One of my monitors has Nvidia 3d / lightboost... I have not even installed the 3D drivers...

Frost upon these cigarettes.... lipstick on the window pane...

Link to comment
Share on other sites

Link to post
Share on other sites

I thought it was pretty clear, that I meant mantle

 

Also it has been mentioned several times now, that G-Sync might see a licensing scheme ...

Haha alright easy easy. I was never disputing anything about G-Sync at all. 

 

Sorry that I didn't see it was so obvious

THE BEAST Motherboard: MSI B350 Tomahawk   CPU: AMD Ryzen 7 1700   GPU: Sapphire R9 290 Tri-X OC  RAM: 16GB G.Skill FlareX DDR4   

 

PSU: Corsair CX650M     Case: Corsair 200R    SSD: Kingston 240GB SSD Plus   HDD: 1TB WD Green Drive and Seagate Barracuda 2TB Media Drive

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

the thing about AMD's Mantel is that it is OPEN and Nvidia or a third party could come in and add some code to the Mantel API to allow for the same/similar "metal" programming that would happen on AMD GCN cards win-win.  on the other hand Nvidia's GSync will only work on Nvidia 6xx and higher; and will probably be Nvidia exclusive. 

i gues your wrong about nvidia using mantle since 1st they should get lisence for that from amd second they most likely should shange architecture of there product and there we get worst even if they would like to change and adopt architecture it would take time LOTS of time Gpus they made and release now they started developing atleast 2-3 years ago its not a thing that you can do in few months =D i doubt that ever gonna happen + i gues nvidia and open platform something not right for my eye =DD but future will show technology changes ever faster i think gaming we knew today will change in 5 years dramaticly like it was when 3d graphic was started to use i still remember 2d games i played and things like true audion g sync mantle 3d oculus DAMMM...... i hope i will be playing Crysis 6 in oculus =DDD with all that and more enabled

http://www.speedtest.net/my-result/2823229239

Made on !7:00 so its worst time =D ussually have 97+mbs/90+mbps -=Fact Lithuania has one of the best internet in the world and best internet infrastrukture in Europe.=-

Link to comment
Share on other sites

Link to post
Share on other sites

http://uk.hardware.info/news/37490/asus-announces-vg248qe-monitor-with-g-sync

 

280$ Monitor +g.synch = 400$ monitor. lol.

 

for 400$ you can get a PA24... ^^

 

Seriously, I'm not saying "If it has g.synch I won't buy it" I'm saying I don't care for it. One of my monitors has Nvidia 3d / lightboost... I have not even installed the 3D drivers...

Technically because the chip is on the monitor it could be reverse engineered. If AMD wanted to write code for that to work they could without a license. No way a judge would rule against running code that somehow " talks " to a monitor that someone else owns. Even with 0 information that monitor could be reverse engineered within 30 days. I wonder if Nvidia would sue AMD if they did that. Would be hard to license and would AMD even ask to be able too? They could just contact a vendor buy a monitor and well... when your a company that woks with vendors who sells G-Sync monitors and sells video cards with your gpus.What am I thinking that would never happen.  G-Sync+mantle hrm... seems like a winner. AMD could call it " No-Tear "  LOL

Link to comment
Share on other sites

Link to post
Share on other sites

Mantle is far more exciting for me... It seems to me that the same effects of Gsync can be achieved with a higher monitor refresh rate.  So the real argument should be Gsync VS 120/240hz monitors.

Really, we should be pushing for more 240hz monitors out there rather than Gsync.  Personally owning a 120hz monitor, I do not notice tearing at all anymore, and at 240hz, I'm sure the difference is a lot greater.

Screen tearing only happens when your FPS > Refresh Rate.  So, having a 240hz monitor, you'd have to have an FPS > 240 to even have tearing in the first place, and in my experience, seeing tearing in 60hz vs 120hz, tearing at 240hz should not be perceivable.  Not to mention you'd actually be using all those frames that would normally not be shown on a 60hz monitor.

 

In conclusion, 240hz > G Sync.

Link to comment
Share on other sites

Link to post
Share on other sites

Everyone says dont buy amd because they suffer from really bad frame stutter and tearing (now largely fixed) and that everyone should buy nvidia. (despite both had relatively the same frame times in single gpu)
Everyone says we need to get rid of directx.

Nvidia creates a technology to reduce tearing on thier cards.
AMD creates a new api to use to help get us away from directx.
wow this is tough....

honestly both are great, the whole vsync=g-sync thing isnt lost to me, I know that just because your v-sync is locking you to the 60fps on your monitor doesnt mean they are displayed at the same time, what g-string, I mean g-sync does is make sure the frame from your gpu is displayed at the same time your screen refreshes.
both require new-ish cards but nvidia wanting us to buy a new card and a new monitor will leave it stagnant for atleast a while.

That said much like when nvidia was getting its arse kicked in the £/perf comparison in their cards (as seen in a previous ltt video) they panicked and only then did they tell us about frame pacing which they had known about for years, now that amd has nearly fixed those issues, nvidia has no secret ace up its sleeve anymore so theyre making a new thing of frame pacing with g-sync despite the fact no one seemed to care before (any nvidia fan wanna say they have really bad frame tearing?... didnt think so).

Mantle will be getting support from a few places and while it could always fade away into the realm of technological worthyness, the idea that its easy to work on and not as demanding as DX would lead us to believe that its easier for devs and could potentially (my own educated guess) give performance benefits similar to what the new linux kernal is providing, in the area of 50%.
so instead of getting 60fps in bf4 with a r9-280x we could be looking at 90 and getting a solid 120fps (with vsync) in crossifre or in the case of a r9-290x (hypothetically speaking).

Neither will win or lose in the end but the industry will move forward which is a win for us whichever way you want to play.

edit; also anyone saying amd has frame rate issues hasnt been around for the last month or so, it is for the most part equal to nvidias frame rate times.
This time next year its entirely possible that AMD has implemented a hardware fix for it AND mantle has been used on multiple titles and has progressed to a very good level and may have reverse engineered g-sync and be able to use it, I cant imagine amd getting sued because thier frame pacing software (wether stolen or not) happens to work fine on any g-sync enabled monitor.

nvidia however I cant see them having half of the things that amd has created over the last few years, trueaudio, mantle, tress fx, ramdisc, apu's/Heterogeneous computing, REALLY supporting open projects such as opensolaris and things like linux.

whats not to say you couldnt use a g-sync nvidia card paired with a 6800k apu and have trueaudio and tress fx offloaded to the integrated apu, or have an amd card with things offloaded in the same way we used to have a physx card (like a gt430) to accompany nvidia graphics cards.
they both make cards, they both make mobile gpus (tegra, radeon-adreno) and they both have a cloud gaming thing with "grid" and "sky".

like i said, we win.

Falcon: Corsair 750D 8320at4.6ghz 1.3v | 4GB MSI Gaming R9-290 @1000/1250 | 2x8GB 2400mhz Kingston HyperX Beast | Asus ROG Crosshair V Formula | Antec H620 | Corsair RM750w | Crucial M500 240GB, Toshiba 2TB, DarkThemeMasterRace, my G3258 has an upgrade path, my fx8320 doesn't need one...total cost £840=cpu£105, board£65, ram£105, Cooler £20, GPU£200, PSU£88, SSD£75, HDD£57, case£125.

 CASE:-NZXT S340 Black, CPU:-FX8120 @4.2Ghz, COOLER:-CM Hyper 212 EVO, BOARD:-MSI 970 Gaming, RAM:-2x4gb 2400mhz Corsair Vengeance Pro, GPU: SLI EVGA GTX480's @700/1000, PSU:-Corsair CX600m, HDD:-WD green 160GB+2TB toshiba
CASE:-(probably) Cooltek U1, CPU:-G3258 @4.5ghx, COOLER:-stock(soon "MSI Dragon" AiO likely), BOARD:-MSI z87i ITX Gaming, RAM:-1x4gb 1333mhz Patriot, GPU: Asus DCU2 r9-270 OC@1000/1500mem, PSU:-Sweex 350w.., HDD:-WD Caviar Blue 640GB
CASE:-TBD, CPU:-Core2Quad QX9650 @4Ghz, COOLER:-OCZ 92mm tower thing, BOARD:-MSI p43-c51, RAM:-4x1GB 800mhz Corsair XMS2, GPU: Zotac GTX460se @800/1000, PSU:-OCZ600sxs, HDD:-WD green 160GBBlueJean-A
 CASE:-Black/Blue Sharkoon T9, CPU:-Phenom2 x4 B55 @3.6Ghz/1.4v, COOLER:-FX8320 Stock HSF, BOARD:-M5A78L-M/USB3, RAM:-4GB 1333mhz Kingston low profile at 1600mhz, GPU:-EVGA GTX285, PSU:-Antec TP550w modu, STORAGE:-240gb  M500+2TB Toshiba
CASE:-icute zl02-3g-bb, CPU:-Phenom2 X6 1055t @3.5Ghz, COOLER:-Stock, BOARD:-Asrock m3a UCC, RAM:2x2GB 1333mhz Zeppelin (thats yellow!), GPU: XFX 1GB HD6870xxx, PSU:-some 450 POS, HDD:-WD Scorpio blue 120GB
CASE:-Packard Bell iMedia X2424, Custom black/red Aerocool Xpredator fulltower, CPU's:-E5200, C2D [email protected]<script cf-hash='f9e31' type="text/javascript"> /* */</script>(so e8500), COOLER:-Scythe Big shuriken2 Rev B, BFG gtx260 sp216 OC, RAM:-tons..
Gigabyte GTX460, Gigabyte gt430,
GPU's:-GT210 1GB,  asus hd6670 1GB gddr5, XFX XXX 9600gt 512mb Alpha dog edition, few q6600's
PICTURES CASE:-CIT mars black+red, CPU:-Athlon K6 650mhz slot A, COOLER:-Stock, BOARD:-QDI Kinetiz 7a, RAM:-256+256+256MB 133mhz SDram, GPU:-inno3d geforce4 mx440 64mb, PSU:-E-Zcool 450w, STORAGE:-2x WD 40gb "black" drives,
CASE:-silver/red raidmax cobra, CPU:-Athlon64 4000+, COOLER:-BIG stock one, BOARD:-MSI something*, RAM:-(matched pair)2x1GB 400mhz ECC transcend, GPU:-ati 9800se@375core/325mem, PSU:-pfft, HDD:-2x maxtor 80gb,
PICTURES CASE:-silver/red raidmax cobra (another), CPU:-Pentium4 2.8ghz prescott, COOLER:-Artic Coolering Freezer4, BOARD:-DFI lanparty infinity 865 R2, RAM:-(matched pair)2x1GB 400mhz kingston, GPU:-ati 9550@375core/325mem, PSU:-pfft, HDD:-another 2x WD 80gb,
CASE:-ML110 G4, CPU:-xeon 4030, COOLER:-stock leaf blower, BOARD:-stock raid 771 board, RAM:-2x2GB 666mhz kingston ECC ddr2, GPU:-9400GT 1GB, PSU:-stock delta, RAID:-JMicron JMB363 card+onboard raid controller, HDD:-320gb hitachi OS, 2xMaxtor 160gb raid1, 500gb samsungSP, 160gb WD, LAPTOP:-Dell n5030, CPU:-replaced s*** cel900 with awesome C2D E8100, RAM:-2x2GB 1333mhz ddr3, HDD:-320gb, PHONE's:-LG optimus 3D (p920) on 2.3.5@300-600mhz de-clock (batteryFTW)
Link to comment
Share on other sites

Link to post
Share on other sites

Technically because the chip is on the monitor it could be reverse engineered. If AMD wanted to write code for that to work they could without a license. No way a judge would rule against running code that somehow " talks " to a monitor that someone else owns. Even with 0 information that monitor could be reverse engineered within 30 days. I wonder if Nvidia would sue AMD if they did that. Would be hard to license and would AMD even ask to be able too? They could just contact a vendor buy a monitor and well... when your a company that woks with vendors who sells G-Sync monitors and sells video cards with your gpus.What am I thinking that would never happen.  G-Sync+mantle hrm... seems like a winner. AMD could call it " No-Tear "  LOL

 

Should this actually become a standard [including IPS / PLS monitors], and should it actually be more than a 120-175$ gimmick, there will surely be work arounds. Yet, Nvidia has a lot of these gimmicks namely 3D-Vision, Shield, Titan, and I don't care for either ^^

Frost upon these cigarettes.... lipstick on the window pane...

Link to comment
Share on other sites

Link to post
Share on other sites

I agree with Linus! But I feel like we're comparing two completely different things. They are both two different technical achievements and I don't like the fact that we have to pick between feature sets on graphic cards. This might be a bad analogy but this mite be like features on cars. What you find on Mercedes eventually works its way down to your Toyota a few years down the line (I'm a bit new school on the computer hardware side, so maybe there is something like this on the computer side). With both of these technology I hope they get implemented in someway down the line even if it's not a AMD or Nvidia. I believe it will be better experience for the end user with either Nvidia or AMD GPUs. I have a Nvidia card now and I will give G-sync a try when they are available for a good (non beginning hype) price.

CPU: 3770K | Motherboard: Asus Maximus V Formula | GPU: GTX 1080Ti SLI | Ram: 16gb Corsair Dominator Platinum 2133 | PSU: Corsair AX1200i | Storage: 2x Samsung 840 pros | Case: Corsair 650D | Cooler:  Corsair H100i with Noctus NF-F12 fans | Monitor: Asus ROG Swift PG278Q

Link to comment
Share on other sites

Link to post
Share on other sites

I'll wait and see for the mantle presentation on November before I make my judgement. 
But I've never had any stutter or tearing since I run 120+ fps on a 144Hz, so G-sync doesn't intrest me right now. 
Mantle sounds more impressive, I say sounds because we haven't seen it yet. 

Link to comment
Share on other sites

Link to post
Share on other sites

How can you even put these two in a fight? Ones for making the screen not tear and lag and feel more realistic (G-Sync). 

 

The other is for making games more optimized for your PC like how a console game out of the box is already optimized for your consoles hardware, and it's to make games run better.

 

I say combine the two together if anything, have the best of both worlds lol. 

Link to comment
Share on other sites

Link to post
Share on other sites

They can't go vs on the air like that.

Meaning. Mantle will improve performance in a large amount of games but not every game will adopt mantle. Not every developing studio will have the resources to develop for both mantle and DX 11.. This is why a large amount of games will not take advantage of it, especially indi games without a large budget.

However g sync will improve every game feel dramatically and will really improve the experience.

On the other hand mantle will not require you to buy a new monitor. So it really depends..

As much as direct x is limiting, going the extra mile for more deep optimization will cost developers more money and a lot of games will either not optimize as AAA titles will or not going support it at all (in games like battlefield 4 that is the mantle flagship I bet the performance improvement will be very respectable.

 

I'm extremely excited for mantle and I will play around with it a little and get my own impressions. G sync is also exciting but I'm a bit worried that G sync monitors will be expensive compared to the alternative.

 

NVIDIA said they are planning to make a DIY kit to upgrade monitors but this might not be practical because there are so many monitors on the market.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Everyone keeps forgetting that the GPU api is not the only part of game software development.... mantle, like OpenGL is strictly graphics with no Utility. DirectX provides a general game programming API that handles several things like networking, sound, File I/O, hardware I/O and the list is pretty extensive... I like the theory of mantle but in practice it just dosn't work like that, and its likley that the bulk of the bottlenecking is not at the DirectX graphics api layers. The other thing people are forgetting is mantle adoption will be slow if it even kicks off, and only really big studios or developers using off-the-shelf engines will be able to justify catering for mantle.

 

Let me put this in another perspective for a second, Xbox uses and always has used DirectX. The bottleneck is really the Operating System and not the Api, the Xbox operating system is tuned to run games really well - windows is not, they have different agendas. 

 

And the last point in my argument is that DirectX has been in the scene for a while, its gone through 11 major iterations and it know what game developers need and many of the challenges that mantle will need to overcome, some of these may mean sacrificing performance for some functional benefit.

 

and this concludes my rant about the delusion that people seem to have about mantle, it is simply another graphics API with a slightly different set of calls doing exact the same thing as others before it are doing. maybe we will get a 5% performance boost at the cost of a bottleneck else where... who knows but its certainly not the thing that people are hyping it up to be.

Link to comment
Share on other sites

Link to post
Share on other sites

I just came here to vote this topic to lowest score since its trash level should not be on this forum. Lock it ASAP.

Link to comment
Share on other sites

Link to post
Share on other sites

I mus say both is fairly good. The biggest reason why some people say Mantle is better is is because of the costs. If they can make a module for my Qnix I will go for it but for me it's hard going from 1440p 96Hz PLS pannel to TN 1080p pannel. The viewing angles is just plain awful. the more I think about G-sync the more it makes sense. 

Link to comment
Share on other sites

Link to post
Share on other sites

I'm saying neither, both rely on other people to take it on or something that will become an industry standard in 20 years

We all know Mantle needs the developers.

G-SYNC need monitors and tv companies to use it.

If neither do it'll either be a fad and burn out.

Link to comment
Share on other sites

Link to post
Share on other sites

i really want to hear more arguments about G-Sync versus <144Hz monitors. Today there is even 400 or 1000Hz TV's to buy. if G-Sync is only equivalent to more Hz than whats the point.

Link to comment
Share on other sites

Link to post
Share on other sites

Apparently Frostbite 3 engine used in Battlefield 4 is based on Mantle. Not sure how that works for Nvidia, whether the engine uses mantle for AMD and then uses directx for Nvidia GPU's on windows?

Yes. That's why it's not so easy to implement for games that are not made for consoles. It's additional work for devs because they have to make the game work with DirectX and Mantle. 

Link to comment
Share on other sites

Link to post
Share on other sites

i really want to hear more arguments about G-Sync versus <144Hz monitors. Today there is even 400 or 1000Hz TV's to buy. if G-Sync is only equivalent to more Hz than whats the point.

 

There aren't 400-1000Hz TVs, that's marketing nonsense. From what I understand they pulse the backlight to give the illusion of smoothness and to have something to say on the box. Plus even at 144Hz you get stuttering at lower framerates it's just not as noticeable because there's less variation.

 

eg at a constant 40fps: 

- on a 144Hz monitor the gap between new frames will jump between 21ms and 28ms

- on a 60Hz monitor the gap will jump between 17ms and 33ms

- on a g-sync monitor it will always be 25ms, every single frame

 

I'll take Linus' word for how it actually looks but in theory it should be pretty awesome. And I'm hyped.

Fools think they know everything, experts know they know nothing

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×