Jump to content

AMD Mantle vs NVidia G-SYNC

I'm Batman

 

ヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽRAISE YOUR DONGERS༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノヽ༼ຈل͜ຈ༽ノ

 

I'm currently in creases mate, I was wondering when somebody was going to post this.

Cpu: i5-2500k @4.8Ghz, MB: Asus Maximus V Formula, CPU cooler: Be quiet! Dark rock pro 2, GPU: Evga Gtx660 FTW@1.24ghz. Ram: Corsair Vengeance 8GB 1866Mhz, PSU: Be quiet! 730w Semi modular, SSD: Corsair force 3 240Gb, HDD: WD Green 1TB, Case: Nzxt H2 with 4 Corsair SP120's, Win7

Link to comment
Share on other sites

Link to post
Share on other sites

G-sync really sounds great. The only problems with it are monetary and the fact that only TN panels are being used (so Ive read). Nvidias cards, where I live, are approaching twice the price of AMDs. A 7950 goes for as little as £190 whereas a 770 goes for £310 (both cheapest, reputable brands. MSi and Gigabyte, respectively). Then theres the cost of the new monitor. I have no idea how well these will perform (colours, viewing angles etc) nor how much they will cost. Id guess at least £200. This is a significant price to pay overall.

 

AMD have a great price-performance ratio from the start. They have significant performance boosts in Linux (normally up to 50%, but even up to around 90% in some cases) and more performance boosts in SteamOS according to valve (I think Nvidia shares the SteamOS benefit). Then add Mantle to that and you have potentially insane performance increases, especially in SteamOS, without doing anything and whilst saving hundreds of pounds. This might even reach 60 fps minimums in many games which I think results in the same effect as G-sync (?). Thats being optimistic though.

 

Im honestly stuck as I was JUST about to buy an AMD card thinking it was a no brainer, but the announcement of G-sync has screwed that up. Im trying to predict the future at this point. Its still looking like I will go with AMD, but I dont know.

Link to comment
Share on other sites

Link to post
Share on other sites

G-sync really sounds great. The only problems with it are monetary and the fact that only TN panels are being used (so Ive read). Nvidias cards, where I live, are approaching twice the price of AMDs. A 7950 goes for as little as £190 whereas a 770 goes for £310 (both cheapest, reputable brands. MSi and Gigabyte, respectively). Then theres the cost of the new monitor. I have no idea how well these will perform (colours, viewing angles etc) nor how much they will cost. Id guess at least £200. This is a significant price to pay overall.

 

AMD have a great price-performance ratio from the start. They have significant performance boosts in Linux (normally up to 50%, but even up to around 90% in some cases) and more performance boosts in SteamOS according to valve. Then add Mantle to that and you have potentially insane performance increases, especially in SteamOS, without doing anything and whilst saving hundreds of pounds. This might even reach 60 fps minimums in many games which I think results in the same effect as G-sync (?). Thats being optimistic though.

 

Im honestly stuck as I was JUST about to buy an AMD card thinking it was a no brainer, but the announcement of G-sync has screwed that up. Im trying to predict the future at this point. Its still looking like I will go with AMD, but I dont know.

Wait until the upgrade kits come out and reviewers start taking other monitors apart to see if they are compatible.

Intel 3570k @ 4.4 GHz |Asus Sabertooth Z77 |EVGA GTX 660 Ti FTW |Kingston HyperX Beast 16 Gb DDR3 1866 (2x8Gb)


|Samsung 840 250 GB |Western Digital Green 2TB 2x |Cooler Master 850w 80+ Gold |Custom Water Cooling Loop |Noctua NF-F12 4x
|Noctua NF-A14 3x |Corsair Carbide 500R (White) |Corsair K95 |Razer Mamba |Razer Megalodon |Samsung SyncMaster T220 2x Computer Bucket List   Greatest Thread Ever   WAN Show Drinking Game  GPU Buyers Guide
Link to comment
Share on other sites

Link to post
Share on other sites

im really exited for both, really worried about the market fragmentation mantle might cause.

Link to comment
Share on other sites

Link to post
Share on other sites

im really exited for both, really worried about the market fragmentation mantle might cause.

I agree about the market fragmentation. But I don't think mantle is really going to take off. It's a neat feature for thefew companies who are going to use it but I don't think many will. Gsync on the other hand is the most exciting thing I've heard since steamOS. Can't wait to get a 2560x1440 monitor with G-Sync.

Link to comment
Share on other sites

Link to post
Share on other sites

I agree about the market fragmentation. But I don't think mantle is really going to take off. It's a neat feature for thefew companies who are going to use it but I don't think many will. Gsync on the other hand is the most exciting thing I've heard since steamOS. Can't wait to get a 2560x1440 monitor with G-Sync.

I'm on the fence on Mantle, if a ton of Console developers start using it then it could end up being big, if not it just fragments the API space. With G-sync I honestly think that Nvidia should license it to AMD and Intel immediately, it helps Nvidia recoup R&D costs and allows for broader use of the technology without fragmenting the market, and at the reducing cost to the consumer (more people using it, more economies of scale, cheaper product). The last thing consumers want from G-sync is to be forced to buy/upgrade a monitor each time they switch GPU teams. G-sync needs to be an industry standard, not a selling point.

Intel 3570k @ 4.4 GHz |Asus Sabertooth Z77 |EVGA GTX 660 Ti FTW |Kingston HyperX Beast 16 Gb DDR3 1866 (2x8Gb)


|Samsung 840 250 GB |Western Digital Green 2TB 2x |Cooler Master 850w 80+ Gold |Custom Water Cooling Loop |Noctua NF-F12 4x
|Noctua NF-A14 3x |Corsair Carbide 500R (White) |Corsair K95 |Razer Mamba |Razer Megalodon |Samsung SyncMaster T220 2x Computer Bucket List   Greatest Thread Ever   WAN Show Drinking Game  GPU Buyers Guide
Link to comment
Share on other sites

Link to post
Share on other sites

I believe G-Sync will become an industry standard at some point. Nvidia will probably milk it a little to see how many more people they can convert from AMD and then license it. Seeing as mantle isn't available for use on consoles I don't see why developers would set aside the not insubstantial amount of money and time to develop AMD cards alone. Last I checked Nvidia is the leader in discrete GPU's which is what most PC gamers use so I can't really see the benefit to developers.

 

EDIT: I looked it up and it seems that AMD has taken back some market share from Nvidia lately. (I don't pay much attention to AMD in all honesty.)

Link to comment
Share on other sites

Link to post
Share on other sites

NVIDIA G-Sync Overview and Explanation with Tom Petersen - PC Perspective

 

Intel i5 3570k | MSI GTX 670 Power Edition/OC SLI | Asus Sabertooth Z77 | Corsair Vengeance LP 16GB | Fractal Design Newton R2 650W | NZXT Switch 810 SE Gun Metal | BenQ 24" XL2420T 120Hz | Corsair K90  | Logitech G500 / Logtitech Performance MX | Sennheiser PC 360 | Asus Xonar DGX | NVIDIA GeForce 3D Vision 2 Wireless Kit

Link to comment
Share on other sites

Link to post
Share on other sites

I have no doubt that Nvidia has a patent on this (and they should, they spent however many millions on R&D to develop it) so if Intel or AMD want to develop something it MUST be in a form that doesn't violate the patent. Further, if we get competing technologies all that does is force consumers to purchase their monitor/monitor upgrade with their GPU, it splits the market into obligate Nvidia and obligate AMD due to cost to the consumer, this limits competition instead of encouraging it. In my mind the best case scenario would be if Nvidia were to license it to AMD (would be good for both parties, Nvida increases the demand for their product and AMD gets to solve a major issue, that being artifacts). Honestly if G-sync works as advertised (all signs point to yes) we should hope it becomes a standard not a point of competition.

Not true at all, forst of all, nVidia can patent technology, but they cannot patent idea, since ways of synchronizing GPU and monitors are basically unlimited, it is very likely that Intel or AMD can develop open licence technology that will not compete with g-sync, but, it will force nVidia to dispute that thechnology, and force them to use open one. Why is this important? It is important exacly for the reason you mentioned. This way, users who want this type of technology are pretty much forced to use nVidia GPU's. Now, if nvidia start licencing g-sync technology, it is also non beneficial for consumers, since nVidia will gain advantage over rest of the GPU developers (AMD manily, but also Intel).

 

So, if i am a head of competing GPU brand, i would already at this time when we speak, start developing different/better ways, and make them open. At the end, i will benefit (because i will not be forced to pay licence), and users wi8ll benefit (since everyone will move to open/better technology). That way, i will force competing manuf. to use my technology and dispute their own. End result is, you ahve open standard, and not closed limited technology. And every manuf. have benefits from it, therefore, one manuf. can't gain other advantage except built quality, win-win situation for all.

 

Since this type of tehnology is pretty much unlimited, and ways of implementing are unlimited, it is just amtter of time when it will happen, not "IF" will ahppen, and i hope it will be very soon, before ral damage comes to other GPU/Display manuf. As soon as we get, less damage will be done.

Link to comment
Share on other sites

Link to post
Share on other sites

Not true at all, forst of all, nVidia can patent technology, but they cannot patent idea, since ways of synchronizing GPU and monitors are basically unlimited, it is very likely that Intel or AMD can develop open licence technology that will not compete with g-sync, but, it will force nVidia to dispute that thechnology, and force them to use open one. Why is this important? It is important exacly for the reason you mentioned. This way, users who want this type of technology are pretty much forced to use nVidia GPU's. Now, if nvidia start licencing g-sync technology, it is also non beneficial for consumers, since nVidia will gain advantage over rest of the GPU developers (AMD manily, but also Intel).

 

So, if i am a head of competing GPU brand, i would already at this time when we speak, start developing different/better ways, and make them open. At the end, i will benefit (because i will not be forced to pay licence), and users wi8ll benefit (since everyone will move to open/better technology). That way, i will force competing manuf. to use my technology and dispute their own. End result is, you ahve open standard, and not closed limited technology. And every manuf. have benefits from it, therefore, one manuf. can't gain other advantage except built quality, win-win situation for all.

 

Since this type of tehnology is pretty much unlimited, and ways of implementing are unlimited, it is just amtter of time when it will happen, not "IF" will ahppen, and i hope it will be very soon, before ral damage comes to other GPU/Display manuf. As soon as we get, less damage will be done.

I never stated that the concept should/could be patented, only that Nvidias iteration probably is (and should be), further you can patent things other than technology (ie a process). I have to disagree that Nvidia licensing it to AMD/Intel would be bad/give Nvidia a major hold over the competition. Major technology companies license patents and concepts all the time and it doesn't fundamentally change the market players (Intels x86 architecture). Open source is not always the answer and closed platform is not always bad, there are major advantages/disadvantages to both. The key here (in my view) is to achieve one standard for consumers as to limit cost of upgrading. The worst thing that could happen would be to fragment the market to where Intel users have one standard, AMD has another, and Nvidia has a third. Consumers would loose. Nvidia has first mover advantage, they spent the money, did the R&D, and are releasing the product, if it is a big enough solution and catches on fast enough it probably will become the standard and Nvidia will reap the lions share of the benefit (as they should given their investment).

Intel 3570k @ 4.4 GHz |Asus Sabertooth Z77 |EVGA GTX 660 Ti FTW |Kingston HyperX Beast 16 Gb DDR3 1866 (2x8Gb)


|Samsung 840 250 GB |Western Digital Green 2TB 2x |Cooler Master 850w 80+ Gold |Custom Water Cooling Loop |Noctua NF-F12 4x
|Noctua NF-A14 3x |Corsair Carbide 500R (White) |Corsair K95 |Razer Mamba |Razer Megalodon |Samsung SyncMaster T220 2x Computer Bucket List   Greatest Thread Ever   WAN Show Drinking Game  GPU Buyers Guide
Link to comment
Share on other sites

Link to post
Share on other sites

I never stated that the concept should/could be patented, only that Nvidias iteration probably is (and should be), further you can patent things other than technology (ie a process). I have to disagree that Nvidia licensing it to AMD/Intel would be bad/give Nvidia a major hold over the competition. Major technology companies license patents and concepts all the time and it doesn't fundamentally change the market players (Intels x86 architecture). Open source is not always the answer and closed platform is not always bad, there are major advantages/disadvantages to both. The key here (in my view) is to achieve one standard for consumers as to limit cost of upgrading. The worst thing that could happen would be to fragment the market to where Intel users have one standard, AMD has another, and Nvidia has a third. Consumers would loose. Nvidia has first mover advantage, they spent the money, did the R&D, and are releasing the product, if it is a big enough solution and catches on fast enough it probably will become the standard and Nvidia will reap the lions share of the benefit (as they should given their investment).

Ok, i can't disagree with your points, except one. For example, x86 did influenced major CPU manufacturers at the time, so we did lost actually, instead of ahving multiple CPU manufacturers, we, as consumers have, in best case (for PC market) two options.

 

Also, display technology is not what should be developed by GPU manufacturers, but, by display manufacturers. Only in that case, you could have it on all platforms. Instead of having fragmented market, with open licence, you would ahve adoption by all manufacturers. Now we have more problems:

 

Problem 1: Display manufacturers, can't implement this technology without nVidia licence or/and verification. This is a ground for monopolizing display market, even worse, nVidia can decide about contract conditions towards manufacturer X, and manufacturer Y, so they can give artificial advantage to manufacturer X for example. This is not good by any means.

 

Problem2: GPU manufacturers get same destiny, if they opt to use that tehnology, and if nVidia allows them (give them a licence).

 

Now, bad solution is fragmented market, but also, unrealistic one. If both AMD and Intel do same thing, and lock down platform, then, you will have fragmentet market, and that is def. what we don't want. However, if any of those two manufacturers create same thing with different (even maybe better) implementation, and make it open, that automatically means cheaper production, and fruthermore, it means adopti0on by large scale of display manufacturers, and at the end, it means end of locked platform in favor of open. It doesn't hurt anyone (even nVidia), and it benefits consumer and majority of other manufacturers.

 

nVidia is entitled to patent it's own technology ofc., that is not even in question. How much effort and money they invested in it, is in question. Don't get me wrong, it is a great thing that someone finally adressed those issues, but knowing how things usualy end up (especially when someone have large market share), this is not good at all. So, i say, if I AM on AMD/Intel place, i would consider other ways of implementing this technology, and make it open, not because i love consumers, but because i don't need/want damage on myself.

Link to comment
Share on other sites

Link to post
Share on other sites

Ok, i can't disagree with your points, except one. For example, x86 did influenced major CPU manufacturers at the time, so we did lost actually, instead of ahving multiple CPU manufacturers, we, as consumers have, in best case (for PC market) two options.

Also, display technology is not what should be developed by GPU manufacturers, but, by display manufacturers. Only in that case, you could have it on all platforms. Instead of having fragmented market, with open licence, you would ahve adoption by all manufacturers. Now we have more problems:

Problem 1: Display manufacturers, can't implement this technology without nVidia licence or/and verification. This is a ground for monopolizing display market, even worse, nVidia can decide about contract conditions towards manufacturer X, and manufacturer Y, so they can give artificial advantage to manufacturer X for example. This is not good by any means.

Problem2: GPU manufacturers get same destiny, if they opt to use that tehnology, and if nVidia allows them (give them a licence).

Now, bad solution is fragmented market, but also, unrealistic one. If both AMD and Intel do same thing, and lock down platform, then, you will have fragmentet market, and that is def. what we don't want. However, if any of those two manufacturers create same thing with different (even maybe better) implementation, and make it open, that automatically means cheaper production, and fruthermore, it means adopti0on by large scale of display manufacturers, and at the end, it means end of locked platform in favor of open. It doesn't hurt anyone (even nVidia), and it benefits consumer and majority of other manufacturers.

nVidia is entitled to patent it's own technology ofc., that is not even in question. How much effort and money they invested in it, is in question. Don't get me wrong, it is a great thing that someone finally adressed those issues, but knowing how things usualy end up (especially when someone have large market share), this is not good at all. So, i say, if I AM on AMD/Intel place, i would consider other ways of implementing this technology, and make it open, not because i love consumers, but because i don't need/want damage on myself.

Apologies in advance as I am typing this on my phone so I probably will miss some stuff. I'll edit and expand when I can get to a keyboard. First, x86 was probably not the best example (though I see you got my general point). Second I absolutely agree that this is a technology that the display manufacturers 'should' have developed, however as with most things in that market space the approach seems to be to do as little as possible to innovate (how powerful would GPU's be if we were in the middle of the 4K change over instead of the beginning, same with 120/144Hz). A direct license to display manufacturers doesn't seem to be he best option unless the technology is available to AMD/Intel as well. In my mind Nvidia would be stupid not to license it as it would speed adoption and lower costs (though that has never stopped companies from making bad decisions in the past). I disagree that having an open source competitor solves all problems as open source doesn't guarantee market success/dominance (IOS thrives after android, Linux has never threatened Windows, etc) and as in the case of android often leads to market fragmentation in and of itself. Anyway sorry if this is disjointed or if I failed to address some of your major points, typing on a phone sucks, I'll edit this in a few hours, just wanted to get my general thoughts out.

Intel 3570k @ 4.4 GHz |Asus Sabertooth Z77 |EVGA GTX 660 Ti FTW |Kingston HyperX Beast 16 Gb DDR3 1866 (2x8Gb)


|Samsung 840 250 GB |Western Digital Green 2TB 2x |Cooler Master 850w 80+ Gold |Custom Water Cooling Loop |Noctua NF-F12 4x
|Noctua NF-A14 3x |Corsair Carbide 500R (White) |Corsair K95 |Razer Mamba |Razer Megalodon |Samsung SyncMaster T220 2x Computer Bucket List   Greatest Thread Ever   WAN Show Drinking Game  GPU Buyers Guide
Link to comment
Share on other sites

Link to post
Share on other sites

I came into this thread unbiased but the smug arrogance of NVidia fanboys really put me off voting for G-Sync...so I voted Mantle.  A silly reason for selecting a poll option I know.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm on the fence on Mantle, if a ton of Console developers start using it then it could end up being big, if not it just fragments the API space. With G-sync I honestly think that Nvidia should license it to AMD and Intel immediately, it helps Nvidia recoup R&D costs and allows for broader use of the technology without fragmenting the market, and at the reducing cost to the consumer (more people using it, more economies of scale, cheaper product). The last thing consumers want from G-sync is to be forced to buy/upgrade a monitor each time they switch GPU teams. G-sync needs to be an industry standard, not a selling point.

I totaly agree with you that G-sync should be an Industiry standart.

But Mantle isn't used for consoles they have there own API.

The problem with Mantle is that only 7xxx/Rx-2xx Cards support it and the market share of AMD is even with 4xxx/5xxx/6xxx Cards not even 50%.

Also does it need support from Devs wich is hard to get(PC as a platform struggles to get support from Devs)so I don't see AMD getting support.

I think G-Sync is just the start and will get later a standart for Monitors/TVs/Tablets/Smartphones/Occulus.

And if Nvidia is smart they partner up with Sony/MS and make console support.

 

RTX2070OC 

Link to comment
Share on other sites

Link to post
Share on other sites

I'm not sure if this has been posted yet, but I think it will help people get a good understanding of G-Sync, displays in general, and what exactly this tech is, because even now people for the most part really don't know what it is. I made false assumptions when I first heard about it, and it wasn't until I heard Carmack discuss high/low persistence in the displays he works on for Oculous VR that I finally 'actually' understood just how important this is.

 

Anyway watching this

and then Linus' discussion again with Carmack who makes the most important point of all>

 

He makes some important points about this. If you can't imagine still how good this can be, take a small window or object on your desktop that has text in it, like notepad, write a sentence in it, then drag it up and down and around your screen, but try to read the text as you do it fast, you can't; it's more complicated than just this but it's one small example of how G-Sync will help. It's a serious problem in VR right now, not so much for us regular gamers but it matters and it's awesome.

Beware the irrational, however seductive. Shun the ‘transcendent’ and all who invite you to subordinate or annihilate yourself. Distrust compassion; prefer dignity for yourself and others. Don’t be afraid to be thought arrogant or selfish. Picture all experts as if they were mammals. Never be a spectator of unfairness or stupidity. Seek out argument and disputation for their own sake; the grave will supply plenty of time for silence. Suspect your own motives, and all excuses. Do not live for others any more than you would expect others to live for you.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm not sure if this has been posted yet, but I think it will help people get a good understanding of G-Sync, displays in general, and what exactly this tech is, because even now people for the most part really don't know what it is. I made false assumptions when I first heard about it, and it wasn't until I heard Carmack discuss high/low persistence in the displays he works on for Oculous VR that I finally 'actually' understood just how important this is.

 

 

He makes some important points about this. If you can't imagine still how good this can be, take a small window or object on your desktop that has text in it, like notepad, write a sentence in it, then drag it up and down and around your screen, but try to read the text as you do it fast, you can't; it's more complicated than just this but it's one small example of how G-Sync will help. It's a serious problem in VR right now, not so much for us regular gamers but it matters and it's awesome.

 

Its interesting to note that at some point soon after 45:00 in the first video, from what Tom Petersen says, it sounds to me like licensing the technology is a real possibility. It also sounds like AMD could find a way to utilise the module themselves, and it seems implied that this would be ok/not condemned by Nvidia. He does make it clear, however, that neither of these possibilities will "happen any time soon" and that Kepler GPUs have some hardware on them that helps G-sync work perfectly. 

Link to comment
Share on other sites

Link to post
Share on other sites

G-Sync looks promising. But needs new monitors. While mantle is free to all amd gcn users, if mantle actually delivers, it will blow g-sync out of the picture.

Link to comment
Share on other sites

Link to post
Share on other sites

G-Sync looks promising. But needs new monitors. While mantle is free to all amd gcn users, if mantle actually delivers, it will blow g-sync out of the picture.

They aren't comparable. At all. Its like comparing a pie to a cell phone. The only similarity is that both products success will rely on their adoption rates. Mantle will give AMD cards a performance boost (how much is unknown) but ONLY in games developed on Mantle. G-sync makes any game played look better (IMHO Nvidia would be smart to license this to Intel and AMD quickly to speed adoption), any monitor with the correct slot can take a G-sync card (so far only 4 are confirmed but as soon as the upgrade kit comes out reviewers will start tearing panels apart to check for compatibility). Mantle will not 'blow G-sync out of the water' neither with G-sync destroy Mantle, they do not operate in the same product space, one is hardware one is an API (dev tool).

Intel 3570k @ 4.4 GHz |Asus Sabertooth Z77 |EVGA GTX 660 Ti FTW |Kingston HyperX Beast 16 Gb DDR3 1866 (2x8Gb)


|Samsung 840 250 GB |Western Digital Green 2TB 2x |Cooler Master 850w 80+ Gold |Custom Water Cooling Loop |Noctua NF-F12 4x
|Noctua NF-A14 3x |Corsair Carbide 500R (White) |Corsair K95 |Razer Mamba |Razer Megalodon |Samsung SyncMaster T220 2x Computer Bucket List   Greatest Thread Ever   WAN Show Drinking Game  GPU Buyers Guide
Link to comment
Share on other sites

Link to post
Share on other sites

They aren't comparable. At all. Its like comparing a pie to a cell phone. The only similarity is that both products success will rely on their adoption rates. Mantle will give AMD cards a performance boost (how much is unknown) but ONLY in games developed on Mantle. G-sync makes any game played look better (IMHO Nvidia would be smart to license this to Intel and AMD quickly to speed adoption), any monitor with the correct slot can take a G-sync card (so far only 4 are confirmed but as soon as the upgrade kit comes out reviewers will start tearing panels apart to check for compatibility). Mantle will not 'blow G-sync out of the water' neither with G-sync destroy Mantle, they do not operate in the same product space, one is hardware one is an API (dev tool).

However, i want to point out at one thing that most (or even all, as far as i saw comments) people missing.

 

G-Sync is a game changer, but, not to the extent people pointing out.

 

If you have 120+Hz display, you will most likely have very smooth experience (even with frame fluctuations) without v-sync, g-sync will push it to look perfect (in other words even better). But, if you have game engine stutter, it will be visible on both non g-sync display, and g-sync display, because it is game that make big latency between frames, tehrefore, what people on nVidia press conference said, is half truth, not entire truth.

 

What Linus and others saw (on nvidia demo), is some sort of "perfect svenario", and probably "worst scenario" for regular display, based on game engine only.

 

Bottom line, developers will still ahve a lot of job optimizing games, even with g-sync.

Link to comment
Share on other sites

Link to post
Share on other sites

Maybe read the thread and you might understand. Both are technologies to reduce tearing and stutter.

the stutters are are made by the desync between monitor -- GPU so i don't think Mantle will have the optimisation that will allow them to do that. But that being said, I would love having the idea of having all my cores working together to give me better performance as well as having the synchronisation with the monitor... only time will tell if Nvidia is gonna pull an "apple" on the industry. 

Intel 4770k : 4.6Ghz @1.285v | Asus Maximum VI Extreme | Asus GTX780 DCUii OC | Corsair Vengeance Pro 16 GB @2400mhz| Corsair AX1200i |

Samsung 840 Pro 256gb | Random mix match 3.5" hard drives |  Asus PB278q

Cooling : EK Supremacy Plexi/Nickle | Swiftech MCP655 Laing D5 | XSPC RX360 | EK D5 X-TOP CSQ |

Link to comment
Share on other sites

Link to post
Share on other sites

IMO these two technologies are meant for each other. They were a perfect combination. I love seeing how Mantle API will increase performance in the future while G-Sync at the same time will eliminate those screen tearing coming from the increased performance. That will surely make PC gaming scene awesome-r. :)

"Cough, Cough, Cough"

Link to comment
Share on other sites

Link to post
Share on other sites

However, i want to point out at one thing that most (or even all, as far as i saw comments) people missing.

 

G-Sync is a game changer, but, not to the extent people pointing out.

 

If you have 120+Hz display, you will most likely have very smooth experience (even with frame fluctuations) without v-sync, g-sync will push it to look perfect (in other words even better). But, if you have game engine stutter, it will be visible on both non g-sync display, and g-sync display, because it is game that make big latency between frames, tehrefore, what people on nVidia press conference said, is half truth, not entire truth.

 

What Linus and others saw (on nvidia demo), is some sort of "perfect svenario", and probably "worst scenario" for regular display, based on game engine only.

 

Bottom line, developers will still ahve a lot of job optimizing games, even with g-sync.

any time a frame that your GPU puts out doesnt sync with the monitor can be noticeable, just watch some 24fps content on your monitor and watch the panning. Yes the higher the frame rate the less noticeable this is but its still there. This is why when ever I can I configure a game to run a constant frame rate that is either the refresh rate of the monitor or at least a multiple of that refresh rate.

Link to comment
Share on other sites

Link to post
Share on other sites

  • 1 month later...

As for mantle, we need to wait for a mantle based game (core,engine made for that technology). Upcoming B4 mantle patch won't tell us anything about it. Maybe 0-10% increse in fps. One of developers of B4 said “not to jugde mantle by its performance in B4, as its mostly using it for shadowing“(hearsay)

Link to comment
Share on other sites

Link to post
Share on other sites

Sorry but , people listen to this tune instead, this thread is just bad.

 

Le Bastardo+ 

i7 4770k + OCUK Fathom HW labs Black Ice 240 rad + Mayhem's Gigachew orange + 16GB Avexir Core Orange 2133 + Gigachew GA-Z87X-OC + 2x Gigachew WF 780Ti SLi + SoundBlaster Z + 1TB Crucial M550 + 2TB Seagate Barracude 7200rpm + LG BDR/DVDR + Superflower Leadex 1KW Platinum + NZXT Switch 810 Gun Metal + Dell U2713H + Logitech G602 + Ducky DK-9008 Shine 3 MX Brown

Red Alert

FX 8320 AMD = Noctua NHU12P = 8GB Avexir Blitz 2000 = ASUS M5A99X EVO R2.0 = Sapphire Radeon R9 290 TRI-X = 1TB Hitachi Deskstar & 500GB Hitachi Deskstar = Samsung DVDR/CDR = SuperFlower Golden Green HX 550W 80 Plus Gold = Xigmatek Utguard = AOC 22" LED 1920x1080 = Logitech G110 = SteelSeries Sensei RAW
Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×