Jump to content

The real difference between "Free-Sync" vs G-Sync

exyia

Mantle is completely open and since it's a low level interaction between the GPU and the software all it would need is a game patch and/or a driver patch to add support for Nvidia and Intel GPUs.

It's similar to how many Intel GPUs didn't support OpenCL until they made the drivers for it. Everyone sees OpenCL as AMD specific whilst Intel and Nvidia have wide support for it, simply AMD GPUs are better at it due to having more raw power.

 

[sarcasm] Obviously since AMD GPUs are better at it, it's obviously biased [/sarcasm]. All I'm trying to say is that when Intel/Nvidia are on top it's all fine and dandy but the moment AMD have something to be proud of an have a throne to sit on someone must come and inspect the finest details of that throne to find the one particle that is not gold.

I'm not saying anything is really bad about mantle but it definitely isn't as open as something like opencl or opengl.  It is more open than directx but look at how much support that has.  Yes nvidia and intel can use mantle as I said but they would have to build in support for it and I don't think they would directly support a competitor in something that probably won't help them that much.  It would be great if they did but sadly it won't happen.

 

Edit: To clarify, yes mantle is open but there is a big difference between open and open source.  Also yes it would be possible for intel and nvidia to create drivers to support mantle, but since it was developed with GCN in mind it is likely to perform very poorly.

i7 4770k, 16GB Corsair Vengeance

Gigabyte z87, Phanteks Enthoo Primo

7970

Link to comment
Share on other sites

Link to post
Share on other sites

I'm not saying anything is really bad about mantle but it definitely isn't as open as something like opencl or opengl.  It is more open than directx but look at how much support that has.  Yes nvidia and intel can use mantle as I said but they would have to build in support for it and I don't think they would directly support a competitor in something that probably won't help them that much.  It would be great if they did but sadly it won't happen.

AMD has openly stated that if Nvidia will suck up their pride that they will gladly give Nvidia support for Mantle. Just thought I'd add that by allowing Nvidia support for Mantle not only will they be helping Nvidia but they'll be helping themselves since it will take their (slightly weaker) CPUs out of the equation in gaming therefore putting money back into their pockets as more people buy AMD CPUs because they don't need the raw power of an Intel CPU for gaming anymore. If Mantle became the industry standard as opposed to DirectX we'd be looking at a lot more money for AMD on the CPU side as well as the GPU side.

Console optimisations and how they will effect you | The difference between AMD cores and Intel cores | Memory Bus size and how it effects your VRAM usage |
How much vram do you actually need? | APUs and the future of processing | Projects: SO - here

Intel i7 5820l @ with Corsair H110 | 32GB DDR4 RAM @ 1600Mhz | XFX Radeon R9 290 @ 1.2Ghz | Corsair 600Q | Corsair TX650 | Probably too much corsair but meh should have had a Corsair SSD and RAM | 1.3TB HDD Space | Sennheiser HD598 | Beyerdynamic Custom One Pro | Blue Snowball

Link to comment
Share on other sites

Link to post
Share on other sites

I'm not saying anything is really bad about mantle but it definitely isn't as open as something like opencl or opengl.  It is more open than directx but look at how much support that has.  Yes nvidia and intel can use mantle as I said but they would have to build in support for it and I don't think they would directly support a competitor in something that probably won't help them that much.  It would be great if they did but sadly it won't happen.

If enough developers and users start demanding Mantle support I think any thing is possible.  The mere fact that it will cost Nvidia and Intel in r&d to implement Mantle support in to their silicon means they won't do it unless they have to.

1 Timothy 1:15

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia holding the video game industry back with proprietary crap.

wait how the hell is this holding the industry, studios doesn't even have to care about this to work whit the games and its gonna work whit every 3d game that exist in any platform no matter if its windows, linux and OSX, this is just an add-on that gamers can play whitout not even knowing this exist and still doesn't care, can you explain in which this is going to hold the industry please 

this is one of the greatest thing that has happened to me recently, and it happened on this forum, those involved have my eternal gratitude http://linustechtips.com/main/topic/198850-update-alex-got-his-moto-g2-lets-get-a-moto-g-for-alexgoeshigh-unofficial/ :')

i use to have the second best link in the world here, but it died ;_; its a 404 now but it will always be here

 

Link to comment
Share on other sites

Link to post
Share on other sites

Who else things PhysX is a Gimmick/Joke?

| Case: NZXT Tempest 210 | CPU: Intel Core i5 3570K @ 3.9 Ghz | GPU: ASUS ROG STRIX GTX 1070 | RAM: Crucial Ballistix Tactical 8GB |

| Mouse: Zowie FK1 | Monitor: Acer 21.5' | Keyboard: CoolerMaster Stealth w/ Brown Switches |

#KilledMyWife - #LinusButtPlug - #1080penis

 

Link to comment
Share on other sites

Link to post
Share on other sites

wait how the hell is this holding the industry, studios doesn't even have to care about this to work whit the games and its gonna work whit every 3d game that exist in any platform no matter if its windows, linux and OSX, this is just an add-on that gamers can play whitout not even knowing this exist and still doesn't care, can you explain in which this is going to hold the industry please 

If you program a game for mantle you can add more polygons and complexity to the game.  By merely adding Mantle support as an after though will only make it so it runs smoother and faster on certain cards.  For a game developer to want to make a Mantle only tile it is necessary to have it more widely adopted.

1 Timothy 1:15

Link to comment
Share on other sites

Link to post
Share on other sites

wait how the hell is this holding the industry, studios doesn't even have to care about this to work whit the games and its gonna work whit every 3d game that exist in any platform no matter if its windows, linux and OSX, this is just an add-on that gamers can play whitout not even knowing this exist and still doesn't care, can you explain in which this is going to hold the industry please 

 

He means that not making G-Sync compatible with AMD cards is holding the industry back from making this tech widely adopted and enjoyed by all (even console gamers). Instead only those with NVIDIA cards can enjoy it.

 

For now... 

Link to comment
Share on other sites

Link to post
Share on other sites

If you program a game for mantle you can add more polygons and complexity to the game.  By merely adding Mantle support as an after though will only make it so it runs smoother and faster on certain cards.  For a game developer to want to make a Mantle only tile it is necessary to have it more widely adopted.

ohh you were talking about mantle, i though about g-sync not mantle since you said nvidia, sorry i guess :P

this is one of the greatest thing that has happened to me recently, and it happened on this forum, those involved have my eternal gratitude http://linustechtips.com/main/topic/198850-update-alex-got-his-moto-g2-lets-get-a-moto-g-for-alexgoeshigh-unofficial/ :')

i use to have the second best link in the world here, but it died ;_; its a 404 now but it will always be here

 

Link to comment
Share on other sites

Link to post
Share on other sites

Who else things PhysX is a Gimmick/Joke?

PhysX is a very very very good technology it's a shame that it's core fundamentals are proprietary otherwise I'd be all for it.

Console optimisations and how they will effect you | The difference between AMD cores and Intel cores | Memory Bus size and how it effects your VRAM usage |
How much vram do you actually need? | APUs and the future of processing | Projects: SO - here

Intel i7 5820l @ with Corsair H110 | 32GB DDR4 RAM @ 1600Mhz | XFX Radeon R9 290 @ 1.2Ghz | Corsair 600Q | Corsair TX650 | Probably too much corsair but meh should have had a Corsair SSD and RAM | 1.3TB HDD Space | Sennheiser HD598 | Beyerdynamic Custom One Pro | Blue Snowball

Link to comment
Share on other sites

Link to post
Share on other sites

ohh you were talking about mantle, i though about g-sync not mantle since you said nvidia, sorry i guess :P

 

This is not the guy you quoted...Dude you've confused yourself. The guy you quoted was Kamina. Luke replied to you thinking you were talking about Mantle. 

Link to comment
Share on other sites

Link to post
Share on other sites

This is not the guy you quoted...Dude you've confused yourself. 

LOL dint read the username i got lost in my train of though when i read his posted that i din't crossed my mind it was another person

this is one of the greatest thing that has happened to me recently, and it happened on this forum, those involved have my eternal gratitude http://linustechtips.com/main/topic/198850-update-alex-got-his-moto-g2-lets-get-a-moto-g-for-alexgoeshigh-unofficial/ :')

i use to have the second best link in the world here, but it died ;_; its a 404 now but it will always be here

 

Link to comment
Share on other sites

Link to post
Share on other sites

I love to get my popcorn and read all the forum fanboys fight it out in these types of threads.

Fite* (MI)

| Case: NZXT Tempest 210 | CPU: Intel Core i5 3570K @ 3.9 Ghz | GPU: ASUS ROG STRIX GTX 1070 | RAM: Crucial Ballistix Tactical 8GB |

| Mouse: Zowie FK1 | Monitor: Acer 21.5' | Keyboard: CoolerMaster Stealth w/ Brown Switches |

#KilledMyWife - #LinusButtPlug - #1080penis

 

Link to comment
Share on other sites

Link to post
Share on other sites

Still think that upcoming versions of standards like DVI, HDMI,... will just add support for varibale frame rates and thus render g-sync and free-sync pointless as the industry will readily adopt those updated standards.

Link to comment
Share on other sites

Link to post
Share on other sites

One more Idea for Nvidia to scam enthusiasts. Amd tells it like it is. and g-sync hmm sends monitors to test the g-sync and takes changing though 3-4 video cards to get one to work. seems legit.... I'd want to pay for that... idk who has 3-4 new gen cards just laying @ there disposal to work this buggy idea out. and yeah i will give it nvidia's g-sync having true hardware in the product instead of software i would think will perform better. but why waste buying new monitor when you could get this free??? sad cuz monitor will cost as much as R9 290 and i would rather have that extra money for 2 video cards or a larger card. but thats my thoughts.

Link to comment
Share on other sites

Link to post
Share on other sites

It's just Nvidia down-playing FreeSync as it poses a serious threat to a bullet-point "feature" and possible source of revenue.
The fact remains that FreeSync utilizes VBLANK which is supported in DisplayPort 1.3 and can simply be added with a firmware update to any compliant monitor.
This includes a number of desktop monitors available today.
 

Finally, as a last minute stirring of the pot, I received an email from AMD's Koduri that indicated that there might be some monitors already on the market that could support variable refresh rate TODAY with just a firmware update.  This would be possible if a display was shipping with a controller that happened to coincidentally support variable refresh, perhaps in an early stage of development for the upcoming DP 1.3 standard.  We are trying to find a list of these monitors so we can talk with them and ask for these necessary changes. 

http://www.pcper.com/reviews/Graphics-Cards/AMD-Variable-Refresh-FreeSync-Could-Be-Alternative-NVIDIA-G-Sync

Link to comment
Share on other sites

Link to post
Share on other sites

"FreeSync" = good name choice btw, followed by name choice (and other things, as it is based on already existing standard - patented by ATI y.2002., taken over by AMD y.2006.), logic suggest it will be open and freely used by Intel (and latter by nVidia, when they "realize" it is standard).

 

It was just a matter of time, and again, as logic suggests, free/open standard will take over and become industry standard. Since AMD do hold some patents, hope they will make right choice (for both user and company interests).

 

Good news for users hopefully..

Link to comment
Share on other sites

Link to post
Share on other sites

good discussion here starting @ 10:20

 

very good discusison, thanks for the link; well looks like I've got someone new to watch, discussions bout technology like this are one of the very few video types that really get me ^_^ .

Console optimisations and how they will effect you | The difference between AMD cores and Intel cores | Memory Bus size and how it effects your VRAM usage |
How much vram do you actually need? | APUs and the future of processing | Projects: SO - here

Intel i7 5820l @ with Corsair H110 | 32GB DDR4 RAM @ 1600Mhz | XFX Radeon R9 290 @ 1.2Ghz | Corsair 600Q | Corsair TX650 | Probably too much corsair but meh should have had a Corsair SSD and RAM | 1.3TB HDD Space | Sennheiser HD598 | Beyerdynamic Custom One Pro | Blue Snowball

Link to comment
Share on other sites

Link to post
Share on other sites

so...

G-Sync = Desktop elitist (Thanks to Nvidia for actually thinking of us)

Free-Sync = Laptop paddy (Because AMD wanted to help out all the newbies with their first Gaming computer)

 

i like it :D

Character artist in the Games industry.

Link to comment
Share on other sites

Link to post
Share on other sites

very good discusison, thanks for the link; well looks like I've got someone new to watch, discussions bout technology like this are one of the very few video types that really get me ^_^ .

 

You haven't heard of PC Per? The main guy Ryan has been on the WAN show before.

 

They came up with the whole frame-rating thing a while back which forced AMD to update their drivers (it's a much more accurate way of capturing the FPS in a game). That's when I discovered them anyway. It's one of my favourite hardware news/reviews sites (& podcast) now. 

 

I've seen them linked often on here. 

Link to comment
Share on other sites

Link to post
Share on other sites

 24-inch 1440p dell ultasharp ips with variable refresh please!

Link to comment
Share on other sites

Link to post
Share on other sites

You haven't heard of PC Per? The main guy Ryan has been on the WAN show before.

 

They came up with the whole frame-rating thing a while back which forced AMD to update their drivers (it's a much more accurate way of capturing the FPS in a game). That's when I discovered them anyway. It's one of my favourite hardware news/reviews sites (& podcast) now. 

 

I've seen them linked often on here. 

 

You talking about the fan speeds for the 290/290X in press vs. retail? I think that issue popped up first in an article on Tom's Hardware, then other sites checked it out (some saw the same thing, some didn't.)

 

But yeah, the PC Perspective podcast is great and one of my favorites to listen to.

Link to comment
Share on other sites

Link to post
Share on other sites

It's just Nvidia down-playing FreeSync as it poses a serious threat to a bullet-point "feature" and possible source of revenue.

The fact remains that FreeSync utilizes VBLANK which is supported in DisplayPort 1.3 and can simply be added with a firmware update to any compliant monitor.

This includes a number of desktop monitors available today.

 

http://www.pcper.com/reviews/Graphics-Cards/AMD-Variable-Refresh-FreeSync-Could-Be-Alternative-NVIDIA-G-Sync

 

The question really becomes, a. When will this standard come out? b. how many monitors forward will adopt it in new designs? c. How many will make their older models compatible with firmware and to a lesser extent how many monitors can have their firmware updated? d. how many monitors can be made backwards compatible? e. How long will it be before it comes to homes near you? f. Lots of people will still have to buy new monitors as some won't support the standard and others don't have displayport.

 

The thing is, how long are people willing to wait and see what the 1.3 standard adopts and when it comes out. As PCPer noted, Displayport hasn't always been the quickest standard to be adopted to say the least. Would you be willing to wait till say November/December 2014 before it becomes a reality? I know several people I've talked to that said if it ain't out soon, they're selling their cards and buying a G-Sync monitor and a Nvidia card. That would actually hurt AMD more then anything. In all honesty it was pretty dumb yet smart for AMD to do this at CES. It was really done to take a stab at G-Sync with no real substance. If AMD waits too long, they're going to shoot themselves in the foot and drive people to Nvidia.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×