Jump to content

AMD FreeSync - AMD's Free G-SYNC Alternative

Torand

It kinda worries me that so many people have heard this "suggestion" from AMD and automatically assumed the following:

 

1. it won't cost anything (nothing is free)

2. it will be better than or the same as gsync (how?  nvidia knew about vblank and chose to go with something else why?)

3. G-sync cost more to implement (again how? has someone got a link to explain this. and don't post links to sales sites, we know it costs more to buy,  that is how marketing works and not necessarily the result of  implementation costs)

4. It will run on anything (even though the article clearly said that was an unknown)

 

Haven't we learnt anything?  wait until a fully working market ready example is out for testing before making silly comments about how this will force nvidia to drop their prices (not that we know if they are even over charging for it) or making silly decisions like buying a GPU.

 

I agree with 2 and 4, except with the costs.

 

If newer monitors conform to this new VESA standard, then it should not cost any more for the "freesync" since it might just be built into newer monitors.  G-Sync will most likely cost more because of the R&D and hardware manufacturing. From what I've read this variable VBLANK is already implemented in video cards and just need the drivers installed, and when we have monitor that have this there is not really any special hardware involved.

 

Final judgement should be reserved until after the products are available to the public of course.

Old shit no one cares about but me.

Link to comment
Share on other sites

Link to post
Share on other sites

Just thought I'd say you can change your temperature target in Powertune to lower the maximum temperature to something you're comfortable with :P , I have an R9 290 and I love it. Purchased mine when it was still the same price as a 7970 Ghz (£300~) and I have 0 regrets at all.

Tech of Tomorrow had a video with a quote I heavily agreed with, you're not going to buy a GPU to not use it to it's fullest potential (at least, the highest base clock). I also like Linus' ideology that if you're going to have a super gaming PC, there is no reason for it to not be quiet.

 

I still want to see more of these new adaptive Vsync technologies.

if you have to insist you think for yourself, i'm not going to believe you.

Link to comment
Share on other sites

Link to post
Share on other sites

Tech of Tomorrow had a video with a quote I heavily agreed with, you're not going to buy a GPU to not use it to it's fullest potential (at least, the highest base clock). I also like Linus' ideology that if you're going to have a super gaming PC, there is no reason for it to not be quiet.

 

I still want to see more of these new adaptive Vsync technologies.

I do agree with you to a degree :P but I probably could have sufficed with a 7950, hell probably a 7870/50 and I got my R9 290 more for future proofing over anything.

Console optimisations and how they will effect you | The difference between AMD cores and Intel cores | Memory Bus size and how it effects your VRAM usage |
How much vram do you actually need? | APUs and the future of processing | Projects: SO - here

Intel i7 5820l @ with Corsair H110 | 32GB DDR4 RAM @ 1600Mhz | XFX Radeon R9 290 @ 1.2Ghz | Corsair 600Q | Corsair TX650 | Probably too much corsair but meh should have had a Corsair SSD and RAM | 1.3TB HDD Space | Sennheiser HD598 | Beyerdynamic Custom One Pro | Blue Snowball

Link to comment
Share on other sites

Link to post
Share on other sites

I agree with 2 and 4, except with the costs.

 

If newer monitors conform to this new VESA standard, then it should not cost any more for the "freesync" since it might just be built into newer monitors.  G-Sync will most likely cost more because of the R&D and hardware manufacturing. From what I've read this variable VBLANK is already implemented in video cards and just need the drivers installed, and when we have monitor that have this there is not really any special hardware involved.

 

Final judgement should be reserved until after the products are available to the public of course.

 

If indeed it does not add to the manufacturing cost of a monitor what's to say that Asus won't charge an extra $50 to put the "freesync ready" sticker on the front. It wouldn't be the first time a company charges a a premium for a inferred hardware advantage as opposed to a real one. 

 

And everyone is still assuming that it will only require a driver change to implement properly, if that was the case then why did AMD not run their presentation on a desktop monitor? why hasn't it already been released? and most importantly if the technology is already in every GPU and monitor why is it not already an option in games like the disable vsync option? 

 

I suggest the reason it is not already a thing right now (even though people claim the hardware already supports it) and why Nvidia chose to go with a different more comprehensive (read expensive r+d designed) hardware approach is because it is either:

 

1. not currently supported by enough hardware

2. not the best implementation (see @LAwLz post for an good explanation)

3. Nvidia have worked hard to promote themselves as a premium GPU company and don't want to taint that with a sync implementation that is not "premium"

 

 

Don't get me wrong, I am not anti AMD, I am anti all the hype and assumptions made with very little actually information out there to support it.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

 "During an impromptu meeting in a hotel ballroom this morning, we got an eye-opening demo of a dynamic refresh rate capability that's been a part of Radeon GPUs for several generations. AMD thinks this feature can be combined with triple buffering to deliver G-Sync-like animation smoothness without the cost associated with specialized display hardware."

 

 

http://www.pcper.com/news/General-Tech/AMDs-under-reported-dynamic-refresh-rates

Link to comment
Share on other sites

Link to post
Share on other sites

In AMD's assessment, it's possible to achieve a G-Sync-like animation smoothness with a combination of two techniques: dynamic refresh rates and triple buffering. The exec initially expressed puzzlement over why Nvidia chose to implement them in expensive, external hardware. After all, triple-buffering can be implemented by a game developer in software or even enabled via a software switch in a graphics driver control panel. He said AMD used to have an option to force the use of triple buffering in its driver control panel, in fact, and would be willing to consider bringing it back.

 

The exec's puzzlement over Nvidia's use of external hardware was resolved when I spoke with him again later in the day. His new theory is that the display controller in Nvidia's current GPUs simply can't support variable refresh intervals, hence the need for an external G-Sync unit. That would explain things. I haven't yet had time to confirm this detail with Nvidia or to quiz them about whether G-Sync essentially does triple-buffering in the module. Nvidia has so far been deliberately vague about certain specifics of how G-Sync works, so we'll need to pry a little in order to better understand the situation.

 

 

http://techreport.com/news/25867/amd-could-counter-nvidia-g-sync-with-simpler-free-sync-tech

 

Interesting. I'm not sure whether triple buffering is a good or bad thing in terms of smoothness/latency. Will have to wait for tests.  

Link to comment
Share on other sites

Link to post
Share on other sites

If indeed it does not add to the manufacturing cost of a monitor what's to say that Asus won't charge an extra $50 to put the "freesync ready" sticker on the front. It wouldn't be the first time a company charges a a premium for a inferred hardware advantage as opposed to a real one. 

 

And everyone is still assuming that it will only require a driver change to implement properly, if that was the case then why did AMD not run their presentation on a desktop monitor? why hasn't it already been released? and most importantly if the technology is already in every GPU and monitor why is it not already an option in games like the disable vsync option? 

 

I suggest the reason it is not already a thing right now (even though people claim the hardware already supports it) and why Nvidia chose to go with a different more comprehensive (read expensive r+d designed) hardware approach is because it is either:

 

1. not currently supported by enough hardware

2. not the best implementation (see @LAwLz post for an good explanation)

3. Nvidia have worked hard to promote themselves as a premium GPU company and don't want to taint that with a sync implementation that is not "premium"

 

 

Don't get me wrong, I am not anti AMD, I am anti all the hype and assumptions made with very little actually information out there to support it.

 

In general AMD is a more budget oriented with their products, but its usually not the best. Considering this I'm assuming it will be cheaper. Doesn't mean it will, but that's what I think. I also think its not an option now because variable vblank isn't supported by monitors nowdays, but it may be adopted here in the future. That also makes me wonder how long variable vblank has been a vesa standard, if its relatively new then obviously not very much hardware will support it.

 

Also its pretty much guaranteed to not be as good as Nvidia's. There is no way they didn't know about variable vblank, and there has to be a reason why they made G-sync. They probably found a problem with variable vblank and it doesn't quite do what they want it to.

 

 

Also, did AMD come up with the term "freesync", because if they did, I'm hardcore facepalming right now. That is the stupidest name I've ever heard.

Old shit no one cares about but me.

Link to comment
Share on other sites

Link to post
Share on other sites

Aaaaaaaannnddd..... nVidia is thinking about fixing G-Sync to be free.

Main rig on profile

VAULT - File Server

Spoiler

Intel Core i5 11400 w/ Shadow Rock LP, 2x16GB SP GAMING 3200MHz CL16, ASUS PRIME Z590-A, 2x LSI 9211-8i, Fractal Define 7, 256GB Team MP33, 3x 6TB WD Red Pro (general storage), 3x 1TB Seagate Barracuda (dumping ground), 3x 8TB WD White-Label (Plex) (all 3 arrays in their respective Windows Parity storage spaces), Corsair RM750x, Windows 11 Education

Sleeper HP Pavilion A6137C

Spoiler

Intel Core i7 6700K @ 4.4GHz, 4x8GB G.SKILL Ares 1800MHz CL10, ASUS Z170M-E D3, 128GB Team MP33, 1TB Seagate Barracuda, 320GB Samsung Spinpoint (for video capture), MSI GTX 970 100ME, EVGA 650G1, Windows 10 Pro

Mac Mini (Late 2020)

Spoiler

Apple M1, 8GB RAM, 256GB, macOS Sonoma

Consoles: Softmodded 1.4 Xbox w/ 500GB HDD, Xbox 360 Elite 120GB Falcon, XB1X w/2TB MX500, Xbox Series X, PS1 1001, PS2 Slim 70000 w/ FreeMcBoot, PS4 Pro 7015B 1TB (retired), PS5 Digital, Nintendo Switch OLED, Nintendo Wii RVL-001 (black)

Link to comment
Share on other sites

Link to post
Share on other sites

 

 

 

Also, did AMD come up with the term "freesync", because if they did, I'm hardcore facepalming right now. That is the stupidest name I've ever heard.

Don't know but. I would certainly hope that If AMD push ahead and bring the concept to market that they choose a half decent name for it.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

In general AMD is a more budget oriented with their products, but its usually not the best. Considering this I'm assuming it will be cheaper. Doesn't mean it will, but that's what I think. I also think its not an option now because variable vblank isn't supported by monitors nowdays, but it may be adopted here in the future. That also makes me wonder how long variable vblank has been a vesa standard, if its relatively new then obviously not very much hardware will support it.

 

Also its pretty much guaranteed to not be as good as Nvidia's. There is no way they didn't know about variable vblank, and there has to be a reason why they made G-sync. They probably found a problem with variable vblank and it doesn't quite do what they want it to.

 

 

Also, did AMD come up with the term "freesync", because if they did, I'm hardcore facepalming right now. That is the stupidest name I've ever heard.

the vblank standard was made in 2002 and updated in 2003 hasnt changed from then 

(1) high frame rate (2) ultra graphics settings (3) cheap...>> choose only two<<...

 

if it's never been done then i'm probably tryna do it. (((((((Bass so low it HERTZ)))))))

Link to comment
Share on other sites

Link to post
Share on other sites

AMD has been doing a lot of great things lately in the GPU business. I hope they can try to get back into the enthusiast CPU market as well :)

CPU Overclocking Database <------- Over 275 submissions, and over 40,000 views!                         

GPU Overclocking Database                                                    

Link to comment
Share on other sites

Link to post
Share on other sites

the vblank standard was made in 2002 and updated in 2003 hasnt changed from then 

 

http://en.wikipedia.org/wiki/Coordinated_Video_Timings I'm not sure if that includes variable refresh rates.

 

Also using the term "VBLANK" is sort of incorrect when speaking of VESA standards, since VBLANK is just something that happens on pretty much all displays. http://en.wikipedia.org/wiki/Vertical_blanking_interval

Old shit no one cares about but me.

Link to comment
Share on other sites

Link to post
Share on other sites

But..but.. I just bought a 780ti like 3 hrs ago in high hopes for gsync.... well fml! Well atleast I got phyx right?..right?..Oh okay then :( 

no regerts.

4770k @4.4 / 16GB @2400 / Plextor MP5X 128GB / MSI Mpower Z87 / MSI GTX 1070 Armor OC / AX860 / XSPC RX240 & EX240 / Koolance 380i / CM 690 II / Qnix 1440p @96Hz / Benq XL2420G

Current Status: Mourning the loss of my 780 ti 

Link to comment
Share on other sites

Link to post
Share on other sites

is it only for laptops?

Cpu: FX8350 MoboGigabyte GA.990FXA-UD3  GPU: EVGA GTX770 2GB  RAM: 8GB Crucial ballistix tactical tracer  STORAGE: Crucial M500 240GB /seagate barracuda 1TB PSU: Corsair RM750  CASE: Xigmatek Talon  COOLING: Corsair H100i KEYBOARD: Quickfire TK KEYPAD: Razer Nostromo MOUSE: Razer Naga 2012  HEADSET: Tritton pro+

Link to comment
Share on other sites

Link to post
Share on other sites

no regerts.

haven't recievied mine yet coz of shipping.  :(

In the grim darkness of the far future, there is only a GTX 1080, just a single 1080, where my glorious PC once stood....

For that is all I need, For the Emperor of Man, Jen-Hsun Huang, protects. We march for Nvidia, and we shall know no fear!

Link to comment
Share on other sites

Link to post
Share on other sites

Too bad I have an Nvidia GPU..

If amd  wants to win the market, they'll probably make it open to all platforms

Finally my Santa hat doesn't look out of place

Link to comment
Share on other sites

Link to post
Share on other sites

in case some people missed it ^_^

 

here is the AMD CES 2014 Press Conference  :lol:

 

Link to comment
Share on other sites

Link to post
Share on other sites

If amd  wants to win the market, they'll probably make it open to all platforms

If its free, i dont see how it would help AMD win the market. Nvidia people would just use FreeSync with their existing cards. on the other hand, it would drop Gsync sales which causes Nvidia to not gain profit.

Link to comment
Share on other sites

Link to post
Share on other sites

If its free, i dont see how it would help AMD win the market. Nvidia people would just use FreeSync with their existing cards. on the other hand, it would drop Gsync sales which causes Nvidia to not gain profit.

That is assuming Nvidia GPUs are set up to use what Freesync uses to actually work. Which I'm assuming they aren't able to do that, or they would have.

† Christian Member †

For my pertinent links to guides, reviews, and anything similar, go here, and look under the spoiler labeled such. A brief history of Unix and it's relation to OS X by Builder.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

AMD has been doing a lot of great things lately in the GPU business. I hope they can try to get back into the enthusiast CPU market as well :)

Not likely, they are committed to HSA for the near future. Though once software support comes through, HSA will likely be a better performer than their dedicated CPUs in all but the most single-threaded software out there.

I do not feel obliged to believe that the same God who has endowed us with sense, reason and intellect has intended us to forgo their use, and by some other means to give us knowledge which we can attain by them. - Galileo Galilei
Build Logs: Tophat (in progress), DNAF | Useful Links: How To: Choosing Your Storage Devices and Configuration, Case Study: RAID Tolerance to Failure, Reducing Single Points of Failure in Redundant Storage , Why Choose an SSD?, ZFS From A to Z (Eric1024), Advanced RAID: Survival Rates, Flashing LSI RAID Cards (alpenwasser), SAN and Storage Networking

Link to comment
Share on other sites

Link to post
Share on other sites

That is assuming Nvidia GPUs are set up to use what Freesync uses to actually work. Which I'm assuming they aren't able to do that, or they would have.

 

Or that it pans out and actually works. I'm really not shocked that people are jumping for joy when all we know is that it was only demo'd on laptops/tablets whatever they were, and that AMD has said they "AMD isn’t ready to productize this nor does it have a public go to market strategy" ie, you ain't seeing it for a VERY VERY long time. My guess, maybe a year or so, if not longer.

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.


×