Jump to content

AMD FreeSync - AMD's Free G-SYNC Alternative

Torand

I think it is "free" in that they won't charge a royalty fee for implementing it.

Or maybe even 'you're free to use any gpu you want as long as it supports the standard'

Clearly most people in this thread have just had a crisis in their underpantaloons at the word 'free', imagining a driver update in a weeks time bringing them 'freesync'

Link to comment
Share on other sites

Link to post
Share on other sites

Erm since when does the chip not get enough juice? Compared to greenlight you can get much more power through to the GPU then a Green team card, currently allowing up 150% power through the PCIe socket and overvolting as well (on most cards) Take my 290x in my sig for example? Will concede that the reference cooler is terrible though. 

Go watch the 290(X) vs 780 sploosh showdown from LTT and you'll see what I mean 

i5 4670K | ASUS Z87 Gryphon | EVGA GTX 780 Classified | Kingston HyperX black 16GB |  Kingston HyperX 3K 120GB SSD | Seagate Barracude 3TB - RAID 1 | Silverstone Strider Plus 750W 80Plus Silver | CoolerMaster Hyper 212X | Fractal Design Define Mini 
 

Link to comment
Share on other sites

Link to post
Share on other sites

Or maybe even 'you're free to use any gpu you want as long as it supports the standard'

Clearly most people in this thread have just had a crisis in their underpantaloons at the word 'free', imagining a driver update in a weeks time bringing them 'freesync'

 

As it is an standard already present in most televisions, many people that are using a TV as a monitor will get it "as a driver update".

Intel i7 5820K (4.5 GHz) | MSI X99A MPower | 32 GB Kingston HyperX Fury 2666MHz | Asus RoG STRIX GTX 1080ti OC | Samsung 951 m.2 nVME 512GB | Crucial MX200 1000GB | Western Digital Caviar Black 2000GB | Noctua NH-D15 | Fractal Define R5 | Seasonic 860 Platinum | Logitech G910 | Sennheiser 599 | Blue Yeti | Logitech G502

 

Nikon D500 | Nikon 300mm f/4 PF  | Nikon 200-500 f/5.6 | Nikon 50mm f/1.8 | Tamron 70-210 f/4 VCII | Sigma 10-20 f/3.5 | Nikon 17-55 f/2.8 | Tamron 90mm F2.8 SP Di VC USD Macro | Neewer 750II

Link to comment
Share on other sites

Link to post
Share on other sites

Knew it was coming, but still good to get confirmation.

Link to comment
Share on other sites

Link to post
Share on other sites

As it is an standard already present in most televisions, many people that are using a TV as a monitor will get it "as a driver update".

If it was implemented on that TV.

Why not demonstrate their concept with supporting TVs? Instead of tablets? ;)

Link to comment
Share on other sites

Link to post
Share on other sites

VBLANK is about power saving and other such things.

No, it's not. That's one thing variable VBLANK intervals can be used for, but that's not the main purpose of why it was developed.

 

 

This section is about FreeSync (I cringe every time I write that) in general, not directed at Vitalius:

It is worth noting that there are some major differences between this VESA standard and G-sync.

 

1) VESA-2003-9 was originally made for CRTs. LCDs work completely different so chances are AMD is just using the same kind of signal to tell the monitor to change behavior. If this is the case, then I can almost guarantee you that your current monitor will not support it, because it is not what the standard originally specified. The command will look the same, but what the monitor does with the command is not standardized so it's completely up to the monitor to decide what to do with it. For example it might decide to say "no signal" if you send the command.

 

2) We have very little info about how this is done, but my guess is that this is similar to what Intel has been doing with DRRS. What DRRS does is that it changes between fixed refresh rates. So if it detects that you can only push 38 FPS it will drop the refresh rate of the monitor down to 40Hz instead of 60. If that's what is going on then it's not nearly as sophisticated as G-Sync which would match the 38 FPS exactly. That would explain why AMD had a demo with fixed frame rates instead of variable like Nvidia had. That is mostly speculation though since nothing has been confirmed yet.

 

3) The G-Sync module has a pretty big frame buffer it uses for frame duplication if frame rates drop below a certain point. The module also add support for backlight strobing which will reduce motion blur.

 

4) AMD doesn't have a good track record for releasing cool stuff they promise, or the products being pretty terrible once they are available. Just take MST hubs as an example of the former, and Enduro as an example of the latter.

 

5) AMD has no plans to make this into a real product. It is just a demo of what they could do. Quote from Anand:

AMD isn’t ready to productize this nor does it have a public go to market strategy

 

We need a lot more info about this "Freesync" (God damn AMD that name is awful please change it) thing before we can say anything concrete about it. Hopefully it will be great, but I feel that a lot of people are just excited because it is AMD announcing it and this place is filled with AMD fanboys. It's best to wait for the product to come out and be tested before praising it like it's the second coming of Jesus.

 

 

Edit: Found a post by "purehg" on %5BH%5DardForum which seems to make sense (but again nothing is confirmed):

FreeSync uses variable VBI, meaning the driver needs to setup the proper VBI for the next frame, therefore requires the driver to predict the future. If the app isn't running in constant FPS, then FreeSync will fail when FPS changes, and you will still see stuttering. Also, you need to enable VSYNC, therefore you still have the lag issue that GSync solves by working without VSYNC. Sure you will have a better experience, but not as good as GSync. With FreeSync you will have have software overhead, and if you predict conservatively you lose FPS, if you predict aggressively you might end up with more stuttering than plain VSYNC.

GSync solves the problems by holding VBI until the next frame is drawn, therefore there is no speculation, so it works under all circumstances. You simply can't do that in software, because software runs on the computer, not the monitor. You have to have a monitor smart enough to wait for the next GPU command to do the drawing, and that's why NVIDIA has to do it with a separate board. There is no VESA standard for that.

GSync is "GPU drives VBI," whereas FreeSync is "driver speculates VBI." The outcome can be close, but one is superior than the other.

Link to comment
Share on other sites

Link to post
Share on other sites

Wow AMD is good.

 

Spoiler

-

CPU:Ryzen 9 5900X GPU: Asus GTX 1080ti Strix MB: Asus Crosshair Viii Hero RAM: G.Skill Trident Neo CPU Cooler: Corsair H110

Link to comment
Share on other sites

Link to post
Share on other sites

We need a lot more info about this "Freesync" (God damn AMD that name is awful please change it) thing before we can say anything concrete about it. Hopefully it will be great, but I feel that a lot of people are just excited because it is AMD announcing it and this place is filled with AMD fanboys. It's best to wait for the product to come out and be tested before praising it like it's the second coming of Jesus.

 

I think part of the problem is that we know what G-Sync does and people have that in their head. Now they see something similar in this Freesync and automatically assume its the same. While I'll defer to you on knowledge because honestly, I don't know shit about monitors, it feels like AMD is stretching on this one. I mean, its not one thing to me, its several things. The way they demo'd it, the way they presented the information, and stuff like that. It's this feeling inside, that I hope I am wrong because it'll be good for AMD and Nvidia to battle it out (though it'll hurt us a little more short term price wise) but hopefully in the long run advance more tech.

Link to comment
Share on other sites

Link to post
Share on other sites

G SYNC is WAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAY better

All your posts lately have been you hating on AMD  :D Butthurt much?

Link to comment
Share on other sites

Link to post
Share on other sites

The scales were even before, but they just tipped slightly into AMDs favour I think.

Even if G Sync is better than FreeSync, there's Mantle. And AFAIK, having a higher frame rate slightly offsets the need of a variable refresh rate anyway.

Either way, gaming is getting even more interesting. I just hope nVidia doesn't start working on their own graphics API in retaliation :/

Tea, Metal, and poorly written code.

Link to comment
Share on other sites

Link to post
Share on other sites

Man, what a frustrating week not to have a WAN show, haha. In the meantime, I hope we get some more information/discussion from people who seem to have a good bit of knowledge about the subject, like the great post from @LAwLz

 

The one thing we know for sure now is that this was a crafty marketing move by AMD to kill some of the enthusiasm as a bunch of G-sync monitors were getting announced. But on a personal level I suppose I should thank AMD. It's way too cold here, and I was having trouble getting fully awake this morning. I reach for my tablet and pull it into bed, check the LTT news sub-forum, see this topic title and BAM - just like that I'm fully awake. :lol:

Link to comment
Share on other sites

Link to post
Share on other sites

Go watch the 290(X) vs 780 sploosh showdown from LTT and you'll see what I mean 

Saw that video, its all silicon lottery? If you honestly think that is an example of how all 290(x) cards perform under overclocking you don't know much. (no malice intended.) But on terms of power delivery you can alot more with an AMD GPU than a Nvidia one, exceptions would include custom bio's like the lightenings LN2 bios. I'm also not saying Nvidia cards don't OC they do and well, but your statement about power delivery is false. 

AMD Ryzen 5900x, Nvidia RTX 3080 (MSI Gaming X-trio), ASrock X570 Extreme4, 32GB Corsair Vengeance RGB @ 3200mhz CL16, Corsair MP600 1TB, Intel 660P 1TB, Corsair HX1000, Corsair 680x, Corsair H100i Platinum

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

No, it's not. That's one thing variable VBLANK intervals can be used for, but that's not the main purpose of why it was developed.

I'm technically correct based on the assumptions of where I said what. What I said was in relation to Freesync and AMD. AMD has been seeking to use VBLANK to reduce power usage in laptops.

I wasn't talking about why it was developed although I can understand that how you would take it that way based on how I said it. I was talking about what it was being used for in this instance. I felt I needed to mention that it was a VESA standard because I thought the one I was replying to thought it was an AMD thing purely (i.e. they made VBLANK, even though they didn't). 

It being used for power savings and being a VESA Standard was meant as two separate pieces of information.

From my understanding, VBLANK is just a fancy way to make it such that the screen only refreshes when a new frame is ready. Which was either necessary in CRTs and analog TVs for various reasons (unrelated to frame tearing) or just a by-product of whatever VBLANK actually did (the whole CRT magetic coils and beams and stuff goes right over my head).

And such, the refresh rate is variable. And screen tearing is completely eliminated because screen tearing is caused by the screen refreshing with pieces of different frames showing, rather than all of one of them (effectively).

That's how they explain it in the article at least. If that's how AMD does it, I have no idea, but it makes sense in theory.

† Christian Member †

For my pertinent links to guides, reviews, and anything similar, go here, and look under the spoiler labeled such. A brief history of Unix and it's relation to OS X by Builder.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

So if we go out and buy a TV and use it as a monitor, it should work out of the box?

should, asTVs are most likely to use the vesa standards but it  won't hurt to double check  for the Coordinated Video Timings http://en.wikipedia.org/wiki/VESA

(1) high frame rate (2) ultra graphics settings (3) cheap...>> choose only two<<...

 

if it's never been done then i'm probably tryna do it. (((((((Bass so low it HERTZ)))))))

Link to comment
Share on other sites

Link to post
Share on other sites

should, asTVs are most likely to use the vesa standards but it  won't hurt to double check  for the Coordinated Video Timings http://en.wikipedia.org/wiki/VESA

VESA is just an organization that makes standards. You don't have to implement them if you don't want to. I wouldn't be surprised if next to no monitors or TVs supports it.

Link to comment
Share on other sites

Link to post
Share on other sites

I think... you are missing something here.

VBLANK is about power saving and other such things. It's a VESA standard. 

AMD using VBLANK as make-shift VSync is an AMD technology (in how they use it). They are making use of a feature that is there on monitors for something no one has used it for before (AFAIK). 

So no. What AMD is doing is unique and theirs. VBLANK isn't theirs, but the way they are using it is theirs.

 

I'm not missing anything, I know what VBLANK is. Its just that Its a way to control the framerate on the monitor for power saving reasons, but in the end, there is a controllable, variable frame rate.

 

I still think Nvidia would have known to look into something like this before making new hardware. Using an existing standard is a lot cheaper than designing and manufacturing new hardware. It seems kind of stupid that Nvidia didn't do this. 

Old shit no one cares about but me.

Link to comment
Share on other sites

Link to post
Share on other sites

VESA is just an organization that makes standards. You don't have to implement them if you don't want to. I wouldn't be surprised if next to no monitors or TVs supports it.

well the mounting system is quite widespread among TVs so that lead me to believe TVs support more vesa standards than monitors. Either way i hope it's as simple as a small mod to existing display from manufactures to support  VBLANK so that it would catch quickly and at no extra cost. also amd should give marketing a shot.   

(1) high frame rate (2) ultra graphics settings (3) cheap...>> choose only two<<...

 

if it's never been done then i'm probably tryna do it. (((((((Bass so low it HERTZ)))))))

Link to comment
Share on other sites

Link to post
Share on other sites

Is it possible that this would come to Nvidia GPUs too? Both my desktop and laptop have Nvidia..

NZXT Phantom|FX-8320 @4.4GHz|Gigabyte 970A-UD3P|240GB SSD|2x 500GB HDD|16GB RAM|2x AMD MSI R9 270|2x 1080p IPS|Win 10

Dell Precision M4500 - Dell Latitude E4310 - HTC One M8

$200 Volvo 245

 

Link to comment
Share on other sites

Link to post
Share on other sites

Is it possible that this would come to Nvidia GPUs too? Both my desktop and laptop have Nvidia..

 

Its a possibility, but I think Nvidia is gonna make you have to pay for their hardware since they already developed it.

Old shit no one cares about but me.

Link to comment
Share on other sites

Link to post
Share on other sites

seeing as the standard was made in 2002 and most TVs use the vesa standard there should be no problems using this tech on a TV unfortunately not a lot of monitors do the whole VESA thing ... currently i'm using a 22'' tv and the refresh rated of the thing does seem to adjust to whatever i'm doing but not continuously so i would still get tearing with an fps spike (half life 2 runs at 74fps on my pc and when i go into the monitor settings the refresh rate reads 74fps)     

The VESA standard is only the 60hz/16ms.

Variable VBLANK which you need for this is NOT a standard.

So this will not work on a normal Monitor/TV!

 

RTX2070OC 

Link to comment
Share on other sites

Link to post
Share on other sites

lolol rip-off 

AMD Ryzen 7950X3D [x2 360 Rad EKWB] | Asus Extreme x670E | RTX TUF 4090 OC@3GHz [EKWB Block] | Corsair HX1500i | G.Skill Neo CL30 6000MHz (2x32GB) | MP700 2TB Gen5 + 980/950 Pro 1TB M.2 | x6 85PRO 512GB | NAS 4x 18TB Seagate Exo Raid 10+Backblaze | Lian-Li o11D XL | Main Screen: Samsung OLED G9 | AUX: LG IPS7 27" (x2) LG CX 55" G-Sync | Copyright©1996-2024 Teletha All rights reserved. ®

Link to comment
Share on other sites

Link to post
Share on other sites

Now we just got to see how it will compete with G-Sync. as far as performance.

The year is 20XX. Everyone plays Fox at TAS levels of perfection. Because of this, the winner of a match depends solely on port priority. The RPS metagame has evolved to ridiculous levels due to it being the only remaining factor to decide matches.

Only Abate, Axe, and Wobbles can save us.

Link to comment
Share on other sites

Link to post
Share on other sites

It kinda worries me that so many people have heard this "suggestion" from AMD and automatically assumed the following:

 

1. it won't cost anything (nothing is free)

2. it will be better than or the same as gsync (how?  nvidia knew about vblank and chose to go with something else why?)

3. G-sync cost more to implement (again how? has someone got a link to explain this. and don't post links to sales sites, we know it costs more to buy,  that is how marketing works and not necessarily the result of  implementation costs)

4. It will run on anything (even though the article clearly said that was an unknown)

 

Haven't we learnt anything?  wait until a fully working market ready example is out for testing before making silly comments about how this will force nvidia to drop their prices (not that we know if they are even over charging for it) or making silly decisions like buying a GPU.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

I'm quite suprised at the hate nvidia gets for premium tech and the praise amd gets for announcments of announcments. Then it deteriorates to some random epeen contest from two groups of fanboys. Is it simply because nvidia aims for the premium and amd has usually more budget oriented options and people get envious that they can't justify premium for themselves? <.< C'mon, it's enthusiasts forum!

 

No, it's not. That's one thing variable VBLANK intervals can be used for, but that's not the main purpose of why it was developed.
 
 
This section is about FreeSync (I cringe every time I write that) in general, not directed at Vitalius:
It is worth noting that there are some major differences between this VESA standard and G-sync.
 
1) VESA-2003-9 was originally made for CRTs. LCDs work completely different so chances are AMD is just using the same kind of signal to tell the monitor to change behavior. If this is the case, then I can almost guarantee you that your current monitor will not support it, because it is not what the standard originally specified. The command will look the same, but what the monitor does with the command is not standardized so it's completely up to the monitor to decide what to do with it. For example it might decide to say "no signal" if you send the command.
 
2) We have very little info about how this is done, but my guess is that this is similar to what Intel has been doing with DRRS. What DRRS does is that it changes between fixed refresh rates. So if it detects that you can only push 38 FPS it will drop the refresh rate of the monitor down to 40Hz instead of 60. If that's what is going on then it's not nearly as sophisticated as G-Sync which would match the 38 FPS exactly. That would explain why AMD had a demo with fixed frame rates instead of variable like Nvidia had. That is mostly speculation though since nothing has been confirmed yet.
 
3) The G-Sync module has a pretty big frame buffer it uses for frame duplication if frame rates drop below a certain point. The module also add support for backlight strobing which will reduce motion blur.
 
4) AMD doesn't have a good track record for releasing cool stuff they promise, or the products being pretty terrible once they are available. Just take MST hubs as an example of the former, and Enduro as an example of the latter.
 
5) AMD has no plans to make this into a real product. It is just a demo of what they could do. Quote from Anand:

 
We need a lot more info about this "Freesync" (God damn AMD that name is awful please change it) thing before we can say anything concrete about it. Hopefully it will be great, but I feel that a lot of people are just excited because it is AMD announcing it and this place is filled with AMD fanboys. It's best to wait for the product to come out and be tested before praising it like it's the second coming of Jesus.
 
 
Edit: Found a post by "purehg" on %5BH%5DardForum which seems to make sense (but again nothing is confirmed):

Seems very reasonable

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.


×