Jump to content

HDMI supports VRR not Freeysnc

GnomeKing

From something else I posted it is basically this. I can't find anything confirming what everyone seems to agree on with both the xbox, freesync, and intel. I see words being used interchangeably but they are vastly different.

 

VRR is not freesync.

VRR is a VESA standard

Freesync is AMD implementation and requires AMD tech, Hence why Nvidia can not support it.

 

copied from another forum.

 

Show me 1 slide, 1 information article or anything that shows HDMI is going to support freesync and not VRR (different tech) and yes they are different (prove that they are different as well if you can). This is like the Intel will support freesync shit that never happened. That was years ago. Where is it that. End of g-sync we have amd and intel working together. but no Intel never even said that they will support freesync just VRR. If anyone can show me evidence that HDMI/displayport VRR is freesync instead of just a VRR than I will agree. Till than they are implementing different things.

Maybe with Xbox supporting it natively it might push HDMI to freesync instead of just VRR as a standard. But as of this moment there is no news of HDMI supporting Freesync  Just VRR (different than VRR)

Link to comment
Share on other sites

Link to post
Share on other sites

Can you be more specific? I'm not seeing anywhere that g-sync is VRR or FREESYNC. Nor that Nvidia can use Freesync tech.

 

FreeSync is royalty-free, free to use, and has no performance penalty.

 

G-sync is not rotyalty free since its hardware based, and is also performance free.'

 

Royalty free doesn't mean anyone can use it. That is royalty free for their hardware partners (display partners) not for anyone to use.

 

 

 

 

(throwing my opnion in here. It is because they can't! Freesync is AMD tech and can only be used by AMD.) Hence why it requires certain AMD cards and not just any random display output with a driver update.

Link to comment
Share on other sites

Link to post
Share on other sites

Yes, Both freesync2 and gysnc2 support HDR. And that is fantastic. Needs to happen, and i'd like to know more about gsync 2 and freesync 2 for specs (nits,bits per channel, back light, etc)

 

 

But i'm still of the mind that they are both exclusive tech but with AMD just saying ((open monitor support (because it is free) Amd tech required) while Nvidia is both monitor tech and Nvida hardware. Amd is saying only hardware is needed  so somehow we are 100% open. There is specific tech needed from AMD. AMD's own website says that.

 

AMD says they use standards in all there slides. But it is not a standard you need a certain tech (AMD) Just just needing to buy hardware in your monitor.

Link to comment
Share on other sites

Link to post
Share on other sites

Just to be clear, literally anything requires some tech or hardware to be built. That isn't what makes something proprietary or not. Just sending out video over a DisplayPort cable doesn't happen by itself, you need to design hardware that will do it. The DisplayPort standard just describes what your hardware will need to be able to do, but AMD and NVIDIA both have to design hardware that will do those things. If we suppose that AMD had implemented DisplayPort and NVIDIA hasn't, yes there is "AMD tech" involved with sending out DisplayPort signals on AMD cards. That doesn't mean NVIDIA can't implement DisplayPort too. The standard is available to them, they can build hardware that implements that standard too. If it was a proprietary standard then they would have no idea where to start.

 

VESA Adaptive-Sync is part of the DisplayPort standard, and describes a standardized method of communicating variable refresh rate video. AMD still had to make hardware capable of sending the signals described by this standard, so there is "AMD tech" involved there, and they had to update their drivers to use it. NVIDIA will have to do the same. That doesn't make it a proprietary standard that NVIDIA "can't" implement. The information on how to communicate with existing FreeSync monitors is available to them, they can implement it if they want to.

 

HDMI 2.1 has added their own protocol for variable refresh, which can be used by AMD and NVIDIA if they want to. Prior to this AMD has already created their own custom solution for getting variable refresh rates over HDMI, but the details of how this works are unknown to me.

Link to comment
Share on other sites

Link to post
Share on other sites

20 hours ago, Glenwing said:

Just to be clear, literally anything requires some tech or hardware to be built. That isn't what makes something proprietary or not. Just sending out video over a DisplayPort cable doesn't happen by itself, you need to design hardware that will do it. The DisplayPort standard just describes what your hardware will need to be able to do, but AMD and NVIDIA both have to design hardware that will do those things. If we suppose that AMD had implemented DisplayPort and NVIDIA hasn't, yes there is "AMD tech" involved with sending out DisplayPort signals on AMD cards. That doesn't mean NVIDIA can't implement DisplayPort too. The standard is available to them, they can build hardware that implements that standard too. If it was a proprietary standard then they would have no idea where to start.

 

VESA Adaptive-Sync is part of the DisplayPort standard, and describes a standardized method of communicating variable refresh rate video. AMD still had to make hardware capable of sending the signals described by this standard, so there is "AMD tech" involved there, and they had to update their drivers to use it. NVIDIA will have to do the same. That doesn't make it a proprietary standard that NVIDIA "can't" implement. The information on how to communicate with existing FreeSync monitors is available to them, they can implement it if they want to.

 

HDMI 2.1 has added their own protocol for variable refresh, which can be used by AMD and NVIDIA if they want to. Prior to this AMD has already created their own custom solution for getting variable refresh rates over HDMI, but the details of how this works are unknown to me.

Just to fill you in, the HDMI standard allows for some flexibility in adding proprietary functionality. Should a manufacturer, for example Samsung, have an HDMI 7.1 receiver hooked up to one of their flagship TVs and they wanted to create a proprietary information path between the two via HDMI, the specification allows for that.  Earlier HDMI standards had this to some extent, CEC and power state mirroring and that kind of thing, but from what I understand later standards let it go even further.  It is to my understanding that AMD simply took advantage of this ability to add proprietary functionality into HDMI and just was able to leverage it to create variable refresh rate.  Of course the manufacturer has to build in functionality on their side to be compatible with AMDs proprietary messaging.

So the delivery of custom data packets/messages over HDMI is an official feature. Interpreting the content of said message is not an official feature.  This method allows manufacturers to build their own scalars or chips that can interpret these messages and implement FreeSync.  Nvidia Gsync requires purchase of a Gsync module which is just an FPGA that acts as a middle man to an existing scaler, which is why it carries such a high premium.

From what I understand AMDs intent was to put pressure on and/or work with VESA to start adopting a more official method for variable refresh rate and it looks like they succeeded. Actually it's pretty similar to how Mantle got folded into Vulkan.  It will be interesting to see if G-sync has a place in the future if HDMI and DisplayPort's baked in features work well.  My guess is that Nvidia will continue to push G-sync as a "premium" experience and will probably push toward ultra low latency at all costs.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, ShredBird said:

Just to fill you in, the HDMI standard allows for some flexibility in adding proprietary functionality. Should a manufacturer, for example Samsung, have an HDMI 7.1 receiver hooked up to one of their flagship TVs and they wanted to create a proprietary information path between the two via HDMI, the specification allows for that.  Earlier HDMI standards had this to some extent, CEC and power state mirroring and that kind of thing, but from what I understand later standards let it go even further.  It is to my understanding that AMD simply took advantage of this ability to add proprietary functionality into HDMI and just was able to leverage it to create variable refresh rate.  Of course the manufacturer has to build in functionality on their side to be compatible with AMDs proprietary messaging.

I understand that, I meant I don't know the exact details of how AMD's FreeSync implementation over HDMI operates :)

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Glenwing said:

I understand that, I meant I don't know the exact details of how AMD's FreeSync implementation over HDMI operates :)

I'd be interested in looking at a whitepaper on it myself.  It would be interesting to see how they managed to get the latency down using a mechanism that wasn't designed for that.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×