Jump to content

Nvidia slams AMD FreeSync: "We can't comment on pricing of products that don't exist"

Faa

I don't know. that sounds impossible and opposite of what it should do.   The monitor can't know what framerate the game is running at so it can't tell the gpu to stop rendering. At best it might be that the monitor asks the gpu at what refresh rate to run at. It sounds like a programing design problem like should the client tell the seller what to bring it or should the seller ask the client what it wants. If in the end the result is the same

(or in this case similar) it shouldn't matter which approach was taken.

The monitor runs at a certain rate. It tells the GPU not to exceed that rate. It's a lot simpler than knowing the frame rate of the game. G-Sync is the GPU calling the shots.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Actually it's the other way around. G-sync uses a complex 2 way system, where the gfx constantly spams the monitor, if it's ready for a frame. When the monitor says yes, it is sent a frame.  

Adaptive Sync on the other hand, uses plug and play, to tell the gfx upon connection, what the supported min/max hz are for the monitor. The gfx will then send frames within that interval, and the monitor will display it instantly.

Both G-sync and Adaptive sync will give stutter if they go below the supported min hz of the monitor.

So Adaptive Sync is simpler, easier and actually better (remember Adaptive Sync supports 9-240hz in the standard, gsync only 30-144hz).

You're thinking of G-Sync which does that, a technique Nvidia has patented out the wazoo.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

I don't care. Happy Green Team gave them a small figurative finger.

Main rig on profile

VAULT - File Server

Spoiler

Intel Core i5 11400 w/ Shadow Rock LP, 2x16GB SP GAMING 3200MHz CL16, ASUS PRIME Z590-A, 2x LSI 9211-8i, Fractal Define 7, 256GB Team MP33, 3x 6TB WD Red Pro (general storage), 3x 1TB Seagate Barracuda (dumping ground), 3x 8TB WD White-Label (Plex) (all 3 arrays in their respective Windows Parity storage spaces), Corsair RM750x, Windows 11 Education

Sleeper HP Pavilion A6137C

Spoiler

Intel Core i7 6700K @ 4.4GHz, 4x8GB G.SKILL Ares 1800MHz CL10, ASUS Z170M-E D3, 128GB Team MP33, 1TB Seagate Barracuda, 320GB Samsung Spinpoint (for video capture), MSI GTX 970 100ME, EVGA 650G1, Windows 10 Pro

Mac Mini (Late 2020)

Spoiler

Apple M1, 8GB RAM, 256GB, macOS Sonoma

Consoles: Softmodded 1.4 Xbox w/ 500GB HDD, Xbox 360 Elite 120GB Falcon, XB1X w/2TB MX500, Xbox Series X, PS1 1001, PS2 Slim 70000 w/ FreeMcBoot, PS4 Pro 7015B 1TB (retired), PS5 Digital, Nintendo Switch OLED, Nintendo Wii RVL-001 (black)

Link to comment
Share on other sites

Link to post
Share on other sites

You're thinking of G-Sync which does that, a technique Nvidia has patented out the wazoo.

Everything in my post is correct. You must be confused. 

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

The amount of misinformation and bullcrap is making my brain melt. Why are people arguing about this? It was clearly a click-bait title. Nobody slammed anyone, Nvidia simply said they can't comment on a technology that doesn't actually exist yet. How were they meant to comment on the pricing of a product that doesn't even have a price tag yet?

Case: Phanteks Enthoo Pro | PSU: Enermax Revolution87+ 850W | Motherboard: MSI Z97 MPOWER MAX AC | GPU 1: MSI R9 290X Lightning | CPU: Intel Core i7 4790k | SSD: Samsung SM951 128GB M.2 | HDDs: 2x 3TB WD Black (RAID1) | CPU Cooler: Silverstone Heligon HE01 | RAM: 4 x 4GB Team Group 1600Mhz

Link to comment
Share on other sites

Link to post
Share on other sites

Everything in my post is correct. You must be confused. 

Nope. I've read both RFCs (Request For Comment = protocol documentation) for both FreeSync and G-Sync. FreeSync lacks dynamic refresh rate as shown by the demo being locked at 47-48 fps. It commands the GPU to stop drawing if the monitor is in the middle of a frame. Also, this happens to slow down general-purpose computations on a GPU. It's a far inferior solution, lack of cost or not.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

can we just wait until freesync/ adaptive sync samples actually comes out before we either start praising it or start throwing stones at it?

CPU amd phenom ii x4 965 @ 3.4Ghz | Motherboard msi 970a-g46 | RAM 2x 4GB Team Elite | GPU XFX Radeon HD 7870 DD | Case NZXT Gamma Classic | HDD 750 GB Hitachi | PSU ocz modxstream pro 600w

Link to comment
Share on other sites

Link to post
Share on other sites

Nope. I've read both RFCs (Request For Comment = protocol documentation) for both FreeSync and G-Sync. FreeSync lacks dynamic refresh rate as shown by the demo being locked at 47-48 fps. It commands the GPU to stop drawing if the monitor is in the middle of a frame. Also, this happens to slow down general-purpose computations on a GPU. It's a far inferior solution, lack of cost or not.

 

Sure you have. VESA disagrees with you http://www.vesa.org/news/vesa-adds-adaptive-sync-to-popular-displayport-video-standard/

Also the 47-48fps from some random german site, is a claim, that has not been proven.

If your point is correct, that the monitor tells the gpu not to draw (which I doubt, since there is no 2 way communication), there is no point in gsync or adaptive sync, if the fps goes above or below the hz interval of the monitor.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

source

his ass

Link to comment
Share on other sites

Link to post
Share on other sites

so you are saying that both amd and VESA are lying out of their teeth and we should never trust either company ever again

No, the presentation was falsely presented thats all. 

Link to comment
Share on other sites

Link to post
Share on other sites

Sure you have. VESA disagrees with you http://www.vesa.org/news/vesa-adds-adaptive-sync-to-popular-displayport-video-standard/

Also the 47-48fps from some random german site, is a claim, that has not been proven.

If your point is correct, that the monitor tells the gpu not to draw (which I doubt, since there is no 2 way communication), there is no point in gsync or adaptive sync, if the fps goes above or below the hz interval of the monitor.

It was proven with a high-speed camera and some basic transformative math. Also, VESA can claim whatever it wants publicly because the RFC covers their ass legally and is accessible to anyone to read. Sorry but your golden goose lays leaden eggs. I'm going to come across as a cocky bastard for saying this, but you really can't beat me when it comes to doing research and knowing the facts. When FreeSync monitors come out and Anandtech and others can give comprehensive side-by-side comparisons, and you're disappointed, just remember you were warned by a real enthusiast. Also, there is 2-way communication in DVI, HDMI, and DP. Look at the RFCs of all three protocols. There's 2-way communication in the RAM bus, in PCIe, in USB, in SATA, and in the graphics output protocols post-VGA.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Nope. I've read both RFCs (Request For Comment = protocol documentation) for both FreeSync and G-Sync. FreeSync lacks dynamic refresh rate as shown by the demo being locked at 47-48 fps. It commands the GPU to stop drawing if the monitor is in the middle of a frame. Also, this happens to slow down general-purpose computations on a GPU. It's a far inferior solution, lack of cost or not.

That doesn't sound right. VESA have said that it supports variable refresh rate.

 

Citation from the Adaptive-Sync whitepaper:

Variable refresh rate technologies such as DisplayPort Adaptive-Sync address these issues by providing a mechanism that allows the display refresh rate to change dynamically in response to the rendering frame rate of the game. In the case illustrated in Figure 2, the display will wait until Render Frame B is finished and ready before updating the display.

I don't think VESA has anything to gain from lying.

 

Anyway I am going to take Nvidia's side here. You just can't comment on products that aren't out yet. We will have to wait and see what will happen when products actually hit the market.

Link to comment
Share on other sites

Link to post
Share on other sites

That doesn't sound right. VESA have said that it supports variable refresh rate.

 

Citation from the Adaptive-Sync whitepaper:

 

 

Anyway I am going to take Nvidia's side here. You just can't comment on products that aren't out yet. We will have to wait and see what will happen when products actually hit the market.

A WhitePaper is not a legally-binding source. We see white papers on IP, HTTP, and the experimental protocols all the time even today, but do you know how many of them which are provably false result in sanctions either academic or industrial? Pretty much none. Like I said, VESA can say what they want, but the Adaptive-Sync RFC has no way of having the GPU tell the monitor to stop drawing/delay a frame flush.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

A WhitePaper is not a legally-binding source. We see white papers on IP, HTTP, and the experimental protocols all the time even today, but do you know how many of them which are provably false result in sanctions either academic or industrial? Pretty much none. Like I said, VESA can say what they want, but the Adaptive-Sync RFC has no way of having the GPU tell the monitor to stop drawing/delay a frame flush.

But what would they gain from lying? I mean, it's a non-profit organization.

 

I can't seem to find any RFC for it (probably because RFC is an IETF thing, not VESA). Can you please give me a link?

Link to comment
Share on other sites

Link to post
Share on other sites

But what would they gain from lying? I mean, it's a non-profit organization.

 

I can't seem to find any RFC for it (probably because RFC is an IETF thing, not VESA). Can you please give me a link?

AMD stands to gain from it due to the PR and the connection with its 3xx series GPUs all supporting DP 1.2a or later. Don't forget AMD has a pretty big voice at VESA. I just found another article about Free Sync. I'll be back with the RFC in a minute, though I may need to find a way to host the pdf.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

@dalekphalm

 

Sorry that was meant as a joke in response to the sarcastic reply by that particular poster.

 

Reminds me that sarcasm doesn't work online :D

Owner of a top of the line 13" MacBook Pro with Retina Display (Dual Boot OS X El Capitan & Win 10):
Core i7-4558U @ 3.2GHz II Intel Iris @ 1200MHz II 1TB Apple/Samsung SSD II 16 GB RAM @ 1600MHz

Link to comment
Share on other sites

Link to post
Share on other sites

The amount of misinformation and bullcrap is making my brain melt. Why are people arguing about this? It was clearly a click-bait title. Nobody slammed anyone, Nvidia simply said they can't comment on a technology that doesn't actually exist yet. How were they meant to comment on the pricing of a product that doesn't even have a price tag yet?

 

 

can we just wait until freesync/ adaptive sync samples actually comes out before we either start praising it or start throwing stones at it?

Link to comment
Share on other sites

Link to post
Share on other sites

Free-Sync was and is an attempt by AMD to steal some of Nvidia's thunder. They don't have anything to show for themselves.

They showed more than enough during the bitcoin gpu mining craze.

Anyone who has a sister hates the fact that his sister isn't Kasugano Sora.
Anyone who does not have a sister hates the fact that Kasugano Sora isn't his sister.
I'm not insulting anyone; I'm just being condescending. There is a difference, you see...

Link to comment
Share on other sites

Link to post
Share on other sites

The monitor runs at a certain rate. It tells the GPU not to exceed that rate. It's a lot simpler than knowing the frame rate of the game. G-Sync is the GPU calling the shots.

That makes no sense. What you are saying basically sounds like v-sync.

Link to comment
Share on other sites

Link to post
Share on other sites

This has really devolved into conspiracy theory nonsense, the idea that AMD/Vesa is 'lying' about adaptive sync is completely absurd. And to top it off, this whole ridiculous thread is based on a completely sensationalist headline, obviously nvidia cannot comment on the prices of a product thats not yet released, there wasn't even any "Nvidia slams AMD" in the first place.

Link to comment
Share on other sites

Link to post
Share on other sites

Nope. I've read both RFCs (Request For Comment = protocol documentation) for both FreeSync and G-Sync.

It was proven with a high-speed camera and some basic transformative math. Also, VESA can claim whatever it wants publicly because the RFC covers their ass legally and is accessible to anyone to read. Sorry but your golden goose lays leaden eggs. I'm going to come across as a cocky bastard for saying this, but you really can't beat me when it comes to doing research and knowing the facts. When FreeSync monitors come out and Anandtech and others can give comprehensive side-by-side comparisons, and you're disappointed, just remember you were warned by a real enthusiast. Also, there is 2-way communication in DVI, HDMI, and DP. Look at the RFCs of all three protocols. There's 2-way communication in the RAM bus, in PCIe, in USB, in SATA, and in the graphics output protocols post-VGA.

A WhitePaper is not a legally-binding source. We see white papers on IP, HTTP, and the experimental protocols all the time even today, but do you know how many of them which are provably false result in sanctions either academic or industrial? Pretty much none. Like I said, VESA can say what they want, but the Adaptive-Sync RFC has no way of having the GPU tell the monitor to stop drawing/delay a frame flush.

 

You are so full of bullpoo it's beyond comprehension. RFC is part of ietf and is about internetprotocols and other standards concerning the internet. it has NOTHING to do with VESA, displayport, g-sync or anything else. There is nothing on their site containing any information of any of those subjects either.

 

Legally binding to whom? VESA is not responsible for any other than their own members, and whomever they sign contracts with. Those contracts ARE legally binding. Some random internet organization, does not hold special legal rights, nor is any organization or hardware vendor required to have their non internet relevant hardware, approved or documented by this organization. EITF, or should I say ISOC, that owns it, is a non profit org., that is in no way an authority on hardware. As such they have no legal right or authority over any hardware standard, or other organizations, such as VESA.

 

Your anti AMD posts are nearing propaganda, and you obviously have some sort of agenda here. Namedropping random organizations, does not make you credible, nor does it back up your made up arguments.

You are welcome to post links to the "official" RFC's to these techs, but we both know they don't exist. IF they do, for some strange reason, they are no more relevant, and not at all more legally binding than anything else VESA releases about their own standard.

 

 

There is ONE question you ask, that I will give you, and find interesting myself. Whatt happens in Adaptive Sync, when a game produces over the max framerate supported by the monitor. We know asus gsync rog supports a framerate of 30-144hz. We know that if the fps goes under 30, that stutter will be introduced like vsync. We also know that if the fps goes ABOVE 144hz, that latency spikes:

 

lag-csgo.png

http://www.blurbusters.com/gsync/preview2/

 

I don't know yet, if Adaptive Sync, tells the GPU to wait with frames, if it hits max supported hz. I don't see the point in making frames, that cannot be drawn, but some pro players, might disagree here. Either way, we will have to see what happens. Remember variable framerate ONLY works within the supported interval of the monitor. That is the case with Adaptive Sync and Gsync. If you have any official document stating what happens, feel free to link those as well.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

You are so full of bullpoo it's beyond comprehension. RFC is part of ietf and is about internetprotocols and other standards concerning the internet. it has NOTHING to do with VESA, displayport, g-sync or anything else. There is nothing on their site containing any information of any of those subjects either.

Legally binding to whom? VESA is not responsible for any other than their own members, and whomever they sign contracts with. Those contracts ARE legally binding. Some random internet organization, does not hold special legal rights, nor is any organization or hardware vendor required to have their non internet relevant hardware, approved or documented by this organization. EITF, or should I say ISOC, that owns it, is a non profit org., that is in no way an authority on hardware. As such they have no legal right or authority over any hardware standard, or other organizations, such as VESA.

Your anti AMD posts are nearing propaganda, and you obviously have some sort of agenda here. Namedropping random organizations, does not make you credible, nor does it back up your made up arguments.

You are welcome to post links to the "official" RFC's to these techs, but we both know they don't exist. IF they do, for some strange reason, they are no more relevant, and not at all more legally binding than anything else VESA releases about their own standard.

There is ONE question you ask, that I will give you, and find interesting myself. Whatt happens in Adaptive Sync, when a game produces over the max framerate supported by the monitor. We know asus gsync rog supports a framerate of 30-144hz. We know that if the fps goes under 30, that stutter will be introduced like vsync. We also know that if the fps goes ABOVE 144hz, that latency spikes:

lag-csgo.png

http://www.blurbusters.com/gsync/preview2/

I don't know yet, if Adaptive Sync, tells the GPU to wait with frames, if it hits max supported hz. I don't see the point in making frames, that cannot be drawn, but some pro players, might disagree here. Either way, we will have to see what happens. Remember variable framerate ONLY works within the supported interval of the monitor. That is the case with Adaptive Sync and Gsync. If you have any official document stating what happens, feel free to link those as well.

RFCs are not unique to IETF. That's a very common misconception. Furthermore, the reason it's bad for the monitor to tell the GPU to stop is it causes jittering (damn autocorrect). If the GPU calls the shots it can edit an intermediate frame to prevent tearing via a simple algorithm. For people who want butter smooth graphics G-Sync is already engineered from the right direction whereas FreeSync isn't. If it turns out that FreeSync is the perfect version out of the box(not likely) then G-Sync will be revised to better handle the edge cases it currently fails at, but in the end FreeSync will jitter in high-performance cases and G-Sync won't.

And before someone on here accuses me of fanboying, I've posted a number of articles on AMD's recent innovations. This just isn't going to be one of them.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

RFCs are not unique to IETF. That's a very common misconception. Furthermore, the reason it's bad for the monitor to tell the GPU to stop is it causes jiggering. If the GPU calls the shots it can edit an intermediate frame to prevent tearing via a simple algorithm. For people who want butter smooth graphics G-Sync is already engineered from the right direction whereas FreeSync isn't. If it turns out that FreeSync is the perfect version out of the box(not likely) then G-Sync will be revised to better handle the edge cases it currently fails at, but in the end FreeSync will jitter in high-performance cases and G-Sync won't.

And before someone on here accuses me of fanboying, I've posted a number of articles on AMD's recent innovations. This just isn't going to be one of them.

 

In other words, you have no proof of anything, and no sources to backup the wild claims and BS you posted earlier. You have been caught red handed in lying.

If fps is over 144hz, as a max, I doubt anyone will see any judder or stutter. I just showed you a source, proving that gsync spazzes out, when fps goes over the supported interval. There is heavy buffering going on there. I'm not saying that is bad per se, but we don't know exactly how gsync deals with the problem. What I CAN say, is that you have not yet proved your claim, that Adaptive Sync, dictates a GFX to stop rendering frames, when the supported max hz is passed.

You have nothing to backup your claim about what Nvidia will do (or not do) with Gsync. You don't know. Pure speculation. Neither do you know how Adaptive Sync is going to behave in end products.

 

RFC's are very much a term only used by ISOC or similar for internet standards. It is not a term I've come across in any other industry, dealing with anything else than internetprotocols.

 

I'm still waiting for all those fancy sources you claim to have read. I guess you "forgot" where they are.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×