Jump to content

Nvidia Pitches Advantages of G-Sync Over AMD's FreeSync

BiG StroOnZ

when OLED comes... and oled is good enough to even hold an image @ 0hz, gsync WILL be superior; unless freesync makes a second edition that isnt capped at min 9hz

 

 

also, AMD better fix those ghosting issues through drivers fast

Link to comment
Share on other sites

Link to post
Share on other sites

when OLED comes... and oled is good enough to even hold an image @ 0hz, gsync WILL be superior; unless freesync makes a second edition that isnt capped at min 9hz

 

 

also, AMD better fix those ghosting issues through drivers fast

Ghosting=panel makers problem.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

SOoooooooo, I guess what I have learned from all the talk about Gsync and Freesync is that I need to wait until the tech becomes more mature and the best products will arise from both of them.

Link to comment
Share on other sites

Link to post
Share on other sites

http://www.blurbusters.com/gsync/preview/

"With G-SYNC, you gain the best of both VSYNC ON and VSYNC OFF worlds. You get the higher frame rates of VSYNC OFF (during the G-SYNC range of 30Hz through 144Hz) without the input lag penalty, and no tearing at all."

Yes 30-144Hz is the limitation of current monitors has nothing to do with the theoretical limit as I'm trying to explain but you are blatantly ignoring. Just like FreeSync is capable of doing 9-240Hz, G-Sync is capable of doing 1-240Hz. However as I'm trying to explain to you but you are not understanding there are no monitors available that allow such variable refresh rate coverage. Which is why you see 30-144Hz limitations in G-Sync and 40-144Hz limitations in FreeSync.

Link to comment
Share on other sites

Link to post
Share on other sites

But that's the point you aren't getting. The G-Sync standard is actually 1-240Hz but the current G-Sync panels currently only support 30-144Hz. So they are listing G-Sync in the comparison as only 30-144Hz but yet they are listing their "theoretical" limit instead. Rather than listing what monitors are actually capable of. Whereas they are listing what G-Syncs monitors are capable of, not the actual G-Sync standard.

 

 

Read above.

I didn't imagine that G-sync supported such a range until Tom Peterson alluded to it in the part of the interview I quoted him on earlier. Do you happen to know where this information has been published though, I couldn't find anything when I searched?

Link to comment
Share on other sites

Link to post
Share on other sites

Yes 30-144Hz is the limitation of current monitors has nothing to do with the theoretical limit as I'm trying to explain but you are blatantly ignoring. Just like FreeSync is capable of doing 9-240Hz, G-Sync is capable of doing 1-240Hz. However as I'm trying to explain to you but you are not understanding there are no monitors available that allow such variable refresh rate coverage. Which is why you see 30-144Hz limitations in G-Sync and 40-144Hz limitations in FreeSync.

 

Could you please supply a source for that? Tom from Nvidia has stated, that the tech only support 30-144hz.

 


 

As for the non timing stuff, clearly the scaler/tcon/whatever, needs to drive the monitor properly. Having driver updates cover all monitors ever made, supporting FreeSync, is quite silly. How would AMD even do it? No, so far it seems certain settings can remove the ghosting, if not, then the scalers needs more updating, or the monitor vendors, needs to do more tweaking.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Analogy:

The 9-240 hz is like having cars that van go op to 240 mph, when there only are roads where you have to go faster than 40 and slower than 144 mph.

They quoted a theoretical limit. Not the one that is there in practice.

It's like nvidia saying they have 4gb of vram. Which is theoretically true, but not entirely accurate.

Welp

Link to comment
Share on other sites

Link to post
Share on other sites

Analogy:

The 9-240 hz is like having cars that van go op to 240 mph, when there only are roads where you have to go faster than 40 and slower than 144 mph.

They quoted a theoretical limit. Not the one that is there in practice.

It's like nvidia saying they have 4gb of vram. Which is theoretically true, but not entirely accurate.

 

Adaptive Sync is a standard, so it has to be future proof. OLED should be able to support something like that, which we might see in VR glasses, like oculus. Further down the line, we might see OLED monitors too. Right now the LCD tech is the limiting factor for hz interval.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Analogy:

The 9-240 hz is like having cars that van go op to 240 mph, when there only are roads where you have to go faster than 40 and slower than 144 mph.

They quoted a theoretical limit. Not the one that is there in practice.

It's like nvidia saying they have 4gb of vram. Which is theoretically true, but not entirely accurate.

 

That analogy falls apart in Germany where you can, in fact, go to 240mph. 

 

When tech enthusiasts start using cars to make comparisons...makes me cry every time. 

Link to comment
Share on other sites

Link to post
Share on other sites

I think you may need to remove your green glasses and present some evidence of AMD cards being less reliable otherwise you're being even more biased then he was..

AMD cooling solutions? I was unaware AMD was even in the cooling market I'm yet to ever see a AMD cooler being sold, both AMD and Nvidia reference coolers are loud and suck balls if you buy them you're an idiot or more worried about it's appearance then performance so AMD being warmer shouldn't even bother you.

You're clearly a Nvidia fanboy your last line sort makes that obvious AMD are far from being left in the dust and making things open wont make you as much money sure but it's less of a douche move. AMD are like that nice guy that gets friend zoned lol Nvidia are the abusive douche that every woman falls all over lol.

Nvidia's reference coolers have gotten a lot better and actually aren't that loud anymore. Far from silent, but they're hardly the turbines they used to be. Of course they have a cooling solutions team. Someone had to design those coolers.

The problem with your final analysis is that it leaves out one very inconvenient fact: AMD can't develop new tech quickly and rebrands 90+% of its new products year over year because it doesn't have the money because it makes horrifically stupid financial choices. AMD could learn a thing or two from Nvidia and Intel: price what the market will bear and deal with price wars later.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

That analogy falls apart in Germany where you can, in fact, go to 240mph.

When tech enthusiasts start using cars to make comparisons...makes me cry every time.

I was trying to make things clear for the fanboys who clearly not understood what the others were saying.

Welp

Link to comment
Share on other sites

Link to post
Share on other sites

Oh noes.

 

Who'd have thought that AMD's kneejerk shoehorned software solution to the same problem Nvidia spent years of R&D and development of hardware for, wasn't going to be just as good as the Nvidia implementation!?

 

Actually, anyone with the bare basic common sense thought that.

 

I'm still buying an AMD GPU next, but it was always extremely obvious that FreeSync wouldn't be as good as Gsync. I'm more surprised people actually thought FreeSync was going to to be even in the same neighborhood.

 

The real question is; Does Nvidias implementation warrant the extra buy-in cost into the ecosystem? Hell no. AMD is far superior in this regard.

In case the moderators do not ban me as requested, this is a notice that I have left and am not coming back.

Link to comment
Share on other sites

Link to post
Share on other sites

Variable refresh rate technology is awesome, but it's completely off the table for me until there are vendor-agnostic solutions. I am in no way comfortable investing in a monitor that requires me to commit to either Nvidia or AMD hardware for it's lifetime.

Link to comment
Share on other sites

Link to post
Share on other sites

im confused hes saying that it wont work when its outside the range of the free sync fps but isnt that the same as g sync. also even if g sync is a lot better than free sync i dont want to pay 150 dollars extra maybe 20 dollars extra so they need to cut back on the price

FPGAs are always expensive. They're very exotic chips and only 2 foundries make them: Intel and Samsung.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Oh noes.

 

Who'd have thought that AMD's kneejerk shoehorned software solution to the same problem Nvidia spent years of R&D and development of hardware for, wasn't going to be just as good as the Nvidia implementation!?

 

Actually, anyone with the bare basic common sense thought that.

 

I'm still buying an AMD GPU next, but it was always extremely obvious that FreeSync wouldn't be as good as Gsync. I'm more surprised people actually thought FreeSync was going to to be even in the same neighborhood.

 

The real question is; Does Nvidias implementation warrant the extra buy-in cost into the ecosystem? Hell no. AMD is far superior in this regard.

 

But Adaptive Sync, IS a hardware implementation. And because existing scaler vendors, make this tech in their scalers, we ave a lot more functionality, like proper OSD, colour and other tweaks, up to professional calibration of these. As well as multiple inputs, like HDMI and DVI. Gsync has none of this, as Nvidia had to invent the scaler from the ground up.

 

So for, the actual synced framerate, within the interval, seems to be equally good, between Adaptive Sync and Gsync, from the reviews, we've seen so far. Some of the ghosting issues, seems to be the lack of tweaking from the monitor vendors, but some can be removed, by changing settings in the monitor.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

But Adaptive Sync, IS a hardware implementation. And because existing scaler vendors, make this tech in their scalers, we ave a lot more functionality, like proper OSD, colour and other tweaks, up to professional calibration of these. As well as multiple inputs, like HDMI and DVI. Gsync has none of this, as Nvidia had to invent the scaler from the ground up.

So for, the actual synced framerate, within the interval, seems to be equally good, between Adaptive Sync and Gsync, from the reviews, we've seen so far. Some of the ghosting issues, seems to be the lack of tweaking from the monitor vendors, but some can be removed, by changing settings in the monitor.

Adaptive sync is over DP only. Would you people get all your facts straight before posting?

Also, GSYNC and FreeSync have almost no use for professional applications.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Adaptive sync is over DP only. Would you people get all your facts straight before posting?

Also, GSYNC and FreeSync have almost no use for professional applications.

 

My facts are completely in order. Note how I did NOT state, that Adaptive Sync works over HDMI or DVI? Only that Adaptive Sync monitors can have these inputs, to use with laptops, bluray players, gaming consoles etc.

 

That makes these monitors better, than Gsync monitors, by giving more options. Adaptive Sync can be used for power savings features as well (which is what variable vblank in eDP was invented for as well). This is also a reason why Intel might support AS, as hey can use it for power saving. That is very interesting in a professional setting.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

I love the idea of variable refresh, but can't get past being locked into a certain GPU ecosystem.

Monitors do not get changed/upgraded in the same frequency as other hardware,....

Considering that some people do change GPU makers between builds and then would have a monitor feature completely rendered useless, is a no go for me.

/I know this is offtopic but it is my thought process on the whole variable refresh matter in general...

Maximums - Asus Z97-K /w i5 4690 Bclk @106.9Mhz * x39 = 4.17Ghz, 8GB of 2600Mhz DDR3,.. Gigabyte GTX970 G1-Gaming @ 1550Mhz

 

Link to comment
Share on other sites

Link to post
Share on other sites

I love the idea of variable refresh, but can't get past being locked into a certain GPU ecosystem.

Monitors do not get changed/upgraded in the same frequency as other hardware,....

Considering that some people do change GPU makers between builds and then would have a monitor feature completely rendered useless, is a no go for me.

/I know this is offtopic but it is my thought process on the whole variable refresh matter in general...

 

Its not off topic, its a very valid criticism of both. 

 

Right now, only crazy early adopters who have 500 to spend on arguably mediocre monitors that work great for gaming are the ones who will do this. I MIGHT do it, but here is my problem. I have Nvidia GPUs right now. That means 700 or so on a ROG Swift, which admittedly is a great monitor for gaming. 

But then I have fanboys telling me how FreeSync is so much cheaper. Okay. I need 400 dollars in GPUs + 500 in a monitor anyways. So what the hell am I saving? Nothing. Not now, not for the next year or so will prices actually make sense. 

 

If I'm spending that much on a monitor, I'm getting either a PA328Q or a 34UC97. Monitors that don't care about which GPU you use. Monitors that will actually last the test of time and not lose their feature set if I chose to change a GPU two years down the line, like I have before. Last time, I went from AMD to Nvidia but all I did was get GPUS. Now I would need to get a new monitor just to get this tech that actually works and improves gaming. But that also stupid to arbitrarily lock myself into either one right now. 

AMD and Nvidia are equally guilty in this. 

Link to comment
Share on other sites

Link to post
Share on other sites

Its not off topic, its a very valid criticism of both. 

 

Right now, only crazy early adopters who have 500 to spend on arguably mediocre monitors that work great for gaming are the ones who will do this. I MIGHT do it, but here is my problem. I have Nvidia GPUs right now. That means 700 or so on a ROG Swift, which admittedly is a great monitor for gaming. 

But then I have fanboys telling me how FreeSync is so much cheaper. Okay. I need 400 dollars in GPUs + 500 in a monitor anyways. So what the hell am I saving? Nothing. Not now, not for the next year or so will prices actually make sense. 

 

If I'm spending that much on a monitor, I'm getting either a PA328Q or a 34UC97. Monitors that don't care about which GPU you use. Monitors that will actually last the test of time and not lose their feature set if I chose to change a GPU two years down the line, like I have before. Last time, I went from AMD to Nvidia but all I did was get GPUS. Now I would need to get a new monitor just to get this tech that actually works and improves gaming. But that also stupid to arbitrarily lock myself into either one right now. 

AMD and Nvidia are equally guilty in this. 

 

Precisely: 60 FPS (or higher) Vsync it's still a more reasonable option being that it's free, works on all platforms and current and future hardware. All Nvidia (and to an extend, AMD) are doing is introducing a totally unwanted element of exclusivity into PC gaming, that shit needs to be kept in consoles only.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

AMD and Nvidia are equally guilty in this. 

 

How is AMD guilty in this? They made an open industry standard, that everyone, including Nvidia can use. Nvidia made a closed off proprietary solution, that AMD has no access to. All the blame, ALL OF IT, is on Nvidia. They could have taken the same industry standard approach, that AMD did, and they could have made the industry standard, everyone could use. They chose not to, giving AMD, and everyone else, zero choice, but to make their own version.

 

Precisely: 60 FPS (or higher) Vsync it's still a more reasonable option being that it's free, works on all platforms and current and future hardware. All Nvidia (and to an extend, AMD) are doing is introducing a totally unwanted element of exclusivity into PC gaming, that shit needs to be kept in consoles only.

 

Some Freesync model are either the same price or cheaper, than the non freesync version. LG 34UM67 has freesync and costs the same as 34UM65, which does not. The 67 version might have a lower MSRP in the US. AMD or Nvidia has no say in tv standards/manufacturers, so I don't see how you can make this a console thing. Also why make consoler master race? Makes no sense.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

How is AMD guilty in this? They made an open industry standard, that everyone, including Nvidia can use. Nvidia made a closed off proprietary solution, that AMD has no access to. All the blame, ALL OF IT, is on Nvidia. They could have taken the same industry standard approach, that AMD did, and they could have made the industry standard, everyone could use. They chose not to, giving AMD, and everyone else, zero choice, but to make their own version.

You're right, they could have taken the same approach....if they wanted to delay release 1-2 years and then launch something that's inferior. Are you really yelling at Nvidia for not sharing all their little secrets? Go to Heinz and try to get their Ketchup recipe...good luck. 

PSU Tier List | CoC

Gaming Build | FreeNAS Server

Spoiler

i5-4690k || Seidon 240m || GTX780 ACX || MSI Z97s SLI Plus || 8GB 2400mhz || 250GB 840 Evo || 1TB WD Blue || H440 (Black/Blue) || Windows 10 Pro || Dell P2414H & BenQ XL2411Z || Ducky Shine Mini || Logitech G502 Proteus Core

Spoiler

FreeNAS 9.3 - Stable || Xeon E3 1230v2 || Supermicro X9SCM-F || 32GB Crucial ECC DDR3 || 3x4TB WD Red (JBOD) || SYBA SI-PEX40064 sata controller || Corsair CX500m || NZXT Source 210.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×