Jump to content

AMD FreeSync vs Nvidia G-Sync, Ultimate Verdict.

TERAFLOP

You can use the frame buffer on the GPU to duplicate frames. Which is a far better method considering every frame on all GPU's are stored there prior to being sent to the display.

 

You have a source? As far we know G-Sync has licensing fees.

Yeah

"Let’s address the licensing fee first. I have it from people that I definitely trust that NVIDIA is not charging a licensing fee to monitor vendors that integrate G-Sync technology into monitors. What they do charge is a price for the G-Sync module, a piece of hardware that replaces the scalar that would normally be present in a modern PC display. It might be a matter of semantics to some, but a licensing fee is not being charged, as instead the vendor is paying a fee in the range of $40-60 for the module that handles the G-Sync logic."

DSC_4620-640x516.jpg

That's the Gsync module, better known as Altera Arria V GX. Nvidia has no right to charge a license free as it's not their product at all.

"But, this method requires a local frame buffer and requires logic on the display controller to work. Hence, the current implementation in a G-Sync module."

http://www.pcper.com/reviews/Displays/AMD-FreeSync-First-Impressions-and-Technical-Discussion

Stop asking people for a source if you can't even prove your own claims.

Link to comment
Share on other sites

Link to post
Share on other sites

Yeah

"Let’s address the licensing fee first. I have it from people that I definitely trust that NVIDIA is not charging a licensing fee to monitor vendors that integrate G-Sync technology into monitors. What they do charge is a price for the G-Sync module, a piece of hardware that replaces the scalar that would normally be present in a modern PC display. It might be a matter of semantics to some, but a licensing fee is not being charged, as instead the vendor is paying a fee in the range of $40-60 for the module that handles the G-Sync logic."

- snip -

That's the Gsync module, better known as Altera Arria V GX. Nvidia has no right to charge a license free as it's not their product at all.

"But, this method requires a local frame buffer and requires logic on the display controller to work. Hence, the current implementation in a G-Sync module."

http://www.pcper.com/reviews/Displays/AMD-FreeSync-First-Impressions-and-Technical-Discussion

You have another source other than a complete Nvidia biased one? Preferably one with at least credibility (I prefer word directly from Nvidia).

 

Stop asking people for a source if you can't even prove your own claims.

Do clarify my own claims?

Link to comment
Share on other sites

Link to post
Share on other sites

Ya know the problem with this "Ultimate Verdict" is that everyone is mistaking what the panel can do, for the limitations of Freesync. Panel and monitor manufacturers are the ones who control the frame rate range, not AMD. Freesync's actual range is 9hz-240hz, so everyone needs to stop blaming AMD for the first crappy implementation by a monitor vender just trying to beat everyone to market with a half baked market.

PC: CPU - FX 8350 @4.5 Ghz | GPU - 3x R9 290 @1100 core/1300 memory | Motherboard - Asus Crosshair V Formula Z | RAM - 16 GB Mushkin Redline 1866 Mhz | PSU - Corsair AX 860w | SSD - ADATA SX900 256 GB | HDD - Seagate 3TB 7200RPM | CPU Cooler - Noctua NH D-14 | Case - Cooler Master HAF Stacker 935

Peripherals: Monitor - ASUS VN248H-P IPS | Keyboard - Corsair K70 | Mouse - Corsair M65 | Headphones - ASUS ROG Orion Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Who the hell plays games at those framerates?

The main point of a dynamic refresh rate is to smooth out the unavoidable dips in framerate when the game demands more from the GPU. Even if you're gaming at an _average_ of 60fps, the framerate will dip substantially when the action picks up. That's when freesync or gsync has to step in. Sure, you could turn the settings down in order to boost your minimum framerate, but the point of freesync or gsync was to let you avoid doing that, to some extent. Asking "Who the hell plays games at those framerates?", is a bit of a cop-out, depending on how low a minimum framerate we're talking (40fps for now). Asking this question is akin to questioning the whole point of variable refresh rate technology.

 

Linus commented on the WAN show that his experience with the 48-75Hz LG freesync monitor was not that good due to the high minimum of 48Hz. Hopefully it's better on the 40-144Hz Acer or BenQ monitors. ...and if it isn't, hopefully some company will sell a freesync monitor going down to 30Hz (or maybe even 20Hz).

 

Sure, freesync is good, but saying it's just as good as the current state of gsync is a bit of an exageration. For better or worse, AMD seems to have produced the cheaper, slightly inferior, but still perfectly adequate graphics solution again.

Link to comment
Share on other sites

Link to post
Share on other sites

Freesync isn't free. It might cost a bit less sure, but it is not free.

I recall seeing  G-Sync Monitors with multiple inputs. Sure only 1 of them (DisplayPort) supports G-Sync, similarly to FreeSync based monitors.

Competition and consumer demand will push manufacture to make G-Sync with more inputs.

 

Also, G-Sync is not responsible for color reproduction. The panel is, and the manufacture calibration of the panel, and other components selected or created by the manufacture in processing the signal received (eg: color processor used or not, Look Up Table used/calibrated, FRC system to emulate more colors that the panel can't natively produce, say the manufacture picks  a 6-bit panel over a true 8-bit per channel panel).

 

Also: wccftech.. sensationalist title, and I don't think I need to continue.

This is simply untrue, please do check. The G-Sync module only supports a Displayport output. And because it replaces the scalar of the monitor other outputs can't simply be "added". We're also talking about color processing, not color reproduction. Because G-Sync replaces the scalars employed by monitor makers to drive the panels, special color processing features of brands such as LG, Samsung, Dell and so on cannot be implemented, only what Nvidia implements into the module is available.

Link to comment
Share on other sites

Link to post
Share on other sites

This is simply untrue, please do check. The G-Sync module only supports a Displayport output. And because it replaces the scalar of the monitor other outputs can't simply be "added". We're also talking about color processing, not color reproduction. Because G-Sync replaces the scalars employed by monitor makers to drive the panels, special color processing features of brands such as LG, Samsung, Dell and so on cannot be implemented, only what Nvidia implements into the module is available.

Multiplexers. Multiplexer is your friend. You can do many things with multiplexers.
Link to comment
Share on other sites

Link to post
Share on other sites

The main point of a dynamic refresh rate is to smooth out the unavoidable dips in framerate when the game demands more from the GPU. Even if you're gaming at an _average_ of 60fps, the framerate will dip substantially when the action picks up. That's when freesync or gsync has to step in. Sure, you could turn the settings down in order to boost your minimum framerate, but the point of freesync or gsync was to let you avoid doing that, to some extent. Asking "Who the hell plays games at those framerates?", is a bit of a cop-out, depending on how low a minimum framerate we're talking (40fps for now). Asking this question is akin to questioning the whole point of variable refresh rate technology.

 

Linus commented on the WAN show that his experience with the 48-75Hz LG freesync monitor was not that good due to the high minimum of 48Hz. Hopefully it's better on the 40-144Hz Acer or BenQ monitors. ...and if it isn't, hopefully some company will sell a freesync monitor going down to 30Hz (or maybe even 20Hz).

 

Sure, freesync is good, but saying it's just as good as the current state of gsync is a bit of an exageration. For better or worse, AMD seems to have produced the cheaper, slightly inferior, but still perfectly adequate graphics solution again.

Why would you infer that AMD's solution is cheaper (quality not cost I'm assuming) or inferior?  Again, it is the monitor manufacturers that regulate the refresh rate of the monitor, and has nothing to do with AMD's implementation.  Freesync itself, as it stands right this second, is capable of driving anywhere from 9hz to 240hz.  This is panel and monitor manufacturers just rushing out products before they've optimized their hardware, just like when the first Gsync monitors came out and everyone thought it looked like crap.  Wait a month or 6, and you will see.  I bet if AMD doesn't even issue any updates for Freesync, you will still get more monitors with wider range refresh rates, simply due to manufacturers getting out the growing pains of a producing a new product.

PC: CPU - FX 8350 @4.5 Ghz | GPU - 3x R9 290 @1100 core/1300 memory | Motherboard - Asus Crosshair V Formula Z | RAM - 16 GB Mushkin Redline 1866 Mhz | PSU - Corsair AX 860w | SSD - ADATA SX900 256 GB | HDD - Seagate 3TB 7200RPM | CPU Cooler - Noctua NH D-14 | Case - Cooler Master HAF Stacker 935

Peripherals: Monitor - ASUS VN248H-P IPS | Keyboard - Corsair K70 | Mouse - Corsair M65 | Headphones - ASUS ROG Orion Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Why would you infer that AMD's solution is cheaper (quality not cost I'm assuming) or inferior?  Again, it is the monitor manufacturers that regulate the refresh rate of the monitor, and has nothing to do with AMD's implementation.  Freesync itself, as it stands right this second, is capable of driving anywhere from 9hz to 240hz.  This is panel and monitor manufacturers just rushing out products before they've optimized their hardware, just like when the first Gsync monitors came out and everyone thought it looked like crap.  Wait a month or 6, and you will see.  I bet if AMD doesn't even issue any updates for Freesync, you will still get more monitors with wider range refresh rates, simply due to manufacturers getting out the growing pains of a producing a new product.

Yeah, in 6 or so months the monitor manufacturers will probably have a 30Hz minimum monitor out. I was mainly responding to the claim that day 1 freesync was as good as current gsync, which seemed kinda misleading since the monitors have a little way to go.

 

When I said "cheaper", I was referring to the cost to produce the monitor, not the quality of the monitor. That _is_ one of the main marketing lines for freesync, right?

 

Quoting the 9 - 240Hz figure is like quoting the range on Linus' car's speedometer (which I presume goes up to around 180km/hr). Sure, it could read 180km/hr, but only if you took the speedo out and put it into a completely different car.

Link to comment
Share on other sites

Link to post
Share on other sites

Finally, for the very first batch of monitors and for beta drivers the results are pretty amazing IMO. It's a proper standard (and not a shitty proprietary solution with a silly mark-up) and it works. That's all it needs to do right now. In a month or two it should work for crossfire users and monitor manufacturers will hopefully get the hang of it and use proper panels with a 30-ish minimum refreshrate. That's the goal for now and I'm pretty sure that this could work for the second wave of free-sync monitors.

 

I think we can all agree (and hope for) this scenario and that Nvidia will make their own implementation of adaptive sync. I want a monitor that supports both, and if Nvidia doesn't accept that and tries to hold onto their shitty module (or tries to pull shenanigans like not allowing monitor manufacturers to support both, etc.) then free-sync it is. That'd be the final nail in the coffin for me personally, I'm thinking of switching back to Nvidia in a few years (obviously depending on the products then, but I am taking it into consideration based on the rumors around Pascal or what comes after that), but if they make a major feature of my monitor useless because they refuse to support the now existing standard then fuck them.

 

There is now a working solution that is less of a hassle and works just as fine, support it god damn it!

Link to comment
Share on other sites

Link to post
Share on other sites

Sure. Yes, while it isn't a dedicated hardware like Nvidia, the manufacture needs to implement the optional requirement if DisplayPort standard to support it. That costs money in terms of R&D from the monitor manufacture, support, and well the hardware needed. This adds cost to the monitor.  The price of the monitor would not remain the same with or without FreeSync support.

I think I understand what you're saying here, but I don't see how that would be any different then what G-sync implemented monitor manufacturers would have to endure.  Surely they have to do some R&D of their own, support and the additional costs of the hardware for Nvidia's implementation are pretty well known to us.

 

I wanted to call BS a bit on the cost of putting in a port, but after a cursory search I've decided that I want to start a business in this sector fer sum ez monies instead.  Jeepers, dat some pricey tin being stamped out by those unionized robots.

http://au.element14.com/webapp/wcs/stores/servlet/Search?st=displayport&catalogId=15001&categoryId=800000001453&langId=43&storeId=10184

Yet that is pretty much a cost that is shared between both solutions, so it still strikes me as a moot point.

Link to comment
Share on other sites

Link to post
Share on other sites

The main point of a dynamic refresh rate is to smooth out the unavoidable dips in framerate when the game demands more from the GPU. Even if you're gaming at an _average_ of 60fps, the framerate will dip substantially when the action picks up. That's when freesync or gsync has to step in. Sure, you could turn the settings down in order to boost your minimum framerate, but the point of freesync or gsync was to let you avoid doing that, to some extent. Asking "Who the hell plays games at those framerates?", is a bit of a cop-out, depending on how low a minimum framerate we're talking (40fps for now). Asking this question is akin to questioning the whole point of variable refresh rate technology.

 

Linus commented on the WAN show that his experience with the 48-75Hz LG freesync monitor was not that good due to the high minimum of 48Hz. Hopefully it's better on the 40-144Hz Acer or BenQ monitors. ...and if it isn't, hopefully some company will sell a freesync monitor going down to 30Hz (or maybe even 20Hz).

 

Sure, freesync is good, but saying it's just as good as the current state of gsync is a bit of an exageration. For better or worse, AMD seems to have produced the cheaper, slightly inferior, but still perfectly adequate graphics solution again.

The primary purpose of VRR is to remove tearing from frame to frame by extending the vblank (FreeSync) or filling it with a duplicate frame (G-Sync). Both technologies are driven to not be a hardware intensive V-SYNC replacement. G-Sync does double buffering (triple buffering with V-SYNC on) which keeps games more fluid even at lower frame rate (duplicate frames). Although this shouldn't persuade anyone to want to run their games at just 30 frames per second. This is why the FreeSync scope starts at 40 Hz on most first generation displays as you'll likely be maintaining above 40 FPS on most of these displays even in battle heavy situations. Reasoning for that is it's not likely for someone to go out and buy a $500 QHD display that supports these technologies to run in tandem with a R7 260X. If you have the money for one of these displays then you likely have a machine capable of maintaining 60 frames per second in the games that you play.

 

I would agree that 48 Hz is a high minimum for a display although I think in time these numbers will differentiate to something like 30-144 Hz. The technology is still in its infancy with these displays running the first batch of scalers made to support Adaptive-Sync. Like any process it takes time to refine.

 

I would say FreeSync is just as good as G-Sync. In none of the reviews done have we seen complaints about stuttering, juddering or tearing.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Anyone who ACTUALLY wants to see what the differences are should look into this 3 page review:

 

http://www.pcper.com/reviews/Displays/AMD-FreeSync-First-Impressions-and-Technical-Discussion

 

And there are some differences. Albeit they all are related to the LCD technology being shit. And both variable refresh rate solutions suffer differently.

Link to comment
Share on other sites

Link to post
Share on other sites

Anyone who ACTUALLY wants to see what the differences are should look into this 3 page review:

 

http://www.pcper.com/reviews/Displays/AMD-FreeSync-First-Impressions-and-Technical-Discussion

 

And there are some differences. Albeit they all are related to the LCD technology being shit. And both variable refresh rate solutions suffer differently.

Believe it or not that review wasn't only discredited on this forum but actually world wide across the web. In the past couple of days I've been on even foreign forums with them slandering that very review. Ghosting was fixed on the BenQ simply by changing the display AMA settings. I think PcPer just took these displays out of the box and immediately proceeded with their review. An Nvidia mouthpiece and has been that way for a long time. FreeSync implementation is pretty much all around better until you hit low frame rate. Which is really a stupid argument that Nvidia brags that it's better at being slow. If you own one of these expensive displays than you likely own a R9 290/GTX 970 or higher GPU to drive your games above 60 FPS on it.

Link to comment
Share on other sites

Link to post
Share on other sites

The module limits the outputs which a G-Sync monitor can support. G-Sync monitors only support Displaport outputs and do not allow for anything other than the most basic monitor features. And because monitor makers cannot use their own feature-full scalar chips to drive these monitors. They cannot include any of their differentiating features in the monitor...  ...and does not support any form of audio output.

This is all good. monitor makers "features" are all fucking shit. And why the fuck would anyone use a monitors audio?!

 

 

The G-Sync module also only allows for very preliminary color processing

This is the only possible concern.

I run my browser through NSA ports to make their illegal jobs easier. :P
If it's not broken, take it apart and fix it.
http://pcpartpicker.com/b/fGM8TW

Link to comment
Share on other sites

Link to post
Share on other sites

Here's my ultimate verdict: If you have an AMD card, get a freesync monitor. If you have anVidia card, get a gsync monitor.

And don't play below the minimum refresh rate because why would you anyways?

Intel i7 6700k @ 4.8ghz, NZXT Kraken X61, ASUS Z170 Maximus VIII Hero, (2x8GB) Kingston DDR4 2400, 2x Sapphire Nitro Fury OC+, Thermaltake 1050W

All in a Semi Truck!:

http://linustechtips.com/main/topic/519811-semi-truck-gaming-pc/#entry6905347

Link to comment
Share on other sites

Link to post
Share on other sites

Believe it or not that review wasn't only discredited on this forum but actually world wide across the web. In the past couple of days I've been on even foreign forums with them slandering that very review. Ghosting was fixed on the BenQ simply by changing the display AMA settings. I think PcPer just took these displays out of the box and immediately proceeded with their review. An Nvidia mouthpiece and has been that way for a long time. FreeSync implementation is pretty much all around better until you hit low frame rate. Which is really a stupid argument that Nvidia brags that it's better at being slow. If you own one of these expensive displays than you likely own a R9 290/GTX 970 or higher GPU to drive your games above 60 FPS on it.

 

I havent looked too much into this but i hope you are right.

Because i really wanted the 34um67, but i wont touch it before i see proof that the ghosting can be fixed with monitor settings.

If i cant find an example of how it is fixed, i will just wait for a gsync 21:9 version.

Link to comment
Share on other sites

Link to post
Share on other sites

FreeSync implementation is pretty much all around better until you hit low frame rate. Which is really a stupid argument that Nvidia brags that it's better at being slow. If you own one of these expensive displays than you likely own a R9 290/GTX 970 or higher GPU to drive your games above 60 FPS on it.

It's not _that_ stupid an argument. The point of VRR is to avoid tearing and stuttering even when the framerate dips, and if gsync can take a slightly bigger dip while still hiding the stutters and tears, then that makes it slightly better (although you may pay alot more than a slightly higher price for gsync). Arguing that you should just own a video card that won't dip below 60fps seems equivalent to arguing that you should just expect less from freesync. That said, if you're an fps fiend that isn't comfortabe below even 60fps, then the current Acer and BenQ freesync monitors may already be all you'll need.

 

It's not that I like gsync better (at this stage, I'm liking freesync better for its open standard, and larger (soon) selection of monitors). I am simply keen to point out where freesync's weaknesses lie: the monitor's minimum refresh rate establishes a fairly hard limit for how low you should allow your fps to dip, so go for (or at least prefer) the freesync monitor with the lower minimum refresh rate. If you go for the 48-75Hz LG monitor, it means your video card has a slightly higher hurdle to keep your fps over.

Link to comment
Share on other sites

Link to post
Share on other sites

Good grief.  How many times are we going to see people rationalize the same erroneous information.  Freesync supports down to 9Hz.  It just these first few monitors that fall on their face under 48fps, because that is what THOSE manufactures engineers where able to pull off with the panels that they're using and within the time frame that those companies wanted to come out to market.

This was the beef I had with leaving this much control with manufactures to bring their own flavour of freesync to the table.  Just going to be more mountains of disinformation being disseminated in attempts to generalize apples to oranges comparisons and making it all sound like a pear without even testing more.

Link to comment
Share on other sites

Link to post
Share on other sites

No-one is going to build a 9Hz monitor. We have down to 40Hz now. I predict we will see something down to 30Hz within the next 6 months, but it's not here YET. ...which is why labelling something as an "ultimate" verdict at this time is a little misleading. Hopefully when the 30Hz freesync monitor gets here, any difference in appearance between gsync and freesync at low framerates will dissappear.

Link to comment
Share on other sites

Link to post
Share on other sites

come on are we gonna complain on LG Freesync monitor?, look at the MSRP it's about the same price(actually cheaper) for a better spec than the regular version.

http://www.lg.com/us/monitors/lg-34UM65-P-ultrawide-monitor

http://www.lg.com/us/monitors/lg-34UM67-P-ultrawide-led-monitor

 

again, the beauty of Freesync is it's an Open Standard, so Manufacture are free to implement it to wide variety of their monitors

and Yes, 48 Hz is quite high for minumum FPS, but again it's not like it priced at higher price than the regular version, there's no price premium here (MSRP), and you got a better spec!.

 

I assume all of these first wave of freesync monitor are not design for FreeSync in the first place, that's why there're some quirk here, and there, they just got some firmware update to make it Freesync capable, and robust enough for Variable refresh rate.

 

I just hope, all monitors from LG, Samsung, BenQ, and others that got some gaming marketing in their brochure to include Freesync or Adapative Sync capability in the future.

Link to comment
Share on other sites

Link to post
Share on other sites

The problem for me with VRR and current FS monitors is that they still have a fairly high minimum refresh rate. Obviously we would all like to stay well above 40 or 48 fps but that doesn't always happen. You actually have to have a fairly beefy setup to maintain that at just 1440p all the time in some of the more demanding games. And I certainly didn't buy a $600 GPU setup with a $500 monitor to not run the game at some pretty serious settings.

 

The big issue is with unoptimized games (re: Ubisoft) where VRR would actually make for a much better experience due to how choppy they run and the fact that they take an ungodly amount of GPU power to run at all. Remember how Unity couldn't average 60 fps with SLI 980's? So I'm holding out for the monitors that get closer to the theoretical 9 Hz limit. Imagine a VRR setup that could make such a steaming pile of dung still seem decently smooth even as the game bounced between 50 fps and 20 fps. It will happen, but probably not until the second generation of Freesync monitors.

Turnip OC'd to 3Hz on air

Link to comment
Share on other sites

Link to post
Share on other sites

It's not _that_ stupid an argument. The point of VRR is to avoid tearing and stuttering even when the framerate dips, and if gsync can take a slightly bigger dip while still hiding the stutters and tears, then that makes it slightly better (although you may pay alot more than a slightly higher price for gsync). Arguing that you should just own a video card that won't dip below 60fps seems equivalent to arguing that you should just expect less from freesync. That said, if you're an fps fiend that isn't comfortabe below even 60fps, then the current Acer and BenQ freesync monitors may already be all you'll need.

 

It's not that I like gsync better (at this stage, I'm liking freesync better for its open standard, and larger (soon) selection of monitors). I am simply keen to point out where freesync's weaknesses lie: the monitor's minimum refresh rate establishes a fairly hard limit for how low you should allow your fps to dip, so go for (or at least prefer) the freesync monitor with the lower minimum refresh rate. If you go for the 48-75Hz LG monitor, it means your video card has a slightly higher hurdle to keep your fps over.

Only Nvidia sites and marketing could make low performance of expensive, top of the line products a priority to rate those products on. There is no less to expect from FreeSync unless you enjoy playing games at intolerable frame rates. I run a HD 5870 which is ages old and I rarely ever dip below 48 FPS in games like Battlefield 3. The point of VRR is to make higher refresh rate gaming tolerable without always having to rely on getting 144 FPS instead of 60. It has zilch to do with sub 60 FPS performance as far as I'm concerned simply because no one is going to go out and spend $500+ on one of these displays without a card capable of 60+ FPS performance.

 

One would argue that 48 FPS isn't nothing short of easily achievable. That scope of "limitation" is not really a limitation of FreeSync but that of the display manufacture (XG270HU has 40-144 Hz range). I would still own the LG even with its 48-75 Hz dynamic range as I would just use Catalysts global frame rate lock to limit my frame rate to 75 FPS. The target frame rates for this performance bracket is far beyond 48 FPS. I haven't seen a single user gaming on a 1440p display with a Bonaire GPU. Your argument is no different than Nvidia's official PR that one is better than the other because it's better at being slow. That we should skimp out on our graphics cards and let the display make up for it.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm concerned simply because no one is going to go out and spend $500+ on one of these displays without a card capable of 60+ FPS performance.

>4k

PSU Tier List | CoC

Gaming Build | FreeNAS Server

Spoiler

i5-4690k || Seidon 240m || GTX780 ACX || MSI Z97s SLI Plus || 8GB 2400mhz || 250GB 840 Evo || 1TB WD Blue || H440 (Black/Blue) || Windows 10 Pro || Dell P2414H & BenQ XL2411Z || Ducky Shine Mini || Logitech G502 Proteus Core

Spoiler

FreeNAS 9.3 - Stable || Xeon E3 1230v2 || Supermicro X9SCM-F || 32GB Crucial ECC DDR3 || 3x4TB WD Red (JBOD) || SYBA SI-PEX40064 sata controller || Corsair CX500m || NZXT Source 210.

Link to comment
Share on other sites

Link to post
Share on other sites

Only Nvidia sites and marketing could make low performance of expensive, top of the line products a priority to rate those products on. There is no less to expect from FreeSync unless you enjoy playing games at intolerable frame rates. I run a HD 5870 which is ages old and I rarely ever dip below 48 FPS in games like Battlefield 3. The point of VRR is to make higher refresh rate gaming tolerable without always having to rely on getting 144 FPS instead of 60. It has zilch to do with sub 60 FPS performance as far as I'm concerned simply because no one is going to go out and spend $500+ on one of these displays without a card capable of 60+ FPS performance.

 

One would argue that 48 FPS isn't nothing short of easily achievable. That scope of "limitation" is not really a limitation of FreeSync but that of the display manufacture (XG270HU has 40-144 Hz range). I would still own the LG even with its 48-75 Hz dynamic range as I would just use Catalysts global frame rate lock to limit my frame rate to 75 FPS. The target frame rates for this performance bracket is far beyond 48 FPS. I haven't seen a single user gaming on a 1440p display with a Bonaire GPU. Your argument is no different than Nvidia's official PR that one is better than the other because it's better at being slow. That we should skimp out on our graphics cards and let the display make up for it.

 

You get what you pay for, when you buy bleeding edge tech you may get cut.  You say you are happy to buy the LG panel and then limit your frame rate to 75,  that's a bit like cutting your nose of despite your face.  Why settle for limited FPS range just so you can buy into freesync? 

 

This is one of those new technologies that most people will either spend up and do it properly or wait for the products to advance to a more usable and affordable state.  And besides it looks like with the price of monitors that G-sync only carries a 10-20% premium if that and so (depending on the user) is mostly worth it for the larger frame rate variance.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×