Jump to content

Freesync range 35-90 only, on 144hz asus MG279Q?

Guest Strangerbob

now nvidia won the sync battle without doing anything

 

Except investing an arse ton into developing their own purpose built scalar and not relying on 4 different companies to produces something that monitor makers have to engineer around to make it work with their chosen panels.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

now nvidia won the sync battle without doing anything

the sync battle is not won, it's just getting started. FreeSync will get better for sure.

 

But Nvidia did the ground work.. They were the ones who pushed this whole issue into the public consciousness. They were the ones who decided to tackle the problem of synchronizing dynamic game frame-rates with a monitor's refresh. Then they did their R&D and were first to market as well. I'm not an Nvidia user anymore but credit where it's due... Most of the gaming world didn't bother to stop and realize that this was a problem that needed to be addressed.

Link to comment
Share on other sites

Link to post
Share on other sites

FreeSync can go up to 240Hz. Just the rest of the tech isn't up to that level, yet.

But tbh, does it matter? If freesync stops working when you go over 90Hz, and let the screen go up to 144Hz when it can. Then what's the issue?

I don't think you need freesync, gsync or whatever sync when your pc spits out over 90fps when you have a 144Hz.

If you want my attention, quote meh! D: or just stick an @samcool55 in your post :3

Spying on everyone to fight against terrorism is like shooting a mosquito with a cannon

Link to comment
Share on other sites

Link to post
Share on other sites

FreeSync can go up to 240Hz. Just the rest of the tech isn't up to that level, yet.

But tbh, does it matter? If freesync stops working when you go over 90Hz, and let the screen go up to 144Hz when it can. Then what's the issue?

I don't think you need freesync, gsync or whatever sync when your pc spits out over 90fps when you have a 144Hz.

The issue is stutter...

If you let the fps go to 144, they are not constant...

They can drop to 133, 127, 112, back to 136...

All this fluctuation without Freesync will cause stutter...

Stutter is actually one of the issues why this whole Freesync and GSync matter In the first place...

Link to comment
Share on other sites

Link to post
Share on other sites

now nvidia won the sync battle without doing anything

This doesn't mean FreeSync is crap or doesn't work as expected. The problem here isn't FreeSync at all it's the manufacture claiming Adaptive-Sync would work up to at least 120 Hz and we are not seeing that on the product page. Although in between the manufactures lies there stands a bit of information that has proven quite a few theorists here wrong. I remember a few members not long ago claiming FreeSync could only support 48-75 Hz on an IPS display meanwhile this display completely debunks that theory with a 35-90 Hz range on an IPS display. This monitor is still a great monitor (as it would seem thus far) it just doesn't support Adaptive-Sync frequency ranges as high as the manufacture initially stated. I would still buy it as I don't mind playing games at 60 FPS (which is all any single GPU is going to provide you with anyways at 1440p in modern games) and with a 35-90 Hz range there's plenty of over and under for frame rate flexing.

Link to comment
Share on other sites

Link to post
Share on other sites

I was about to post about this. I am a bit upset actually if its true. 90hz is a real blow considering its ROG and was rumored to have 120hz (which they were looking at overclocking) and then finalized at 144hz. I was soo looking forward to it, it is cheaper and much better than the BenQ competitor. i would have actualy been willing to pay over a grand for this screen but oh well. 

                                                                                       Project Devils Delight

Mobo:ASUS B85 PRO Gamer  CPU: Intel Core i5 4670k   RAM: 8GB Kingston HYPERX Beast 1600Mhz    GPU PowerColor Devils 13 Dual Core 290X

                                                                Case: Corsair Air 540 strips Storage: Samsung 256GB SSD and a 1TB WD Black (2013 Model)                                                                                                       

Link to comment
Share on other sites

Link to post
Share on other sites

You have a source for your claims? As several well known reviews state otherwise.

None of them do. All of them say with FreeSync enabled the range goes down. Find me a single reputable one with a counter example.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

The issue is stutter...

If you let the fps go to 144, they are not constant...

They can drop to 133, 127, 112, back to 136...

All this fluctuation without Freesync will cause stutter...

Stutter is actually one of the issues why this whole Freesync and GSync matter In the first place...

 

Not on Freesync, as you can just have a free variable framerate, instead of activating vsync. So if you ever go above the 90 fps, you will just have screen tearing instead. Obviously that is not a good thing, but it is less noticeable at 100+ fps, and a LOT better than stutter.

 

But I agree, it seems like the first gen Adaptive Sync monitor controllers are not quite up for the job, in a satisfactory way, but surely it will get there. Once it does (this year I assume), it will render Gsync redundant. If only Nvidia would have made Adaptive Sync instead of making a gsync module, this would all be a non issue., but NVidia swears to anti competitive vendor lock in, resulting in lower adaptation, less options, at higher prices.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Not on Freesync, as you can just have a free variable framerate, instead of activating vsync. So if you ever go above the 90 fps, you will just have screen tearing instead. Obviously that is not a good thing, but it is less noticeable at 100+ fps, and a LOT better than stutter.

But I agree, it seems like the first gen Adaptive Sync monitor controllers are not quite up for the job, in a satisfactory way, but surely it will get there. Once it does (this year I assume), it will render Gsync redundant. If only Nvidia would have made Adaptive Sync instead of making a gsync module, this would all be a non issue., but NVidia swears to anti competitive vendor lock in, resulting in lower adaptation, less options, at higher prices.

Nvidia also swears to delivering quality and assuring its customers have a great experience. Do you really think Nvidia would have invested in an expensive FPGA in order to have full flexible control over both ends of the hardware, able to change every specification of the standard at any time, if it was so easy to pull off variable refresh rates at such high upper limits? The truth is Nvidia is just as interested in maximizing its consumer base and providing the best experience to forward that goal. It would do AMD some good to follow suit. And, Nvidia does not in any way lock out its competitor, monitor makers are free to implement a secondary scaler with whatever ports they want, including DP 1.2a, making them FreeSync-capable.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

None of them do. All of them say with FreeSync enabled the range goes down. Find me a single reputable one with a counter example.

 

Not questioning you, but you seem to be more familiar with them than me. Does the BenQ Freesync monitor not have an effective Freesync range of 40-144Hz? I thought that was the whole selling point of that monitor.

Turnip OC'd to 3Hz on air

Link to comment
Share on other sites

Link to post
Share on other sites

First, I LOVE being questions as biased then actually have the issue we reported on all called a "defect." Really brings everything home for me.

 

Secondly, the issue ASUS appears to have is that the scalar they implemented for an IPS display can only support that 35-90 Hz range. Other scalars might have been able to hit other ranges, but no current FreeSync enabled scalar can hit 350/40-144 Hz on IPS. Why? I honestly don't know why. But I know that's the case now. 

none of them? have you checked the acer and benq 144hz free sync monitors 

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia also swears to delivering quality and assuring its customers have a great experience. Do you really think Nvidia would have invested in an expensive FPGA in order to have full flexible control over both ends of the hardware, able to change every specification of the standard at any time, if it was so easy to pull off variable refresh rates at such high upper limits? The truth is Nvidia is just as interested in maximizing its consumer base and providing the best experience to forward that goal. It would do AMD some good to follow suit. And, Nvidia does not in any way lock out its competitor, monitor makers are free to implement a secondary scaler with whatever ports they want, including DP 1.2a, making them FreeSync-capable.

 

Which they could have done, by working with the entire industry, instead of just bypassing the monitor controller vendors, straight out. Also you don't "invest" in an FPGA, you just buy them and use them. All monitor controller vendors, uses FPGA's too, to prototype their controllers; difference is they use cheaper ASICS to make the controllers from the FPGA in the end. Gsync was fundamentally broken at launch, so they needed the FPGA to fix that in every single monitor. All of this makes everything more expensive, for no good reason.

 

NVidia is interesting in monopolies, thus making everything proprietary, to achieve vendor lock in. The result is as I mentioned: Fewer options, higher prices and less competition. We already see the price premiums NVidia charges for gsync monitors, and most of their cards.

 

So your solution, is to have monitor vendors use 2 premium priced monitor controllers? One of which is overly expensive to full NVidia's pocket? So now the monitors should come with a 200$ price premium? Hardly a useful solution, especially taken into consideration, that you can get Adaptive Sync for free (see LG's ultrawide 29" monitor, which costs the same with and without AS).

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Not questioning you, but you seem to be more familiar with them than me. Does the BenQ Freesync monitor not have an effective Freesync range of 40-144Hz? I thought that was the whole selling point of that monitor.

If I remember correctly it tends to stop around 105-110 with FreeSync, and overdriving it causes problems.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Which they could have done, by working with the entire industry, instead of just bypassing the monitor controller vendors, straight out. Also you don't "invest" in an FPGA, you just buy them and use them. All monitor controller vendors, uses FPGA's too, to prototype their controllers; difference is they use cheaper ASICS to make the controllers from the FPGA in the end. Gsync was fundamentally broken at launch, so they needed the FPGA to fix that in every single monitor. All of this makes everything more expensive, for no good reason.

NVidia is interesting in monopolies, thus making everything proprietary, to achieve vendor lock in. The result is as I mentioned: Fewer options, higher prices and less competition. We already see the price premiums NVidia charges for gsync monitors, and most of their cards.

So your solution, is to have monitor vendors use 2 premium priced monitor controllers? One of which is overly expensive to full NVidia's pocket? So now the monitors should come with a 200$ price premium? Hardly a useful solution, especially taken into consideration, that you can get Adaptive Sync for free (see LG's ultrawide 29" monitor, which costs the same with and without AS).

It's a perfectly good reason and it was not broken at launch in the slightest. That said, it offers flexibility to add features to existing hardware which is pro-consumer. It also allows any possibly hidden bugs or limitations to be erased since FPGAs are fully re-programmable. It's the superior technical solution and certainly the longer-lasting.

Also, no, the scaler industry moves at a snail's pace. Like Hell Nvidia should have worked with and waited for them. That's exactly what landed AMD in hot water being unable to move the last of its 280+ inventory since Nvidia's solution became the de facto standard and consumer favorite. It would have been better to release their own module while simultaneously trying to get Adaptive Sync into the VESA standard. That would have been optimal from both a business and consumer perspective.

Free Sync apparently has limitations even Benq's solution doesn't circumvent. This is the problem with ope standards and complex ASIC's there is going to be a flaw found down the road. It's not like mining where it's purely arithmetic. If Intel can screw up TSX-NI which is a small instruction set extension, VESA can screw up Adaptive Sync! It's obvious that has happened, and it only makes more obvious Nvidia's solution is technically superior. Also, both AMD and Nvidia are interested in monopoly status. It's the very nature of a market in which the products are not direct substitutes and simply cannot be. They both patent troll the living Hell out of Intel and guard fundamental IP from the 90s such as vertex shaders to keep anyone new from competing. AMD is no saint and Nvidia is no demon. Nvidia is superior and the market favorite and prices itself accordingly, Get AMD to make better products and market them appropriately or actually invite competition or quit whining. Nvidia acts in accordance to the wonts of its shareholders just like any big company. Consumers are a tertiary concern in business.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

I was reading the freesync FAQ on the Asus site.

I have a question.

It says to enable freesync on the monitor menu and then select a refresh rate on the computer between 35 and 90hz...

Forgetting the fact that the monitor is 144hz capable, how do we limit the fps to 90?

Is it done in catalyst? Windows? The games? Or does it require a workaround like afterburner?

Link to comment
Share on other sites

Link to post
Share on other sites

None of them do. All of them say with FreeSync enabled the range goes down. Find me a single reputable one with a counter example.

Anandtech review for one states completely otherwise to your own speculation. AMD's own internal testing has shown charts proving these displays can hit well above 110 Hz without any drawbacks (actually improving performance in some cases). You said FreeSync is hard limited to 35-90 Hz even on displays proven to exceed 110 Hz without issue or any additional input lag. Unless you have a source I, like everyone else is calling your bluff (keep in mind your claims are in question thus you need to provide the source).

Link to comment
Share on other sites

Link to post
Share on other sites

None of them do. All of them say with FreeSync enabled the range goes down. Find me a single reputable one with a counter example.

 

Here you go:

 

http://www.pcper.com/reviews/Displays/AMD-FreeSync-First-Impressions-and-Technical-Discussion/Gaming-Experience-FreeSync-

 

From the article:

 

"Even though our LG3 34UM67 has a fairly narrow VRR window at just 26 Hz, the BenQ and Acer 2560x1440 will over a much larger 104 Hz range, diminishing the impact of the negative characteristics above."

 

Pretty sure here aren't any IPS monitors available with VRR windows that span above 100hz at this time, but there are TN options. The models the article is referring to are the BenQ XL2730Z and the Acer XG270HU. I used PCPER's article for obvious reasons ;)

 

 

 
Link to comment
Share on other sites

Link to post
Share on other sites

this is all interesting and what not, but right now the cost of available free-sync monitors are making the term "free-sync" seem very ironic. The only thing early adopters are ever rewarded with is bugs and limitations.

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

Anandtech review for one states completely otherwise to your own speculation. AMD's own internal testing has shown charts proving these displays can hit well above 110 Hz without any drawbacks (actually improving performance in some cases). You said FreeSync is hard limited to 35-90 Hz even on displays proven to exceed 110 Hz without issue or any additional input lag. Unless you have a source I, like everyone else is calling your bluff (keep in mind your claims are in question thus you need to provide the source).

Do not twist my words. I said there are limits to having your cake and eating it too. Smoothness vs. frame rate vs. color. So far it looks like with FreeSyncyou cannot have all 3. Also, I've read the Anandtech article, and you're full of crap. Also I require no source because I claimed nothing. I observed there seems to be a pattern emerging. I did not in any way argue there's a hard limit on FreeSync. I said there appears to be based on everything we have seen. Also, internal testing is never a valid source. Real products, real reviews and benchmarks, or it's not worth stating.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Here you go:

http://www.pcper.com/reviews/Displays/AMD-FreeSync-First-Impressions-and-Technical-Discussion/Gaming-Experience-FreeSync-

From the article:

"Even though our LG3 34UM67 has a fairly narrow VRR window at just 26 Hz, the BenQ and Acer 2560x1440 will over a much larger 104 Hz range, diminishing the impact of the negative characteristics above."

Pretty sure here aren't any IPS monitors available with VRR windows that span above 100hz at this time, but there are TN options. The models the article is referring to are the BenQ XL2730Z and the Acer XG270HU. I used PCPER's article for obvious reasons ;)

Does not deny what I said. Secondly, there are AHVA (IPS) panels which go to 120 or 144Hz now.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

I don't know if it's sad or hilarious that some people pretend like this isn't an issue because "you shouldn't run games at over 90 FPS anyway" or similar inane comments. 35-90Hz is a decent range but come on guys... Are you really going to resort to the same bullshit argument like console players use of "you don't need more than ## FPS anyway"? It would be objectionably better to have the range be 35-144Hz, and I think a lot of people were expecting it to be. Not just expecting it, a lot of potential buyers will buy the monitor thinking it will work over 90Hz and then get screwed.

Here is what I've gathered from the Freesync vs G-Sync stuff I have read (not that much so correct me if I am wrong).

1) G-Sync is right now a far more polished and superior product. This might change in the future, but if you are looking to buy a monitor which variable refresh rate and want a good product, G-Sync is basically your only choice.

2) AMD's FreeSync certificate doesn't seem to hold a lot of water. Again, this might change in the future but right now it seems like they are just putting it on really poor implementations of Adaptive-Sync, possibly to just get as many products with the branding out on the market as possible. If you're buying a FreeSync certified product you might get something decent, or you might get crap.

3) AMD should force monitor manufacturers to specify which refresh rate range the monitors support. This would avoid a lot current and future issues.

Link to comment
Share on other sites

Link to post
Share on other sites

Does not deny what I said. Secondly, there are AHVA (IPS) panels which go to 120 or 144Hz now.

 

I'm not trying to prove you right or wrong. One of your posts asked for an example of an available AS monitor that has a VRR window that stretches above 100hz. I provided a linked article, written by a very trustworthy reviewer, that refers to two available monitors that have a 40-144hz VRR range.

Link to comment
Share on other sites

Link to post
Share on other sites

If I remember correctly it tends to stop around 105-110 with FreeSync, and overdriving it causes problems.

There might be something wrong with your memory. I think the problems you may be thinking of is Freesync not working with monitor AMA settings mentioned in the TFT review: http://www.tftcentral.co.uk/reviews/content/benq_xl2730z.htm#freesync_problems

 

Where as the BenQ was never mentioned to have problems with it's range, but I have heard it said that it can reach to 144 with Freesync working: https://youtu.be/VkrJU5d2RfA?t=24m33s

 

Link to comment
Share on other sites

Link to post
Share on other sites

I don't know if it's sad or hilarious that some people pretend like this isn't an issue because "you shouldn't run games at over 90 FPS anyway" or similar inane comments. 35-90Hz is a recent range but come on guys... Are you really going to resort to the same bullshit argument like console players use of "you don't need more than ## FPS anyway"? It would be objectionably better to have the range be 35-144Hz, and I think a lot of people were expecting it to be. Not just expecting it, a lot of potential buyers will buy the monitor thinking it will work over 90Hz and then get screwed.

 

 

Here is what I've gathered from the Freesync vs G-Sync stuff I have read (not that much so correct me if I am wrong).

 

1) G-Sync is right now a far more polished and superior product. This might change in the future, but if you are looking to buy a monitor which variable refresh rate and want a good product, G-Sync is basically your only choice.

 

2) AMD's FreeSync certificate doesn't seem to hold a lot of water. Again, this might change in the future but right now it seems like they are just putting it on really poor implementations of Adaptive-Sync, possibly to just get as many products with the branding out on the market as possible. If you're buying a FreeSync certified product you might get something decent, or you might get crap.

 

3) AMD should force monitor manufacturers to specify which refresh rate range the monitors support. This would avoid a lot current and future issues.

 

 

Absolutely, I find it very hypocritical of some members to be using arguments like how important high frames are yet only months ago they were arguing that you need more than 100 for super smooth gaming.  Some where also arguing how cheap the scalars are making freesync cheaper than gsync,  now they are trying to argue that the chips are too expensive to put more than one in to give the option of freesync and gsync.  Please.  more inconsistencies in these arguments than a politicians maiden speech.

 

Take you pick LTT,  you can't pick and choose which logic you are going to use based on the product. You have to use the same logic for all products, otherwise you are just a biased fanboy.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.

×