Jump to content

42" 4k display with freesync ~ 800 dollars - the korean off brands are coming and gsync won't be able to play

Sammael

Literally everything you are saying has already been contradicted by people who actually know what they are talking about. I don't know what to say to you. Take your fingers out of your ears and pay more attention.

The gpu editors over at hard forums are on record as stating that none of the single gpus on the market are sufficient for 4k alone. And the types of games I and a lot of other people enjoy playing, rpgs, fair poorly at 4k.

http://www.anandtech.com/show/9306/the-nvidia-geforce-gtx-980-ti-review/8

Even on high. Average of low 50s. Mins certainly a lot lower than that. That is what you find acceptable for a card people drop at LEAST 700 dollars on with tax? I don't. My standards are not that low. If yours are then yes, it's fine at 4k.

I am impelled not to squeak like a grateful and frightened mouse, but to roar...

Link to comment
Share on other sites

Link to post
Share on other sites

Even on high. Average of low 50s. Mins certainly a lot lower than that. That is what you find acceptable for a card people drop at LEAST 700 dollars on with tax? I don't. My standards are not that low. If yours are then yes, it's fine at 4k.

 

There's literally a screenshot posted above of MegaDave playing Crysis 3 at 66fps on a single Titan X. Stop talking out of your arse and read what people are telling you the experience is like. You are talking absolute bullshit when you assert that it is limited to 50 fps.

Link to comment
Share on other sites

Link to post
Share on other sites

There's literally a screenshot posted above of MegaDave playing Crysis 3 at 66fps on a single Titan X. Stop talking out of your arse and read what people are telling you the experience is like. You are talking absolute bullshit when you assert that it is limited to 50 fps.

A single game. If crysis 3 is all you play then a 980 ti is fine. If your mode of argument is to simply ignore every game where it is deficient at 4k then you are cherry picking. At 1440p it's solid across the board in all but the most cartoonishly unoptimized games. You Cannot make that claim at 4k even on high settings across the board. Pointing to crysis 3 is a data point not a general performance standard. And we don't even know how those gps hold up during different scenes. Pointing the camera at a single guy against a wall. Really? Things get messier graphically which is why you want some headroom, not something that can in some cases if the settings are lowered during the right camera shots just barely maybe handle 4k as a single card.

I am impelled not to squeak like a grateful and frightened mouse, but to roar...

Link to comment
Share on other sites

Link to post
Share on other sites

A single game. If crysis 3 is all you play then a 980 ti is fine. If your mode of argument is to simply ignore every game where it is deficient at 4k then you are cherry picking. At 1440p it's solid across the board in all but the most cartoonishly unoptimized games. You Cannot make that claim at 4k even on high settings across the board. Pointing to crysis 3 is a data point not a general performance standard. And we don't even know how those gps hold up during different scenes. Pointing the camera at a single guy against a wall. Really? Things get messier graphically which is why you want some headroom, not something that can in some cases if the settings are lowered during the right camera shots just barely maybe handle 4k as a single card.

 

But Crysis 3 is still arguably the single most graphically intensive game available. Like it is the poster child of "this will cripple your PC". Yes you have to turn down some settings that have been demonstrated to not affect the visual quality at all.

 

I'm not going to argue any more. You've had three different people who actually use 4K and know what this card is capable of all tell you the same thing and it's done nothing to shift your misconceptions, and argue falsities as though you think that you are on an equal footing here.

 

You're just wrong.

Link to comment
Share on other sites

Link to post
Share on other sites

Incidentally, if you coupled the 980 ti experience with a g sync 4k display then the experience would be improved. But that brings us back to the cost differential of a gsync display vs the coming onslaught of Korean off brand adaptive sync displays. 2016 will be a much more compelling upgrade year on multiple fronts.

I am impelled not to squeak like a grateful and frightened mouse, but to roar...

Link to comment
Share on other sites

Link to post
Share on other sites

A single game. If crysis 3 is all you play then a 980 ti is fine. If your mode of argument is to simply ignore every game where it is deficient at 4k then you are cherry picking. At 1440p it's solid across the board in all but the most cartoonishly unoptimized games. You Cannot make that claim at 4k even on high settings across the board. Pointing to crysis 3 is a data point not a general performance standard. And we don't even know how those gps hold up during different scenes. Pointing the camera at a single guy against a wall. Really? Things get messier graphically which is why you want some headroom, not something that can in some cases if the settings are lowered during the right camera shots just barely maybe handle 4k as a single card.

Firstly, it's a general idea of the performance one can gain with minor tweaking to the graphics settings. Secondly, you are wrong; it's more than just a screenshot at the perfect timing thereby capturing a good framerate.

Case: Corsair 4000D Airflow; Motherboard: MSI ZZ490 Gaming Edge; CPU: i7 10700K @ 5.1GHz; Cooler: Noctua NHD15S Chromax; RAM: Corsair LPX DDR4 32GB 3200MHz; Graphics Card: Asus RTX 3080 TUF; Power: EVGA SuperNova 750G2; Storage: 2 x Seagate Barracuda 1TB; Crucial M500 240GB & MX100 512GB; Keyboard: Logitech G710+; Mouse: Logitech G502; Headphones / Amp: HiFiMan Sundara Mayflower Objective 2; Monitor: Asus VG27AQ

Link to comment
Share on other sites

Link to post
Share on other sites

Who needs Korean monitors when you can get the real deal?

Bought my Acer XB270HU for 390€ instead of 799€

Open your eyes and break your chains. Console peasantry is just a state of mind.

 

MSI 980Ti + Acer XB270HU 

Link to comment
Share on other sites

Link to post
Share on other sites

But Crysis 3 is still arguably the single most graphically intensive game available. Like it is the poster child of "this will cripple your PC". Yes you have to turn down some settings that have been demonstrated to not affect the visual quality at all.

 

I'm not going to argue any more. You've had three different people who actually use 4K and know what this card is capable of all tell you the same thing and it's done nothing to shift your misconceptions, and argue falsities as though you think that you are on an equal footing here.

 

You're just wrong.

And yet that poster child can be less taxing than games like dragon age inquisition. Poster childs for being the most graphically demanding that are not so need to stop being used as argument killers.

Look at these numbers

http://www.anandtech.com/show/9306/the-nvidia-geforce-gtx-980-ti-review/5

They say those are high settings, but the point there are they are probably looking at more taxing scenes than a single dude against a wall at high settings and extrapolating from that the type of typical performance.

I am impelled not to squeak like a grateful and frightened mouse, but to roar...

Link to comment
Share on other sites

Link to post
Share on other sites

AHAHAHAHAHAHA

 

Wow, this post. Fanboys...

Why exactly are you calling him a fanboy? He has a valid point. While some freesync monitors are absolutely superb, some are pretty awful. The window in which freesync can operate differ drastically depending on the panel, and there is no set standard in which they operate. Some will have a window of 45-72 fps, others much lower or higher than that. The experience that freesync offers can differ greatly because of this. Fall outside of that window, and it becomes awful. Nvidia has standardized their requirements, making sure all of them have a window of at least 30fps, to as high as the monitor refresh rate can go. So if a monitor has 60hz refresh rate, you can expect the window to be 30-60fps. If it is 144hz, you can expect it to be 30-144hz.

 

If a freesync monitor does not explicitly advertise their freesync window, you will be taking a gamble on how well it will perform. That is not to say AMD won't improve this in the future, but the way it is being handled as of now is not exactly the best. Calling him a fanboy for that remark seems rather childish, don't you think?

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Firstly, it's a general idea of the performance one can gain with minor tweaking to the graphics settings. Secondly, you are wrong; it's more than just a screenshot at the perfect timing thereby capturing a good framerate.

Fair enough, that poster was just highlighting that a graphical drop down gave solid gains without much of any tangible visual losses. But others and perhaps you are taking that to mean that is all that is needed for consistent 66fps gaming at 4k. And that is a gross misinterpretation of data. It smacks of buyers bias of people looking through rose colored glasses of their own purchase or that of their nvidia love to inflate a cards merits in ALL fields, like single gpu 4k gaming. I'm not buying it, the hard forum gpu editors a not buying it, the anand tech guys are not buying it. But "I" a the one taking crazy pills?

I am impelled not to squeak like a grateful and frightened mouse, but to roar...

Link to comment
Share on other sites

Link to post
Share on other sites

Why exactly are you calling him a fanboy? He has a valid point. While some freesync monitors are absolutely superb, some are pretty awful. The window in which freesync can operate differ drastically depending on the panel, and there is no set standard in which they operate. Some will have a window of 45-72 fps, others much lower or higher than that. The experience that freesync offers can differ greatly because of this. Fall outside of that window, and it becomes awful. Nvidia has standardized their requirements, making sure all of them have a window of at least 30fps, to as high as the monitor refresh rate can go. So if a monitor has 60hz refresh rate, you can expect the window to be 30-60fps. If it is 144hz, you can expect it to be 30-144hz.

 

If a freesync monitor does not explicitly advertise their freesync window, you will be taking a gamble on how well it will perform. That is not to say AMD won't improve this in the future, but the way it is being handled as of now is not exactly the best. Calling him a fanboy for that remark seems rather childish, don't you think?

You definitely need to be much more careful with freesync displays due to the more wide variation in performance. Most people should probably wait until others test the displays.

I am impelled not to squeak like a grateful and frightened mouse, but to roar...

Link to comment
Share on other sites

Link to post
Share on other sites

You definitely need to be much more careful with freesync displays due to the more wide variation in performance. Most people should probably wait until others test the displays.

Yeah. Luckily, some manufacturers are starting to put the freesync window in their documentation on retail websites. That is a serious boon when picking out which monitors to use. It is especially important on these 4k models, because you need to figure out beforehand what kind of hardware you will need to have in order to stay within that window consistently. The price premium on G-Sync is not ideal, and i do wish it were cheaper, as they were able to implement it on already existing laptop panels (they claim the G-Sync module is not required on laptops, because it only requires one input method). Still, after seeing AMD pull it off with a plethora of input methods makes me question exactly what the G-Sync module does to differentiate itself from your typical freesync panel.

 

I am not exactly an expert on these two technologies, so i can't say for certain. If someone knows, i would love to be informed.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Fair enough, that poster was just highlighting that a graphical drop down gave solid gains without much of any tangible visual losses. But others and perhaps you are taking that to mean that is all that is needed for consistent 66fps gaming at 4k. And that is a gross misinterpretation of data. It smacks of buyers bias of people looking through rose colored glasses of their own purchase or that of their nvidia love to inflate a cards merits in ALL fields, like single gpu 4k gaming. I'm not buying it, the hard forum gpu editors a not buying it, the anand tech guys are not buying it. But "I" a the one taking crazy pills?

FYI, I am the one who captured the Cysis 3 screenshots. Does it still drop below 60fps? Absolutely. To say single GPUs aren't yet capable of 4K as a result is buffoonish, though.

 

Not to discredit Anand and the others because I know they are reputable sources, but during benchmarking, their investment into a single game's graphical settings is probably scant; they pick a single setting and run it across all options; bench it and publish it. You can do better with slight modifications.

Case: Corsair 4000D Airflow; Motherboard: MSI ZZ490 Gaming Edge; CPU: i7 10700K @ 5.1GHz; Cooler: Noctua NHD15S Chromax; RAM: Corsair LPX DDR4 32GB 3200MHz; Graphics Card: Asus RTX 3080 TUF; Power: EVGA SuperNova 750G2; Storage: 2 x Seagate Barracuda 1TB; Crucial M500 240GB & MX100 512GB; Keyboard: Logitech G710+; Mouse: Logitech G502; Headphones / Amp: HiFiMan Sundara Mayflower Objective 2; Monitor: Asus VG27AQ

Link to comment
Share on other sites

Link to post
Share on other sites

FYI, I am the one who captured the Cysis 3 screenshots. Does it still drop below 60fps? Absolutely. To say single GPUs aren't yet capable of 4K as a result is buffoonish, though.

 

Not to discredit Anand and the others because I know they are reputable sources, but during benchmarking, I doubt they invest much time into a single game's graphical settings; they pick a single setting and run it across all options; bench it and publish it. You can do better with slight modifications.

 

[H]ardOCP prioritizes highest playable settings when doing reviews. They also actually play the games for each test, for each card, every single time.

Link to comment
Share on other sites

Link to post
Share on other sites

[H]ardOCP prioritizes highest playable settings when doing reviews. They also actually play the games for each test, for each card, every single time.

Good. I am glad they do.

Case: Corsair 4000D Airflow; Motherboard: MSI ZZ490 Gaming Edge; CPU: i7 10700K @ 5.1GHz; Cooler: Noctua NHD15S Chromax; RAM: Corsair LPX DDR4 32GB 3200MHz; Graphics Card: Asus RTX 3080 TUF; Power: EVGA SuperNova 750G2; Storage: 2 x Seagate Barracuda 1TB; Crucial M500 240GB & MX100 512GB; Keyboard: Logitech G710+; Mouse: Logitech G502; Headphones / Amp: HiFiMan Sundara Mayflower Objective 2; Monitor: Asus VG27AQ

Link to comment
Share on other sites

Link to post
Share on other sites

Good. I am glad they do.

 

Same. I wish more reviewers would do the same thing. It takes a lot more time, but it's also a lot more accurate.

Link to comment
Share on other sites

Link to post
Share on other sites

Why exactly are you calling him a fanboy? He has a valid point. While some freesync monitors are absolutely superb, some are pretty awful. The window in which freesync can operate differ drastically depending on the panel, and there is no set standard in which they operate. Some will have a window of 45-72 fps, others much lower or higher than that. The experience that freesync offers can differ greatly because of this. Fall outside of that window, and it becomes awful. Nvidia has standardized their requirements, making sure all of them have a window of at least 30fps, to as high as the monitor refresh rate can go. So if a monitor has 60hz refresh rate, you can expect the window to be 30-60fps. If it is 144hz, you can expect it to be 30-144hz.

 

If a freesync monitor does not explicitly advertise their freesync window, you will be taking a gamble on how well it will perform. That is not to say AMD won't improve this in the future, but the way it is being handled as of now is not exactly the best. Calling him a fanboy for that remark seems rather childish, don't you think?

And you only have to pay $200-300 more for that little bit of peace of mind!

Link to comment
Share on other sites

Link to post
Share on other sites

And you only have to pay $200-300 more for that little bit of peace of mind!

Worth is subjective, is it not? Who are you to deem what is appropriate or not for people to spend their money on? While Freesync offers almost identical performance to G-Sync, it has been proven by several sources that G-Sync is still a superior technology. No matter how slight that superiority might be, people are still getting something in return for their money. Would i personally pay $200-$300 more for that technology? Probably not. Would depend on the scenario i was in. If i were in need of a monitor, already own an Nvidia GPU, and can more than afford such a luxury, then sure, i probably would. Not everyone will make that kind of investment.

 

My point is, the term "fanboy" has been thrown around so much that it has lost its meaning entirely. Just because someone offers a differing opinion between two brands, does not make them a fanboy. Nvidia has made claims that more features are to be unlocked from the G-Sync module. I have absolutely no idea what those features might be, but they are making claims that the module itself still has some worth to it. Call it a company trying to protect themselves after a competitor made a cheaper solution, or take their word for it. Either way, it does not matter. If you yourself do not want to pay $200 for that technology, you do not have to. Plenty of non G-Sync monitors exist, as well as Freesync monitors. 

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Worth is subjective, is it not? Who are you to deem what is appropriate or not for people to spend their money on? While Freesync offers almost identical performance to G-Sync, it has been proven by several sources that G-Sync is still a superior technology. No matter how slight that superiority might be, people are still getting something in return for their money. Would i personally pay $200-$300 more for that technology? Probably not. Would depend on the scenario i was in. If i were in need of a monitor, already own an Nvidia GPU, and can more than afford such a luxury, then sure, i probably would. Not everyone will make that kind of investment.

 

My point is, the term "fanboy" has been thrown around so much that it has lost its meaning entirely. Just because someone offers a differing opinion between two brands, does not make them a fanboy. Nvidia has made claims that more features are to be unlocked from the G-Sync module. I have absolutely no idea what those features might be, but they are making claims that the module itself still has some worth to it. Call it a company trying to protect themselves after a competitor made a cheaper solution, or take their word for it. Either way, it does not matter. If you yourself do not want to pay $200 for that technology, you do not have to. Plenty of non G-Sync monitors exist, as well as Freesync monitors. 

 

Well, the Acer XB270HU is the only IPS monitor on the market that can do variable refresh rate up to 144hz.

Open your eyes and break your chains. Console peasantry is just a state of mind.

 

MSI 980Ti + Acer XB270HU 

Link to comment
Share on other sites

Link to post
Share on other sites

Well, the Acer XB270HU is the only IPS monitor on the market that can do variable refresh rate up to 144hz.

I do not understand how this has anything to do with what i had said in the post you quoted. Can you explain what you meant by it?

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I do not understand how this has anything to do with what i had said in the post you quoted. Can you explain what you meant by it?

There is no Freesync version available of a monitor that can do that.

Open your eyes and break your chains. Console peasantry is just a state of mind.

 

MSI 980Ti + Acer XB270HU 

Link to comment
Share on other sites

Link to post
Share on other sites

There is no Freesync version available.

Oh. Correct. ASUS did announce that MG279Q, which was advertised as 144hz 1440p IPS, but it's variable refresh rate window is 35-90hz. 

 

http://www.newegg.com/Product/Product.aspx?Item=N82E16824236466

 

It launched at the same price as the XB270HU. 

 

A review on it can be found here: http://www.pcper.com/reviews/Displays/ASUS-MG279Q-27-1440P-144Hz-IPS-35-90Hz-FreeSync-Monitor-Review

 

 

As with most of all of our reviews of the new variable refresh rate technology based monitors, the ASUS MG279Q requires some discussion of the gaming experience it provides. Though clearly it doesn't provide the same range that the ASUS ROG Swift PG278Q or the Acer XB270HU with the 144 Hz peak variable refresh rate, the MG279Q from ASUS is the first FreeSync monitor that provides enough a range, and a range in the right location, to making gaming on it not only feasible but enjoyable and recommended.

 

I spent some time with the brand new AMD Fury X card and the MG279Q and played through a host of games of the course of a few hours to see for myself how this display handled it all. The variable refresh rate range of 35 Hz to 90 Hz when in FreeSync mode provides ample room for your graphics card render rate to find a comfortable location. Any time you are gaming at 35 FPS up to 90 FPS you'll be in that middle zone, where frames are presented without horizontal tearing and without the judder normally associated with V-sync enabled setups.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Oh. Correct. ASUS did announce that MG279Q, which was advertised as 144hz 1440p IPS, but it's variable refresh rate window is 35-90hz. 

 

http://www.newegg.com/Product/Product.aspx?Item=N82E16824236466

 

It launched at the same price as the XB270HU. 

 

A review on it can be found here: http://www.pcper.com/reviews/Displays/ASUS-MG279Q-27-1440P-144Hz-IPS-35-90Hz-FreeSync-Monitor-Review

 

As I've said, only one monitor on the market as we speak. Also why should I pay more then I did for my Acer XB270HU?

 

I'm sure somewhere stores in the US also sell returned/refurbished Acer XB270HU's because the quality control is so shit everyone has to return atleast 3 monitors.

Open your eyes and break your chains. Console peasantry is just a state of mind.

 

MSI 980Ti + Acer XB270HU 

Link to comment
Share on other sites

Link to post
Share on other sites

Supposedly Crossover is going to update the firmware on the 494k and 434k to support freesync too.

Link to comment
Share on other sites

Link to post
Share on other sites

Supposedly Crossover is going to update the firmware on the 494k and 434k to support freesync too.

 

 

And that one looks even cheaper.

 

This monitor nerd over on hard forums ordered one, and intends to review it.  But he does not have an amd card to test freesync.

 

http://hardforum.com/showthread.php?t=1869702

I am impelled not to squeak like a grateful and frightened mouse, but to roar...

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×