Jump to content

Acer hits the ball out of the park with the Z35P

I've been using a 3440x1440 IPS LG 34" UW for about a month now and I'll never go back to a 1080p 16:9. My UW has the same vertical real estate as my old 27" WS with about 8" more horizontal real estate. Gaming is great and much more immersive. The Z35P is way overpriced for being a VA panel. If it was an IPS panel then the price would be more reasonable. The 1800R curve is a little more than my 1900R and that's a good thing. The 34" Samsung CF791 is also a VA panel, 100Hz refresh rate, a curve of 1500R, and can be found on sale for $700.00 at times. The design of the Z35P just doesn't work for me.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, 007vsMagua said:

I've been using a 3440x1440 IPS LG 34" UW for about a month now and I'll never go back to a 1080p 16:9. My UW has the same vertical real estate as my old 27" WS with about 8" more horizontal real estate. Gaming is great and much more immersive. The Z35P is way overpriced for being a VA panel. If it was an IPS panel then the price would be more reasonable. The 1800R curve is a little more than my 1900R and that's a good thing. The 34" Samsung CF791 is also a VA panel, 100Hz refresh rate, a curve of 1500R, and can be found on sale for $700.00 at times. The design of the Z35P just doesn't work for me.

The Z35P is a G-Sync monitor, so $1100 MSRP vs a $700 "on-sale" similar non-G-Sync monitor is about what I'd expect, actually. Maybe slightly more, for the 120 Hz I guess, but it's not as big of a premium as it looks all things considered.

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, LAwLz said:

I don't really see how having the extra wide screen would help with that, and those are some really niche things compared to for example a word document, which benefits more from vertical space.

Besides, two monitors (or three) is far better for when you are using multiple programs, which is something a lot of people doing "productivity" things do.

 

 

TV series, and online videos.

I am not even sure if the majority of movies are 21:9. Maybe the majority of blockbuster movies, but there is a lot of 16:9 movies out there.

 

Oh I didn't know it was that widely supported. I feels like I hear people complain about a lack of UW support in games all the time.

I run a 34" UW for my primary, and a 27" 16:9 in portrait for my daily tasks. Most of my tasks are strategy development, planning, and some content creation. 

 

Word, Adobe PP, PS, AI, ID, DW, Excel, Visio, Outlook, Tabs left right and center, etc. 

Link to comment
Share on other sites

Link to post
Share on other sites

On ‎5‎/‎24‎/‎2017 at 1:19 PM, Glenwing said:

The Z35P is a G-Sync monitor, so $1100 MSRP vs a $700 "on-sale" similar non-G-Sync monitor is about what I'd expect, actually. Maybe slightly more, for the 120 Hz I guess, but it's not as big of a premium as it looks all things considered.

I do not understand why G-Sync commands such a high price. Is it because Nvidia is charging a high license fee...why? Just because they can?  It seems manufactures are losing out on sales and Nvidia seems greedy. Or, is it because a major hardware upgrade is required on the manufactures' end of things? Well you know the old saying:  "Make Hay While The Sun Shines"

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, 007vsMagua said:

I do not understand why G-Sync commands such a high price. Is it because Nvidia is charging a high license fee...why? Just because they can?  It seems manufactures are losing out on sales and Nvidia seems greedy. Or, is it because a major hardware upgrade is required on the manufactures' end of things? Well you know the old saying:  "Make Hay While The Sun Shines"

You need to buy the display controller from NVIDIA, and they charge way more for it than a conventional display controller.

 

Part of this is because of silicon costs, the G-Sync module is an FPGA which are generally used for prototyping, not for production products, at least not mass-produced ones. An FPGA is a configurable circuit that you use to check if your designs work or not before producing an ASIC for them, FPGAs are inefficient in mass production because generally a lot of silicon isn't being used. Designing and producing an ASIC specifically to do your application has a lot of development cost, so you really want to check and make sure your basic circuit design works before you start working on an ASIC for it. This is what FPGAs are for, they are a generic configurable circuit with tons of logic gates and can be set to any arrangement, it's basically like a lego kit with every piece you could possibly need for any arbitrary design, so this same kit can be used to build all kinds of different designs. But when you transition to mass production you wouldn't really want to sell that entire kit, once your design is finalized you would want to go back and make a package that only includes the pieces you actually used in your design. But NVIDIA did not do this for the G-Sync module, for whatever reason. They now have a second-gen G-Sync module which supports HDMI input (not with G-Sync, but just for convenience) and other things, I don't know if they still use an FPGA in that one, I haven't looked into it.

 

The G-Sync module also has 768 MB of memory used for frame buffering, for frame duplication so the G-Sync effect can be maintained even if the frame rate falls below the monitor's physical minimum refresh rate. This adds more cost to G-Sync. AMD accomplishes the same thing in drivers, so there is no added cost on the FreeSync side.

 

Also it's not like you just buy a G-Sync module from NVIDIA and that's that, NVIDIA is involved in the development process, tweaking things like ULMB and optimizing the RTC overdrive for that specific monitor, and things like that. So there's further cost if you want to make a G-Sync monitor, NVIDIA's dev teams are not free. At least that's how it was with early G-Sync monitors, I don't know if that's still true. FreeSync also has a certification process from AMD which costs money, but it is optional. Any monitor that follows the VESA adaptive-sync spec should work with FreeSync.

Link to comment
Share on other sites

Link to post
Share on other sites

Well I am going to 'kinda' find out soon as I have  cheap LG 29 IPS 21:9 75Hz 1440p panel coming, and currently using a Asus ROG swift 27 TN 16:9 144Hz 1440p panel.

I guess if IPS and Ultrawide do it for me I will sell these and buy a 34 IPS ultrawide when the prices drop a bit.

I play alot of Squad and currently the min FPS in 1440p is 70 with my rig, Squad is an FPS but not exactly flinch style so I am less concerned with 144Hz screens.

CPU Intel i7 8700K @5Ghz Motherboard ROG Maximus Hero 10 RAM Corsair Vengeance 32GB 3600MHz 

GPU MSI Gaming X 1080ti Case Thermaltake Core P3  Storage SSD Boot plus Samsung 960 Evo M.2 nvme storage 

PSU Corsair RM750W Gold Display Asus ROG Strix XG32VQ 144Hz 1440p Cooling Corsair H100i V2 

Keyboard Roccat Ryos MK FX Mouse Roccat Kone Aimo Audio MK3 Fostex T50RP + Schiit Magni 3 AMP and Modi 2 DAC 
Operating System Win 10

VR HTC Vive, Audio Strap Motion Platform DOF Reality 2 DOF

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

21:9 is a fad. 

 

So are curved monitors.  

 

Wait a few years.  Game devs will not move towards 21:9 in design. 

Yo soy el hombre murciélago

Link to comment
Share on other sites

Link to post
Share on other sites

On ‎5‎/‎26‎/‎2017 at 1:19 PM, Glenwing said:

You need to buy the display controller from NVIDIA, and they charge way more for it than a conventional display controller.

 

Part of this is because of silicon costs, the G-Sync module is an FPGA which are generally used for prototyping, not for production products, at least not mass-produced ones. An FPGA is a configurable circuit that you use to check if your designs work or not before producing an ASIC for them, FPGAs are inefficient in mass production because generally a lot of silicon isn't being used. Designing and producing an ASIC specifically to do your application has a lot of development cost, so you really want to check and make sure your basic circuit design works before you start working on an ASIC for it. This is what FPGAs are for, they are a generic configurable circuit with tons of logic gates and can be set to any arrangement, it's basically like a lego kit with every piece you could possibly need for any arbitrary design, so this same kit can be used to build all kinds of different designs. But when you transition to mass production you wouldn't really want to sell that entire kit, once your design is finalized you would want to go back and make a package that only includes the pieces you actually used in your design. But NVIDIA did not do this for the G-Sync module, for whatever reason. They now have a second-gen G-Sync module which supports HDMI input (not with G-Sync, but just for convenience) and other things, I don't know if they still use an FPGA in that one, I haven't looked into it.

 

The G-Sync module also has 768 MB of memory used for frame buffering, for frame duplication so the G-Sync effect can be maintained even if the frame rate falls below the monitor's physical minimum refresh rate. This adds more cost to G-Sync. AMD accomplishes the same thing in drivers, so there is no added cost on the FreeSync side.

 

Also it's not like you just buy a G-Sync module from NVIDIA and that's that, NVIDIA is involved in the development process, tweaking things like ULMB and optimizing the RTC overdrive for that specific monitor, and things like that. So there's further cost if you want to make a G-Sync monitor, NVIDIA's dev teams are not free. At least that's how it was with early G-Sync monitors, I don't know if that's still true. FreeSync also has a certification process from AMD which costs money, but it is optional. Any monitor that follows the VESA adaptive-sync spec should work with FreeSync.

+1 Thank you for the awesome reply:)

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×