Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
BiG StroOnZ

Acer Unveils Predator CG437K-P monitor: 43" VA, 4K, 144 Hz, Adaptive Sync, 1000 nits

Recommended Posts

On 4/13/2019 at 7:20 AM, Arika S said:

image.png.6751900970202c4c12f06f0bb35d9608.png

 

DO NOT use this as a PC monitor at full brightness, this should be treated as a TV and sit faaaaar away from it

I would beg to differ. I have the pg27uq which also has 1000nits and it sits right on my desk not that far away from me and it's not that bad when you get used to it. 

Link to post
Share on other sites
On 4/13/2019 at 4:45 AM, Captain Chaos said:

No way you're going to push that thing with a single 2080Ti.  Looks like SLI will make a comeback.

It's kinda a joke that people think it needs to hit 144hz when I wouldn't try to hit that even if I had a card capable of that. If you want to get over 120hz on current display port technology you have to use color compression which kinda defeats the point of having a nice 4k monitor. If you want to do HDR10 or just use 10 bit color over display port the max you can do is 98hz. If you are sticking with 8bit color then you can do 120hz which is a fair middle ground.  

Link to post
Share on other sites
On 4/13/2019 at 11:12 AM, BuckGup said:

I'd say VA at 144hz is preferable over IPS as I have yet to have a high res and refresh rate monitor not have terrible backlight bleed or color shift. Only exception was curved monitors as the angle is different than a normal monitor so the bleed isn't as noticeable 

I have noticed either of those on my pg27uq. I will take ips over VA any day of the week. 

Link to post
Share on other sites
On 4/13/2019 at 11:15 AM, mynameisjuan said:

HDR 400 is fine. I dont get why they are pushing 1000nits. Does not one realize how bright that is when that close?

It looks amazing. HDR with 1000 nits peak brightness is honestly insane and it makes anything else kinda look like crap in comparison. I am unsure about this monitor but I know with mine you can have alternative brightness between SD content and HDR content. 

Link to post
Share on other sites
1 minute ago, Brooksie359 said:

It looks amazing. HDR with 1000 nits peak brightness is honestly insane and it makes anything else kinda look like crap in comparison. I am unsure about this monitor but I know with mine you can have alternative brightness between SD content and HDR content. 

I know it does. I used HDR all the time with my OLED. But my point is on a monitor that is wicked bright. My LG UK650 monitor hits like 500-600 nits and I have to turn my brightness down. This doesnt compare to TVs or phones that only take up a portion of our FOV

Link to post
Share on other sites
1 minute ago, mynameisjuan said:

I know it does. I used HDR all the time with my OLED. But my point is on a monitor that is wicked bright. My LG UK650 monitor hits like 500-600 nits and I have to turn my brightness down. This doesnt compare to TVs or phones that only take up a portion of our FOV

and yet a smartphone at over 1k nits is reasonable when not staring at a white box. All these peak brightness rating are just that, peak. It really matters on Average Picture Level


Delidded 3770k 4.4GHz | Sapphire Nitro+ Special Edition RX 580 1550MHz/2250MHz  | #2 FireStrike Extreme & #2 Superposition 1080p Xtreme | 32GB DDR3 1600MHz

Link to post
Share on other sites
1 minute ago, S w a t s o n said:

and yet a smartphone at over 1k nits is reasonable when not staring at a white box. All these peak brightness rating are just that, peak. It really matters on Average Picture Level

And yet still why I said FOV. Sure on a 65" TV or a S10, that starry night moon might only be a tiny portion of the picture and its peak brightness is fine. But cram that onto a monitor and shove that in your face and that moon now is much bigger and the brightness is much stronger. 

Link to post
Share on other sites

VA....meeeeh. Also too big. Nothing more than 32 will ever go on my desk. I don't live in a bowling alley.


CPU: 9900k @ 5.35ghz Motherboard: Z390 Aorus Xtreme GPU: EVGA 2080ti FTW3 + HYDROCOPPER @ 2190Mhz RAM: 16GB Vengeance RGB 3600CL17 PSU: 850P2

COOLING: Bent Glass Loop CASE: 900D

Link to post
Share on other sites
3 hours ago, mynameisjuan said:

And yet still why I said FOV. Sure on a 65" TV or a S10, that starry night moon might only be a tiny portion of the picture and its peak brightness is fine. But cram that onto a monitor and shove that in your face and that moon now is much bigger and the brightness is much stronger. 

The moon will be smaller on the phone display but the phone is closer to your face than the monitor, it would be more of your FoV

 


Delidded 3770k 4.4GHz | Sapphire Nitro+ Special Edition RX 580 1550MHz/2250MHz  | #2 FireStrike Extreme & #2 Superposition 1080p Xtreme | 32GB DDR3 1600MHz

Link to post
Share on other sites
3 hours ago, mynameisjuan said:

I know it does. I used HDR all the time with my OLED. But my point is on a monitor that is wicked bright. My LG UK650 monitor hits like 500-600 nits and I have to turn my brightness down. This doesnt compare to TVs or phones that only take up a portion of our FOV

1000 nits doesn't bother me tbh. 

Link to post
Share on other sites
On 4/13/2019 at 11:19 PM, RejZoR said:

I have the old ASUS VG248QE and it's torching my retina with only 350 nits at full brightness... How can these HDR screens even be used without going blind?

Its not 1000 nits all the time, my TV is a 1000 nits rated unit and i play HDR games all time along with HDR media. It's only at full brightness when you have a scene of extreme brightness, a sky or a explosion ect. If you think about it its not that common and the point of HDR half the time isnt the fully brightness its the contrast between the dark and bright as well as the amazing vibrancy of the panel. I would take HDR over 4k any day and i think gamers particularly would benefit from this.


My Current Build: PCPartPicker URL My PartPicker Build
 
CPU: AMD - Ryzen 5 1600X @4.0Ghz| Cooling: be quiet! - Dark Rock Pro 4 50.5 CFM | Motherboard: Asus - STRIX X370-F GAMING | RAM: G.SKILL Trident Z RGB 2x8Gb DDR4 @3000MHz | GPU: Gigabyte - GeForce GTX 1080 8GB WINDFORCE OC | Storage Samsung - 860 EVO 250GB M.2-2280 | PSU EVGA - B3 650W 80+ Bronze Certified Fully-Modular ATX | Case: Corsair - SPEC-OMEGA RGB ATX Mid | System Fans: Corsair - ML120 PRO RGB 47.3 CFM 120mm x 4 & Corsair - ML140 PRO RGB 55.4 CFM 140mm | Display Samsung KS9000 |Keyboard Logitech - G613 | Mouse Logitech - G703 | Operating System Windows 10 Pro
Link to post
Share on other sites

This rely doesn't interest me at this price, if they want to be competitive with TV's this needs to be lower in price. I would be interested at around 900 but this is too much, its just not worth the extra cost. 


My Current Build: PCPartPicker URL My PartPicker Build
 
CPU: AMD - Ryzen 5 1600X @4.0Ghz| Cooling: be quiet! - Dark Rock Pro 4 50.5 CFM | Motherboard: Asus - STRIX X370-F GAMING | RAM: G.SKILL Trident Z RGB 2x8Gb DDR4 @3000MHz | GPU: Gigabyte - GeForce GTX 1080 8GB WINDFORCE OC | Storage Samsung - 860 EVO 250GB M.2-2280 | PSU EVGA - B3 650W 80+ Bronze Certified Fully-Modular ATX | Case: Corsair - SPEC-OMEGA RGB ATX Mid | System Fans: Corsair - ML120 PRO RGB 47.3 CFM 120mm x 4 & Corsair - ML140 PRO RGB 55.4 CFM 140mm | Display Samsung KS9000 |Keyboard Logitech - G613 | Mouse Logitech - G703 | Operating System Windows 10 Pro
Link to post
Share on other sites

Kinda a shame it's not gsync.  Freesync monitors are usually lower quality overall because Nvidia doesn't give gsync certification unless it's a good monitor overall.

 

And also 43" is too big for a 16:9 monitor on a desk.  No one is sitting 36" away from their monitor, so I'd rather see 32-38".


Workstation: 8600k @ 4.6Ghz || ASRock Z390 Taichi Ultimate || Gigabyte 1080Ti || G.Skill DDR4-3800 @ 2666 4x8GB || Corsair AX1500i || 25 gallon whole-house loop.

HTPC/GuestGamingBox: Optoma HD142X 1080p Projector || 7600K@ 4.6 || Gigabyte Z270 Gaming 9  || EVGA Titan X (Maxwell) || Corsair RM650x || CPU+GPU watercooled 280 rad pull only.

Server Router (Untangle): 8350K @ 4.5Ghz || ASRock Z370 ITX || 2x8GB || EVGA G3 750W || CPU watercooled, 25 gallon whole-house loop.

Server VM/Plex/HTTPS: E5-2699v4 (22 core!) || Asus X99m WS || GT 630 || Corsair RM650x || CPU watercooled, 25 gallon whole-house loop.

Server Storage: Pent. G3220 || Z87 Gryphon mATX || || LSI 9280i + Adaptec + Intel Expander || 4x10TB Seagate Enterprise Raid 6, 3x8TB Seagate Archive Backup, Corsair AX1200i (drives) Corsair RM450 (machine) || CPU watercooled, 25 gallon whole-house loop.

On the Shelf: EVGA X99 micro2, 780, 740 GT, 210 w/ DVI port unsoldered (Hint: it can be done but it ain't easy). 

Laptop: HP Elitebook 840 G3 (Intel 8350U).

Link to post
Share on other sites

Why are IPS panels way more popular than VA panels? I currently have the BenQ BL3200PT (1440p VA 32'' 60Hz) since 2015 and it's great, especially for the movies (those blacks). My only experience is with TN and VA panels though, and of course the TN panel is plain aweful for anything other than gaming/office.

Does anyone make a 32'' 1440p 144Hz VA panel? That would be sweet...

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×