Jump to content

The Acer Predator X27 has almost every feature you might ever want in a monitor

EunSoo
Just now, M.Yurizaki said:

So if it had every high end spec in the book, but it looked like a giant dildo, you'd be happy rocking that on your desk?

Within reason obviously. I just mean most people don't care that much about the stand or logo design if it performs well. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Soonercoop21 said:

Within reason obviously. I just mean most people don't care that much about the stand or logo design if it performs well. 

On the flipside though, if the company is spending resources on designs and flash that A. people don't care about or B. can be more harmful than good (like heat sinks that are poorly designed but were designed that way for the sake of flashiness), they're wasting money that could be better spent investing on bringing high performance down to the everyman.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, M.Yurizaki said:

On the flipside though, if the company is spending resources on designs and flash that A. people don't care about or B. can be more harmful than good (like heat sinks that are poorly designed but were designed that way for the sake of flashiness), they're wasting money that could be better spent investing on bringing high performance down to the everyman.

Completely true, but nonetheless most people I know who PC game who are even 18-30 years old see the color red or flashing lights and they think that means good performance.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Soonercoop21 said:

Completely true, but nonetheless most people I know who PC game who are even 18-30 years old see the color red or flashing lights and they think that means good performance.

And this is where I have a problem with many PC gamers. They're not really PC enthusiasts even though they think they are.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, M.Yurizaki said:

But it's what sells.

 

The irony of "gaming" hardware should really be function over form, but when you spend $2000 on a computer, people seem to think it needs form over function.

 

What's worse, those monitors claim HDR capable, but the only two HDR standards require Rec.2020, which is an even wider color gamut.

Yep, was gonna comment. 

 

Here are the standards for the current HDR10 (10 bit HDR)

HDR10 mastering supports up to 4,000 nits peak brightness, with a current 1,000 nit peak brightness target
As close to 100% Coverage of the DCI-P3 color space as possible. Usually these sets are marked as Wide Color Gammut, Quantum Dot, Tri-luminance etc.

For Dolby Vision (12 bit HDR)

Dolby Vision mastering supports up to 10,000 nits peak brightness, with a current 4,000 nit peak brightness target
Dolby Vision mastering supports up to the BT.2020 color space, HDR10 is mastered for DCI-P3. (Finding a 12 bit set that can actually display in the BT.2020 color space... well frankly it pretty much doesn’t exist for consumers (for anything reasonable.)

Data Scientist - MSc in Advanced CS, B.Eng in Computer Engineering

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Suika said:

I need to know pricing, because I don't know if I need to sell one kidney or both.

The PG27UQ, which has very similar specs to this Acer panel, had its price leaked to be $2k. This panel will probably have the same price range 

 

2 hours ago, M.Yurizaki said:

But it's what sells.

 

The irony of "gaming" hardware should really be function over form, but when you spend $2000 on a computer, people seem to think it needs form over function.

 

What's worse, those monitors claim HDR capable, but the only two HDR standards require Rec.2020, which is an even wider color gamut.

there are actually six HDR standards now. 

 

HDR10

Dolby Vision

BBC/NHK HLG

Technicolor/Philips HDR

HDR10+ <-- this is recent, formed by samsung

Active HDR <-- by LG

 

30 minutes ago, randomhkkid said:

Yep, was gonna comment. 

 

Here are the standards for the current HDR10 (10 bit HDR)

HDR10 mastering supports up to 4,000 nits peak brightness, with a current 1,000 nit peak brightness target
As close to 100% Coverage of the DCI-P3 color space as possible. Usually these sets are marked as Wide Color Gammut, Quantum Dot, Tri-luminance etc.

For Dolby Vision (12 bit HDR)

Dolby Vision mastering supports up to 10,000 nits peak brightness, with a current 4,000 nit peak brightness target
Dolby Vision mastering supports up to the BT.2020 color space, HDR10 is mastered for DCI-P3. (Finding a 12 bit set that can actually display in the BT.2020 color space... well frankly it pretty much doesn’t exist for consumers (for anything reasonable.)

To be fair, the peak brightness is merely stated for highlights, such as glare or sunlight. so really this monitor will only need to hit at least 1000 nits in a 20% window or less. the overall peak can be as low as 300 nits, though that may be detrimental to an optimal HDR experience. 

 

Dolby Vision is what i call the 'premium HDR'. it's a bit like NVIDIA G-sync actually. as the display requires a proprietary module to decode and process the special HDR signal. though this tech may not matter too much for game developers, as the majority go to HDR10. i am surprised NVIDIA didn't take up on DV, but DV is very limited to TVs. 

Link to comment
Share on other sites

Link to post
Share on other sites

QLED HDR beast of a monitor o-o!

 

Pretty similar to the Asus one announced earlier, but this one looks better!

Groomlake Authority

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Technicolors said:

here are actually six HDR standards now. 

 

HDR10

Dolby Vision

BBC/NHK HLG

Technicolor/Philips HDR

HDR10+ <-- this is recent, formed by samsung

Active HDR <-- by LG

Goddammit. Everyone should stop polluting the ducking market with this crap.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, M.Yurizaki said:

Goddammit. Everyone should stop polluting the ducking market with this crap.

and it's steadily growing. sucks really, as it's hurting HDR as a whole and making it even more a gimmick, which it really isn't if you look into it 

Link to comment
Share on other sites

Link to post
Share on other sites

43 minutes ago, randomhkkid said:

Yep, was gonna comment. 

 

Here are the standards for the current HDR10 (10 bit HDR)

HDR10 mastering supports up to 4,000 nits peak brightness, with a current 1,000 nit peak brightness target
As close to 100% Coverage of the DCI-P3 color space as possible. Usually these sets are marked as Wide Color Gammut, Quantum Dot, Tri-luminance etc.

For Dolby Vision (12 bit HDR)

Dolby Vision mastering supports up to 10,000 nits peak brightness, with a current 4,000 nit peak brightness target
Dolby Vision mastering supports up to the BT.2020 color space, HDR10 is mastered for DCI-P3. (Finding a 12 bit set that can actually display in the BT.2020 color space... well frankly it pretty much doesn’t exist for consumers (for anything reasonable.)

 

Mostly right, but I'm pretty sure HDR10 uses REC2020 as colour space too.

 

Do remember that NOTHING on the market anywhere supports full REC2020. It's called 2020, because the industry wants to achieve a panel capable of the full colour gamut by 2020. I doubt they will succeed though.

 

Also, wtf Dolby vision? 10000 nits? You want to give people permanent eye damage? HDR is nice, but extreme brightness is just stupid. Herpaderp, lets make a standard, where movies can do the same as making you look straight into the sun:

 

206f5837717aa22cb9e973adf4878db3b6575f0b

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Notional said:

Do remember that NOTHING on the market anywhere supports full REC2020. It's called 2020, because the industry wants to achieve a panel capable of the full colour gamut by 2020. I doubt they will succeed though.

There's a Rec.2100, does that mean they want everyone to move to that by 2100?

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, SCGazelle said:

 

This monitor is literally insane. 

... It is literally insane? 

 

I suspect you have used the non literal meaning of both literal and insane! 

Link to comment
Share on other sites

Link to post
Share on other sites

As awesome as this is, I'd rather it be ultrawide 1440, and maybe OLED (if we're dreaming here).

But my biggest issue is how damn fugly this thing is. Why can't people make a badass monitor that just looks good.

Laptop: Asus GA502DU

RAM: 16GB DDR4 | CPU: Ryzen 3750H | GPU: GTX 1660ti

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Notional said:

Mostly right, but I'm pretty sure HDR10 uses REC2020 as colour space too.

 

Do remember that NOTHING on the market anywhere supports full REC2020. It's called 2020, because the industry wants to achieve a panel capable of the full colour gamut by 2020. I doubt they will succeed though.

 

Also, wtf Dolby vision? 10000 nits? You want to give people permanent eye damage? HDR is nice, but extreme brightness is just stupid. Herpaderp, lets make a standard, where movies can do the same as making you look straight into the sun:

 

 

it works quite well for the PQ EOTF, the ideal HDR curve

Image result for pq eotf

 

3 minutes ago, M.Yurizaki said:

There's a Rec.2100, does that mean they want everyone to move to that by 2100?

funny enough it also mentions high frame rates

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, M.Yurizaki said:

There's a Rec.2100, does that mean they want everyone to move to that by 2100?

No, because frack logic. Speaking of logic, why do these standards even support the old obsolete, and utterly retarded 23.97p, 29.97p and 59.94p (and and even a NEW 119.88p). Just because the 'muricans were utterly incompetent when making NTSC colour standards. So dumb. Let it die in a horrible fire.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Misanthrope said:

AMD has nothing that can push 144hz at 4k anyways. Hopefully dual top Vega cards or 3 of them will get you somewhat close but alas, where the fuck is Vega? 

They could make a free sync option. Just ditch the g-sync module and get it certified. Would be cool to see. Mayhe with vega we'll them enter the 4K realm (1080~ish). 

Bleigh!  Ever hear of AC series? 

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Notional said:

No, because frack logic. Speaking of logic, why do these standards even support the old obsolete, and utterly retarded 23.97p, 29.97p and 59.94p (and and even a NEW 119.88p). Just because the 'muricans were utterly incompetent when making NTSC colour standards. So dumb. Let it die in a horrible fire.

It seems dumb to you now, but back then it's what you'd call a clever engineering solution:

 

Quote

Here is why: Originally, the vertical scanning rate was 30 hertz (interlaced, so it was 60 fields per second, 30 frames per second) and the horizontal scanning rate was 15750 hertz. Notice that 15750 is divisible by 30. When the black and white video signal modulates a carrier wave, we get sidebands every 15750 hertz out away from the carrier. Most of the information is close to the carrier but there is still some information that we cannot tolerate losing that is several Mhz away from the carrier. The scheme with NTSC color was to was to put a subcarrier for color several Mhz out away from the main carrier. They picked a subcarrier frequency of 3.579545 Mhz. Then they moved the vertical scanning frequency to 15734 hertz and the verticle scanning frequency to 59.94 hertz (frame rate to 29.97 hertz). Why? All these frequencies are phase related. Black and white information still falls in sidebands out away from the carrier as it always did. Color information was found in sidebands either side of the subcarrier every 15734 hertz. Here is the genius of the whole thing: The color sidebands fall in between the black and white sidebands. It's like having a bucket of large rocks and saying it is full, then you pour sand in and let it settle in between the rocks. More stuff in the same bucket. A comb filter is used to seperate the color information from the black and white. A fairly simple scheme was used to do this. Each line of video information on the average is very similar to the previous. Also, the phase of the color information changes by 180 degrees each line. Knowing this, they split the video information into 2 paths, delay one by 1/15734 (one scan line) and add the two. The color information cancels itself out because of the 180 degree phase shift and the black and white signal is left. To get the color signal without the black and white, the same thing is done except one of signal paths has an amplifier with a gain of -1 in it. Most if not all black and white TV sets didn't know the difference. The scanning frequencies are close enough to original so that they worked fine.

 

Reference https://www.physicsforums.com/threads/the-29-97-frames-per-ntsc-standard.334830/

I mean, maybe it's dumb now to keep it, but don't knock on it because it's a number you don't like.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, M.Yurizaki said:

It seems dumb to you now, but back then it's what you'd call a clever engineering solution:

 

I mean, maybe it's dumb now to keep it, but don't knock on it because it's a number you don't like.

 

It's still dumb. They could have increased the horizontal resolution from 525 lines to 625 (like PAL), and the problem would be solved for a clean 30hz. This video explains things quite well:

 

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, M.Yurizaki said:

And this is where I have a problem with many PC gamers. They're not really PC enthusiasts even though they think they are.

Meanwhile some legit PC enthusiasts hate the "edgy gaming aesthetics". Reasons why I avoid MSi as it's main appeal is this whole hardcore gaming marketing that I shrug to.... It really is a complicated market, one of the things I love about the 29UM68-P is it's super clean looks, Asus ROG and the Acer Predator are awesome screens but did they all need to look stuff that came from an alien ship? :/

Personal Desktop":

CPU: Intel Core i7 10700K @5ghz |~| Cooling: bq! Dark Rock Pro 4 |~| MOBO: Gigabyte Z490UD ATX|~| RAM: 16gb DDR4 3333mhzCL16 G.Skill Trident Z |~| GPU: RX 6900XT Sapphire Nitro+ |~| PSU: Corsair TX650M 80Plus Gold |~| Boot:  SSD WD Green M.2 2280 240GB |~| Storage: 1x3TB HDD 7200rpm Seagate Barracuda + SanDisk Ultra 3D 1TB |~| Case: Fractal Design Meshify C Mini |~| Display: Toshiba UL7A 4K/60hz |~| OS: Windows 10 Pro.

Luna, the temporary Desktop:

CPU: AMD R9 7950XT  |~| Cooling: bq! Dark Rock 4 Pro |~| MOBO: Gigabyte Aorus Master |~| RAM: 32G Kingston HyperX |~| GPU: AMD Radeon RX 7900XTX (Reference) |~| PSU: Corsair HX1000 80+ Platinum |~| Windows Boot Drive: 2x 512GB (1TB total) Plextor SATA SSD (RAID0 volume) |~| Linux Boot Drive: 500GB Kingston A2000 |~| Storage: 4TB WD Black HDD |~| Case: Cooler Master Silencio S600 |~| Display 1 (leftmost): Eizo (unknown model) 1920x1080 IPS @ 60Hz|~| Display 2 (center): BenQ ZOWIE XL2540 1920x1080 TN @ 240Hz |~| Display 3 (rightmost): Wacom Cintiq Pro 24 3840x2160 IPS @ 60Hz 10-bit |~| OS: Windows 10 Pro (games / art) + Linux (distro: NixOS; programming and daily driver)
Link to comment
Share on other sites

Link to post
Share on other sites

I'm in the 32" camp. Then I can slap a roku or something on it and have it be friendly enough to act as a bedroom tv when I'm not around. 

Link to comment
Share on other sites

Link to post
Share on other sites

24 minutes ago, Nup said:

They could make a free sync option. Just ditch the g-sync module and get it certified. Would be cool to see. Mayhe with vega we'll them enter the 4K realm (1080~ish). 

Again what would be the fucking point if AMD doesn't has a high end product out right now? And even when (at this point you could almost say if) they come out with Vega and it is competitive with high end offering for maybe 6 months they'll probably take at least another year without anything that can really push that high of a refresh rate on that high of a resolution.

 

Sorry but most of you continue to basically ignore that AMD is almost completely out of the high end market, exactly what this monitor is aimed at. It's been years since their high end card can legit outperform Nvidia there's always an excuse: wait for a better manufacturing process, HBM will give them the edge, no wait for HBM2 now, wait for DX12, wait for better drivers, wait for fucking Vega that's absurdly delayed, after it's out and it's just not as fast as the 1080ti people will say wait for better drivers, wait for more fringe DX12 games that nobody plays outside of benchmarks like AOTS, then Volta will probably crush them, then they'll be too busy with another Ryzen iteration or the Ryzen APUs, so on and so on and so on.

 

It never fucking ends, it sucks that we have to pay Nvidia a motherfucking G-Sync tax for the priviledge of their stupid fucking DRM but they get away with it because there's fuckall from AMD on this segment.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, Notional said:

It's still dumb. They could have increased the horizontal resolution from 525 lines to 625 (like PAL), and the problem would be solved for a clean 30hz. This video explains things quite well

Well sure, hindsight is 20/20, and NTSC was built around black and white standards while PAL was developed using color. And rather than make a new standard to screw everyone over who had a black and white TV or was going to have one (color TV appeared to not have been popular until the late 60s, probably due to cost), they did what they did with NTSC.

 

But I agree, keeping it on a digital standard is silly. To which I ask you Europeans why are you still using 50 FPS?

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, M.Yurizaki said:

Well sure, hindsight is 20/20, and NTSC was built around black and white standards while PAL was developed using color and rather than make a new standard to screw everyone over who had a black and white TV or was going to have one (color TV appeared to not have been popular until the late 60s, probably due to cost), they did what they did with NTSC.

 

But again, keeping it on a digital standard is silly. To which I ask you Europeans why are you still using 50 FPS.

 

Again, they could have just increased the resolution. But yeah, the second digital anything was a thing, those odd refresh rates should die in a ditch.

 

50 fps is twice 25, so it's mostly for replay of old stuff I guess. But all TV's in Europe supports fully 60p, 30p and similar. I would prefer 60p to be the standard.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×