Jump to content

AnonymousReindeer

Member
  • Posts

    8
  • Joined

  • Last visited

Awards

This user doesn't have any awards

AnonymousReindeer's Achievements

  1. Hello, I'm currently looking for a new display for gaming and photography and am currently favouring the LG 27GL850-B. However, I noticed it has a wider gamut, like many or most of the newer screens out there. While I'd say for photography this is a huge advantage at least for prints (please correct me if I'm wrong), most content out there is still available in sRGB which is also the standard for any content on the web. I went down the rabbit-hole and even watched Taran's video on colour calibration (https://www.youtube.com/watch?v=QPubgoqtfbk). I also learned that once activated even applications like firefox can do colour management. I own a Spyder 4 colorimeter so I can create a colour profile of the screen and add that as an icc profile to Windows/Linux. Is my understanding correct that once I create such a colour profile for a wide gamut display and the application supports colour management (OS itself, Firefox, Lightroom,...) the colours of sRGB content will be transformed to the expected colour? So for sRGB content the application wouldn't show me colours outside the sRGB space, although my display is capable of doing that? Assuming I'm correct so far, there's a second topic that grinds my gears a bit: gaming. From what I found games do or do not perform colour management. Some even come with their own colour profiles. I just assume it's an easy way for applications to perform changes to gamma settings, brightness, contrast,... However, this does also mean that games on a wide gamut display will inevitebly look oversaturated. Inspired by Taran's approach I created a garbage colour profile for my current sRGB display (I calibrated it while setting the green channel to 0) that has an easily visible orange taint over it when applied to my otherwise default settings. I then started a few games from my Steam library: CS:Go: Provides its own colour profile when in fullscreen mode. However, it uses the OS colours in windowed fullscreen Arma 3: Provides its own colour profile when in fullscreen mode. However, it uses the OS colours in windowed fullscreen Portal 2: Provides its own colour profile when in fullscreen mode. However, it uses the OS colours in windowed fullscreen Lego Ninjago Movie: Provides its own colour profile when in fullscreen mode. No windowed fullscreen modef Neon Drive: Uses the OS's colour profile Rocket League: Uses the OS's colour profile This certainly is a small sample size, but can I roughly expect most of my games to look like garbage if there's now windowed fullscreen mode? I know that many wide gamut displays come with en emulated sRGB mode, that, however, usually locks many of the display's settings, and changing the OSD all the time is also pretty annoying. Is there a nicer method I overlooked to force applications without colour profile to be interpreted as sRGB and transformed correctly and for other applications not to assume my display is sRGB and to transform for my display instead? I feel like with many to most newer displays coming out with wider-gamut this will become and increasing issue, even though many users feel that "the more vibrant colours are great" - they aren't exactly more vibrant they are likely just wrong. Likely the colourspace will get even bigger in the future, making these current games look even more off.
  2. As someone who is using both a 4K as well as a 1080p regularly: I don't know how close you're sitting to your screen, but as someone sitting a bit more than an arm length (80-90cm) away from the display I find little benefit in 4K at 27". More so at the 4K screen I increased the font size to about 130% for comfortable reading. The last time I checked this roughly translates to a 1440p screen at the same size. So personally I'm going from my 1080p to 1440p but I hardly see a reason to consider a 4K screen. It's just way harder to drive, requires font scaling while high framerates are way more expensive to drive. Whereas 1440p offers you a sharper image, is only a bit harder to drive, but also comes with an increased screen real-estate
  3. Thank you very much for the info, I missed that part in the pinned post. I also overlooked the FI27Q-P, I'll put that on my list instead and this also pretty much eliminates the viewsonic one. Yes unfortunately there's not a lot of info on the display or even the panel used. The only one with a bit more detailed info I could find is this one (it's in German though): https://www.pcgameshardware.de/Monitor-Display-Hardware-154105/Tests/HP-X27i-WQHD-IPS-Gaming-Monitor-1347835/ It also only has DisplayPort 1.2, but I couldn't find any info on whether it offers 10bpc or just 8bpc.
  4. Hello everyone, I'm looking for a new monitor (technically two - dual monitor) for gaming as well as (some) photo editing, also the usual programming/office/web stuff - but any display can do that I think. I know that the first 2 requirements opose each other a bit, but from my initial research it looks like it's better than some years ago. So basically I'm looking for 27", 1440p with 144Hz, freesync, flicker-free backlight and decent colour accuracy. So I'm either in the VA or IPS territory, but no TN. My initial list boils down to: Acer Nitro XV2 XV272UPbmiiprzx BenQ EX2780Q Gigabyte Aorus FI27Q Gigabyte Aorus FI27Q-P AOC Agon AG273QCX AOC Agon AG273QX HP X27i LG UltraGear 27GL850-B Samsung C27HG70 ViewSonic VX2758-2KP-mhd Also: Eizo Foris FS2735 (a tad to expensive, and already quite old) Samsung Odyssey G7 (couldn't find any reviews yet) Did I overlook anything noteworthy? Or something that's going to be released soon-ish? Or are there any on there I should definitely not get? Regarding the question in the title: I've noticed that many of the displays offer 10 bit color with FRC, but only have HDMI 2.0 and DP 1.2 (like the Aorus FI27Q and the ViewSonic). I couldn't find any official information, but it looks like one can't even do 1440p with 10 bit colour at 144Hz (with 4:4:4)? Does anyone have a source/calculator for what the limit is with these things in mind? At least that'd allow me to further narrow down my list. Thanks in advance!
  5. Yes I understand that. Sorry, maybe I didn't ask that quite right: Why is there virtually no performance difference (in any benchmarks) between the 3700x and the 3900x in quad- or octacore workloads, although this always causes additional latency on the 3900x due to its CCX layout*? *quad core workload requires 2 CCX on the 3900x compared to 1 CCX on the 3700x. Octacore workloads require interaction between 3 CCX on the 3900x compared to 2 CCX on 3700x
  6. Thank you for your reply. Yes, that pretty much aligns with what I was thinking. Also historically I'm not updating very often. However, I was digging a bit further in that matter and it looks like the 3900x has only 3 active cores on each CCX, but only cores on the same CCX have a low inter-core latency (view chart here). Therefore, quad core and octacore workloads should run better on an 3700x, which hast 2 full CCX on a single CCD (see chart here). However, this isn't really reflected in benchmarks or discussed anywhere. Why's that? Is it just the additional clock frequency? bad scheduling? the lack of proper quad or octacore workloads?
  7. This is usually probably asked the other way round but I'd like to know what the disadvantages of the 3900x are over the 3700x as I'm trying to decide between the two. Obviously the 3900x with its 12 cores has a higher absolute price (same price per core around here) and higher power consumption under load. From what I've found in benchmarks, both of them are nearly identical in gaming, with the 3700x more often being like 1 FPS higher than the 3900x. Furthermore, some badly programmed games (maybe also applications) struggle with the additional cores or won't even start (like Dirt Rally or Dirt 4). I don't how many there are, and it mainly indicated terrible programming in the first place. In multicore productivity applications the 3900x is clearly superior, however, it looks like performance doesn't scale quite linearly with the number of cores (e.g. x264 encoding), resulting in diminishing returns. So one could say atm the 3900x provides no benefit for gaming and worse value for productivity stuff. This might change drastically over the next few years though? Are there any more disadvantages of the 3900x that I overlook? Some more background information that might be helpful to argue this in my specific case: I'm planning on either of them with my Vega 56 and something like a ASUS Prime X570-Pro, 32GB RAM and a Noctua NH-D15. I'm using it for gaming (usually not the latest games, though, I'm looking forward to e.g. Doom Eternal or Cyberpunk 2077 and currently on 1080p but planning on WQHD), photo-editing (lightroom), some video-encoding (Handbrake, shotcut), from time to time running VMs locally (hopefulyl tinkering with VFIO stuff) and maybe some streaming in the future. Programming and therefore compiling should also be on the list. Thank you all
×