Jump to content

Logue

Member
  • Posts

    56
  • Joined

  • Last visited

Awards

About Logue

  • Birthday Jul 25, 1993

Profile Information

  • Gender
    Male
  • Location
    Brazil
  • Occupation
    Psychoanalyst

System

  • CPU
    Ryzen 7 3800X
  • Motherboard
    ASUS Crosshair VII Heero
  • RAM
    2x16Gb Corsair Vengeace RGB Pro DDR4 3600MHz
  • GPU
    Sapphire Pulse 5700XT
  • Case
    Corsair 680X Black
  • Storage
    Kingston A1000 NVMe M.2 PCI-E 3.0 x2 240GB A1000 // Seagate Barracuda Pro 7200rpm 6TB // Sandisk Ultra 3D SSD 2TB
  • PSU
    Corsair RM850i 850W 80+ Gold
  • Display(s)
    LG 34GL750 34" FreeSync 144Hz
  • Cooling
    Noctua NH-D15 SE-AM4 (CPU) || 2x Noctua NF-A14 Industrial PPC (3000RPM) intake || 1x Noctua NF-F12 Industrial PPC (3000RPM) exhaust back
  • Keyboard
    Corsair K68 RGB
  • Mouse
    Logitech G502 SE Hero 16k Wired
  • Sound
    Edifier R1100 2.0 42W RMS || Fiio K5 Pro DAC AMP || Beyerdynamics COPs - Hifiman Sundara - Sennheiser HD600
  • Operating System
    Windows 10 Pro x64

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Ah, I see. Since I'm just gonna use this TV for occasional movie/video watching (apart from the PC, it's use will mostly be simple Cable TV (which never goes beyond 720p/1080p depending on the channel and apps, like YouTube/Netflix/Prime Video etc.), then probably my best solution regarding 4K HDR is to use 4K@30Hz with RGB 4:4:4 and HDR ON, then, right? That's the "best" I can do right now with this setup/cable/GPU (4K@30Hz HDR ON)? The PC has it's own monitor, I'm just using the TV to take advantage of HDR and 4K in a big format display for movie watching. I'm not worried about gaming or anything like that in this particular case. Just movie/video watching. And the only solution for this is to buy a "current" gen GPU like RTX 30 or Radeon 6XXX cards, right? They have HDMI 2.1 and that supports DSC which would enable all the bling, right? Like, 4K@60Hz RGB 4:4:4 and HDR ON - that's only possible with HDMI 2.1, is that correct? And the TV also has to support it but I think it does.
  2. Hey everyone! I recently bought a Samsung TV (The Frame 43" - model 43LS03A) and I'm having some doubts about how to connect it to Windows 10 with HDR enabled at 4K (3840 x 2160) at 60Hz. I've tried everything, but I can't have it all: either I have HDR disabled, which allows me to choose the 4K resolution at 60Hz or I have it enabled, which limits the refresh rate to 30Hz. Basically, I can only have either 4K 60Hz with HDR OFF or 4K 30Hz with HDR ON. I've already enabled the Input Signal Plus setting for the HDMI port I'm using the PC with (HDMI 1). I've also tried changing both the bitdepth on my GPU settings (in the AMD Radeon Settings, under "Screen") -> tried 8bpc, 10bpc and 12bpc. Also tried using YCbCr 4:2:0 instead of the RGB 4:4:4 (Full RGB) it was set to as default. None of these settings allowed me to enable HDR with 4K@60Hz. If I have 60Hz selected, when I turn on the HDR toggle in the Windows 10 settings, the toggle turns on for a brief moment and then instantly turns off, with the screen blacking out for a second before it goes back to "normal". Enabling HDR first and then trying to change the refresh rate results in the same problem: if I have HDR enabled at 4K@30Hz and then try to switch it to 4K@60Hz, the new refresh rate gets selected but quickly goes back to 30Hz (doesn't get applied, effectively). Is this a hardware limitation from my PC? Is it a cable problem? Is it a TV problem (the TV doesn't support it)? The system I'm using the TV on is an i5-6600 (non-K) with an RX 580, Win10 Professional 64 bits. I'm not sure if this is a limitation of the HDMI 2.0 cable (I'm not sure the one I'm using is 2.0 either). I've found this on Samsung's website: https://www.samsung.com/us/support/troubleshooting/TSG01207728/ but it doesn't work either. Setting a custom resolution is worthless since the HDR toggle just turns on and off instantly (again). Any help? P.S.: I've found in the Samsung e-Manual that maybe only one of the HDMI ports allow RGB 4:4:4 at 4K@60Hz but it doesn't say anything about HDR. AFAIK that port should be HDMI 4, tho' I've also seen people saying to use HDMI 1. IDK what to do or if I'm doing something wrong or this specific 43" model for this TV doesn't support 4K@60Hz with HDR ON. Does anyone know that? I thought HDMI 2.0 supported 4K@60Hz with HDR ON no problem, or is it a bandwidth limitation?
  3. Thank, I'll keep that in mind. Since I live in Brazil, my display options are somewhat limited - if not by the prices displays end up costing here due to import tariffs and taxes and the dollar conversion, but by the sheer lack of selection and availability. But thanks for the tip! I know I was really picky with some of the answers, thanks for answering all of them with such patience and kindness. Have a great day, appreciate it a lot. Honestly, LTT should do a video about this post as a whole (not specifically, but a video with these questions in mind would be a great help for noobs like me). A somewhat simpler version of this, anyway: "How to calibrate and set up your PC for high quality content consumption" and include topics like madVR, color calibration and contentt gamut differences, etc. I feel these are such "niche" questions for whatever reason, even tho' I feel like they should be more mainstream, more widely known. I leave this as a humble suggestion, @LinusTech@James@jakkuh_t
  4. Ok, you're probably right. I've done what you said and things look pretty good here. Thanks a lot for all your help, man! You were a godsend! If anyone else is interested, I did find this forum post from way back in 2015, which has also helped a great deal understanding all this gamma talk and whatnot. https://www.avsforum.com/threads/madvr-argyllcms.1471169/page-208#post-40108362 I ended up settling on these settings: Gamma 2.2, Relative with 100% Black output offset - following the advice from the forum post I linked above. I also tried the other settings suggested in that post and I didn't find it to be much different, so I ended up with the "default" 2.2 gamma curve.
  5. Hey! So, should I select gamma 2.2 or use Rec. 1886 in the dropdown menu and then input Gamma 2.2? Like, the tone curve should be Rec. 1886 or Gamma 2.2? Also, I can't change the input encoding to Full, it's not an option: And output encoding is greyed out (see 1st image) Another question: See that the calibration I performed wasn't intended for video/madVR (it was for sRGB). So my tone curve (in the calibration/profiling process, before the 3D LUT generation) was selected to be sRGB. Is that a problem?
  6. Maybe while I have your attention... I'm now looking into creating 3D LUTs to use along MPC-BE and madVR, to watch videos (movies, TV shows, etc.). I've already found plenty of links and tutorials for that. One question I still have tho': do I need to add MPC-BE and madVR's executables to the Exceptions list using the DisplayCAL Profile Loader (right click in the system tray)? Should I do that? And, if so, should I disable both the profile loader AND reset video card gamma table? Because the 3DLUT generated will already contain all the information thatt madVR needs to render the video with the correct colors, right? Also, I'm using the 3DLUT Maker program, NOT the main DisplayCAL program to create the 3DLUTs because I wanted to use the measurements/profile I already have. I'd just like to make sure that I'm supposed to use the default settings for that... Here's what it looks like right now: Are those correct? After creating the 3D LUT there I can use it in madVR just fine? Or do I HAVE to measure it all over again and do it via the main DisplayCAL program? I see that gamma is set to 2.4, but that is default for tthe Rec. 1886 tone curve, right? I don't need to change anything, even tho' my display's gamma is now 2.2? EDIT: After trying it out and creating the profile with the settings above, things look pretty good, better than before the 3D LUT was applied. So I'll assume I did everything right until your next answer. What I found weird was that this process produced four files, not just one. One is a .3dlut which was what madVR was asking for. However, there's also a PNG and 2 ICM profiles that get generated. I'm assuming I just ignore the other 3 files, what matters to me is the .3dlut file, correct? Here's a screenshot: Also, in the madVR calibration window, there are fields to insert different 3D LUTs depending on the color space. I just created one for the BT.709 because it's what I mostly watch/use. However, could I possibly create 3D LUTs for BT.2020 and DCI-P3 too, even tho' those are wide gamut and my display is not (it's SDR)? If so, which 3D LUTs should I choose for each? As you can see below, there are multiple DCI-P3 and BT.2020 options... Which DCI-P3 and which BT.2020 do I choose?
  7. Yeah, I'll try that! Thanks a lot! And, indeed, in the famous Hardware Unboxed video they made about color calibrating your display, the guidance is to just select sRGB and leave everything else at default (which includes the 175 patch default setting). That's great to know, I'll just compare my current measurement (I think I used 3400 patches, around there) with a new one using defaults and maybe something in the middle (i.e., 1500 patches). Compare the 3, see if there's any difference in deltaE between them and, if there is, how much so. But, yeah, like you said, probably not perceivable by humans.
  8. WHAT?! How can that be?! Like, why?! Shouldn't 5000 patches render better accuracy than 175 (default)? Same for calibration, I noticed that with slow speed calibration it takes many more measurements before proceeding to profiling. I mean, if I can ACTUALLY get the same accuracy, then great, but, no offense intended here, how can I trust that answer? Is there evidence to back it up? Like, how do you know that? Cuz I've wasted some 4 hours in some measurements believing it'd be more accurate. LOL. If I can get same accuracy with less time measuring, great, I'm all for it. Maybe there are diminishing returns? Like, sure, I believe that 5000 patches might be too much (with more than that, DisplayCAL just stopped working - process froze - for some reason, both times I tried with that many patches). I just set to that cuz I left calibration running overnight, so it didn't really matter how long it'd take as long as it was done in the morning which it was (with 5000). Sure, ok, maybe 5000 is too much. But maybe 175 is too litle? Like, around 1000 seems to be a decent compromise, but maybe even that is worthless? Anywhere I could read up on that, like, maybe an article comparing the accuracy of X, X*2 and X*4 patches?
  9. Thanks a lot for your answers and patience! If anyone is curious and has the same monitor model (LG 34GL750) as I do (I know there are variations from unit to unit, but hey... it may help someone), I've settled on these settings: Display settings: Contrast 70 (around 1100:1 measured, it varies) Brightness 24 (it translates to 120cd/m² measured via DisplayCAL) - I found that after calibration, everything gets brighter anyway, so I might as well lower the brightness and the "washed out" effect I was noticing before is now gone and things are starting to """feel""" right (I got used to it, basically, lol) Gamma Mode 1 (which is the mode that is measured to be 2.2 gamma via the uncalibrated report tool in DisplayCAL) Color temperature - User mode -> RGB Values: 40 Red, 50 Green, 45 Blue DisplayCAL settings: Preset sRGB Interactive display adjustment checkbox enabled but with low calibration speed and lots of patches (around 5000) for higher accuracy since this display isn't the most accurate (even after calibration, it says it only covers about 92% of the sRGB gamut with 104% volume) White level: as measured (like I said above, around 120cd/m²) Correction: LCD White LED family (AC, LG, Samsung)
  10. Ok, thanks! Another curiousity moment: Is there a difference between calibrating with different gamma modes? Like, if I choose Mode 1 for gamma (2.2) and calibrate it to sRGB. Is that gonna look exactly the same as if I'd selected Mode 2 for gamma (2.4) and then also calibrated for sRGB? Like, the same settings both times on DisplayCAL, the only change being the gamma mode in my display. There is, right? If I understand it all correctly. Color will still be accurate, but everything will be "darker" in Mode 2 (2.4)? That's the only change?
  11. Got it, thanks. I'll try gamma 2.2 for a few days to try and adapt to it. Right, it's what I thought then. But is there a reference? Like, what value should I aim for in the white level? 120cd/m²? IF I go for higher than that, what does it mean in terms of accuracy? Let's say sRGB aims for 120cd/m². If I'd like to use my display at 160cd/m² (which is what I set it to), am I giving something up? Like, my dark content will be brighter, is that it? Not that it matters that much for my use case, I'm just curious at this point. Yeah, gamma 2.4 seems better to me because I've gotten used to it, but I'll try to adapt to sRGB and gamma 2.2 over the next few days. Basically, I feel like gamma 2.2 some dark parts are brighter than they used to be - which I think is what's actually happened, right? Going for gamma 2.2 after having used 2.4, things WILL look brighter (cuz they ARE). Especially the dark portions. But blacks (#000000) are still just as black. Just the darker greys that look brighter now, but I think that's okay. For example, the command prompt and the Windows File Explorer (using dark mode in Windows 10) are much brighter now than they used to be, even tho' they are almost black. It's like there's more gradation now between pure black and 70% black (or sth like that). Also, if I want to check if the calibration works, I should just use the verification tab, right?
  12. About 1) - the "best" gamma mode would be the one, between the four available, that is closer to Gamma 2.2 on the Uncalibrated report, correct? Strangely enough, that's mode 1. Mode 2 = gamma 2.4 and mode 3 = gamma 2.6 (pretty dark). Mode 4 is gamma 2.29~2.3, kinda weird. However, the "default" (the one selected after you reset the settings to factory defaults) is mode 2 (gamma 2.4). Should I choose mode 1 then? I did that and calibrated with it selected - I can't get used to how "bright"/washed out things are. Probably my eyes are used to more contrast/saturation. About 6) - should I set it to what my current monitor brightness is in cd/m² (by measuring it)? Or should I set 120 even if my display's brightness is measured to be 160cd/m²? Maybe that's why it's "washed out"/bright? With this latest calibration I did I left it on "As measured":
  13. Right, I see. In my case, that may not be much of an issue (again, I'm not doing professional work with colors or photos etc.). I'll just make two profiles with everything else being the same and compare the two for my use case and see what looks better. And if I ever need an sRGB profile, I have it too. One thing that's happening however is that it seems the profiles are removed after I close the main DisplayCAL program. The DisplayCAL Profile Loader is active in the system tray and it's colored (not grayed out as it gets, for example, when you open DisplayCAL), but it doesn't seem to be doing it's job. You know the "Download" button (looks like one) that you can click in DisplayCAL to install the profile you created after the calibration process is over? Yeah, so, that... When I click that (even if I'd already done it before), the colors change to the "correct" ones. Great, now they are correct. However, as soon as I close DisplayCAL (with the profile loader still active in the system tray), the colors revert back to what they were before calibration. Is that correct? I had that happen with both PCs I've calibrated - both use AMD GPUs, don't know if that may be a problem with software/driver interaction and Windows 10... During installation, I DID select "Let DisplayCAL handle calibration" on both computers. If I uninstall DisplayCAL and just use Windows calibration (then selecting the "Use Windows calibrattion" tickbox in the Color Management settings in Control Panel), it seems to work just fine. Is DisplayCAL doing anything ELSE besides applying the color profile (that'd be the .ICM file, correct?)? There's that "1D LUT" it generates, that's applied directly to the GPU or something like that...? Does that remain even if I uninstall DisplayCAL? Is that applied together with the profile when you select it in Windows settings? Is it another file? Or is it a setting just accessible/enabled if DisplayCAL is installed?
  14. So, final question: is there any reason not to choose the Gamma 2.2 in the Tone Response Curve setting (in calibration window) and go for the sRGB instead? Like, why not the Gamma 2.2? Cuz I've seen people say sRGB is bad if you watch video (??) cuz dark portions get significantly brighter (allegedly, I haven't tested it yet - intend to create 2 profiles with everything the same apart from that setting).
  15. Thanks for all the answers so fast! I'm gonna do these things later, I'm now calibrating the other computer which was in far worse shape (pretty old display, a LG W2353V). But thanks a lot, pretty much all I wanted to know. And I kinda did exactly what you described in your step-by-step little guide there, except for the sRGB portion. I didn't use the sRGB because the first calibration I did came out like crap because whitepoint was set to "as measured" (even tho' I had selected the sRGB preset... weird, not sure what happened there). But now I know what went wrong (my second calibration worked great bc I specified 6500k as the whitepoint, not "as measured". But then I used the tone response of Gamma 2.2, not sRGB. Also, about the gamma modes in the display, I don't think Mode 4 is gamma 2.6, because it looks a lot like the Mode 2 (which supposedly = gamma 2.2). Like I said in the first post, in the manual Mode 4 is described as a mode to be used "when you don't need to adjust the gamma". IDK what that means exactly... Do they just use gamma 2.2? Then why have 4 modes and not just 3, with the Mode 2 being gamma=2.2? And although it is similar to Mode 2, it's not quite the same. I'll check both with the report tool later and I'll come back here with some logs/reports, maybe there's a difference.
×