Search the Community
Showing results for tags 'hdr'.
-
Hi Folks, I've got High Quality 500+ movies on my USB Hard Disk, and I'm planning to buy an AV Receiver (For e.g. Yamaha RX-V6A) to play them. The end goal is to build a home theatre. What will be the best way to play/stream my movies? I have a 65" 4K Google TV (Sony X82L), and a high end PC (AMD 7800 X3D CPU + AMD 7900 XTX GPU). I researched a lot, but I didn't find a proper solution. I don't know if I should build a Jellyfin media server on my old laptop or is there some other way to play the movies using the AV Receiver. Please help guys. Thanks and regards.
-
Guys I've spent troubleshooting a week and still can't play HDR through my laptop. Spent a few days on different HDMI cables before checking my laptop specs and finding out that it has 1.4 hdmi port and not 2.0 as required by spec. My laptop has 2 Thunderbolt3 ports, which was my next attempt, and I was sure it was going to work easy. I bought adapter (Ugreen 70444) which says supports 4k60 and uses VL100 chipset. Using adapter and good cable Windows still doesn't recognize my C3 as an HDR supported display. I tried latest GPU driver from Intel and latest driver from Lenovo, nothing changes. Also the Thunderbolt 3 app doesn't detect anything while adapter+cable are connected(Should it?) I have collection of blurays ripped to my NAS and jus't cant wait to watch the 4K HDR LOTR. HELP! I hope this is somehow fixable and I dont need new computer!
-
So for a while, I was struggling to get my Odyssey G7 to look good with HDR enabled. Games looked extremely desaturated and washed out. Almost a year later I finally discovered what was going on the whole time. Here's an explanation: On SDR, the Odyssey G7 defaults to 50 50 50 for RGB color balance. This is the correct setting and setting to 100 100 100 looks very oversaturated. However, on HDR, you have to set the colors to 100 100 100. When switching back and forth from HDR, the color balance will then automatically adjust to 50 50 50 or 100 100 100 depending what mode you're in. Some games now look mind-blowing (such as Horizon Zero Dawn), while others with mediocre implementation look actually acceptable instead of a gray mess. Not sure if this is fixed on newer monitors or firmware but just thought I'd leave this here in case anyone was struggling like me.
- 8 replies
-
- odyssey g7
- samsung monitor
-
(and 4 more)
Tagged with:
-
Long read so be ready: I use my TV as the display for my media PC and sometimes I laugh, caugh or sneeze or talk and I leave small drops of saliva in the screen. After cleaning the screen with a microfiber cloth microfiber cloth (same used to clean eye glasses) and water, there was a stain in it that turned out to be either the Anti Reflective Coating of it or the Ultra Viewing Angle layer being removed. It was pretty visible when the TV was off or it was displaying fully black content with any other source of light in the room. I got this TV because of the great black levels it can produce and how bright it is for living rooms. I decided I would rather take consistency than keeping it the way it was and I tried different types of materials to rub the layer out, from softest to roughest. The one that gave me the best result was Paper Towels, the same you most likely use in the kitchen to clean stuff. It was taking way too much time and a lot of elbow grease so I actually used a polishing machine and put paper towels in the disc to complete the task. The result is a super glassy finish which I really like a lot. However, not everything is great because under some heavy heavy light you can see some micro scratches that look like smudge. It is not noticeable in a regular use but I can't take my mind off it and would like to finish the job with great success. I am thinking on either applying an oleophobic coating to the entire TV or buy another antireflective/protective film to get rid of the micro scratches. Similar to when the glue of the protective film of a phone fills the gaps created by a scratch with your nail due to a lot of use. I wanted to know if someone has applied an oleophobic coating in something big like an iPad or a table to know if it's feasible in a 50 inch TV and maintain a consistent application without more stains, smudges or something like that or if someone has any other recommendation. I will add pictures that show the 1.- Half way done process with napkins and hand work. 2.- Finished job while playing Warzone that show it's almost impossible to see 3.- Me polishing the actual screen in case you have any doubt. 4.- Picture with flash showing the "smudges" I hope someone from Linus Tech Tips team sees this and thinks of it as a great project for a video. Crazy ways or ways to remove a scratch from your screen and compare results.
-
Hello everyone! For some background context, I've been troubleshooting and trying to get HDR working for 2 months now... I had bought a new TV, two new PCs, new GPU, new cables. I'm slowly ready to rip my hair out by frustration.... ... Latest status: ordered brand new HDMI 2.1 cables to be 100% that the cables aren't at fault. After noticing nothing happened on ubuntu I switched back to windows for the third time but for the first time on this new TV. Windows 10 automatically installed GPU drivers and after a short screen refresh i automatically finally had HDR10! notif. From windows said it had automatically detected the display thus turned it on... radiating with happiness I went to get my HDD with HDR content to finally test it. As I came back 15 minutes later, HDR was off, and I could not toggle it on in the settings no more. It is supported but as soon as I toggle it the screen flashes dark and HDR turns itself off after less than a second. I had suspected the Windows updates had caused this, so I reinstalled win. Again, but this time the first thing I did was disable and block ALL Windows updates. HDR worked again... as I went into settings and changed the res. From FHD to 4k > HDR immediately turned itself off again. It couldn't be turned back on again, same as before. What is causing this!? Could someone please help find the source of this problem? I've been working on my home cinema project for over a year now, and I find it ridiculous i have to troubleshoot a base feature of a high-end TV for 2 months… I would really like to finally finish this project. TECHNICAL DETAILS /specifications: PC: current gpu: AMD RX6400 4GB HDMI 2.1 ITX (previous: gtx 1050TI ITX) cpu: Intel I5 6500 mobo: Asus B150I PRO gaming aura ram: 16GB DDR4 2133Mhz psu: bequiet! poer 9 500W SSD: 500GB crucial MX300 sata 2.5" HDD: WD mybook 8TB (WD RED) Windows 10 Pro AV/R: Onkyo TX-SR393 5.2 HDR / HDR10 / Dolby Vision / HLG ARC current TV: Hisense E7KQ 75" QLED 2023 HDMI eARC HDR / HDR10 / HDR10+ / Dolby Vision / HLG previous TV: Samsung UE58NU7179 LCD 2018 HDMI ARC HDR / HDR10 / HDR10+ / HLG in TV settings HDMI mode: "enhanced" firmware: up to date (otherwise apparently the TV cant achieve HDR) I heard HDR auto-detection could be causing this some more proprietary settings and formats from the TV manufacturer because "why not" :)... And maybe also some Windows drivers that are being installed automatically? All help appreciated and thank you in advance!
-
Hi guys. Unfortunately my beloved monitor is showing signs of failing. Both HDMI and DP signal on occasion fade to black until I remove the cable and reposition it. The fan stays on continuously unless unplugged even hours after a gaming session. These are becoming more frequent and I can’t find a suitable repair shop in the UK. As such I’m looking for a monitor that can deliver the same performance and especially the HDR performance that this monitor can do. any suggestions would be amazing as this is going to be a steep investment. My Budget is £1700 and would like 32in max screen size. Thanks for reading.
-
Hi I bought the Dell Alienware AW3423DWF and it is my first real HDR monitor. Before my main monitor was an IPS LCD (Asus VG27AQL1A) with HDAren't (which I obviously didn’t use). So I don’t have much experience with HDR anything. But I saw all the LTT videos praising the new QD-OLED panels and after some research and thinking, I ended up buying the AW3423DWF because that seemed like the best choice (at least for me) and when I got it, it was great. First off the ultravide format was REALLY nice, it took a little getting used to, but it was great and the same with the perfect blacks. But then there was the HDR, I was very confused because when I turned HDR on, it looked terrible, washed out, yellowish with the most bland colors on everything I have ever seen. My old generic IPS LCD monitor looked so much better in comparison, that i just couldn't understand, what was going on, so I searched and looked for wrong settings or anything that I might have done wrong, but I couldn't find anything. I did everything correctly, but it was not in any way like they say in any of the videos. They are always so impressed with how good it looks, how color accurate it is and how bright it gets, and that, the brightness, is a least a little bit noticeable with HDR on, lamps and shiny things does look brighter. So I am now asking here to get some help. I have some examples of the difference in color between HDR and SDR that I have taken with my Samsung Galaxy S21 FE. I tested it in Hitman 3 (Hitman World of Assassination), Marvel’s Spider-Man Remastered (PC) and Far Cry 6 and it can be seen below. I also have other games with HDR support that I could test in if needed. The difference is very noticeable, the top and washed out one is with HDR on, and the bottom and vibrant one is with HDR off. The photos might be a little off, but in real life the one with HDR on is very bland, dead and washed out, whereas the one with HDR off has nice and more vibrant colors though nothing crazy (like how the photo might seem). Hitman World of Assassination - Chongqing, game graphics settings are all max Marvel’s Spider-Man Remastered (PC) - Rooftop in Chinatown, game graphics settings are all max Far Cry 6 - Montero Farm, game graphics settings are almost max Another thing is that all internet browsing is just terrible with HDR, so I have to turn it on and off, whenever I go in and out of a game, but that’s fine, I mapped a key to turn windows HDR on/off. However even when viewing video in the browser it is still bad both LTT videos on Youtube and Marvel on Disney+. I also have some examples of that again taken with my phone. Disney+ - Avengers Infinity War Youtube - "Elon Musk vs. MrBeast - WAN Show January 5, 2024", max quality My monitor and Windows settings are: “HDR Peak 1000” HDR mode on the monitor, HDR, Auto HDR and HDR Video Streaming is on in windows and I have no color profile. My setup is: RTX 3070 and Ryzen 7 5800X. If more details are needed then just ask and I will provide them.
- 35 replies
-
- qd-oled
- washed out colors
-
(and 2 more)
Tagged with:
-
I recently picked up the new Alienware AW2725DF and it's my first OLED Monitor; coincidentally also my first experience at all with HDR. If I have HDR disabled, everything looks extremely dim and washed out, and if I enable HDR without changing any settings it looks pretty much identical. The only way I seem to be able to get the monitor to look nice and colourful is if I change the "SDR Content Brightness" slider under HDR settings until it's around 80-90%, but even then it looks darker than my old VA panel and only just brigher than my old TN. I tried playing Red Dead Redemption 2 with the SDR Content Brightness cranked up and it went dim again even if I disabled HDR ingame, and when I enabled HDR and tried to calibrate it, it didn't seem to make a difference. I've tried multiple different HDR settings in the monitor's settings (Desktop, Movie HDR, Game HDR, DisplayHDR True Black and HDR Peak 1000) and they all mostly look very similar except Peak 1000 has much more ABL than the others. I'm not sure if it's an issue with my panel or something because everything I can find online indicates that this monitor is plenty bright and a lot of people are saying they even have it at 80-90% brightness in the monitor settings when even 100% for me looks very dim, and a lot of people have been saying you shouldn't have the SDR Content Brightness slider above 30%. I tried the windows HDR calibration and that didn't seem to make a difference and I downloaded the 2.2 gamma ICC profile online and that didn't help either. The only time I've been satisfied with the brightness was when I was playing Red Dead 2 and it was night time in game, the blacks were inky and the bright spots were actually bright, but I'm assuming that's only because the rest of the display was fully black so it looked much brighter in contrast. I play a lot of competitive games and in those the brightness just isn't anywhere near where I would have expected it to be without maxing out the SDR Content Brightness slider, but the 360hz and motion clarity are absolutely incredible. I might have been imagining this but it also seemed like the brightness auto adjusted after playing for a while, it seemed to dim mid match and then by the end of the match or when starting the next it was brighter again. I'm using the included DP 1.4 cable with the monitor, my windows version is Windows 11 Pro 23H2. I'm running this monitor beside a 4k ASUS PB287Q and a 1440p Samsung C27JG50. I've attached an image of the advanced displa settings also. PC Specs (not sure if needed but I'll include anyway): RTX 3070 FE Ryzen 7 5800X3D MSI B450M Mortar Max Corsair Vengeance 8x4GB DDR4 3600MHz I'll include any more information if needed, thanks!
- 7 replies
-
- hdr
- brightness
-
(and 1 more)
Tagged with:
-
Hello, just a simple question: I‘ve found this trick to fix the washed out colors in SDR when HDR is enabled , i guess…. https://github.com/dylanraga/win11hdr-srgb-to-gamma2.2-icm?tab=readme-ov-file So when i understood this right, SDR content will look good on an HDR monitor? So how can i select the SDR-fix color profile when i disable in window HDR and a standart profile when HDR is enabled? Make that any sense? Is is possible to cange the color profile with windows task scheduler or a batch file? Thanks for your help!
-
Been using RTX Super Resolution and RTX Video HDR since release. So far, the technology has been working flawlessly and is noticeable when toggling between them, which leads me to believe that it would be worth adding to my home theater PC. Currently the technology playing 4K video from Amazon Prime (Invincible, so far just got to season 2) was using 10-13% of my RTX 4090. This gives me concern that the absolute lowest end cards might struggle to maintain the highest quality settings for RTX Super Resolution while doing RTX Video HDR, though I imagine an RTX 3050 can do it perfectly fine (at like 80-90% usage). Anyone else use these features and have low-mid tier RTX hardware who can observe GPU usage?
- 12 replies
-
- super resolution
- ai
-
(and 3 more)
Tagged with:
-
It is not worth it. 1. It's not a true HDR 1000 monitor that requires at least 850-1150nits in 50%-100% window. It's just a HDR 400 (even less due to ABL) OLED with gimmick 1-2% window size that reaches 1000nits. You will have frequent ABL in HDR 1000 mode. 2. Despite other issues, the monitor flickers in way that's not good for the eyes. =============Part 1: flickering================ To record the flickering, the camera shutter speed is set below 1/1000. In the below tests, the monitor flickers. The flickering is not perceptible to human eyes but is very prone to eye strain compared to the traditional DC dimming or high-frequency PWM dimming. Due to OLED's physical properties, changing the current intensity alone will impact both the brightness and the color accuracy. OLED manufacturers have to use PWM combined with their analog algorithm, aka "emulated DC diming", to display color with moderate brightness control. But OLED still flickers due to the imperfect hybrid implementation. In this case, AW3423DW is trying to use emulated DC dimming but ends up making a worse result. The flickering frequency is the same as the monitor refresh rate. The frequency is low. To make things worse, due to the lack of a polarizing layer, it needs to be used with dim ambient light; due to the ABL, its brightness fluctuates. In the particular video, every parry comes with ABL though the camera doesn't show it clearly. Eye strain can happen very quickly in scenes where brightness fluctuates even if the overall brightness is less than 400nits. The combination of these is commercially in a grey area where whether or not it results in eye damage in long-term use. In general, the flicker is not healthy for the eyes, especially in a dim environment. The package, the manual, and the Dell website only describe "flicker-free" as far as one of Dell's product features without any indication of a flicker-free TÜV certification. There is a TÜV certification on Certipedia stating this model was certified for flicker-free. From the description, the panel is specifically mentioned as a flat panel. It can be an early model. The market trick is that Dell can still trademark their product features as ComfortView that includes only low blue light TÜV certification. I don't recommend this monitor for the long intensive daily drive if a gamer only uses one monitor in a basement for 3 years. If you have multiple monitors and tend to replace them every year, this monitor should be probably fine. =============Part 2: HDR================ Now I talk about HDR: HDR 1000 videos are from The Spears and Munsil UHD HDR Benchmark. This comparison needs at least two exposure settings for accurate HDR comparison in SDR pictures. This is how to compare HDR in SDR mode. The exposure is set at ISO 100, shutter speed 1/125, and ISO 100, shutter speed 1/25. The middle monitor shows the reference luminance level. When the curves are flattened at a top level, that level is where 1000nits is. In the pictures with 1/125 shutter speed, details are preserved on both monitors. In the pictures with 1/25 shutter speed, the details are not preserved but have more pronounced brightness in SDR. The difference regarding relative brightness is represented at both settings though a true HDR 1000 monitor appeared to be overexposed in 1/25 while AW3423DW appears to be dim in 1/125. I expect you can see it in the comparison to know that a true HDR 1000 monitor delivers 2x-3x more luminance, aka more contrast to the eyes, than AW3423DW in some high APL 800nits scenes without losing any details or causing distracting blooming. Therefore, a true HDR monitor delivers more realistic images. In HDR (also in SDR), except fast pixel response, AW3423DW is not at the same level as a true HDR 1000 monitor due to ABL. AW32423DW only looks the same when it displays small window size with a large black background. Also, in these high APL HDR scenes blooming is not noticeable because central object is emitting 1000nits luminance, making edge blooming unnoticeable to the eyes, even to the camera. In average HDR, blooming is not noticeable either unless you actively search for it. I have to warn you most people don't know HDR very well. They also don't have both monitors to compare but low brightness OLEDs. I suggest you don't listen to people who don't know how to use a true HDR monitor but a mere 200nits full-field OLED TV. OLED is always a mid-tier monitor and it is going to be a mid-tier compared to the FALD LCD. Its brightness is struggling. So does its contrast. I mean what I say: The contrast of OLED loses to FALD LCD if the brightness is not enough. The infinite (x/0) contrast of OLED is not the true contrast due to the compromise of brightness. The higher the brightness goes without rising the black level, the more contrast the monitor displays. And FALD LCD has more contrast in this regard. It's also why the premium/flagship product out there is always FALD LCD. Most people don't understand it. In some cases, a true HDR monitor with its caliber in SDR 400nits can look even better than AW3423DW in HDR when ABL kicks the OLED below sub 400nits. Ture HDR 1000 vs AW3423DW HDR YCbCr SDR 400 vs AW3423DW HDR =============CONSLUSION ================ The OLED or QD-OLED is struggling. Brightness is not enough. And PWM fatigue is more severe because the OLED is trying to use emulated DC dimming but end up making a worse result. The flickers frequency is the same as the monitor refresh rate. The frequency is low. The flicker is not healthy to the eyes especially under a dim environment. My eyes become rather irritated when looking at the monitor. It's not as comfortable compared to other true HDR 1000 monitors even though other monitors are much brighter. The traditional DC dimming or high frequency PWM won't have the problem and it is safe to use for a long time. And the HDR monitor is going to hit 10,000nits for image quality. The comparison of the latest QD-OLED vs a 4-year-old FALD 512-zone true HDR 1000 monitor is made to prove this point. In order to archive a high-level HDR performance, OLED has to deal with flickering and brightness one way or another. But that won't happen very soon. So I suggest you don't buy it. You wait. This monitor won't last even for a short pierid of time considering a better FALD LCD is becoming much cheaper.
-
Hi, hoping someone has some insight here- the display is a Dell G2724D. When HDR is turned on, the text/iconography on the screen looks great- properly scaled, free of artifacting etc. But when HDR is disabled the image looks oversharpened/aliased, with kind of outlines around text and pixellated icons. Any ideas as to what could be happening? Some games have poor HDR implementation with vulkan so I'd like to be able to disable it to play them without sacrificing all the image quality :')
-
Hi All, TL;DR Do either the MKV container or MP4 container support 4K AV1 content with HDR10? I've been having issues getting it to work. More Details: Recently I've been backing up my own and my father's Blu Ray collections which consist of both regular Blu Rays and 4K/HDR Blu Rays. I haven't had any trouble with the regular Blu Rays yet, but lately I've noticed some strange issues with the 4K HDR ones. For some context, I recently got an AMD RX 7800XT which has support for both AV1 decoding and encoding. I decided I should try to compress the 80GB 4K Blu Rays using AV1 because why not! The movie I used as a test subject was my Dune 4K Blu Ray, which contained both Dolby Vision metadata and HDR10 metadata according to dovi_tools The 4K Blu Rays are encoded using HEVC/H265 which my graphics card should support for decoding, and starting with the 1.7.x builds of Handbrake, they also support AMF AV1 encoding so I gave that a try. I found out that Handbrake doesn't support AMF decoding at all, so I was limited to around 40fps by the CPU decoding. This wasn't that big of a problem personally, but still kinda irritating. So I used the FFMPEG command line executable since they support AMF decoding through both the existing DirectX 11 Video Acceleration API and the new Vulkan Video API. This did technically work, and I was now able to transcode at closer to 80fps! Although when I went back to play the 4K AV1 MP4s, VLC wasn't tone mapping the HDR properly and everything was washed out. Strange. I checked that the original media was tone mapped properly by VLC, which it was (not desaturated at least). I did a bit of research and found out that some people were reporting that the MP4 container doesn't support all of the metadata needed for HDR (which I eventually proved false... Sort of) so I retired the transcoding but now into an MKV file. Same problem, although the HDR10 metadata was being detected by VLC, just without tone mapping for some reason. So the AV1 MP4 didn't have the proper metadata and wasn't getting tone mapped, and the MKV had the correct metadata but still wasn't being tone mapped. I re-tried the same steps, but instead of AV1, I used HEVC so basically I was just compressing the original by reducing the bitrate. That time, both as an MP4 and MKV, the HDR10 metadata and tone mapping worked properly in VLC. I was able to use dovi_tools again to verify that HDR10 metadata was present in at least the AV1 MKV as well as the HEVC MKV and MP4, so I'm beginning to think it might be a problem with VLC. I tried using Handbrake anyway to create the 4K AV1 files instead of just FFMPEG and confirmed the same exact behavior; metadata was present for both codecs in and MKV container, but VLC didn't tonemap the AV1 video properly. I made sure VLC was updated, same with FFMPEG and Handbrake. Any insights would be welcome
-
I can't seem to find a way to force the Steam Deck OLED out of HDR mode. CrossCode is a great game but it really suffers from oversaturation of colors when running in HDR. This is relatively easy to fix in Windows by just turning off HDR in the OS settings, but I can't find a way to do that on the Steam Deck. Streaming the game from my PC works, but can't exactly do this when traveling... I've tried to find commands to do this in steam's launch options but haven't had any luck.
-
I have a Lenovo IdeaPad 520 (i5-7200U, 8GB, NVidia 940MX 4GB) Installed Windows 11 23H2 last week. I did not noticed before but now while verifying if everything installed after the upgrade, I noticed the display colour depth is set to 6-bit in the Advanced Display Setting Window. Under the List All Modes it shows the display is capable of 8-bit (True Colour 32-bit). I have installed the Latest Graphic Driver from Intel Website, it's still the same. Cannot change the colour depth thru Intel Graphics Command Center. But when uninstalled all the intel graphics driver, Windows Basic Display Adapter installed automatically. Then it shows 8-bit. But, I cannot control brightness or play HDR videos with that driver. Please find me a way to set the display to 8-bit. You know the difference between 6-bit and 8-bit, right? 262,144 VS 16,777,216. That's a lot of numbers (colours). Also on the HDR Window it shows the display can stream HDR Videos. Shows Colour Space is SDR on the Advanced Display Setting Window. Please Help me out! On the following post the it shows 8bit inspite the display is 6bit.
- 5 replies
-
- bit depth
- windows bit depth
-
(and 4 more)
Tagged with:
-
There is a strange new Netflix related issue happening on my PC in the past few weeks. Issue When I am streaming SDR contents on Netflix Windows app in fullscreen, every time the UI/subtitle comes up and fade out, the screen goes black for a few seconds while the audio continue. This cause Netflix to be unusable in full screen for SDR contents. SDR content playing in windowed mode seems working fine. I have tried HDR contents on Netflix and it seems working fine both windowed and fullscreen. Local or Youtube HDR/SDR contents (with and without subtitle) have no such issue. More context Since early this year, I got a miniLED monitor, and start watching HDR contents regularly. But since Windows 11 HDR does not auto handle HDR content very well, I turned on HDR in Windows setting 100% of the time. Before the above issue, there is another strange behavior when I fullscreen Netflix video, screen will first go dark a few seconds before running properly. I solved that by following this post on reddit and changed my non main monitor to "Content Type Reported to the Display" to "Desktop Programs". But I don't know if it's Netflix update/Windows update/Nvidia driver update recently, the issue changed to what I mentioned above. Setup i5-13400F 32GB DDR4@3600MHz RTX 3080 10GB Windows 11 23H2, with all available Windows updates installed Nvidia driver 546.33 (also happens with the 2 official versions before) Main monitor: INNOCN 27M2V connected with Display Port, running at 4K 144Hz, 10bit, HDR on, G-sync on, HDCP 2.2 supported (I want Netflix to play fullscreen here) 2nd monitor: ASUS VG259QM connected with Display Port, running at 1080p 280Hz, 8bit, HDR off, G-sync on, HDCP 2.2 supported 3rd monitor: Dell U2414H connected with HDMI, running vertically at 1080p 60Hz, 8bit, no HDR, no G-sync, no HDCP 2.2 support, set "Content Type Reported to the Display" to "Desktop Programs" I know base on https://nvidia.custhelp.com/app/answers/detail/a_id/4583/~/4k-uhd-netflix-content-on-nvidia-gpus Netflix 4K is not streaming because my 3rd monitor being not HDCP 2.2 supported, but in 27" 1080p HDR is still bearable.
-
Looking for recommendations for a, QHD minimum, 32" monitor with 165hz, or higher, refresh rate. Prefer 16x9 as I run two side monitors as well but this is to replace my center main display. VESA HDR400 minimum, but prefer HDR600. I am fond of Samsung's QLED displays so have been looking at the Samsung Odyssey G7 (c32g75t) ($550 usd @ my local microcenter). I know this is the 'old' model and that the NEO G7 has mini-LED instead of edge lighting, but the price jump is a bit too much. Are there any comparably priced mini-led monitors that carry these specs or just better monitors in general around that price range?
-
Hi everyone. I've been having issues with some games in HDR, namely Battlefield 2042 and Divinity Original Sins 2 (Definitive Edition). The issue does not appear in HDR mode in Assassin's Creed Odyssey. When I launch the game in HDR mode (which can be toggled in BF but not in Divinity), the colors are washed out as if there were a coat of grey above the actual colors. I have the issue both in fullscreen and borderless. BUT when I'm in borderless and Alt+Tab and focus on another app on my second monitor, the game looks like it should in HDR (see file). I have tried win+alt+b to toggle HDR on and off, (in and out of games, while the games are running and while they're closed, with and without reboot for each step). I have tried every combination available of color config in the Nvidia control panel (RGB, YCbCr422, YCbCr444, 8 and 10 bpc). I have tried updating my graphic card BIOS, my motherboard BIOS (can't hurt), reinstalled the games, reinstalled Nvidia drivers (and Nvidia experience). My main monitor is hooked up with a display port, I tried plugging it via HDMI but the issue remains. The issue appeared in Battlefield 2042 as of update 4.2.0 (april 2023), and I can't say exactly for Divinity (started playing in august 2023, and have had the issue since). I really don't understand the issue, thank you for your time and thank you in advance for any help. Config : Displays : Main : https://www.acer.com/ac/en/GB/content/predator-model/UM.HX3EE.P13 Secondary : https://eu.aoc.com/en/gaming/products/monitors/ag241qg AMD R9 3900X Asus ROG STRIX X570-F GAMING 32 GB @3600 Mhz (DDR4 Ballistix Sport LT) RTX 3080 Gygabite Vision TP-Link Archer T4E Also some other stuff like storage and so one Drivers are up to date Windows 10, latest update as of this post. PXL_20231124_122548673~3.mp4
-
Hello Friends, I am looking to upgrade my laptop's panel (my laptop is Lenovo ThinkBook 15 IML 20RW/ my current panel is NT156FHM-N61) and i have found two of best options are 1-M156NVF6 R1 has 1000 Nits and 2000:1 Contrast ratio and 71% NTSC 100%sRGB 74% DCI-P3 2-N156HCA-GA4 has 500 Nits and 1000:1 Contrast ratio but 96% NTSC 100%sRGB 100% DCI-P3 and Anti-glare and they are both 1080p IPS and 60hz and there also other options that are in between such has having 1500:0 contrast ratio but still only 100%sRGB and 470 nits And I am looking for the panel that provide most enjoyable/engaging visual experience for web browsing and entertainment such Youtube videos and Gaming And just to note I had very mixed views from people I have asked them about this topic Give me your thoughts friends to make the final purchase decision Thank for Reading
- 4 replies
-
- panel replacement
- panel
-
(and 3 more)
Tagged with:
-
Hello Monitors enthusiasts, So I was comparing two monitors specs to see which had the better looking/more enjoyable visual experience . The First one(M156NVF6 R1) was 1000 Nits and 2000:1 Contrast ratio and 71% NTSC 100%sRGB 74% Adobe RGB 74% DCI-P3 not sure if it has matte finish The Second one (N156HCA-GA4) was 500 Nits and 1000:1 Contrast ratio but 96% NTSC 100%sRGB 88% Adobe RGB 100% DCI-P3 and has matte finish and anti-glare And both were IPS and 60Hz, So do you think having higher brightness and contrast ratio will beat a better/wider colour gamut coverage? my use is mostly web browsing,entertainment such youtube,and video games(potentially that support HDR) Give me your thoughts and thanks for Reading
-
Hello all, unfortunately ever since I updated to Nvidia game ready driver version 545.84, the colors on my LG Ultragear 32GN63T-B gaming monitor appear washed out with HDR enabled. [Edit] These washed out colors are present throughout Windows, from the desktop to applications. [Edit] Steps I have taken: I have contacted Nvidia support who have: Had me use the Nvidia control panel to change the output color format from RGB to YCbCr 444 and YCbCr 422, unfortunately with no success to my issue. Walked me through rolling back to the previous driver, version 537.58, unfortunately with no success. Had me use the 'Nvidia Clean-up Tool' to clean install driver version 537.58, unfortunately with no success. Had me submit a msinfo32 report (which I have attached) I have factory reset and fresh installed windows 11, and upon installing the latest drivers and enabling HDR, the issue is still present. Right now, the only solution to my problem is to disable HDR. I understand this is not a simple issue and all help would be greatly appreciated. System Specs: https://pcpartpicker.com/user/Torsten123/saved/c2bY99 MSInfo.nfo
-
Hey guys. I've got a Samsung UA55AU7700KLXL 4K TV running Tizen 2141.2 as it's OS (More info in the photo attached). I've been using prime video in this TV for a year or so and it's been all fine until a few weeks ago. Whenever I wanna stream 4K HDR movies in prime video, it keeps switching to HD HDR and sometimes just HDR while the movie's playing. I can confirm that neither my internet nor my router is the issue as the TV says the speed is well around 150Mbps (Photo attached of the speedtest). I'd appreciate for someone to help me out with this cuz it's been bugging me a bit too much recently. Thanks in advance :)
-
Hi I looking to buy a used tv. I went to the seller and did the 4k pixel test and I used video from YouTube "Check Your Screen For Dead/Stuck pixels (up to 4k UHD) PixelFixel" no dead or stuck pixels all the colors looked fine except black one didn't look right. Here is the image.
-
Hows it going everyone? So I have an Asus Zenbook 14 OLED q409za. Here's the link to the ASUS product page. I bought it as an open box item and it had burn in on the screen from when it was a demo. The store shipped it off and they replaced the screen, motherboard, and SSD. Well the laptop came with a 2880x1800 resolution screen 90hz refresh rate, but it now has a 3840x2400 60hz screen. I am not sure why the screen is different now. The problem is playing 4k content or even hdr content. I cant get any higher than 1080p on prime video and disney+. I tried some of my purchased 4k movies on youtube and the best i get is 480p. I have tried the cyberlink ultra hd blu ray advisor and it says that I do not have intel SGX, advanced protection audio/video path (gpu). I have tried to find sgx in the bios but cannot find it as an option. I have tried the intel sgx tool it says it cannot enable/activate it. I have gone through all of the windows updates, intel updates, and asus updates. I am at a loss what to do or try. Willing to try basically anything. Intel Iris Xe Graphics (Alder Lake-P 682 GT2) - Integrated Graphics Controller [ASUS] Alder Lake-P GT2 Intel Alder Lake-P PCH Intel Core i5-1240P UX3402ZA.310 uefi bios windows 11 64bit Whatever info you need or want let me know. Thanks in advance.