Jump to content

dmtzcain

Member
  • Posts

    7
  • Joined

  • Last visited

Awards

This user doesn't have any awards

dmtzcain's Achievements

  1. Depends. If you can push over 120FPS at 4k. 144hz config and with (if the game supports it) HDR as I mentioned before is best.
  2. Hope that worked. Cheers!
  3. Right click on desktop and go to NVIDIA control panel. Under Display > Change resolution, you should be able to see the refresh rate. Change it there. Hit save. If you do not see 144hz, in 3. Apply the following settings. > mark "Use NVIDIA color settings. In Output color format select YCbCr422. Hit Apply. Now you should be able to see 144hz. Select and apply. Also right click the desktop again and go to Display Settings. Turn on "Play HDR games and apps" to take advantage of HDR if you want.
  4. Right, High Dynamic Range in photography has been around for a while now. Interestingly though, what you guys are mentioning is quite different than what HDR displays could offer. You both are right that smartHDR for iPhones, as well as most HDR for photography, use their sensors to get information on the darkest shadows and the brightest lights. Then they "flatten" the image to display an image that is less dark and less bright but with the details of both extremes. ("mapping into a standard dynamic range") This is called HDR, yes, though it is very different from what happens in video and games, where the contrast values are expanded not contracted. All of this is very confusing. I just found out that Nvidia can capture HDR images (not photography HDR, but HDR10) and they actually use the OpenEXR format I mentioned before. So the image is not brought into standard dynamic range but keeps the super brights and super darks.
  5. HDR Monitors increase the amount of light (nits) and/or decrease the shadows. To prevent color banding, sometimes 8-bit color space has a lookout table or the monitor is 10-bit. So far this is my understanding and I am able to see the increased contrast and vibrant colors of HDR in Video and Games. But what about still images? I did some research and found here that some image formats - OpenEXR, Radiance (.HDR), DirectDraw Surface (.DDS), and JPEG XR - can display the increased contrast and wide color gamut in still images. Web browsers can display HDR video, but they don't seem to support these file formats nor take advantage of the increased nits & WCG in images that could potentially have this information, for example, a screenshot of an HDR video or game. What gives? Balubish Tech in this video shows that just by encoding a video in HDR, there is increased nits and shadows (contrast), even if the source was SDR. If he is right or wrong, I do not care, but it got me thinking that it is the image encoding/format what is needed to make HDR images. Even though it seems easy to achieve that encoding in Premiere Pro, it does not seem to be as straight forward or supported in Photoshop. Furthermore, wouldn't any 16-bit RAW photograph be enough to encode/display a wide color gamut? Then the only thing the image would need is the peak brightness. Why do we have HDR video and Games but no HDR still images? Cheers!
  6. Thanks, everyone for replying! It seems that some of you share the opinion that for the roughly same price, upgrading to the next tier CPU/GPU or even RAM yields more performance than investing in overclockable (intel)Motherboard and fancy cooling. I agree also with the people that said that if it is already possible to overclock, why not just do it in the case of AMD systems, GPUs and if the Intel hardware you own allows it. Thanks, everyone! D
  7. To me, it seems that with the savings from cooling and an overclockable motherboard when building a low to mid-tier PC I might upgrade to the next tier processor or GPU. So is overclocking worth only for high-end hardware? Is the gain for an upgrade in CPU/GPU higher than overclocking? What are your thoughts? Thanks!
×