Glenwing

Moderator
  • Content count

    13,377
  • Joined

  • Last visited

Reputation

  • check
    Agree 716
  • info_outline
    Informative 491
  • tag_faces
    Funny 109
  • thumb_up
    Thumbs Up 480
  • thumb_up
    Likes (Old) 8014

Awards


About Glenwing

  • Title
    Mostly Dead

Profile Information

  • Gender
    Not Telling
  1. HDR10 is a standard, not a spec. The HDR10 standard requires displays to have several capabilities, including the actual HDR part, plus some other things like 10 bpc color depth and recognition of certain color spaces. Color depth isn't related to HDR, it's just an additional requirement that that particular standard requires displays to have, in addition to the HDR capability also required by the standard. The actual HDR part of the HDR10 standard is some specific brightness/contrast capabilities and a special gamma function to properly utilize it.
  2. RGB or YCbCr 4:4:4 are the two full color modes. RGB is preferable as that's the native pixel format for computers. A lot of TVs when they detect RGB format will automatically turn off a lot of image processing crap that can interfere with computers displaying properly.
  3. Just a note, the "4K" notation refers to the width in pixels, so "4K but wider" would be 5K. 5120×2160 is a 5K resolution, 6880×2880 would be considered a 7K resolution by most.
  4. Yes
  5. Using a DP to HDMI 2.0 adapter should allow you to use full color.
  6. No, and even if you managed to find one, the ground pin would be only decorative. There are only two pins on the C7 connector, it does not have a ground pin, so there is no place for a ground pin on the AC side to be connected to. That type of plug is an ungrounded connection by design. You would only have a ground pin if the device used the C5 connector, which is the grounded version of the C7.
  7. I know how upscaling works, but the loss in sharpness isn't really that much on a 4K screen.
  8. Not really. At that high of a resolution it can reproduce pixels quite well. Personally I can't tell the difference between 1440p and 4K in games on a 4K screen other than the UI size. It also depends on how the display handles upscaling though.
  9. At 4K? Not possible, 4K 60 Hz maxes out the bandwidth of an HDMI 2.0 connection. If it supports 120 Hz it's only going to be at lower resolutions like 1080p.
  10. montior

    Not many budget 144 Hz monitors with slim bezels. Go for the XG2401 if you're willing to sacrifice that.
  11. Ah, missed that post.
  12. Downscaling from a resolution higher than the display will always look blurry.
  13. You restart after?
  14. So uh, I dunno what's going on with the math in this thread... If you have a PC running for 6 hours a day for a year, that's 6 × 365 = 2,190 hours of running per year. Consuming 500 W of power while running means it's consuming energy at a rate of 500 watt-hours per hour, so for 2,190 hours that would be 500 Wh/h × 2,190 h = 1,095,000 Wh of energy used (or 1,095 kWh) over the course of the year. If each kWh costs $0.13 and you have 1,095 of them, then the total cost would be $0.13 × 1,095 = $142.35 per year. I don't know why there's so much division in the OP, you shouldn't need anything but multiplication. If you're running 24 hours a day as miners do rather than 6, it's four times as much. At $0.13 per kWh, it costs you $113.88 per year per 100 W of power your system uses. If you're mining then electrical costs will likely be higher, since usually you start paying at a higher rate if you have high power usage. If you can get a card that uses 100 W less, you'll save hundreds every year running 24/7. It shouldn't take too long to pay for itself.
  15. WIN+R -> "msconfig" -> Boot tab -> Advanced options -> make sure "Number of processors" is unchecked.