• Content count

  • Joined

  • Last visited


  • check
    Agree 574
  • info_outline
    Informative 387
  • tag_faces
    Funny 100
  • thumb_up
    Thumbs Up 399
  • thumb_up
    Likes (Old) 8014


About Glenwing

  • Title
    Mostly Dead

Profile Information

  • Gender
    Not Telling
  1. The waste is the difference in price, so it does matter if it's free or not. If you have two monitors that are exactly the same, except one has extra capabilities, you probably don't want to pay more for that one if you aren't going to use those capabilities. It would be a waste of however much more expensive it is, if it's $200 vs $250 it would be a waste of $50. But if it is the same price, that is a $0 waste. And in that case it would be sort of a waste not to get it, you get more stuff for the same amount of money, even if you don't plan to use it, if there is no money to be saved then you gain nothing by abandoning it, but you lose future flexibility for no reason at all. FreeSync monitors, from what I've seen, are pretty much the same price as equivalent non-FreeSync monitors. Maybe the first few FreeSync monitors ever had a small premium for a few months, and you might still see some of that here and there, but the vast majority of newer FreeSync monitors don't even have any direct non-FreeSync versions made, because they would either have to charge the same price which would be pointless, or take a lower profit margin, because removing FreeSync doesn't lower the production cost since it doesn't add any cost in the first place. I get that the instinctive feeling of not fully utilizing something you pay for doesn't feel right, but let's put it this way, if you are buying some high-end software, and you see that for some reason the basic version and the premium version are both $50, but you aren't going to use any of the capabilities that the premium version adds, does it really make sense to say that the buying the premium version would be a waste because you would be paying for features you aren't going to use, and go buy the basic version instead?
  2. It's a measure of the pixel area, it doesn't include the borders of the screen, which will be different on different monitors. If you want to match up two monitors in height though, the pixel area is what you'll actually care about when using the computer, not matching the height of the frame. In addition, look for more detailed descriptions than "24 inches" with no decimal places. The label 24" is used for anything from 23.6" to 24.1". Most 24-inch 16:9 monitors are 23.8" (for IPS monitors) or 24.0" (for TN monitors).
  3. That's what I recommend, but if people are going to use shorthand they may as well use it correctly.
  4. NVIDIA manufactures GPUs, not graphics cards (generally speaking). ASUS, EVGA etc manufacture graphics cards which have NVIDIA GPUs on them, plus other stuff (memory, power delivery components, and such). Sometimes NVIDIA does manufacture a whole graphics card, like various Titan cards, in which case they are indeed just sold as "NVIDIA GeForce XYZ", purchased from the NVIDIA website. Sometimes you can also find NVIDIA-branded reference cards sold at brick and mortar stores like Best Buy. It's the same reason you see ASUS motherboards and MSI motherboards that use an Intel CPU, you don't expect to just see "Intel motherboards", because Intel doesn't make the whole board, they just make the CPU.
  5. Please don't bump old threads to ask a new question, create a new thread.
  6. The PB278Q is not a 144 Hz monitor, it came out in 2012 which is long before 144 Hz IPS tech even existed.
  7. It's just another CPU. 1800X is better than the 1800 which is better than the 1700. But there is no special relationship between the 1800 and 1800X compared to the 1700. The X is just creating some in-between model numbers. It doesn't mean anything in particular, just that it is a higher model number than the non-X.
  8. Did you switch the inputs on the monitor?
  9. It's not something widely tested (and hard to test beyond just eyeballing it). There isn't a huge amount of variation between monitors, though there is some. I don't know if any particular manufacturer or series has better scaling than others. Should be fine, scaling isn't really a huge deal for gaming/watching content. The interpolation makes document-size text much harder to read but it doesn't have much effect on images.
  10. Yes, almost all displays have the ability to scale images if the resolution is lower than the monitor's resolution. Some monitors are better at it than others. On computers, scaling is usually handled by the GPU and is therefore independent of the monitor. You can turn off GPU scaling in your graphics card's control suite but the default is usually to use GPU scaling.
  11. If that's how it looks in real life it looks pretty normal to me.
  12. ShadowPlay has existed since long before Maxwell was even released. It's supported on 600 series cards and above (650 Ti Boost and higher).
  13. If you have both of the displays already, and you have the PS3, can't you just plug it into each one and see which one looks better to you?
  14. Use 8 bpc with 4:4:4.