Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

RTX2060Owner

Member
  • Content Count

    13
  • Joined

  • Last visited

Awards

This user doesn't have any awards

About RTX2060Owner

  • Title
    Newbie

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Do you think I can keep on using my monitor for the next 10 years and only ever upgrade the PC, then?
  2. But, say, 1680x1050 used to be an actual resolution, but nobody uses it anymore, and it seems a little bit weird, looking back at it. Will the same happen to 1440p?
  3. So it's Okay if I use 2560x1440p? It's not unorthodox because of movies not using it? Or weird? Or will I be looking at myself 20 years from now, thinking "Jesus Christ, was I really using such a weird resolution back then?"? P.S. Yeah I know some movies are on film, but you have to create a Digital Intemediate so the consumer can watch it
  4. Hey. I have a 1440p widescreen monitor. It's called MSI G27CQ4. I got it some 4 months ago, and I am quite satisfied. The 1440p resolution is definitely an upgrade over 1080p. Usually, I don't even need to keep apps (aside from games) full screen, as there are so many pixels on the screen that 2/3rds is more than enough to see everything I need. But come to think of it, this resolution is pretty "weird", in a way, Bear with me. Movies don't use it, they are either in 1080p (either the 1920 or the 2048 version) or 4K (again, either 3840 or 4096). While 1440p is a less than 2x of an
  5. So you think I am already running it in WCG? As in, I am fine? Steam says I am running 32 bits per pixel, does that mean WCG is enabled?
  6. Ah, Okay. I thought that if there were too few funds, the card would place the owner in debt, as opposed to straight up being rejected. Maybe that's different in the US. Or maybe she is using a card other than a credit one.
  7. Hey guys. My monitor is MSI G27CQ4. It doesn't have HDR, but I think it has Wide Colour Gamut. The label on the frame says so, so does MSI's page. When i go into Settings, System, Display, I cannot enable WCG. It says i cannot use apps in the WCG format. Before you ask, yes I have two different monitors connected to the same PC, but that's not what the problem stems from, as I've the correct one selected Is it possible to enable WCG on a monitor without HDR? Why does it say it has WCG if i can't turn it on? Or is it turned on automatically? What do you thin
  8. Why would more detail be bad? You mean artifacting? The 4K TV's are not actually mine, but my parents, and i thought i could avoid bothering them by connecting the Blu Ray drive to my monitor, instead :) I don't know if you read my post... I already have a PS3. As a matter of fact, i also have a dedicated 1080p Blu-Ray player, but i am not using it, since when connected to my monitor the image had some awful pink tint, while the PS3 looks totally fine, for some reason.
  9. I am specifically talking about movies stored on discs, as in not Netflix or anything. But you can also say something about streamed movies, if you want to. I have been thinking about purchasing a 4K Blu-Ray player, so that i can watch my favourite movies in high definition (as opposed to DVD's, which have low resolution, or streaming, which is compressed and usually doesn't offer 4K resolution, either). I have heard really good things about "the Lord of the Rings" (my childhood movies) in 4K, so i am really excited about it. My monitor is 2560x1440p, with no HDR
  10. Guys i don't know why this conversation strayed to 4K. I am obviously aware that my GPU is quite insufficient for 4K. And that's fine, because it wasn't meant for 4K. Nvidia may have marketed it as a 4K capable card, but anyone with a brain could tell it wasn't. I believe that the 2080ti averages 50 FPS in RDR2 on max settings, 4K. So naturally, a card with a -60 suffix isn't meant for that resolution. I just wanted to say his reaction was definitely overblown. This is a really decent card for widescreen 1080p and 1440p. So he definitely went harsh on it. This isn't the
  11. Sorry, i worded myself incorrectly. Yes, the 2060 is bad for 4K gaming with newer titles (unless you are fine with 30 FPS, then it's great). However, it's obviously not intended for that resolution at that price point. That's like comparing a laptop that's intended for office work to a gaming desktop
  12. Hey, i am new here! In this video, Linus heavily criticises the RTX 2060. He mocks it by saying it's a crap card for 4K and basically a waste of money. Why? I own it and it's a pretty good card, in my opinion. I game in 2560x1440p, and i am able to maintain very good FPS. I play RDR2 in a mixture of medium and high settings, and i am hitting 60-70 FPS. With lighter titles like Pillars of Eternity, CS:GO, Grim Dawn etc. i am hitting 100-165 FPS (my refresh rate) either at Ultra or on a mix of Medium and High. The Witcher 3 runs in 120-140 FPS with all post processing
×