Jump to content
Search In
  • More options...
Find results that contain...
Find results in...


  • Content Count

  • Joined

  • Last visited


This user doesn't have any awards

About Rdpruitt

  • Title


  • CPU
    Intel Core i9-9900k
  • Motherboard
    Gigabyte Z390 Designare
  • RAM
    Corsair Vengeance Pro 32gb 3200mhz
  • GPU
    Gigabyte RTX 2080 Ti Gaming OC
  • Case
    Corsair 500D SE
  • Storage
    Samsung 970 Evo 2TB M.2 + 8TB HDD
  • PSU
    Corsair AX 860
  • Display(s)
    Garbage(waiting for BFGD to upgrade)
  • Cooling
    Corsair H115i Platinum
  • Keyboard
    Corsair K95 Platinum
  • Mouse
    Corsair Dark Core SE
  • Operating System
    Windows 10 Pro 64
  • PCPartPicker URL

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Doesn’t...doesn’t everyone use that? It’s a perfectly useable word, and easily gets OPs point across.
  2. Well, technically, my card never gave out it was just super slow. Triple A games at 720 low could maybe, maybe, get upper teen, lower 20 fps. The worst part was when I bought big games like monster hunter world when they went on sale, knowing full well I couldn’t play it until I got a better gpu. Had that in my library for months before I could play it edit: and your 550 ti was way more than double the performance of my 5570. So take that. My card sucked more
  3. I’m seeing all these people saying that’s it’s time to upgrade their 1060s/1080s to a 20 or 30 series card, while I’m sitting here, having upgraded my only pc from an AMD 5570 to a 2080 ti last year of course that’s an extreme, but I don’t think every generation or every other generation is worth upgrading from. How often do you guys upgrade?
  4. Oooooohhhhhhh, i guess i never realized the "brought to you buy". That makes so much sense. Now i feel dumb lol
  5. This has confused me for a while, and ive always forgotten to ask, and finally remembered today. In the TechLinked channel, in nearly every video, the host says "and now time for the quick bits" and then give the sponser of the video. But then after the sponser, they say "time for quick bits" again! im so confused. i know its a stupid post but whatever
  6. So awhile ago I heard about something called Nvidia Optimus in laptops with Nvidia cards. How can I tell if a laptop i want will have this feature? Or do all Nvidia cards automatically have it? and does it only work with Intel CPUs, or do AMD ones work as well? Speaking of AMD, do their cards have a feature similar to this?
  7. banned for not giving the specs of the computer your trapped in
  8. Thanks for the info. With 10-bit color, uncompressed, DP 1.4 can only 4k 98hz, right?
  9. So after all this time of having DP 1.4, this really is the first monitor with it? And can GeForce cards use it?
  10. So Asus announced a 4k/120hz/Freesync 2/600 HDR/43" Monitor at CES, the Strix XG438Q. Recently, Asus announced a 4k/144hz/Freesync 2/1000 HDR/43" probably because of the unexpected competition from Acer. Physically, it looks just like the XG438Q. We don't know if this is a premium version, or if the base version has been upgraded. The thing that confuses me and what my question is about, is that Asus said that the newly announced 144/1000 nit version is the first monitor ever to have DSC. And this website (https://www.overclock3d.net/news/gpu_displays/asus_rog_and_amd_reveals_the_w
  11. Sorry. I forgot to write my question. It's leaked that this will cost around $1299. That seems like a pretty good price when compared to the Asus pg27uq. The only real advantage that the pg27uq has is g-sync. Why is the Acer cg437kp so 'cheap'? Is there any drawbacks I haven't noticed?
  12. This “monitor” looks perfect! I’ve been waiting for the xg438q from asus, which is 4K, 120hz, HDR 600, and freesync 2. The Acer cg437kp is 4k, 144hz, HDR 1000, and has freesync. The Acer one seems better in every way, AND it's (supposedly) cheaper than the estimated price of the xg438q. The only downsides I can see are that it doesn't have wall mounting, and it only has normal freesync, not freesync 2, which doesn't matter for me because I have a Nvidia card.( Nvidia cards can't take advantage of freesync 2, right? If you use a freesync 2 monitor, it won't be any different than a normal
  13. So, I've been seeing things everywhere about the 'raytraced Minecraft'. And everyone arguing that it's path tracing, not RTX. Even though it isn't part of the official RTX brand, would these path tracing games/shaders utilise the RT cores in a RTX card?
  14. Honestly, the notch isn't that big of a deal. I have an XR and thought the notch would get anoying, especially while playing games or watching videos, but it's really no problem.