Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

TheKDub

Member
  • Content Count

    12,881
  • Joined

  • Last visited

Everything posted by TheKDub

  1. That's very inaccurate. Many components are made in Taiwan, Japan, and Korea, including things like monitors, CPUs, GPUs, SSDs, and RAM. The Wuhan Coronavirus won't have much impact on PC component prices, if any, and if it did, it wouldn't last for long.
  2. I'd go with the 3900X regardless. The 9900K isn't worth the extra $80 (+ however much more for an adequate cooler, which will be probably at least another $90) for maybe a few extra FPS. Next gen Ryzen probably won't be here until late this year, so I wouldn't bother waiting unless you don't really need the PC until close to next year.
  3. What if it's supposed to be there to protect the circuitry? I wouldn't just go scrubbing/scraping off things without much reason (such as cleaning a specific area to fix it, or cleaning corrosion.)
  4. Not possible. Also, SLI and Crossfire aren't worth it (at least for gaming). So few games support it that it's not worth the added cost, and you might end up with more bugs or worse performance because of it as well, since some games react really poorly to SLI and/or Crossfire.
  5. It shouldn't take much force. Maybe as much as you'd use when writing with a pen or pencil, if that.
  6. Isopropyl alcohol if you do. It won't "dissolve" the dust, but it can help to clean it off. Make sure it sits out for a few hours to ensure all of the alcohol (and water) has evaporated fully before re-connecting to any power sources. Just get some canned air and blast the dust away. You can also use a soft bristled brush (think small paintbrush) to help loosen some stuff that's more caked in. I've also used Q-Tips with isopropyl alcohol to remove dust and remnants of spilled soda from electronics before.
  7. 3900X is $470. 9900K is $550 ($80 more). The 9900K is only slightly more powerful for single threaded applications, however the 3900X dominates it for anything capable of using more than 16 threads. If you overclock the 9900K, the gap grows a little more, but keep in mind you're looking at over 200W of power draw with it under heavy loads, and even higher if overclocked. The 3900X on the other hand tops out around 150W of power draw. With that added power draw, you're also going to have a ton more heat to deal with. $80 less for 50-100W less power consumption and better (multi-threaded) performance? The 3900X sounds like a great deal to me. I'd go with the 3900X without question.
  8. The Ryzen 7 2700X doesn't support PCIe 4.0. Don't worry about it, there won't be a performance difference between the GPU using PCIe 3.0 and PCIe 4.0. For the most part the only thing that can really take advantage of PCIe 4.0's speed is SSDs.
  9. If you're planning on using them for gaming, absolutely not. Crossfire (and SLI) are both essentially dead. Very few games still support it, and the performance gains when they do aren't worth the added component cost, heat output, and power consumption.
  10. The physical size of a monitor has no effect. All V-Sync does is caps your FPS so you don't get screen tearing or excessive load on your GPU (though sometimes you want your FPS as high as you can get it, that way you get the most recent frame you possibly could, which makes the game feel more responsive).
  11. Download the one specifically for your computer. You're probably gonna need a different computer to download it if you can't use the internet on the current one, and a flash drive to transfer over the drivers.
  12. Just noticed in that screenshot, and the first one. You have an A4-6300, not an FX-6300. The A4-6300 is a "dual core" CPU (Two cores in one physical module, which is why Task Manager shows 1 core but 2 logical processors.)
  13. Right click the graph, go to change graph to, then logical processors. Just took a closer look. Something's definitely fucked. try clearing CMOS?
  14. Well yeah. Rosewill Glacier is rated C+ Tier, and I'm pretty sure they don't have a track record for making decent quality units even at that. I'd suggest you go with something at least B+ tier for your replacement.
  15. That wouldn't matter. Double check the drive connections. Also take a look at what format the drive is. For Windows, NTFS or exFAT are the two you'd want to use. (Not FAT32, as that has a 4GB file size limit)
  16. Yep, they've been doing this for years. The best solution is to not use any of their garbage equipment in the first place and just buy your own.
  17. Without a 25mm fan, there would be 240mm of clearance for a GPU. If your GPU is 328mm, you have 12mm remaining. In theory, yes, you could put a 12mm fan in that gap, however you have to get it into that gap and attach it, and then hope there's just enough wiggle room that it's not being touched by the GPU. A subpar fan is still better than no fan, in most cases. "esport title with esport settings" is incredibly vague. Different games are harder to run than others. An RTX 2070 Super (or RX 5700XT) would likely do just fine for your purposes. The Gaming X Trio has a third fan, and has slightly better cooling. Both are fine though. I'd personally go with EVGA as their quality and support are top notch from what I've heard. (Or if you were to go with a 5700XT which performs roughly the same, and costs less, Sapphire would be the way to go. Either the Pulse or the Nitro+ {Nitro+ is better}).
  18. Why do you "need to get every letter dedicated to vars"?
  19. It's in the #minecraft-info channel on Discord.
  20. 37c is fine. As long as your load temps aren't going over 80c or so, there's not much reason to worry.
  21. Interesting. Maybe it's already doing 10-bit then, just not "HDR"?
  22. Not quite. The macbook would see what the monitor is capable of via that connection and then display whatever options are compatible with it. Most likely, yes. By the sound of it, it's not a huge difference between having "HDR" (not actual HDR) enabled or disabled on this monitor. Probably the biggest difference you'd see is sometimes smoother gradients.
  23. It's likely not even your GPU that's being a bottleneck. How much RAM are you allocating (I'd do about 4GB)? What version of MC are you using? Do you use Optifine (I'd recommend it if you don't)? How's your CPU usage on Minecraft? Is your FPS being limited in-game or in the Nvidia control panel? What's your render distance set to? Have you tried updating your graphics drivers? Make sure your system power setting is set to performance and not balanced or some trash like that.
  24. Don't worry too much about it. There are a ton of variables that play into how many FPS you get. I get anywhere from 100 to 600 FPS with my i7 8700K OC'd to 4.8GHz and GTX 970. As long as you're not dipping below your monitor's refresh rate, it's not that big of a deal, especially in something like Minecraft. (If this were competitive CS:GO or Overwatch or some other FPS where milliseconds can make or break a shot, then sure, but MC? Nah.)
×