Jump to content

maximumzero

Member
  • Posts

    46
  • Joined

  • Last visited

Awards

This user doesn't have any awards

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

maximumzero's Achievements

  1. Yeah, curious how this turned out in the end? Don't think I'd want to risk my GPU, but I'm intrigued nonetheless.
  2. Keeping mind here that the 16GB 3600/16 is the "old" RAM, and the 32GB 3200/16 is the "new" RAM, are you still recommending sticking with the old? I actually don't think either kit is on my motherboard's QVL list, to be fair.
  3. Hey guys & gals, Back in 2020 when I built my current PC I picked out a Ryzen 5 3600 to power it and multiple resources recommended DDR4-3600 CL16 RAM as the "sweet spot" option, so I went with a 16GB kit (2x8GB) and dropped it into my motherboard. Keep in mind I'm not doing any sort of overclocking outside of picking out the proper XMP speed in my BIOS. Fast forward to 2022 and I've recently upgraded to the Ryzen 7 5800X3D. I've heard through the proverbial grapevine that the 3D's extra cache memory makes it less reliant on system RAM speed & latency. I have a 32GB kit of DDR-3200 CL16 RAM left over from another project and I'm thinking of replacing my current RAM with it. Would I see any performance decrease in going from my 16GB kit of 3600/CL16 RAM to 32GB of 3200/CL16 RAM, or should I just stick with the 3600 RAM for now and wait it out to upgrade to 32GB of that same speed in the future? Thanks.
  4. Unfortunately that one is a bit too long to fit in my current PC case, though I am eyeballing a Powercolor Red Dragon 6800 XT over on Amazon for $554.99. I was concerned that the Red Dragon was Powercolor's budget line, but after a bit of research I found out it's they mid-range line, which I am totes okay with. Still weighing that, the RX 6800 version of the Red Dragon, or just going used as mentioned in the opening post.
  5. Found an image that's similar to my situation for reference. Many thanks for your input, guys and/or gals. I've only seen it get up to the mid-70s in gaming, though obviously something like Cinebench just has it pegged at 90c continuously. There may still be a Dark Rock Pro 4 in my future, but I feel okay sticking with what I have for now.
  6. I'd like to pick up a RX 6800 XT for my PC, but with inventory more or less having dried up at this point, it looks like going used is my only real option. Problem is that I don't want to end up buying a GPU that was mined on, nor do I want to support anyone who was involved in mining. It's easy to dodge some users, as they straight up have actual mining hardware listed for sale, but others just have 10+ listings for GPUs, which screams to me "miner" and not "Gaming enthusiast that likes to collect them all." Unfortunately eBay doesn't allow you to outright block users, so I've been doing my best to screen those that I can.
  7. While replacing my R5 3600 with a R7 5800X3D recently I discovered that the surface of my Be Quiet Dark Rock Slim has a small nick in it, as if something had chipped it somehow. I initially thought it was a fleck of thermal paste that had dried up and stuck to the surface, but after liberal use of isopropyl alcohol it remained, and sure enough, when I ran a small toothpick over it I could feel it slightly “drop” into the chip. I’m not entirely sure how or when this happened, as I’ve only pulled it off a handful of times, primarily to upgrade the BIOS of friends and families PCs that I’ve assisted them in building. I’m assuming and hopefully that the thermal paste will “fill in” this chip and that it’s small enough that it won’t seriously affect performance. How concerned should I be? Don’t worry about it or make this an excuse to upgrade to a Dark Rock Pro 4 finally?
  8. So...swapped out the GTS 450s with a GTX 680 and the Folding@Home client is reading "Disabled" for the slot that I attempt to add. Is that GPU not supported either? Nevermind, had to restart the computer after the driver install.
  9. So basically what you're saying is that the Nitro+ is a nicer product but probably not necessary?
  10. Seriously considering picking up a RX 6800 or RX 6800 XT, especially if prices drop. I see Sapphire Technology makes a "Pulse" and "Nitro+" line, does anyone know what the real differences between these cards are? I presume the Nitro+ has better cooling, higher clocks, and maybe a bit better overclocking headroom, but how much does all of this translate to real-world performance? I'm looking at $618.77 for the Pulse RX 6800 and $654.99 for the Nitro+ RX 6800, a difference of $36.22. The Pulse RX 6800 XT is $710.59, with the Nitro+ RX 6800 XT at $785.98, a much bigger gap of $75.39. There would have to be a something above a minor performance bump to justify the premium on this one I would think.
  11. So any suggestion on what to do with the GTS 450s short of recycling them?
  12. Helped my brother-in-law build a new computer over the weekend. His GPU hasn't arrived yet so we pulled a GPU out of an old Alienware PC (GTX 680) as a temporary fix to get everything else up and running for the moment. In the process he dug out two even older GPUs out of his closet including a SLI bridge. They were unbranded (they were the cards that originally shipped with said Alienware PC) so I was unsure of what they were, but I took them off his hands, excited that I would finally have GPUs to fold on. After pulling off the cooler, I discovered they were GTS 450s. Over a decade old at this point, but I presumed they would be enough to at least get a handful of points daily, moreso than the CPU folding I was doing on some of the older computers anyway. This morning I slotted them into an appropriately older PC and booted it up, only for Folding@Home to tell me they're "disabled". Am I to presume that these GPUs are just too old for Folding@Home and are pretty much useless? Oh well, he told me when his new GPU arrives that I can have the GTX 680. Maybe that one will work at least.
  13. So now that the 5800X3D is on store shelves (in spirit anyway) any updated thoughts on this?
×