Jump to content

Zando_

Member
  • Posts

    15,603
  • Joined

  • Last visited

Everything posted by Zando_

  1. Just remember that RAW is what it says it is, it's the raw data dumped from the image sensor with 0 processing. A movie in RAW would look like garbage, the films you actually watch have had a bunch of color grading done to make them look... whatever is "good" for the aesthetic the director was going for, be that stylized or realistic. Here's an example from Adobe of a RAW vs JPEG (automatically processed by the camera) image: That aside, the tech does seem really cool. As others have said, I don't know that it'll take off though. There's other reasons companies prefer digital over physical (lower overhead cost, executive paranoia about "ooh scary piracy", etc) that'll hamper their use, especially for consumers.
  2. My grandfather has Parkinson's and is quite sure it has something to do with the 30 years he spent as a plumber working with all sorts of lead, so not a great metric. It won't straight up kill you but it can make you more likely to just suddenly degenerate into a shell of your former self when you get older. Study here, there's some newer ones as well.
  3. An 8-pin PCIe and 6+2-pin PCIe are the exact same cable, they just split 2 pins off for those who have a GPU that takes a single 6 or an 8+6. An 8-pin EPS is a different cable, but won't fit unless you have some impressive strength and brute force it (as they are keyed differently), so no worries about confusing cables. You should be good with the dual 6+2 cable you have now + another 8-pin.
  4. 7900XTX seems to peak at ~365W or so, so even if OCed likely not breaking 400W. Your CPU and the rest of the build won't pull 350W unless you have 50 case fans or something. Should be good. Buy new cables. Corsair should sell replacements, Cablemod also makes aftermarket cables if you want them in a different color.
  5. Garage with the door open should be good. The random breeze outside could make it harder to avoid the solder/flux fumes. Like when you sit at a fire and the wind decides to keep putting the smoke in your face no matter where you move.
  6. Yes. No. The 1080 has no issue with x8: So the 1060 3GB would be fine.
  7. Better to have it and not need it, than need it and not have it. When you run out of VRAM it'll overflow to system RAM which is ~10x slower usually, so games will choke. Whereas if you have too much VRAM.... nothing happens, the game just won't use it all. So like @BiG StroOnZ said, if you can get the better card then do so.
  8. The purple artifacts from a failing GPU or failing/unstable VRAM (I've gotten these when pushing OCs too far, they went away when I pulled clocks back) will be flashing squares (usually green and other colors as well) rippling across your screen, they're won't just turn an in-game light purple. It's either meant to be purple like that or the game itself has derped out over that one light. Here's an example of artifacting I grabbed off google:
  9. Yep, the 270X is the main thing holding it back. I ran my 4930K (6c version of the 4820K) at 4.7GHz with my 1660 Ti without issue. Especially if you're like me and play at 60Hz, these old CPUs can remain usable for quite a while. At stock clocks you may have some issues with more demanding games, but these Ivy Bridge chips pretty much all do 4.5GHz or higher and are easy to cool (I ran mine on a 280mm AIO and I don't recall it ever breaking into the 80s so there was still thermal headroom left). If you're on a super tight budget then OC the CPU and try to snag a newer GPU. Otherwise as @RONOTHAN## said, you can probably get a good price for that board if you're willing to wait a bit (they don't sell super quickly as only folks like me who like this old stuff buy 'em). I got my EVGA X79 Dark, 4930K, and 16GB Vengeance RAM for $300 shipped back in 2019. ASUS Rampage boards went for a bit more at the time. I don't know if their value has dropped much... from a quick look on ebay they go for anywhere from $83 - $270 for the board alone, most look to be around $120-170.
  10. Money. For online games, a bunch of real players, but not the full peak player count, would definitely help with stress testing, and you piss less people off if your services fail. But you can do the same with an open beta so it's not an argument for the preorder early access thing.
  11. Oh nice, that looks like a standard 2.5” SATA HDD. No difference vs a 3.5” drive other than the size, they both work the same. I’d plug it in and run your speed tests again, if it’s faster over SATA and the SMART data doesn’t say it’s about to fail, clone over your main HDD (on Windows I use Macrium Reflect to do this) or take this as an excuse to do a clean install, gets everything freshened up. If you’re using an activated copy of windows, make sure you have your key either written down somewhere or linked to your MS account if you use one (mine is linked to my MS account), before you do a clean install.
  12. What external case is it? If it's a standard 2.5" HDD in an external housing, you can just pull the drive and use it in your PC, most cases have 2.5" drive mounts for this purpose. If it's a newer sealed unit from seagate/wd or similar, on some of the new ones they've resorted to soldering the USB connector directly onto the HDD - replacing the standard SATA connectors - so there would be no point in shucking it (cracking the plastic case off so you can use the drive like a normal one). If you're asking whether you can simply use it as a USB drive and run programs off it.... yeah. Should have no issues doing that, so long as the USB drive doesn't get bumped loose like you mentioned. If you want to run your OS off of it, it may actually end up worse as you might hit USB bandwidth limits, would have to try it and see I guess. Would depend on what OS and what you're doing. I've had no issues booting off HDDs over USB to test if a drive clone worked before swapping it in to the machine, but I never tried to actually run a bunch of programs, just verified the drive was properly duplicated then shut it down.
  13. That's laughably unrealistic and doesn't answer OP's question, so I didn't mention it. If OP got their cooling solution working and kept their chip well below the dew point, condensation would form and short-circuit the system, killing it. That's the simple and concise answer. There's ways to get around condensation, like waterproofing the board (see some of ASRock's old XOC boards, they had a waterproof coating to try and make it easier to keep the board alive when doing LN2 overclocking) or slathering it in petroleum jelly and wrapping everything possible in insulation. Or both, here's an old chiller thread where people listed what they did: https://www.overclockers.com/forums/threads/how-do-you-handle-condensation.790155/. Could obviously look at what modern chiller users are up to and see if there's better solutions, last I saw much about it I believe people who daily-ed a chiller were just keeping them enough above the dew point to not get condensation to begin with, as it's a nasty thing to try and consistently prevent on a daily machine.
  14. Yep. It would rapidly kill itself thanks to condensation.
  15. Yep. My older intel HEDT chips will sit in the 50s-60s under load on a decent cooler. But they're only boosting to 3.1-3.6GHz allcore, when those chips are actually capable of 4.2-4.5Ghz or higher. I manually push them to those clocks and the temps jump up a lot, into the 70s in games and 80s-90s in the toughest synthetic stress tests. Modern chips are basically doing the overclocking themselves, they come out of the box boosting to 5.0GHz or well above (lots of the higher ones run 6.0 on a few cores), pulling a bunch more current to do so. They're also physically small chips so not much surface area to pull heat out, which results in similarly high temps despite the efficiency gains over older chips.
  16. 70s is ice cold for modern chips at stock clocks, you're fine.
  17. I have X58, X79, X99, and X299 kit. Lot less passion for it these days, but I like the platforms. Old HEDT isn't very good vs current kit, but it's neat and I like that. My X299 Dark is the only board I have that supports ReBAR, if I didn't have it then I wouldn't have been able to get my ARC A770, as the ARC cards are functionally dependent on ReBAR. Large performance drop and IIRC stutter issues without it.
  18. ReBAR and PCIe 4.0 are separate things. You want PCIe 4.0 for any modern card with an x4 electrical connection. I have ReBAR enabled on my X299 Dark with a 7980XE, which is a PCIe 3.0 board and CPU.
  19. Worth noting that ARC cards still seem to have some issues with drivers: I haven't ran my A770 in a while (due to driver issues/shortcomings for what I've been playing/doing lately) so I can't speak on this in the games I play, which aren't usually tested in these videos.
  20. Yep! Looks like you can squeeze this 7900XT in, it's bang on your $700 limit: https://pcpartpicker.com/product/3BQcCJ/sapphire-pulse-radeon-rx-7900-xt-20-gb-video-card-11323-02-20g. Here's TomsHardware's review of the 4070, includes the 7900XT and 4070 Ti (~$30 over your $700 limit) on the charts as well: https://www.tomshardware.com/reviews/nvidia-geforce-rtx-4070-review/4. This is a tad long so splitting it out to its own paragraph: The 7900XT is better raster performance and more VRAM than the Nvidia cards. Some people do have issues with AMD drivers though (I had issues with my Radeon VII and 0 issues with my Nvidia cards, the friend I sold it to had 0 issues with it and only ever had problems with Nvidia cards, so seems luck of the draw). The Nvidia cards offer better raytracing performance, and in many games DLSS is a bit better than FSR, though that matters less given the 7900XT should run most games without needing upscaling to begin with. I believe Nvidia's encoders (for streaming/recording) are slightly better, but it's very close this generation, so probably not a deciding factor unless you're a full time streamer or youtuber. Nvidia cards will also work better in most non-gaming tasks due to better industry support for CUDA and Nvidia's drivers. TLDR: AMD is better normal gaming performance and more VRAM, Nvidia cards have the edge in ray-tracing and upscaling + some non-gaming tasks. Pick whichever you think will work better for you, assuming you play at 1440p or lower any of the cards listed will be excellent. EDIT: There's actually a 4070 Ti available at the same $699.99: https://pcpartpicker.com/product/9mNYcf/msi-ventus-3x-e-oc-geforce-rtx-4070-ti-12-gb-video-card-rtx-4070-ti-ventus-3x-e-12g-oc.
  21. No prob. Gotta set the list to public if you use the link saved to your profile though, right now I can't see it as it's set to private. IMO it's simpler to just click "edit this part list" and then copy the permalink than change the privacy setting.
  22. Just copy the link man, I'm not downloading a file. PCPP has both a permalink and the option to copy the full list as text so people can read it right on the forum:
  23. Anytime anyone offers any payment method that does not hit your account while you're standing there, run the fuck away. Instant transfer can work instead of cash, but something taking 2 hours is a heeeeeellll no. $380 real dollars is better than $500 maybe dollars IMO.
  24. I didn't mean that it had problems, Nvidia just does not allow a higher TDP so you will hit voltage and wattage limits before you hit clock limits. I didn't mention thermals though, on an LP card that could be a concern. Doesn't mean much. I've had the 1050 and 1050 Ti from EVGA with the dual fan cooler and 6-pin, and the single-fan slot power only ones. I don't recall any difference in OC performance. The 6-pin would hypothetically let you go above the 75W the PCIe slot can provide, but then you'll hit the voltage cap, so it doesn't matter.
  25. Vega cards are known for massive transient spikes (giant wattage spikes for a couple milliseconds). Your nice unit probably isn't set up to account for these if it's an older design, so it trips OCP/OPP and shuts off the PSU. Lots of good units had this issue when the Vega cards released, no mark against their quality just the OCP/OPP settings had to be updated, and have been on all the newer refreshes. The crappy unit likely does not care and is just allowing the GPU to do whatever it wants.
×