Jump to content

Zando_

Member
  • Posts

    15,614
  • Joined

  • Last visited

Everything posted by Zando_

  1. MacOS will use RAM for caching and such if it's free, it'll dump unimportant stuff if the RAM is actually needed. Those mid-2010s are Core 2 Duos, they are going to absolutely choke and wheeze on... anything past Yosemite or El Capitan really. So I think the CPU is your slow-down, not the RAM usage.
  2. Nvidia's encoders are a bit better. AMD's have gotten much better on the latest stuff, but still lag behind AFAIK. Not a massive margin, thus the word "slight".
  3. On building it? Or are you looking for feedback on the list? If building, my advice is to remember to plug in your CPU power connector. I forgot that on my first build (and a couple times since ), freaked at first when the PC just wouldn't POST. And as it's a Ryzen PC, remember to grab the chipset drivers from AMD and install those, they're important for Ryzen chips, and include Windows power plan that makes them perform better.
  4. Do you think you'll need the money before summer when you can work? If no then I wouldn't see that as a reason to return it. I get feeling a bit remorseful over buying expensive things, but if it helps to rationalize it, a GPU is a pretty good expensive thing to buy. It's not like you blew it on food or something that you'll consume and no longer have. You can keep a good GPU for 3-5 years at least, maybe longer now with how good upscaling tech is (and it does continue to get better). And it's something you'll actually use and enjoy regularly, assuming you're on your PC a decent bit. Radeon VII back in... 2018, 2019? I forget when exactly they launched. Very cool card (I bought it right at launch, had it overnighted so I got it the morning after they were released), great for a while, then the Adrenaline 2020 package dropped and busted the drivers so badly, for months, that I sold the card. It was to the point I'd get a blackscreen forcing a full hard reboot (hold the power button or unplug the PSU) to get display out back, even just idling on the desktop. A couple coworkers also had a bad time with AMD's more modern drivers on Polaris cards (400/500 series). And a friend had his 5700XT Liquid Devil (an overclocking card) forced to 500MHz for months until AMD finally fixed their drivers. Before Adrenaline 2020, AMD's drivers were rock solid in my experience, I never had issues with my RX480, Vega Frontier Edition, or the 290X I messed around with once. From what I've read, the 6000 and newer series cards are better off... I've seen a friend have BIOS issues with his ASRock 7900 XT, but no driver problems AFAIK. Again this is all my experience or anecdotes from friends, if you search around you'll find folks who have had the exact same experience, but with Nvidia cards. As I mentioned before, I sold that card (the Radeon VII) to a friend and he had 0 issues with it. He'd only had driver nightmares with Nvidia, never AMD.
  5. You are severely over-estimating how hot the air coming through your radiator is. I've never had air get close to that hot-feeling even coming out of custom loop radiators (better at getting heat out and through them than AIOs) with a much higher output machine (an overclocked 7980XE, which alone will pull similar wattage to modern CPU + GPU combos). The most I've felt is... warm air. Not super hot, just warm. Which means it's likely ice cold to the computer (mid 30s Celsius or less, which it seems to be as if I let the PC idle it will stabilize at low-mid 30s Celsius). I run my shower decently hot, at 108F. That's 42C, cool for a computer. Hottest I can stand after warming up is around 115F, which is 46C, still cool for a computer. Actual hot temps for a PC would burn me, as it'd be at 95-110C depending on components, which is near or above the boiling temp of water. Either airflow option is fine, with top intake you are fighting against natural convection though. I've only had issues with a front-intake rad when running fans at low rpm through a 60mm thick radiator. AIOs use 25-38mm radiators so they have a lot less restriction, I didn't have any issues running an AIO as front intake. Though I did have a much lower draw GPU (1660 Ti, ~130W if absolutely pushed) at the time. I believe the 4090 FE is overbuilt though, so it likely won't care either, you may see 1-3C higher temps with a front AIO intake. Air cooler then, not AIO. After ~5 years or so you're on borrowed time with an AIO, as the coolant evaporates slowly through the rubber tubing, and most AIOs do not have a way to top it off (this happens in custom loops too but you can just add more coolant to the reservoir).
  6. Are you on an extremely tight budget? I hate shipping things, and I do use DLSS, and would like to use more ray-tracing (I currently don't as my 2060 Super doesn't have the chops for it in Cyberpunk at 4K, even with DLSS on). I've also had bad experiences in the past with an AMD card, so I avoid them personally. Said AMD card was sold to a friend who has only ever had issues with Nvidia cards, so it seems it's luck of the draw what platform works better for you . For me, it'd be worth $91 to keep it.
  7. ^^^ X58 and X79 were a good value proposition for gaming back before Zen/Zen+ had a solid footing. There was a tiny window where X99 was a pretty decent value proposition, but now you can just get a 3600/3700X cheap used, slap it in a $90 motherboard, and call it a day. I believe the old LGA1366/2011/2011-3 stuff is still a good value overseas, thus why you see Chinese companies making boards for them still. On the US market, Ryzen is usually cheaper. This is true, PCIe lanes are one of the few things making it still worth it. Sheer core/thread count also, if you don't need particularly fast ones, just a lot of them, then you can get a nice stack of cores with LGA2011-3 stuff especially. FWIW, no need to have a PC be a good value proposition, I still use X299 despite being keenly aware of its shortcomings, because I find the platforms neat to work with (I am slowly moving away from them though, but have yet to actually get rid of my X58/79/99/299 hardware). As @AbydosOne said, the two Xeons are not cross-compatible. There's LGA2011 and LGA2011-3, same physical socket but some slight pinout differences IIRC. V1/V2 Xeons should be on LGA2011 (X79, I forget the C chipset number for servers), V3/V4 on LGA2011-3 (X99 and its respective C chipset). If you want a better gaming machine, you want a 2011-3 platform with a Broadwell chip (this should be V4 Xeons, though I believe some are still Haswell). Broadwell is the first generation of 14nm chips, and has a pretty strong IPC increase over Haswell (V3) and Sandy/Ivy Bridge (V1 and V2). My i7 6950X performed the same as my i7 5960X at 200Mhz lower clocks. If you're just tinkering for fun, then any gen is good 'nuff. I like Noctuas, so whichever tower cooler of theirs you can get for a fair price. These chips do not put out much heat at all until you get into manual OCing and push 1.3v or higher on vCore, which you won't be doing with a Chinese board and a locked Xeon. If you don't like the price of Nocs, look at Thermalright, they typically perform the same or better while being cheaper, and they should have a 120mm tower cooler compatible with LGA2011/2011-3. The nice thing about these sockets is that the threaded holes for CPU coolers are already part of the socket itself, so cooler install is typically easy, you dodge a lot of the shenanigans some cooler mounting systems require. Assuming that's one of the better Chinese boards, you should have no issues. A few folks here have had even the older ones and unless you're asking them to do a bunch of advanced stuff, they're pretty much issue-free.
  8. If $91 is worth losing DLSS, better RT performance, slightly better encoder performance/quality, CUDA acceleration, and your time/effort to return and re-order, then yep get the 7800XT.
  9. Zando_

    Elden Ring

    I enjoy Fallout 76, I have 378 hours in it now. There is a lot of inventory management if you don't fork over the $ for Fallout 1st though (which gives you an infinite scrapbox to dump junk into, if you're familiar with Fallout 4 crafting then you'll know the value of being able to hoard massive quantities of junk). And it's worth looking at what build you want to have and working towards that as you level up, swapping between multiple builds isn't easy till you're ~100+ hours in (you get the perks as cards from leveling up, not a board you pick points on like in Fallout 4). Though those are mostly important for the big boss fight events, and usually there'll be some level 1000+ (I've seen as high as 2200+ players) with a perfectly minmaxed build who can carry the boss damage. The community is generally helpful, there's a lot more players now due to the show so I have ran into... I think 2 trolls so far, that's it. Usually folks are happy to drop items for new players, carry them through daily ops/expeditions/boss events for the XP and rewards, etc. From the random teams I've joined where they used voice chat (there is no text chat, only voice) most people seem to just be regular 9-5er adults chilling after work. I think the nature of Fallout 76 keeps away most of the tryhards/trolls other games often struggle with. The C.A.M.P system (think a smaller version of Fallout 4's settlements, but you can place it nearly anywhere) is nice once you collect enough plans for stuff to build, many of these you just get around the world, the others are from the Atom shop (micro-transaction store). If you get it for free (I have an XBOX code from Amazon Prime, let me know if you want it and I'll DM it to you) or on sale then it's worth giving a shot or two to see if you like it. I didn't at first, then I picked it back up a year later and sunk over 200 hours into it. I like it enough that I usually pay for Fallout 1st when I'm actively playing it ($12/mo, I manually cut the subscription off and on).
  10. 6 foot USB extension cable. Cheap, should be no perceptible increase in input lag.
  11. Good looking list! I only have 2 suggestions: PCPartPicker Part List: https://pcpartpicker.com/list/3fGfL9 CPU: AMD Ryzen 7 7700X 4.5 GHz 8-Core Processor ($297.10 @ Amazon) CPU Cooler: Noctua NH-L12S 55.44 CFM CPU Cooler ($64.90 @ Amazon) - Much, much more capable cooler than the L9. If you cannot stand the fan, get the Chromax version (the fan is the NF-A12x15 if you want to search it up). You want the absolute best cooler you can fit in an SFF build as it's always a limiting factor for anything above low-end hardware. Motherboard: ASRock B650I Lightning Wifi Mini ITX AM5 Motherboard ($199.99 @ Newegg) Memory: G.Skill Flare X5 32 GB (2 x 16 GB) DDR5-6000 CL30 Memory ($104.99 @ Amazon) - Better timings, same speed, same price, still a clean and low-profile cooler design. Storage: Western Digital Black SN850X 1 TB M.2-2280 PCIe 4.0 X4 NVME Solid State Drive ($81.00 @ Walmart) Video Card: XFX Speedster SWFT 309 Radeon RX 6700 XT 12 GB Video Card ($329.95 @ Amazon) Case: Fractal Design Ridge PCIe 4.0 Mini ITX Tower Case ($129.99 @ Amazon) Power Supply: SeaSonic FOCUS SGX (2021) 750 W 80+ Gold Certified Fully Modular SFX Power Supply ($109.99 @ Amazon) Case Fan: Noctua A14 PWM chromax.black.swap 82.52 CFM 140 mm Fan ($26.95 @ Amazon) Case Fan: Noctua A14 PWM chromax.black.swap 82.52 CFM 140 mm Fan ($26.95 @ Amazon) Total: $1371.81 Prices include shipping, taxes, and discounts when available Generated by PCPartPicker 2024-04-26 19:38 EDT-0400 And one note: Double-check that you can fit those NF-A14s in with the GPU. They are 25mm thick (frame) and 1-2mm thicker than that with the rubber sound-dampening corner pieces, so you need to have ~30-32mm clearance from the GPU (as you don't want the fans rubbing anywhere if case clearance is slightly off). Though if you want the best noise vs performance fan Noctua makes, then look at the NF-A12x25 (specifically the x25 version, not any other NF-A12 variant). They make a Chromax version of it as well now, it's a little more expensive but it is the best fan they have. They've never come out with a 140mm version because they could not get a 140mm fan to perform better than it. It will have the same vertical clearance concerns as the A14s though, ~26-27mm thick with the sound dampening corner bits.
  12. Gotta compromise somewhere. I didn't realize the performance difference between FSR and DLSS, I only use it when DLSS or XeSS are not an option (I have Nvidia and Intel GPUs). If you think you'll be fully dependent on DLSS and Nvidia's frame gen then just get the best Nvidia GPU you can afford (seems the 4070 Ti in this case). I play at 4K60, and I have a 2060 Super and an Intel ARC A770. The ARC is better for rasterization but has some compatibility issues with the games I play, leading to me using the 2060 Super more often. Enough of my games support DLSS that I don't notice the drop in raw GPU power, and the ones that don't are usually old enough that they'll run playably on not-eyesore settings. It's not exactly the same situation you're in, but similar enough that I thought it was worth sharing.
  13. The 7900 XTX has a solid lead over the 6950XT at 4K, it'd be even better vs the 6700 XT, considering the 6950XT already has a massive lead over that. Tomshardware has their 4K roundup with both the 7900 XTX and 6950XT on the charts here: https://www.tomshardware.com/reviews/amd-radeon-rx-7900-xtx-and-xt-review-shooting-for-the-top/4. And the 6950XT numbers with the 6700XT on the charts here: https://www.tomshardware.com/reviews/amd-radeon-rx-6950-xt-review/4. 4K wants the most GPU you can throw at it. If you have no other reason to prefer Nvidia than DLSS, I'd say you're better off with more rasterization performance and VRAM. You should be able to get away with a higher native resolution, which should look as good or better than DLSS at a lower internal resolution. The only stickler would be games that only have FSR or DLSS, no XeSS. You'd have to make do with FSR there.
  14. The 4090 is one of the most efficient GPUs for heavy workloads (which is what it's designed for). Using Folding@Home as an example, it's 6th for PPD/kWh: https://folding.lar.systems/gpu_ppd/overall_ranks_power_to_ppd. The only GPUs beating it are other Ada Lovelace (4000 series) cards that are running further inside their efficiency curve than the 4090. It's more efficient than every other GPU before it. The gap between the Ada cards and even the previous generation is pretty wide, and that's one single generation, if you compared it to Maxwell or Kepler (decade-ish old architectures) the results would be comical. And they are, the 4090 does ~2.4m points per kWh, the Maxwell 980 Ti does... 0.184m points for the same kWh of power draw.
  15. It'll likely be fine. Your CPU cooler should have its own fan(s) as should the GPU, and the motherboard usually doesn't need much so it should be fine with the air pushed around by the CPU/GPU coolers and just natural radiation (of heat, not nukes ). Should be fine. The only thing that might see a slight performance drop is the GPU, if it's a modern one that boosts less and less the hotter it gets. Won't throttle, will just boost less, which is usually a ~3-5ish fps difference. Likely not noticeable unless the GPU is already pushed to its absolute limits (like my 2060 Super I ask to run 4K, that ~3fps vs stock/base clocks is noticeable because it's sometimes the deciding factor between reaching my display's minimum refresh rate - 40Hz - or not). I've ran PCs with not enough fans or straight up in a box with no fans... they got hot. That's about it, no major throttling, when I did this I was using a Maxwell GPU so it didn't change its boost clocks at all either. And not hot enough to damage the hardware, as others noted it'll shut itself off before that happens. Yep. And lots of those work PCs don't just have 0 intake fans, there's no intake vents anywhere on the front panel. The PC lives off what the exhaust fan can feebly pull through the gaps in the chassis. They run fine for... I've worked for the company I'm at for 7 years and I believe we've still got some of the Skylake machines kicking that were purchased around the time I was hired. I've had a dead one or two, not bad for $350-500 machines that are approaching a decade old in sub-optimal conditions (some are office PCs, others are in and out of hot/humid/dusty production rooms or warehouse/pick line locations).
  16. Elden Ring. Helldivers 2 is rapidly getting up there too. Probably Destiny 2 in its prime (warmind to just before sunsetting), and Star Citizen, though both those games are very far from perfect.
  17. I believe the NF-A12x25s remain the best airflow/pressure/noise balanced fans... they're just rather pricy. What temps were your NVMe drives actually hitting? A 20C increase isn't bad unless it puts them out of safe operating range.
  18. Like @RONOTHAN## said there really isn't much better. Noctua has their F series that focuses on static pressure, but IIRC the Redux P12s are based off the older version of that fan anyways, so they're already pressure optimized fans. You do *not* want the 3000rpm iPPC fans if you're trying to avoid jet engine noises. I have 'em in 140mm form, they are very much a jet engine when spun up. We haven't invented a way to ignore physics, so pushing air harder requires more force, and we get that by spinning fans faster. There's only so much you can do with the blade design itself.
  19. Because most cheap boards only have 4. That's why I mentioned the HBA. Those drives are extremely expensive. Not sure what you mean with the SMART monitoring, that's been a thing since... the mid 90s, looks like. I've got drives out of vista PCs that spit out SMART data, that's really the oldest hardware I've personally used. I've never considered a drive not having it, but I guess they must have not at some point. No. SMART is not that reliable and fluke failures are a thing. We don't do redundancy for fun. OP hasn't mentioned what drives exactly they intend to use (how many, what capacity), what OS they intend to use, and how much data they intend to store. So we can't really advise on what RAID/ZFS/other array type they should use.
  20. Case is one of the cheapest options with 8 3.5" drive bays - in slots too, and right in front of the intake fans, so the drives will stay cool and be more easily swappable. If OP doesn't need that many drives then yes there are cheaper options. As @DrMacintosh said, sorta depends on how much exactly OP is intending to store. Good point. Can get that same kit in 2x8GB instead of 2x16GB: https://pcpartpicker.com/product/P4FKHx/silicon-power-sp016gxlzu320bdaj5-16-gb-2-x-8-gb-ddr4-3200-cl16-memory-sp016gxlzu320bdaj5. It is only $20 cheaper, so if OP will want 32GB in future, it is cheaper to just get a 32GB kit, as it's less than 100% more cost for 100% more RAM. Also depends on the OS OP intends to use. For a beefier NAS box I'd prefer TrueNAS Scale, and by default that will only use 50% of the RAM for caching, so ~16GB. I believe you can manually override this, but if you then boot up some containers/VMs or something and forget to change the limit, you can run out of RAM and the system will hard crash.
  21. Something like this: PCPartPicker Part List: https://pcpartpicker.com/list/BVCDVW CPU: Intel Core i3-12100 3.3 GHz Quad-Core Processor ($121.98 @ Amazon) Motherboard: ASRock B660M Pro RS Micro ATX LGA1700 Motherboard ($94.99 @ Newegg) Memory: Silicon Power GAMING 32 GB (2 x 16 GB) DDR4-3200 CL16 Memory ($53.97 @ Amazon) Case: Antec P101 Silent ATX Mid Tower Case ($109.99 @ Newegg) Power Supply: Corsair CX750M (2021) 750 W 80+ Bronze Certified Semi-modular ATX Power Supply ($74.98 @ Amazon) Total: $455.91 Prices include shipping, taxes, and discounts when available Generated by PCPartPicker 2024-03-29 14:42 EDT-0400 That board has 4 SATA ports, the case fits 8 3.5" drives and the PSU includes 8 SATA connectors. So grab a used HBA and you can use all 8 drive bays. The board has a PCIe x16 and x4 (physical x16 but x4 bandwidth) slots, so you should be able to fit both an HBA and a 10Gb NIC if you have a 10G LAN. Picked the i3 12100 as the 12th gen chips have good idle power draw, the iGPU can be used for en/decoding (and means you don't need a GPU filling a slot and adding more power draw), and most NAS tasks are single-threaded so there's no need for more cores/threads (if there's something else you want to do that needs them, the i5 12400 is a very good pick).
  22. Yep. It'll even snitch on apps now, it will double-check with you on whether you want to allow the app to track you cross-app (to other apps in general or back and forth from your browser). I always say no to this.
  23. It's the 2nd M.2-ish connector behind the M.2 slot I believe: The spec sheet notes it's a "High-Speed Custom Solutions Connector (PCIe x4)". That connector looks like what you'd need for x4 PCIe. EDIT: actually re-looking, I think that is the SATA M.2 slot, the one below it looks like an M.2 slot for a wifi card? Unless that's integrated on the board. The custom PCIe connector may be on the other side of the board. You would need a separate PSU to run the drives, yeah. I'd grab a USB 2.0 (not 3.0, needs to be a 2.0) thumb drive and give Unraid a shot. If you don't need the speed of ZFS - and I assume you don't, as you wouldn't get it over a USB hub to begin with - then Unraid should do what you need as far as NAS duties. It's set up for consumer drives, can handle mismated arrays, AFAIK it should be fine with USB hubs, and can do stuff like sleep the drives, which will help with power draw. ZFS keeps them spinning always, and will have issues if you stop it from doing that (drives drop from arrays). Honestly the drives spinning (assuming you are using HDDs) was probably most of the power draw you were seeing. Each drive is ~6-10W, so you're looking at up to 40W for 4 3.5" drives spinning constantly.
  24. ZFS is built for datacenters, it wants/needs full access to and control over the drives. It won't work with RAID controllers unless they're flashed to function as a basic HBA, no shock that it'd dislike a USB hub. Does the NUC have an M.2 NVMe slot? You can get an M.2 HBA with IIRC 4 or 6 SATA ports.
  25. Yep. CPUs are the same basic tech across the board. Enterprise motherboards can use higher quality capacitors and be built a tad better overall as they're intended for 24/7 operation with minimal downtime. If you're worried about that very small percentage chance of failure then you can just get a server board for a mainstream chip, ASRock and Supermicro make some. ^^^ 1st gen Threadripper has poor single core performance (very important to many game servers as they are often single-threaded), and the power draw will be quite high vs a mainstream chip. It does add up when ran 24/7. Also, if you're running Windows as the host OS, 1st/2nd and 3rd gen TR still have TPM stutter with Windows (the whole system hitches for a couple milliseconds). AMD fixed this for AM4 but never bothered to for the X399 and TRX40 platforms. I believe if you run Windows 10 with TPM off it should dodge that, but W10 will be EOL sooner rather than later, so given there's 0 advantage to TR I don't see the point of trying to make it work for this to begin with. The best machine for this sorta thing is usually a 12th gen Intel based setup, as you can get DDR4 boards for them (cheaper RAM, though DDR5 is very cheap now so this matters less), they have very low idle power draw, and great single core performance. Anything Ryzen that's Zen 2 or newer is excellent as well. What exact chip you want depends on what board you wanna go with, and how many cores/threads you think you need. You can get up to 16c/32t on AM4/AM5.
×