Jump to content
Search In
  • More options...
Find results that contain...
Find results in...


  • Content Count

  • Joined

  • Last visited


This user doesn't have any awards

About drcheeseman

  • Title

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. drcheeseman

    Graphics Card Recommendation for Quiet SFF Build

    Thanks for this reply! I also checked out the Twin Frozr cards by MSI based on input from outside the forum which would've been a good option if I wanted to go 2070, but in the end decided on Sapphire's RX 580 based on some review videos. I'm was still deciding on this. It's a gaming rig I use when traveling so I was open to it being a mid-range to high end gamer as long as it's quiet. Based on DrMacintosh's input and outside input from a friend with an MSI card, I decided to go the RX 580 route vs an RTX 2070 route since most of the games I play while away from home are mid-range to low-end games in terms of graphics intensity and if I do want to play a AAA, I'll be playing on a TV vs a monitor so I'm sure the RX 580 will do the 30 FPS necessary to push frames to the TV I'm using on any game.
  2. I built a system on a 2400G and and want to add a dedicated graphics card. Building in a Node 202 so have a strict 2 slot only requirement and standard length; the upper bay already has 2 slim noctua fans to move air and want to leave them in there. Also looking for a card with fan stop as a feature; tried an Auros Rx 580 windforce but the fan noise when it was running was pretty bad (also was an Ebay buy and while it worked fine the bios was a mining bios so hdmi and displayport port had no audio) so I returned it. Open to anything from a 1060 to a 2080 or an AMD option but hard requirements are 2 slot solution with fan stop and ideally with tolerable fan noise when the fans are spinning. Appreciate any recommendations from people who have had good experiences.
  3. drcheeseman

    Steam Caching Tutorial

    Tip for native linux installs. Had trouble getting it to actually save stuff to the cache directory and found out it was a permissions issue. If you're not seeing files in the /data/cache folder of your server, check the error.log. You'll see a line that looks like this: asudouser@cache:/media/steamcache$ cat /media/steamcache/logs/error.log 2018/12/23 05:07:14 [crit] 48#48: *465 open() "/data/cache/cache/ef/db/c3f2f550f48baef4623627398df8dbef.0000000217" failed (13: Permission denied) while rea ding upstream, client:, server: , request: "GET /serverlist/95/20/ HTTP/1.1", upstream: "", host: " valve614.steamcontent.com" 2018/12/23 05:07:14 [crit] 40#40: *467 open() "/data/cache/cache/ef/db/c3f2f550f48baef4623627398df8dbef.0000000218" failed (13: Permission denied) while rea ding upstream, client:, server: , request: "GET /serverlist/95/20/ HTTP/1.1", upstream: "", host: " valve619.steamcontent.com" ... ... ... 2018/12/23 05:07:14 [crit] 47#47: *471 open() "/data/cache/cache/ef/db/c3f2f550f48baef4623627398df8dbef.0000000220" failed (13: Permission denied) while rea ding upstream, client:, server: , request: "GET /serverlist/95/20/ HTTP/1.1", upstream: "", host: "va lve176.steamcontent.com" The root of the issue is the user id of the owner of the cache directory on your machine has to be the same as the www-data user on the container. In Ubuntu it's 33 so just have to make sure it matches. You can check the permissions in the container by running docker exec steam-cache ls -al /data It should look like this: drwxr-xr-x 5 www-data www-data 4096 Dec 19 21:56 . drwxr-xr-x 1 root root 4096 Dec 23 04:56 .. drwxrwx--- 2 www-data www-data 0 Dec 23 04:49 cache drwxr-xr-x 2 www-data www-data 4096 Dec 19 21:56 info drwxrwx--- 2 www-data www-data 0 Dec 23 04:49 logs But it may look something like this if your permissions aren't right: drwxr-xr-x 5 www-data www-data 4096 Dec 19 21:56 . drwxr-xr-x 1 root root 4096 Dec 23 04:56 .. drwxrwx--- 2 www-data 1001 0 Dec 23 04:49 cache drwxr-xr-x 2 www-data www-data 4096 Dec 19 21:56 info drwxrwx--- 2 www-data 1001 0 Dec 23 04:49 logs To fix it, find out the user id number on the container matches the one on your local machine (should be www-data in ubuntu). Then recursively set the ownership to that user. asudouser@cache:/media/steamcache$ docker exec steam-cache cat /etc/passwd | grep www-data www-data:x:33:33:www-data:/var/www:/usr/sbin/nologin asudouser@cache:/media/steamcache$ cat /etc/passwd | grep 33 www-data:x:33:33:www-data:/var/www:/usr/sbin/nologin asudouser@cache:/media/steamcache$ chown -R www-data:www-data /media/steamcache
  4. drcheeseman

    Retro Rig Question

    Yes, I remember those well. This is all about showcasing the graphic card design art. Only regret is the two cards I landed for this project don't have the original boxes to showcase with them; thus the interest in getting a custom case wrap or something similar done up.
  5. drcheeseman

    Retro Rig Question

    Yup, this particular SLI build is NOT about functionality. This is a build to be abnoxious, not necessarily super smart in terms of best implementation XD.
  6. drcheeseman

    Retro Rig Question

    Basically what I'm trying to do XD, though the system will also serve a dual role as an emulation rig. Planning on dual booting Windows and Linux with Linux being the foundation for the emulation side of things and Windows being around to play retro-PC games and modern low-end-geared PC games like Fortnite.
  7. drcheeseman

    Retro Rig Question

    Thanks for the correction; always been on red-team myself but the AMD cards never really had the same flare when it came to graphic card design art in my opinion.
  8. I'm a fan of the old retro style cards that had graphics on them and recently got my hands on a pair of 8800GT's with my absolute favorite design. I want to build an SLI system with both cards vertically mounted and then get the whole case painted custom inline with the design. Would like the rig to be as high-end as possible without overbuilding it for the available graphics because it would be cool to play stuff like Fortnite if at all possible. Questions are as follows: 1. What CPU and memory would people recommend for this? I'm leaning towards a Q6600 775 or a 4790k 1150 build with DDR3 memory. Leaning towards a 4790k build because I heard the 775 mobo's sometimes required special components to enable SLI. 2. Any leads on extended SLI bridges? Since they'll be mounted vertically I may need more than a 120mm bridge. Also, are these cards late enough in the game that SLI bridges were optional? 3. Any advice on SLI compatibility with such old cards. Will I be restricted to Windows 7 or earlier? 4. Any leads on a case with 2x120mm rear fans or that would be good with dual GPU mounting in a vertical format? Given the cards have a graphic design they both need to be mounted in the same direction. I have 2 Mnpctech Stage 2 Vertical Video Card GPU Mounting Bracket's for this purpose but would need a case with a vertical bracket and 120mm fan or more desired 2 120 mm fans in the back so the cards could both be mounted upright and in close proximity (considering the SLI bridge is probably already going to be stretched). Thanks in advance for any advice!
  9. drcheeseman

    Proposed Scrapyard Wars Seson 8 Theme

    Have a great idea for Scrapyard Wars season 8, but want to propose it to someone that can consider it without Linus or Luke's (and the community's) knowledge so it's a surprise if they like the idea. Anyone know who is the one that comes up with the rules for the different seasons so I could DM them my idea? Thanks in advance for any help!
  10. drcheeseman

    One Last Argument for Low End/Cheap Graphic Cards

    P.S. Here are screenshots of me running NVENC vs CPU with ffmpeg. The first timed encoding a 1h video which shows NVENC finishing 7x faster than the CPU. The next two screenshots are traces of the CPU when running the different tests to show they're analogous to the Plex traces I showed earlier. I did try to get vaapi working as a comparision as well (Plex doesn't support VAAPI, but is planning to), but to no avail despite the i3-7100 supposedly being in the list of supported hardware for h264 encoding.
  11. drcheeseman

    One Last Argument for Low End/Cheap Graphic Cards

    Just for fun and for anyone that's interested; here's a VCE build. Looks like you'll be saving an extra $15 if you go with the minimum required that supports VCE that's still on the market, but you may be able to find cheaper stuff used to increase the value over the solution I proposed with a dedicated GT 730. Without a graphics card as well, you could probably use cheaper proprietary power supplies from a used parts shops and/or go with a slimmer case that doesn't have slots for an expansion card. Downside to this approach is of course that Plex doesn't support it, but that might not matter to people who are building a NAS that want to be able to auto-transcode videos or some other similar functionality. 201802-400 VCE Box.pdf
  12. drcheeseman

    One Last Argument for Low End/Cheap Graphic Cards

    So just for fun, I started up my laptop and tethered to my phone and had both my phone and laptop playing different episodes. Did one with Hardware encoding enabled, one with it disabled and made sure to select videos I hadn't played before (and this time I swapped the order; did hardware encoding first and CPU second). The traces speak for themselves I think; CPU usage pegged at first to fill the buffer and then fluctuated between 40-80% a lot with 70% seeming to be the average (could hear my CPU fan fluctuating too). The GPU usage to contrast seemed very similar to the single video example with the exception of more drama when the video started. Maybe when it says it's limited to one stream it means one frame at a time and what frame that video is coming from in terms of encoding is irrelevant? Either way I think I have another counter-example to add to the pile.
  13. drcheeseman

    One Last Argument for Low End/Cheap Graphic Cards

    Hadn't thought of that, so I went and played an episode from a season I hadn't watched yet and observed similar behavior. I'm 100% sure the GPU is in play given the change in baseline memory usage. Regarding actual GPU core usage, that actually isn't supported by this version of the linux drivers for this card, so there may be an unknown limitation. Your point about only supporting a single stream is also notable and not something I had seen yet. Do you have a link to any product specs that can describe the number of streams NVENC supports for different classes of cards?
  14. drcheeseman

    One Last Argument for Low End/Cheap Graphic Cards

    I can confirm that's false; aside form the screenshots I provided, I also was able to use the ffmpeg h264_nvenc encoder and could confirm the cpu was not being touched when encoding. The article I linked points to a list of supported cards; but that list doesn't include any consumer cards and the paragraph at the top is the main clue as to what consumer cards support NVENC, namely anything with a Kepler, Maxwell or Pascal GPU. I also have a friend who specializes in GPU architectures that confirmed my interpretation when I was shopping for a compatible card. Basically I have proof by counter-example on this argument. That's probably another value proposition; but for my case, and possibly others like me, I already had a working system but just needed to add something to support some level of hardware encoding. Also, specifically for Plex, NVENC is the only option as it doesn't support VCE, so NVIDIA NVENC cards were the only option. If however there's some edge case that needs VCE support but not any use for the GPU otherwise, you are very right in that an APU with an HD 7900 or later would probably fit the bill.
  15. TLDR: If you just need a card that supports NVENC, then price is king and overall card performance doesn't matter. Specific example is if you want to turn your existing NAS into a Plex server that can use hardware encoding for h.264 playback when streaming videos directly from your server. So I actually have some disagreement with the statement made in this follow-up video to the low end graphic card discussion made in this video here: But I'm not flaming upset about it though because their context never discussed my particular case, and I actually think it would make a good video segment as well. Basically it comes down to one thing for me; NVENC support. I built myself a 10TB ZFS raidz storage server and wanted to install Plex (I had it on my desktop but I power cycle it constantly switching between work and gaming boot drives, so I wanted something that would run 24/7). The server itself is relatively low end hardware as it is mostly used for media streaming and long term infrequently accessed storage. Specifically it includes a core i3-7100, 8 GB of ram and as mentioned a 10TB ZFS pool for storage running on Ubuntu 16.04; very much a budget archive for lots of data. That said I wanted to stream videos from it to my phone or other devices while away on business trips. However to utilize this and not peg the CPU I needed NVENC support which as the below article points out is mainly dependent on the architecture of the card and not so much if it is high-end or not (comes down to the GPU being a desktop Kepler, Maxwell or Pascal GPU): https://help.elgato.com/customer/portal/articles/2471964-which-nvidia-graphic-cards-do-support-nvenc-technology- Thus for my 10TB ZFS raidz I went with a Zotac GT 730 that I bought used, but I could have easily gotten ANY GT 7XX desktop card with a Kepler GPU. I'm actually considering this because there are some low-end entry models that are completely passive and the value of the Zotac GT 730 seems to have plateaued so I think I can get my money back on the card I bought initially and upgrade to passive cooling. Regardless what I observed is with my setup was without hardware encoding, my CPU had all 4 cores very busy out when encoding h.264 video to a 4G mobile endpoint. With it enabled, the CPU stayed mostly quiet and...to my surprise...so did the GPU (as observed through nvidia-smi). So basically when it comes down to it, if you're just trying to get a card for NVENC support price trumps all and ram capcity, outputs, etc don't need to be considered (I actually use the onboard graphics controller for my rig). I attached 2 screenshots of me streaming a video to my phone over a 4G connection; the first is with CPU where you can see it fluctuating around roughly 40-60% utilization on all cores as it refills the streaming buffer vs the GPU enhanced version where you just see one core bouncing around probably doing network related processes while the GPU takes care of the rest. Take note, the GPU barely moves in terms of utilization, only showing a 32MB increase in the memory utilization. This effectively multiplies the number of clients I can stream to by the number of cores I have. So in this specific example, with no hardware acceleration, I could maybe get 2 devices streaming while maxing out my CPU and doing nothing else on my NAS, but with hardware acceleration I could easily encode h264 to 8 clients or just stream to 4 and still be able to use 2 cores worth of processing power. There is a limitation however in that this only works when directly playing videos remotely; for some reason PLEX was inconsistent on whether it would use the GPU to encode stuff while using the sync feature (and seemed to default to CPU/direct file upload) and local network playback was almost always a direct file playback. The counter argument is just to get an Nvidia Shield which supports NVENC, but if I already have a NAS or I want to have my Plex box function as a NAS, then adding a $40-$50 card that supports NVENC is much cheaper than the price of a Shield. For someone wanting to build a NAS and a dedicated Plex server, the value proposition of adding an entry level Kepler video card to the NAS is hard to beat. I even attached 2 builds for a 1TB mirrored NAS barebones with 8GB of ram that would support Plex Hardware encoding for just under $400 if you want a slick looking Fractal Design Node 202 case or a 'fat' version of the same hardware for just over $330 if you'd rather go with the Cougar MG110 and 3.5" drives. You can obviously swap out to a different motherboard to get more SATA ports and add drives as desired to you NAS spec, though if you want to stick to the ASROCK embedded solutions you'll need to switch to a 1x PCIE card such as the Zotac GT 710 which may actually be desired because it's passively cooled, making your entire NAS box fanless if you stick with the Asrock solutions. In summation though; I have NO issues with the arguments presented in the videos about the value proposition for getting GPU performance per dollar. However if you just need NVENC, GPU performance is moot compared to whether the card supports NVENC. This is purely an edge-case where I only needed an NVENC enabled card and after digging to the bottom of the barrel for cards available I found the performance for NVENC was what I needed and the value proposition for someone trying to do the same as I is very hard to beat. Thanks for reading this and would love to see the Linus Media Group team give an overview of NVENC and what it's utility could provide to non-gamers or NAS owners wanting to take some load off of their CPU when streaming videos through Plex. Love the channel and will continue watching even if this post turns into a flame on my build :). EDIT 201802171209EST: It was noted in the replies that AMD APU's can support VCE which is a similar hardware encoding utility. This could also work if you're building a budget system from scratch, but for those who have an existing system with an incompatible socket to an AMD APU, I still think a budget Kepler card is the best way to go. Additionally, if you plan on using hardware encoding for Plex (like I was), VCE isn't supported anyway so an NVIDIA NVENC enabled card is the only way to go. (VCE: https://en.wikipedia.org/wiki/Video_Coding_Engine) 201802-400 PLEX BOX_Cougar.pdf 201802-400 PLEX BOX_Pretty.pdf