Jump to content

Emily Young

LMG Staff
  • Posts

    1,213
  • Joined

  • Last visited

Everything posted by Emily Young

  1. There were some old builds floating around from the Transgaming days, but I had to compile it myself. It was RIDICULOUSLY annoying to do so as it required Visual Studio and a bunch of other stuff and Windows isn't exactly an ideal build environment. Here's what I ended up with. You typically only need to copy the d3d9.dll into the same folder as the game's executable, and it only works with DX9 (or OpenGL with the libGL DLLs).
  2. We FINALLY got our hands on AMD’s new EPYC Rome processor – Let’s see what 64 Zen 2 cores can do, shall we? Buy a Supermicro H11SSL-NC with 64-core EPYC pre-installed: On Newegg: https://lmg.gg/8KVPJ Or buy a Threadripper instead: On Amazon: https://geni.us/RqprkC On Newegg: https://lmg.gg/8KVPL Or maybe settle for a consumer Zen 2 CPU: On Amazon: https://geni.us/0i5Hu On Newegg: https://lmg.gg/8KVPB
  3. We cut the test at around 19 minutes or so into each run (when the first Intel test started throttling hard and the water appropriately enough hit 69°C). By that time, AMD was done and its water temperature was lower (it never got a chance to significantly turn the dye black like the Intel runs if you notice).
  4. We have two 8-core CPUs: One at 105W, the other at 95W – But those numbers are completely arbitrary. Let’s test them and see for ourselves which will heat up faster! Buy a Core i9-9900K: On Amazon: https://geni.us/A7fMnMq On Newegg: https://lmg.gg/8KVmd Buy a Ryzen 7 3800X: On Amazon: https://geni.us/c8jRoGK On Newegg: https://lmg.gg/8KVmy
  5. Oh boy MCT. Heaven forbid you insert an ExFAT formatted drive...
  6. It definitely was something stupid: We didn't realize the drive wasn't fresh... The problem was just that it was already formatted. It wasn't talking about physical sector size, it was talking about the formatted cluster size. WHY it wouldn't just nuke the drive when adding it to a zpool or otherwise say "hey, there's data on this drive, you sure about this fam" is beyond me.
  7. That didn't take long - A whole Petabyte of storage is now completely full, so come along for the ride while we deploy a band-aid solution! Buy Seagate IronWolf Pro hard drives: On Amazon: https://geni.us/PSstrc On Newegg: https://lmg.gg/8KVmr
  8. Basically yeah. I was also taken to task on it as being too high in the overclocking guide video as a result of The Stilt's findings on silicon degradation past 1.325V... But he's talking about high currents. In other words, setting even higher than 1.35V isn't going to hit 1.325V at full load due to vdroop unless you've got really aggressive LLC enabled, so it's a safe value. Steve over at Gamers Nexus did a video on undervolting Ryzen 3rd-gen and found that Ryzen can and will undervolt well, unless you go too low. In the case of the overclocks I was able to achieve, thermals were the big constraining factor on the 3900X, while the 3700X and 3600 were limited primarily in terms of silicon - Higher voltages did nothing for stability nor performance. TL;DR 1.35V maximum manual voltage setting is mainly a guideline based on my observations and not wanting to stray too close to the sun on silicon degradation. The maximum safe voltage in general was determined to be 1.47V, although transient voltages will be higher than that even at stock and that's fine - It's sustained you need to worry about. A note regarding voltages and hardware monitors: Some core voltages will remain "high" at idle as a result of software being incapable of reading the state of a parked core, which means the last measurement before parking becomes the "current" measurement whether it's accurate or not. Ryzen Master can show the status of each core and their true current clocks, but unfortunately doesn't show each core's voltage as HWINFO does. Much of this is also going to vary from board to board, BIOS to BIOS, and chip to chip. Early BIOSes had poor voltage management, so in these cases the CPUs might have been starving or otherwise not boosting properly. This is less of a concern now.
  9. This is ridiculously cool and I'm relieved they didn't end up in a landfill somewhere. I'm not sure if there's enough content that works to make a video on it... Found a Windows Central article on it that goes over everything they got it to do and there's not much to see. Still, this is a curio and a piece of gaming history I definitely want to see preserved... I'll see what @LinusTech thinks during our next content meeting.
  10. I feel this will be one of the final hurrahs for Parallels since it’s come to light that macOS Catalina is baking in support for virtualization. Most likely, Apple is working on their own solution, one that utilizes qemu/kvm, which means a dGPU could be passed through (or they could be working on a Virgil implementation). Parallels certainly isn’t helped by the fact that they’ve been becoming less relevant as people switching from Windows to macOS likely are already familiar with it and many major applications are cross platform. Game performance has always been abysmal, and they haven’t focused on it in a long time; Unless something major changed in the new update, I doubt that’s changed, either. But I haven’t looked too closely at it yet. If it comes out that Apple’s rolling their own VM solution, we’ll probably have to test Parallels since its entire raison d’être is going to be jeopardized. Parallels still gets brownie points for the way it manages to tightly integrate its Coherence mode and what have you, but Apple has a way of applying a layer of polish others lack, so even that may not be quite enough… Unless Apple limits if to macOS VMs or to the Mac Pro in particular.
  11. Thanks to LG and Seasonic for sponsoring this video! Buy the LG 34GK950G: http://bit.ly/2QIAY8M Buy on Amazon: https://geni.us/34GK850GLTT Buy Seasonic PSUs on Amazon: https://lmg.gg/8KV6P Buy Seasonic PSUs on Newegg: https://lmg.gg/8KV61 Enter LG's 27GL850 Gaming Monitor giveaway on Instagram at https://lmg.gg/8KV3c In part 1 of this series, we looked at how PC gaming evolved from 2009 through 2014 - Now, our story continues to present day. How far have we come? Watch part 1: Buy a Ryzen 7 3700X: On Amazon: https://geni.us/5oZt On Newegg: https://lmg.gg/8KV3d Buy a GeForce RTX 2080 Ti: On Amazon: https://geni.us/4OGK On Newegg: https://lmg.gg/8KV3y
  12. "According to FIT, the safe voltage levels for the silicon are around 1.325V in high-current loads and up to 1.47V in low-current loads (i.e ST)" Right, so based on that, setting 1.35V and having vdroop bring it down to 1.28V or whatever will be within his recommendation, since he's talking about high-current, multi-core loads. So make sure that your high-load voltages don't exceed 1.325V (that is, don't set a too-aggressive voltage target or LLC) and you're fine.
  13. I believe The Stilt found that 3rd-gen chips begin to degrade at 1.35V, although stock operating voltages with XFR can pass 1.4V incidentally. With 1.35V configured manually, vdroop means it's really running at 1.28V or something like that when all cores are under load, depending on your load line calibration setting. In any case, setting over 1.35V didn't seem to help stability much in my testing regardless, so for all of those reasons, that's as high as I'd feel comfortable recommending.
  14. Overclocking AMD’s 3rd-gen Ryzen CPUs is pretty different from previous generations – So it’s time for an update to our Ryzen overclocking guide! Buy a Ryzen 5 3600: On Amazon: https://geni.us/AyDSv On Newegg: https://lmg.gg/8KVz7 Buy a Ryzen 7 3700X: On Amazon: https://geni.us/MB2V On Newegg: https://lmg.gg/8KVzM Buy a Ryzen 9 3900X: On Amazon: https://geni.us/Ae4bF On Newegg: https://lmg.gg/8KVzC
  15. Have gaming PCs always looked the same? How did performance scale over time? Let’s take a trip down memory lane and see what gaming PCs looked like in 2009 through 2014! LG 34GK950G: http://bit.ly/2QIAY8M Buy on Amazon: https://geni.us/34GK850GLTT Buy Seasonic PSUs on Amazon: https://lmg.gg/8KV6P Buy Seasonic PSUs on Newegg: https://lmg.gg/8KV61
  16. tfw you aren't verified on any platform but someone asks to prove who you are

    Screenshot_20190804_191503.png

    1.   Show previous replies  1 more
    2. lewdicrous

      lewdicrous

      You need your own channel, only then would they learn! /s

  17. Every now and again I like to run through the forums and see if I've had a mention that didn't include an @. Had a few occasions where people were looking for my opinion on something but didn't actually tag me, so it's kind of become a habit at this point. It's been something of a blitz, yeah. I don't really want LTT to be "another Gamers Nexus" since I think Steve does that better than we could and I think there's value to the oftentimes more surface-level reviews we do. However, I do want to round it out with more in-depth coverage where it makes sense - Though in the case of the memory video, of course it turns out that others like Buildzoid and even Tech Deals beat us to the YouTube punch. At least I feel we brought a little more to the table this time.
  18. Everyone knows Ryzen LOVES fast RAM.. But AMD made some tweaks to the 3000-series that make it less straightforward. So what should YOU buy for third-gen Ryzen? Buy a Ryzen 5 3600: On Amazon: https://geni.us/AyDSv On Newegg: https://lmg.gg/8KVz7 Buy a Ryzen 7 3700X: On Amazon: https://geni.us/MB2V On Newegg: https://lmg.gg/8KVzM Buy a DDR4-3000 Samsung B Die memory kit (for low-speed overclocking) On Amazon: https://geni.us/nJzR On Newegg: https://lmg.gg/8KV3h Buy a DDR4-3600 Samsung B Die memory kit On Amazon: https://geni.us/JOIcw4r On Newegg: https://lmg.gg/8KV39 Buy a DDR4-3800 Samsung B Die memory kit On Amazon: https://geni.us/8Q3M0nA On Newegg: https://lmg.gg/8KV3w
  19. In this case, if you're not doing much else than gaming, then the RTX 2070's new x264 fast-like encoder will provide you with... Well, comparable quality to CPU encoding, particularly for streaming. There's little reason to go with AMD's on-stage demo of x264 slow for a stream, so really GPU encoding is a good option all around so long as it works for you. As for performance impact, rough estimate around 5%. It's possible you'd get a lower perf impact with CPU-based x264 encoding with enough threads, but... Well, that depends greatly on what the system is doing. For overall performance, the 3700X has more "headroom" for system threads and other things (like having a Chrome tab open while streaming to monitor chat / stream quality for example, and for any overlays you might have in your stream setup), but won't significantly differ from the 3600 in scenarios where the extra threads aren't being used. If you have the budget headroom, I'd suggest the 3700X for "future proofness", if such a thing exists, since more and more games are known to take advantage of higher numbers of threads and will likely continue to do so as the "traditional" consoles roll with 8 cores. In terms of priority, here's what I'd do: RTX 2070 if you can get it for cheap; Otherwise, RTX 2060 SUPER or RX 5700, depending on if you think RTX is worth the extra $50+ USD. If you can help it, do not go lower than this performance level at this budget. Quality memory, preferably something with Samsung B die or Micron E die. On its own, it won't make a huge difference, but the extra performance (and performance consistency) you can squeeze out of the CPU by tightening the timings is quite tangible. Finally, the CPU choice. If you do anything other than streaming and gaming, or if you're worried about headroom for future titles, aim for the 3700X for the extra threads. If gaming is all you care about and your stream output is being taken care of by GPU encoding, then the 3600 is the obviously better choice, since it's much less expensive and similar in performance per thread. Of far lesser significance is the chipset, unless you're going with an older B450 or attempting an A320 build. You'll want to do your homework on the motherboard and whether or not it'll run your CPU and crucially run it well. Many B450 boards have poor VRMs, which could make overclocking the 3700X (including via Precision Boost) less viable, and you can forget about it entirely with A320. In addition, I may be wrong, but I don't believe the B450 chipset supports Precision Boost Overdrive. That's not a big deal if you plan to assign per-CCX clocks (recommended), but something to keep in mind. It's worth noting that there are new B450-series boards being produced by multiple vendors with expanded BIOSes to support the whole lineup and implement newer features. In making your decision here, do not care one iota about PCI Express 4.0 right now. It's not worth worrying about unless you absolutely need very fast storage (game load times will likely not improve and GPUs can only take advantage of it in an academic sense).
  20. The 3700X is a good middle ground I'd say. If you wanted more gaming performance, the 3600 makes more sense since it performs about the same for a lower price, but the extra cores are useful for a number of reasons. Regarding overclocking, the 3800X would be better binned from what I understand (we don't have one yet), but so far we're basically getting similar OC results across the board, so it's really not worth worrying about. Despite having to do so much work all at once because of a... Let's call it slightly hurried launch, I had a lot of fun testing the new Ryzen chips, and I suspect you will, too. Pay particular attention to memory timings - We've got a video on Floatplane right now that should hit YouTube sometime in the next week detailing memory performance, and the end result is it's virtually all about timings and the Infinity Fabric clock's relation to the memory clock. Depending on the game, gaming performance can receive a pretty big uplift.
  21. Here's my $0.02: CPU: I'd go with the 3700X for a streaming setup. It's a little strangely positioned in the product stack, but the extra threads over the 3600 will make a difference when streaming. If it came down to the 3600 vs 2700, then for the same reason I'd go with the 2700, even though it's slower than the 3600 for gaming. The difference in IPC is tangible, but it's not life-changing, and those extra threads will help keep the stream from dropping frames when something other than your game and OBS decides to use the CPU. The one caveat to that is if you decide to use GPU encoding instead, in which case the 3600 is the clear choice for better game performance. GPU: If you can get the RTX 2070 for cheap, do it. It's faster than the RTX 2060 SUPER in most cases (2060 SUPER IIRC has a clock speed advantage but fewer CUDA cores), and it's discontinued - Meaning chances are there'll be clearance deals soon if they haven't already started. Beyond that, if you're considering the GTX 1660, then I'd lean towards doing the extra spend for the RX 5700 hands-down. It nearly matches the RTX 2060 SUPER in many scenarios, and even its reference design is pretty decent for cooling, unlike the 5700 XT. In our testing, AMD's new hardware encoding engine for streaming wasn't great, but apparently that's a result of poor support in OBS and AMD's insistence to us that it should all just work without any updates. EposVox and Wendell did a video on it, and it actually performs very well. The caveat to the GPU selection, of course, boils down to whether you want RTX or not. Support isn't widespread right now, but now that Unity, Unreal, and other engines support it out of the box, you can expect that to change rapidly. If you play a lot of Battlefield V, I would argue that at least enabling RTX even at the lowest quality setting not only improves the quality of reflections, but can give you an advantage as long as your frame rate can keep up (really depends on the resolution). The reason is simple: You can see things on reflective surfaces that aren't already in your field of view. That means you can be facing away from someone coming up behind you or around a corner and see their reflection, alerting you to the danger. This is currently exclusive to Battlefield V, mind you; Most other RTX implementations are all about shadows, which are mostly eye candy for better immersion. Making things even more complicated, there's also no reason why AMD couldn't make their Radeon cards do DXR fallback like Pascal and newer GeForce cards without RT cores; It's a feature of DX12 that requires no new hardware, though to be clear the performance penalty is severe (but potentially worth it if you can still achieve 60+ FPS at 1080p). A final note: Regardless of which build you go with, I'd recommend getting the fastest RAM you can afford, up to a point. We have a video on that on Floatplane right now, but the TL;DR is the tighter you can push the timings, the better, even if it's otherwise "slow" memory (eg. DDR4-2133). Aim for 3200 or preferably 3600 and use DRAM calculator for Ryzen to try and dial in the timings. It's worth noting that the type of DRAM your memory modules use is important; Samsung B die is ideal, but Micron E die is reportedly good, too. With tight enough memory timings, you can get a significant performance uplift over stock or even "default" XMP in many games. Overall, I don't think you can really go wrong at this budget level right now. Things have been shaken up so completely that the pricing and performance is now such that even if you don't get current-gen, you can get something that was hot gear last gen for relatively cheap (like the 2700). It's a really good time to build a PC.
  22. This is patently ridiculous for AMD to say. Of course it introduces more complexity for a scheduler - Yeah, it's CCX aware, but if you're in a situation where you've got a choice between two CCXes for your extra game threads and one of them happens to be on another CCD/chiplet from your main thread(s), there's a whole lot more latency involved in crossing that boundary than just the CCX boundary, and from what I've seen anyway, Windows currently doesn't have any awareness of that. All of our testing was done with the most up to date patches and BIOS revision available at the time, including the CCX-aware optimizations in 1903. It IS true that there isn't any additional complexity from the game's perspective... At least not in theory. It's all transparent, after all, hence the "topology appears monolithic" comment, and ironically this is also why CCX scheduling was a problem: If it's not known what CCX a thread is on, then it's not possible to ensure any related threads are also on that CCX. To say that there's no optimizations that can be made to the scheduler to avoid this, at least from my observations, is completely false. Why else would forcing a game to run solely on one CCD improve performance if it's not that it was crossing the CCD boundary before? Now, if the scheduler is supposed to be CCD-aware, then it's currently broken and the "optimization" would necessarily just be fixing it. At any rate, I wouldn't say that wrangling this issue is a showstopper. Ryzen 1st and 2nd gen also had scheduler issues that prevented them from reaching their full potential out of the gate, and that Ryzen 3rd gen runs as well as it does is a good endorsement in my eyes. I just don't know why AMD would be damage controlling this rather than looking into it more closely. I mean, maybe it is our setup that's the problem, but to date nobody from AMD has approached me to talk about our results.
  23. Apple’s new Mac Pro will max out at 28 CPU cores, but that’s projected to cost as much as a midrange sports car. Nothing a bare metal Hackintosh can’t fix… Buy a Gigabyte C621 AORUS XTREME On Amazon: https://geni.us/K0UF On Newegg: https://lmg.gg/8KVzS Buy a Xeon W-3175X On Amazon: https://geni.us/lnX1 On Newegg: https://lmg.gg/8KVzr
  24. Emily Young

    Hi Gabon jr do you know if there be a 2080 supe…

    Depends on how many games you want to store on the SSD. Realistically, you'd get better overall value out of a smaller but fast NVMe SSD and a larger SATA SSD (or if your collection is particularly huge like mine, a 7200RPM HDD). For most games, load times are bound by the CPU/GPU/RAM past a certain point, and storage speed increases won't result in significantly faster load times, especially if the game is the only thing running on the drive. SATA SSDs by and large still satisfy that point of diminishing returns and are often much less expensive per GB (though this is changing slowly - WD/Sandisk NVMe drives can be relatively inexpensive).
  25. He says it's around 10-series NVENC level of quality with the new plugin if I'm not mistaken, which is still good but not quite up to par, but he's impressed with the AVC encoder. It's unfortunate that we didn't have the time to do a deep dive like Epos and Wendell did... Perhaps I'll ping him next time I'm doing this and see if he's made any observations, since he's far more focused on that side of things than we are.
×