Jump to content

Frankenburger

Member
  • Posts

    1,683
  • Joined

  • Last visited

Everything posted by Frankenburger

  1. The way it's currently set up is the modem is wired into the router, and the mesh is wired into the nighthawk. It's really not a big enough deal to go out and buy new equipment. Just looking to see if there's a way to configure the router/mesh settings. The Nighthawk is 192.168.1.1 and the mesh is 192.168.86.1. I've tried manually changing the mesh to match the nighthawk, but that didn't work.
  2. They're both right next to the modem. Bridge mode is not available, only mesh mode.
  3. So there's no way to configure the DNS or DHCP between the two for their connected devices to communicate? Unfortunately, eliminating either one is not an option. Both the Nighthawk and the Google Mesh are necessary for different reasons. Getting rid of either would be more of an inconvenience than just manually switching networks when necessary.
  4. Even so, Mesh has a much slower wireless speed than the Nighthawk. I'd rather deal with the inconveniences of having to switch networks as opposed to lowering the network speed.
  5. That's not possible. We have 3 systems that require hard wire.
  6. I tried a factory reset of the mesh earlier, it didn't give me the option to select bridge. It automatically locked me into mesh mode. Also, using just one network isn't an option. The reason why we got the mesh is because we have dead zones.
  7. I have a Google Mesh network plugged into my Nighthawk router. The Mesh is used for low priority devices, and devices that otherwise would not reach my Nighthawk. Because of this, the devices connected to the Nighthawk doesn't see the devices connected to the Mesh, and vice versa. For example, my PC (wired to the Nighthawk) can't see the printers on the Mesh, and the livingroom TV (connected to the mesh) can't see my media server on the Nighthawk. Unfortunately, when setting up the Mesh, it's defaulting to mesh mode instead of bridge mode, and it won't let me configure it to bridge from the Nighthawk. Hence why I technically have 2 networks.
  8. Honestly, as long as you're not benchmarking, I wouldn't worry about the 1% and 0.1% lows. Keeping an eye on the frametime graph is going to be a lot more helpful, since it shows you the real time variance.
  9. Possibly, but if you can pass a RAM stability test, then it shouldn't be a problem.
  10. Stutter can be caused by a number of things. Poor graphics settings, background processes, hardware, power, drivers, and even frame times. For me, the most common cause of stutter comes from Vsync, so I usually disable that and see if it goes away. Here's an example https://i.imgur.com/G5p0Sb9.jpg - Vsync on, poor frametimes https://i.imgur.com/TSvGPPA.jpg - Vsync off, great frametimes I would configure MSI Afterburner to display your frametimes. From there, disable any unnecessary background process, including gaming and peripheral related software, and set the game's graphics down to low. If you're still getting stuttering, then that should narrow it down to something driver, hardware, or OS related.
  11. Monster Hunter World (Custom ReShade @ 5120x2880)
  12. Guild Wars 1 by a long shot. I've spent over 2000 hours in that easily. It's a shame nobody plays it anymore, and GW2 isn't what I wanted as a sequel. Diablo 2 was a close second, and I used to play Counter Strike Source religiously, but those two are hard for me to put a number on since time tracking wasn't a thing back then.
  13. Definitely use NVENC to avoid a hit to your CPU's performance. Make sure you use the advanced option and set the bframes to either 3 or 4. Also, you could run 1080p/60fps if your bandwidth allows for it, but I'd personally go for 1600x900/60. Most people won't be watching your stream in full 1920x1080 anyway, so might as well size the resolution down a little, which should help retain good image quality.
  14. I've been running SLI since 2014, currently on SLI 1080Ti. Stability really isn't an issue. Either SLI works, or it doesn't. If SLI doesn't work, then you can often get it working by changing the compatibility bits. In the rare occasion SLI causes negative scaling, and changing the bits doesn't help, you can simply disable SLI for the game's process. My only recommendation would be to run games with DSR, SSAA, or with graphics mods. Of course, you don't have to, but SLI is easy to bottleneck at 1080p. Most games simply don't stress modern high end GPUs for SLI to really show its stuff until you start rendering above 1440p.
  15. Really depends on the games you play and how much effort you're willing to put into tinkering with SLI settings. For average users, SLI really isn't worth it. For people who aren't bothered with spending time on system tweaks, and can accept SLI won't always work, no reason not to go with it. This is because many games either aren't SLI compatible from the get-go, or don't require SLI to run effectively. SLI compatibility can easily be expanded with driver tweaks, and older games can benefit from SLI if you super sample and/or use performance heavy ReShade mods, but this isn't the case for most users.
  16. Spyro Reignited 4k RTGI + SLI
  17. Did you know that the 3 most common GPUs found in gaming rigs are the 1060, 1050Ti, and 1050, making up a total of 30% of the gaming community? Hell, there's more people rocking the 750Ti than there is the 1080Ti. https://store.steampowered.com/hwsurvey/videocard/ Yes, people run at lower settings than ultra, and probably even lower than high in a number of cases. Why? Because they don't have the hardware required to run at a stable FPS using ultra settings on a 1060 or lower. Even I turn certain settings off or down to low to save performance. Godrays in FO4, Volume Rendering in MHW, SSR in MHW, Volumetric Lighting in RE2R, etc, are all settings I personally turn down or off. A lot of people disable motion blur, and a lot of people don't like depth of field. As I said before, ultra settings is a status symbol. Developers include ultra as a novelty for users to take advantage of if they so choose.
  18. I can easily flip the question back at you "Why test at ultra settings when most users have rigs that can't run at those settings, and will turn those settings down?" There's a reason why optimization guides exist - because most people would rather play at a stable 60/120/144 FPS regardless of settings. Most people can identify that ultra settings are useless without a side by side comparison. Sites that simply test games at ultra or on the highest preset are very disingenuous to their viewers. Even though 8k gaming is still years away from reasonable expectation in high fidelity titles, I'm perfectly ok with seeing 8k medium benchmarks, because I'm also perfectly ok with seeing 1080p medium benchmarks too. The more testing done, the better.
  19. All the while ignoring the fact you can run at a higher resolution, higher framerate, and have access to graphical mods? I'll take 4k medium/high with ReShade at 144 FPS over 1440p Ultra/Max at 120 FPS any day of the week.
  20. "Max settings" is a status symbol and nothing else. Games are more often optimized around medium settings, and high settings often looks quite similar to ultra/max without a huge performance penalty. There's zero point in turning up the sample counts of shadows, lights, post processing, etc just to edge out over medium/high at 70% a reduction in performance.
  21. What you're seeing is called motion artifacts, and are honestly about what I'd expect considering you're using the fast preset. You can reduce the amount of motion artifacts by raising the preset level, using custom me or subme variables, or perhaps even changing the tuning, but you will always have them because of hardware limitations and the limitations Twitch puts on streams. Considering how much you need to swing the camera around just to get them to show, I wouldn't worry about it. Even high profile streamers have artifacts every now and then with dedicated encoding hardware and raised stream limitations. I'm gonna level with ya though, the motion artifacts really aren't distracting to me as a viewer. The rapid swinging of the camera is more distracting to me than the artifacts. Your stream quality looks pretty solid to me.
  22. 970 SLI You still see some newer games with SLI compatibility, and if the game runs in DX11, there's a chance you can tweak the drivers to get it working manually. Also, if you play older games or plan to revisit older games, 970 SLI should hold up better than the 1060 3GB.
  23. My friends and I still play on occasion, where our off times are mostly thanks to conflicting schedules. The battles are still satisfactory, especially during peak hours, but the game has gotten to the point where it would really benefit to consolidate the servers. Last I checked, Connery's population is quite small compared to Emerald, and since the game is about large scale territory wars, I don't understand why they don't just do a server merge.
  24. I vaguely remember them saying that once the PC version caught up in content, all future releases would be simultaneous between console and PC. That said, the PC and console versions were using the same patch a few weeks before Iceborne dropped, and since the foundation for the PC version was established, I don't understand why they didn't release Iceborne on PC along side the console version.
  25. It might be earlier. MHW PC was originally slated for Sept, and ended up dropping August. The release date for Iceborne on Steam says "expected", so it's not set in stone yet.
×