Jump to content

FeIIex

Member
  • Posts

    180
  • Joined

  • Last visited

Awards

This user doesn't have any awards

Contact Methods

  • Discord
    Fellex#4210
  • Steam
    FeIIex

Profile Information

  • Gender
    Male
  • Location
    New Brunswick
  • Interests
    Computer hardware, Casual & Retro gaming, History
  • Biography
    Pizza freak
  • Occupation
    Tech Supervisor

System

  • CPU
    Ryzen 7 5800x3D
  • Motherboard
    MSI B550 Tomahawk
  • RAM
    2X32GB Patriot Viper 4 Blackout
  • GPU
    MSI RTX 4090 Gaming Trio
  • Case
    Lian Li 011 Air Mini
  • Storage
    Sabrent Rocket 4.0 2 TB
    ADATA SX8200 Pro 2 TB
  • PSU
    Fractal Design Ion+ 860P
  • Display(s)
    Dell AW3821DW 3840x1600
    Acer ET322QK 3840x2160
  • Cooling
    Noctua NH-D15S
    x3 Noctua F12's
    x2 Noctua A14's

  • Keyboard
    Leopold FC750RBT w/ Cherry Blacks
  • Mouse
    Logitech G502 HERO
  • Sound
    Sennheiser HD 660 S Headphones
    Creative Labs Sound Blaster Z
    Blue Yeti
  • Operating System
    Windows 10 Pro
  • Laptop
    MSI Katana GF66 11UE
    i5-11400H 3060 85W
    32GB 3200mhz
  • Other
    Steam Deck 512GB

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Nothing like Windows 11 crashing due to using a second display, goood times. Worked fine until today, rolling back GPU drivers that I updated a week ago didn't help. All crashing immediately stopped after I turned my second monitor off, and returned when turned back on!

  2. Built August 2014 GPU from the first build was upgraded (280X to 1070) December 2016, old card couldn't run Fallout 4 with settings I found acceptable. Aside from adding an SSD in 2017, my first build lasted until December 2020. It now is used as my parent's desktop, though with no 1070.
  3. I would go with a monitor upgrade first, then better hardware down the road. The 10700k is still a very good CPU. Choosing 1440p would also have less of a CPU bottleneck, so if it's affordable, that'd be the best choice. You could also keep the monitor for when you get an itch to upgrade again.
  4. Mostly comes down to weird quirks when I had AMD. Minecraft with shaders would have bad frametimes, every couple seconds my FPS drops to sub 10 for a second, then back up. The nvidia gpu at the exact same settings runs at a constant 144 fps (16 view distance, complementary shaders with shadows set to 16, Versions 1.17 and 1.18) Even the 1080 Ti I used to have didn't have these issues. 7 days to die runs like crap regardless of hardware, but in Alpha 19, the AMD gpu would have sub 20 fps while either moving your character or your camera, even when set to 1080p. Coil whine was abnormally loud only in this instance as well. Turning shadows off had the game run as expected. Fixed in alpha 20, but that issue was never present in Alpha 19 with the nvidia card. Left 4 Dead 2 at this point should be a CPU bound game, running at hundreds of frames per second with any modern system. The AMD card would drop down to 50 fps during hordes (90 with Vulkan enabled), where the nvidia card only drops down to 295 in the same situations. When the AMD GPU worked as intended, I loved it. Borderlands 3, Metro Edodus Enhanced and heavily modded Skyrim ran like a dream. Unfortunately for most of what I played it was unable to deliver consistent framerates. All of this said, the issues, minus 7DTD as alpha 20 was out at the time, I had were completely absent when I demoed Manjaro Linux for two months. Leading me to believe AMD's Windows drivers are to blame. Had to DDU AMD's drivers a few times due to failed updates, or Windows breaking the adrenaline software. GPUs in question are the 6900 XT and 3080 Ti, paired with a 5900x running at 3840 x 1600. Had a chance to side grade for a few hundred bucks and took it.
  5. The lights in my room start flickering when playing demanding games, solution was to undervolt my 3080 Ti. Looking forward to when I upgrade both my GPU and power outlet!!
  6. I am also a Linux newb, and recently switched ship from Windows 10 to Linux for the same reasons. My distro of choice was Manjaro Linux, which is Arch based instead of Debian, of which Ubuntu and Mint branched off of. Main use cases are web browsing and gaming, and other than the initial set up, I haven't had many issues past that. My GPU is an AMD card though. Using Proton and tweaks on ProtonDB, I have been able to get everything I want to run well enough, except Metro Exodus Enhanced. The standard version works without any issues, though. My laptop is currently running Linux Mint, which has an aging Nvidia 680m. That experience was not nearly as great, but I cannot determine whether its due to lack of knowledge, the Nvidia card it has, or my distros of choice. (Mostly issues with non native games not running at all, even through Proton and the tricks applied from ProtonDB. Tried both Proprietary and open source drivers) If you have a spare old HDD/SSD lying around, you can install distros on it until you find the one you like best without committing right away. I also heard something about creating a separate partition for the /home directory to make distro hopping easier, but I have not attempted it myself.
  7. I've used audit mode to create base images for our WDS a bunch of times, but was unaware of the ability for the system to remember what I had put onto there after the image is created (never did go through the installation process afterwards). Good to know that we have a reasonable solution for the future, thanks for sharing!
  8. As someone who refurbished PCs, it is difficult enough when nearly all of our donated PCs do not meet the minimum specifications for Windows 11.. But now once we are forced to start using it in 2025, we need to have our clients set up microsoft accounts? ;( The PCs we send out go to people who are less technically inclined and children, so not having security updates is a no go.
  9. I replaced a not even 2 year old Corsair K95 RGB Platinum with a Model M that I found at work, it was free but its a new to me board. Probably sticking with this thing until either it bites the dust, or I do.
  10. I'd remove them on my personal machine mainly for not having to deal with those tiny cables.. (Big hands, big problems) I also do not use front IO much personally, but I do know people that do.
  11. Looking good! Whenever GPU prices return to normal, this machine will be able to handle a 3080 + with no sweat.
  12. You can mix the two sticks together, though the 8gb stick will also run at 2400mhz.
  13. I know plenty of people here in NB, Canada that much prefer manual vs automatic, but I am not one of those people. I can completely understand why people love manual though.
  14. I do believe that is coil whine, my 6900 XT (Reference card made by Asrock) makes a similar sound during super specific situations, this being 7 days to die with shadows on in my case. All coil whine, if any, is drowned out by the fans in everything else. But like you, I do not have a ton of experience with coil whine, just my single card experience..
  15. So either my 6900 XT reference card suddenly became just as good as a custom loop, or the heat sensors are going... Near 100% utilization in Minecraft with extreme shaders 3840x1600. No driver updates, was running what I expect last night (hotspot being 84C with same settings) If this is an issue, how would I even go with fixing this? (I could just set an extremely aggressive fan curve?)

    image.thumb.png.8b7fd7d9501fc4fe92a7a5a14452ebe4.png

    1. FeIIex

      FeIIex

      Forgot to mention, the readings are the same across HWINFO64, task manager and AMD's own software

×