Jump to content

WoofyDaTechHound

Member
  • Posts

    116
  • Joined

  • Last visited

Awards

This user doesn't have any awards

Contact Methods

  • Discord
    ShadoWoof
  • Steam
    Sgt_ShadoWolf

Profile Information

  • Gender
    Female

System

  • PSU
    Seasonic Focus Plus 850w 80+ Gold
  • Mouse
    Razer Mamba Tournament
  • Sound
    Sennheiser GSP 350
  • Operating System
    Windows 10

Recent Profile Visitors

933 profile views

WoofyDaTechHound's Achievements

  1. Thing is im wanting to get a card like soon so i dont lose the battlepass as i dont even wanna really play the game bc of how bad it is. Im capping my vram at 100% rn, 5.5gb is starting to show bottlenecks in other games also
  2. 70 for just the game, 100 for all the bonus shit which seems worth it. includes the battle pass, 50 levels, gun stuff. At 1080p, lowest of all settings i still cant actually play pvp bc of such low frames Also with the 1060, its still the most used gpu according to steam hardware survey for september 2022 with like 7% of all users.
  3. Right now i have a 1660 super and its starting to get bottlenecked on the 6gb of vram. MW2 is actually unplayable on multiplayer and campaign looks like a textureless source engine game. I def wanna play before the end of the first season so i can enjoy myself and use the $100 i just spent to buy the damn thing and not miss out of new content. Ive been searching and kinda settled on the 3060 but with the 40 series knocking at our wallets, is it worth waiting more until we get a release date for the 4060. Also with the 12gb 3060 vs 8gb 4060, is the 8gb on the edge bottlenecking with some titles requiring so much vram. Im also planning to use vr, vtube models and streaming. I see the new ai image rendering would create frames using ai taking stress off the gpu but since this gpu has yet to be released and no testing trying to test this new feature to know if it is worth to drop vram size
  4. Same, but when plugged in to what i thought was meant to work it just never got recognized
  5. x470 strix gaming f. when i went to plug in my corsair controllers i found that the needed pluggins were not on the board but could go buy adapters to fit
  6. 6800xt performs within a few fps of a 3080ti at $200 less so its a much better option for the price but i prefer nvidia. As with intel, i need to look much much more into how these dual core design they are using for better performance
  7. x470, been so long since i ever looked at my mobo. However i need a new mobo anyway bc the one i got lacks all the connectors for fan controllers, rgb controllers or a few other hook ups
  8. Rn running 16gb of 3200mhz and a x270 so dead chipset. With 6gb of vram that gets maxed pretty quick even at moderate settings but some high it struggles if not almost caps out, in system says there is only like 5gb of usable vram tho. The small cache on the cpu seems to get maxed easily on newer titles
  9. Rn got 2700x with 16gb of 3200 but along with my gpu (1660 super) that the amount of vram and L2 L3 cache that get used on some games make it almost unplayable. Damn even deep rock galatic and warzone struggle bc it maxes out my ram on them
  10. So been looking through all the new tech and I knew ddr5 had become a thing. With my initial research and talking to others it seems rn its not worth the money/performance that ddr5 is giving rn due that it hasnt been optimized enough yet to fully unlock its potential hitting around the 6000mhz speeds. This seems like again a "early adopter" thing for people to invest now to wait X months/years before the full potential is unlocked and get very similar performance rn to ddr4 at a slightly higher price. My dilemma is to whether go Intel and just swap board and ram once these are no longer a early adopter thing or go with a 5000 series ryzen and go full ddr4. My current takeaway is go ddr4 for now and just buy into the new tech once its ironed out the efficiency issues
  11. Everything im seeing says otherwise as you record and stream to OBS with it. It plugs from the gpu to the card and then to the monitor so. It just captures the games play, send it to obs then to your screen Seems it pulls alot of of stress off the cpu with the encoder
  12. Why wouldnt it when it has a h.264 encoder onboard? Why would it be there then?
  13. Yeah thats what ive selected as i wont touch the 9th gen. But im wondering if i should get a HD60 Pro to do all my encoding. Im not a fan of AMD as the options just dont compete with intel in terms of performance but do have a lower price point on a less optimized architecture
×