Jump to content

jerubedo

Member
  • Posts

    1,444
  • Joined

  • Last visited

Awards

This user doesn't have any awards

Profile Information

  • Gender
    Female
  • Location
    Long Island (New York)
  • Interests
    Gaming, Programming, PC Building, Anime, Comics, Movies.
  • Occupation
    Senior Java Developer

System

  • CPU
    i7-9700K @ 4.8GHz all-core undervolted
  • Motherboard
    Gigabyte Z390 Aorus Pro
  • RAM
    16GB G.Skill Trident Z 3200MHz C14
  • GPU
    EVGA GTX 1080 Ti Founder's Edition
  • Case
    Corsair Obsidian 750D Airflow Edition
  • Storage
    Samsung 970 Pro 1TB
  • PSU
    EVGA Supernova G2 750w
  • Display(s)
    Dell U2415
  • Cooling
    Corsair H150i
  • Keyboard
    Logitech G105
  • Mouse
    Logitech G402
  • Operating System
    Windows 10 Pro
  • PCPartPicker URL

Recent Profile Visitors

1,712 profile views
  1. Well it really depends on what you're looking for. If you want just the highest possible FPS, period, out of the 6800, then yes you'll "need" an upgrade on the CPU. If you want 90-95% of the performance the 6800 has to offer, then no you really don't need to upgrade. As for if the 5000 series will run on B350, it's kind of doubtful. Already support on 400 series boards is delayed until 2021, supposedly. I'd check with your board's manufacturer, though. It could be possible for some of them to provide support for 5000 series, although I'd think unlikely.
  2. In ANY way? Yes. Realistically? not really. The 3600 is more than enough for up to 144Hz in most games. Beyond that and you'll probably want something better. The 5900X would be what to look at, at a premium of course. Outside of that any 5000 series would be what to go for, or maybe Rocket Lake. Intel seems to think they can still keep the gaming crown, but I am kind of doubting it after seeing 5000 numbers.
  3. @Falkentyne Okay, here's the data I've gotten. At the wall: - 579W while at a scene at 85 FPS - 605W while at a scene at 55 FPS So the wattage went up, but of course that's total system draw, so which component actually used more power? No way to tell Software readings (Aida 64): The 75 FPS scenario: GPU:353W CPU: 110W The 55 FPS scenario: GPU: 315W CPU: 150W So according to software, a 38W drop, which sure does point to the CPU being the problem (a 9900K at 5.0GHz by the way). However TDP for the card is 350W, so 315 is then using roughly 90% of its potential. Something still seems off, though. In the 75 FPS scenario the card is getting .21 frames per watt, but the 55 FPS scenario is only getting .17 frames per watt. Seems odd, although software based reading of wattage could be off.
  4. Oh, interesting! I'll have a look at power draw too, then! That might shed some more info on this.
  5. @Falkentyne I just re-tested 4K. I think I made a typo last night. At that same scene it's 35, not 45. I corrected that post with an edit just now.
  6. Yes, I've monitored each thread and the max loaded thread is 80%. Also, if it were a CPU limitation, the GPU usage would not be 96%. An example of a clear CPU bottleneck is in Flight Simulator 2020. In that I get only 45% GPU usage and only 45ish FPS. In that there are 2 maxed out threads, and overall CPU usage is 85%+
  7. At the scene I was at? I can largely get 60 FPS or over as well but at certain spots like the one pictured it stays at a constant 51. What average do you get on the benchmark run?
  8. RTX Ultra with DLSS off and Extra Details off actually does the trick. I guess I can try it for a bit.
  9. That would only hold true if the GPU wasn't being utilized close to 100%. If you can get a 1080p load to put 96% utilization on the GPU and not maintain 60 FPS, then that GPU, regardless of what it is, is not able to handle the load being requested of it. Another good example is Red Dead Redemption 2. With everything maxed it's able to pull about 75 FPS at 1080p in the most demanding scenes (snowy areas in the beginning as an example) with 99% GPU utilization. At 4K those same scenes are right around 60 FPS. That makes it just fine for 144Hz 1080p or 60Hz 4K. A side note: most people assume that San Denis is the most demanding area, but it's not. Those snowy areas are the worst case load.
  10. Well, given what you actually have, you did as well as anyone could. As @brob said, if you want to spend some extra cash to tidy up, buying shorter PSU cables would certainly help, but that's at a premium. In that tower, the motherboard only mounts in one place. You can just change the outer panels around and put the panel with the feet on the side, and put the side panel on the bottom, and change the orientation of the front panel. That effectively puts the motherboard in the horizontal position like you see in other pictures. I'm not sure how that helps them, though. Everything internally would still be mounted just the same.
  11. There's a few considerations here: 1) Games right now are using 4-6GB in 4k, but that's largely because consoles only have about 6GB of VRAM available to them in games (the other 2 is for the system and its various processes since VRAM is shared on both PS4 and Xbox One). Devs will absolutely eat up any and all resources given to them. PS5 and Series X will both have 16GB of VRAM. Believe me when I say we will see games using close to that! It likely won't be in any kind of optimized fashion, but that bring me to number 2: 2) Poorly optimized games. Some games will use more than 6GB of VRAM simply from being poorly optimized. Those games being "just poorly optimized examples and not representative" doesn't hold as an argument because, well, if they're games you want to play then that's what you're stuck with, unless the devs fix optimizations, which I would not count on. 3) Textures are constantly improving. 8K is just starting to gain some traction (nowhere near mainstream of course). That's not to say that these cards are even capable of doing 8K gaming, but using 8K textures at 4K (effectively texture down-sampling) still yields a nice uplift in quality, and that will eat up VRAM.
  12. Yeah, I've already identified that the biggest 2 settings are RTX of course and extra details. Unfortunately I see a WORLD of difference in reflections between RTX off and on. The extra detail is noticeable as well. With it at 100% the background objects are super sharp but at 0% they are blurry as hell, albeit then I get 60 FPS. 50% is a good compromise but it's still not 60 FPS (only 57) so my mentality is well if it's still not 60, might as well roll with 51 instead of 57 and get all the goodies.
  13. Oh yeah, other games are fine, but this worries me about upcoming releases. I play a lot of Ubisoft games. Valhalla is next for me after this lol. It will probably be the same . I really hoped the 3090 would power through dev issues and bull crap.
  14. See my previous response: Yes, at 4k DLSS brings me back up to 51 FPS. But that's it. On that same scene of course.
  15. No, it gets worse at both 1440p and 4k. At 1440p it goes down to 47 FPS at the same scene (with 99% usage). And at 4k it goes down to 45 FPS (still with 99% usage). EDIT: 35 FPS at 4K, not 45.
×