Jump to content

Mister Woof

Member
  • Posts

    9,485
  • Joined

  • Last visited

Everything posted by Mister Woof

  1. It will still do better than a 2700x. I'd say they should consider what their workload is mostly going to be - if it's CPU render then perhaps consider a similarly priced 5900x depending on their motherboard. If it leans more to games, then 5800X3D might be a better choice and is a little more forgiving on motherboards. If it is this: https://www.asus.com/motherboards-components/motherboards/prime/prime-b450m-a/ I would say a 5900X is probably a bit too much for it. If you're spending more time doing render then perhaps consider upgrading the platform.
  2. Didn't quote you, but it's also difficult to gauge CPU usage in individual cores due to SMT. Sometimes it can be hard to determine. GPU utilization is easy at a glance.
  3. You can be CPU limited in games without getting even close to 100% utilization. I'd use GPU utilization as a better metric. If when you game, you are not 96%+ GPU usage, then you probably have a CPU limitation. The 5800X3D will absolutely result in large game performance increases; however, you won't really feel too much of that with just a 1080ti. That said, some games that are very CPU dependent or that lean more heavily to CPU will still gain in the frametimes.
  4. only benefit of the 13900k over the 13700k in gaming is factory clock speed and slight increase in cache. they both have the same amount of P cores and another 8 E cores aren't going to do much but yeah if you got the money for a 4090, why not get the best?
  5. been running mine about 6 months now, never had a thermal throttle issue even in the standard 10m long CBR23 test.
  6. unless you're in a mini itx or some other thermally constrained scenario, undervolting and its accompanying stability issues/testing isn't worth your time. Most often CPUs wont be pulling full load in typical workloads anyway, and the majority of your power usage is going to be at idle in which case it's kind of a wash. In regards to full load power, it gets your job done faster anyway, so that scales somewhat relative to power cost. we're talking about generally less power consumption than a few old school light bulbs.
  7. In general, most people use PCIE slots for capture cards, sound cards, or network cards at most. All of those can be solved with USB, and iGPU/GPUs can do a lot of work that makes additional graphics cards pointless. Most people will never populate any of those slots. I do agree that HEDT needs a comeback, in which case more PCIE slots could be a thing, but for the average user, more M.2 slots are just more useful. Nobody wants to hang a bunch of SATA drives anymore either.
  8. USB, iGPU, and increased GPU capabilities mean consumers don't really need more PCI slots. There's more use cases for storage in the form of M.2 than PCI slots. Your issue is niche.
  9. Should be improvement in the 1% and 0.1% lows. The 1600 is incredibly slow, slower than Haswell. The 6600 is plenty fast for new games, and 8GB is plenty for 1440p in most scenarios. I'd agree that you should have gone with a 5600/5800X3D, though.
  10. Don't get 5600G. As stated earlier, half the cache, plus only PCIE3.
  11. As a longtime consumer overclocker....not worth your time. Run everything at the most reasonable settings for stability and performance, and enhance your computing experience in other ways. Most of the time you're looking at single digit performance differences (especially in modern hardware) and in real world that isn't noticeable enough to be worth your time. Work on noise levels and heat.
  12. userbenchmark sucks but i would absolutely rather play games on an i3-13100F than a threadripper system.
  13. You can be CPU bottlenecked even if your utilization is low. Overall utilization especially is not necessarily representative of a CPU limitation. The operating system views the 9900k as a 16 core CPU, and if a game only really can use 4 cores (8 logical if it can even use hyperthreading) then you can be at 50% utilization and still be fully capped out. Some games aren't well multi-threaded, especially online games. At this point in time the single-threaded IPC of Skylake is pretty old. As far as 4.9ghz vs 5ghz...there's no real world difference in 100mhz. There's barely any difference between 4.7 and 5.0. Ultimately, you're looking at 8 year old IPC, which will sometimes struggle in modern games. My wife still uses a 9900k, and in a lot of the games she plays, which are mostly simulation games, there are times when the games start to chunk along and it's because of that low IPC.
  14. Probably because the chip lineup won't be huge. I think @GuiltySpark_is right, it will be some minor release
  15. I wouldn't bet on it. 6th gen and 7th gen used 100/200 series boards, and even though 8th and 9th used the same socket they locked them out and needed 300/400 series.
  16. 4080 pulls less power than its previous generation counterparts I run a 3080 and 13700k on an old 850w Seasonic Focus Plus Gold, should be fine
  17. I have a few PCI Wifi cards for a few systems oops misread, yeah they are pcie
  18. It's not that big a difference, sure, but they aren't the same CPU. Especially considering "same cpu" has been a thing in the past.... i9-11900k is the same CPU as the i7-11700k, just slightly higher frequency which is a BIOS setting. i7-7700k is the same CPU as the i7-6700k, just slightly higher frequency which is also a BIOS setting. 13700k has hardware differences and IPC improvements over the 12900K, in the form of more L2 cache....and higher frequency, and IIRC the memory controller is better. 13700k is a better 12900k for less money
  19. 4000 series wasn't very significant, just some APUs if i'm not mistaken
  20. While I wouldn't go so far as to say the 13700k would obliterate the 12900k....they aren't quite the same either. The 13700k is a "better" 12900k. It's a refined process, has higher boost, and more cache.
  21. there'll be times when it isn't esp. in games that are network sensitive no matter what hardware you have
  22. what do you do with your computer what GPU do you have and when do you plan to upgrade it the 5800X3D is a good drop in upgrade but honestly if you aren't on a very high end GPU and if you aren't unhappy with FPS lows, then I'd just wait for a bigger upgrade. Coming from someone who went from 10900K --> 13700K, which is IMO a much larger jump from a 5600 to a 5800X3D, I can't really say the difference is all that big to warrant the $800 price tag. Even if it was only $329 like your price tag....it's iffy.
  23. do you live by a microcenter? Their specials can often dictate best value choices outside of normal channels
  24. well i5-12600k ~= i5-13400f i7-12700k < i5-13600k i9-12900k < i7-13700k or just get a 7600 for less and get about the same perf. in games
×