Jump to content

WereCat

Member
  • Posts

    17,135
  • Joined

  • Last visited

Reputation Activity

  1. Agree
    WereCat reacted to Alex Atkin UK in Asus TUF 4090 questionable cooling choice?   
    Indeed, and given the FE card runs its RAM hottest and clearly is still within spec, anything cooler is well within acceptable limits.

    I wouldn't be surprised if the VRAM is rated up to 105C so around 80s is no big deal.
  2. Funny
  3. Informative
    WereCat reacted to Mr.Stork in FFMPEG vs Handbrake giving me different results   
    The QP21 is in comment, the actual code is set to 30 at the top. Check 7th line.
  4. Informative
    WereCat got a reaction from Rym in Asus TUF 4090 questionable cooling choice?   
    Capacitors and inductors don't need cooling.
    Memory controller is in the GPU die so it's cooled.
    Mosfets and VRAM are cooled as you can see on the pictures you posted. There are thermal pads that contact them when the cooler is mounted to transfer heat to the cooler.
  5. Funny
    WereCat reacted to BotDamian in Petition to change the Screwdriver price   
    Hello fellow Comrades,
    I would like to create a petition to change the LTT Screwdriver price to 69.69USD.
     
    Thanks for your time.
    (Vote agree or 1v1)
  6. Like
    WereCat reacted to Origami Cactus in 5800X3D help me find benchmark or anyone with an actual experience   
    Yes, physics simulations especially are much faster on the 5800x3d because of the vcache. Look at f1 2022, or asseta corsa competizone benchmarks in the 5800x3d reviews to see how much faster they are.
    But that is because of the single threaded nature of those simulations, I don't know how blender smoke simulations work.
  7. Like
    WereCat got a reaction from Origami Cactus in 5800X3D help me find benchmark or anyone with an actual experience   
    I've only found this Phoronix review but I have no idea what are these benchmarks about. I guess many are just machine learning where the 5800X3D wins by quite a bit but no idea about the rest
     
    https://www.phoronix.com/review/amd-5800x3d-linux/8
     
    Example where the 5800X3D beats 5950X because of VCache ... still no idea about what this benchmark is though.
     

     
    edit:
    GPAW is a density-functional theory (DFT) Python code based on the projector-augmented wave (PAW) method and the atomic simulation environment (ASE).
     
    wish I knew how it fares in Blender too for these tasks. Simulations... at least some seem to benefit a lot from cache.
  8. Like
    WereCat reacted to Slave2school in 5800X3D help me find benchmark or anyone with an actual experience   
    I can't find any information online regarding your specific request either. Buy one, test it and post up. If it sucks you can sell it.
  9. Like
    WereCat got a reaction from Slave2school in 5800X3D help me find benchmark or anyone with an actual experience   
    Warzone has massive swings due to CPU overhead in many locations depending on how many players are there.
    Arma 3 is ... well Arma game with massive CPU overheads as always. Often I play at 40-50FPS depending on what's going on.
     
    The games where I have FPS issues the 5800X3D fares much better than 5950X.
    In games where I don't have FPS issues even my 3900X is fine. I know what I'm expecting for gaming.
     
    I want to know ... IS the 5800X3D faster than 5950X for what I do?
    Not all workloads benefit from extra cores/threads. WILL my workload benefit more from the massive cache of 5800X3D rather than from 16-croe 5950X?
     
    That too but I haven't played Valheim for quite a while.
  10. Like
    WereCat got a reaction from Origami Cactus in 5800X3D help me find benchmark or anyone with an actual experience   
    Warzone has massive swings due to CPU overhead in many locations depending on how many players are there.
    Arma 3 is ... well Arma game with massive CPU overheads as always. Often I play at 40-50FPS depending on what's going on.
     
    The games where I have FPS issues the 5800X3D fares much better than 5950X.
    In games where I don't have FPS issues even my 3900X is fine. I know what I'm expecting for gaming.
     
    I want to know ... IS the 5800X3D faster than 5950X for what I do?
    Not all workloads benefit from extra cores/threads. WILL my workload benefit more from the massive cache of 5800X3D rather than from 16-croe 5950X?
     
    That too but I haven't played Valheim for quite a while.
  11. Like
    WereCat reacted to Slave2school in 5800X3D help me find benchmark or anyone with an actual experience   
    It must be Valheim!
  12. Like
    WereCat got a reaction from Origami Cactus in 5800X3D help me find benchmark or anyone with an actual experience   
    I'm in a kind of a weird spot.
    I have Ryzen 9 3900X and I would love to either upgrade to R9 5950X or R7 5800X3D.
     
    I absolutely want that sweet FPS boost from 5800X3D but at the same time I don't want to have worse performance than I have now for workloads. I work in Blender and all the CPU benchmarks for Blender I found are focused on CPU based tile rendering which is very useless to me as I use GPU based rendering.
     
    What I'm interested in is the viewport and physics simulation in Blender but so far I wasn't able to find any coherent benchmark for this. Some of the smoke simulations I use take around from 20GB to 50GB of RAM so it's a very memory heavy task which makes me wonder if the extra cache on 5800X3D even matters at all since I'm dealing with such a huge data flow or if the extra CPU headroom on 5950X would simply just plow trough it much faster. But I don't really do stuff like this often so even if I had worse performance it does not matter that much. However...
     
    I spend most of my time in Viewport adjusting stuff, sometimes it takes a long time moving stuff in Edit mode when working with many verts or sculpting complex geometry or dealing with hair and my CPU is pegged at 100%, I would expect the 5800X3D to be significantly better than at least 3900X for this but again... I can be wrong. This is also somewhat memory demanding task so the cache can be very helpful but I don't know if it matters when it's using around 5GB to 10GB of RAM.
     
    Also one thing I never had to deal with, single CCD Ryzens tend to have only half the write speed of memory vs double CCD Ryzens so I don't know if this would be a big hit to the performance as well. I'm not going to buy anything soon but I'm definitely moving to a higher tier on AM4 eventually instead of going AM5 or LGA1700 once these chips appear on 2nd hand market for cheaper. For now just trying to make a sense of it.
  13. Agree
    WereCat got a reaction from Arika in A PLAGUE TALE REQUIEM - What's wrong with them?   
    Ultra settings are useless 
  14. Funny
    WereCat got a reaction from Bananasplit_00 in Massive quad-slot cooler for unreleased 900W GeForce RTX 40 series GPU has been pictured   
    It's not a space heater because it will heat your room. It's a space heater because NVIDIA is fighting against the inevitable expanse of the universe. 
  15. Funny
    WereCat got a reaction from DildorTheDecent in Massive quad-slot cooler for unreleased 900W GeForce RTX 40 series GPU has been pictured   
    It's not a space heater because it will heat your room. It's a space heater because NVIDIA is fighting against the inevitable expanse of the universe. 
  16. Agree
  17. Funny
    WereCat reacted to BiG StroOnZ in Massive quad-slot cooler for unreleased 900W GeForce RTX 40 series GPU has been pictured   
    Summary
    In a Chiphell post it was explained that this cooler was supposed to be used for an RTX 40 series card with a peak power of 900W; since that’s of course peak power, the actual TDP could be much lower. The leaker has other photos apparently, but only some were published. In the released pictures you can see a massive four slot cooler frame, and heatsink for a supposedly unreleased Ada Lovelace GPU.
     

     

     

     
     
    Quotes
     
    My thoughts
    Now it's obvious this cooler was meant for what was described as "The Beast" or the Ada Titan. But many news outlets have since claimed that card has been cancelled and there will be no Ada Titan. MLID claimed the reason the Ada Titan was cancelled was because it was "melting power supplies, and sometimes melting itself." I'm wondering if this cooler will be repurposed for the 4090 Ti, as the 4090 is not the full fat AD102 die; so there's still room to release a fully enabled AD102. Utilizing this cooler would definitely keep the temps in check like the 4090 cooler does. As the 4090 is not thermally limited by any means. We will probably have to see what RDNA 3 brings to the table before we see NVIDIA make a move on releasing a more powerful card though. 
     
    Sources
    https://videocardz.com/newz/massive-quad-slot-cooler-for-unreleased-900w-geforce-rtx-40-gpu-has-been-pictured
    https://www.pcgamer.com/look-the-laurel-and-hardy-of-nvidia-graphics-card-coolers/
    https://www.chiphell.com/thread-2452314-1-1.html
  18. Agree
    WereCat got a reaction from Holmes108 in A PLAGUE TALE REQUIEM - What's wrong with them?   
    Ultra settings are useless 
  19. Like
    WereCat reacted to MageTank in Intel claims Core i9-13900K will be 11% faster on average than AMD Ryzen 9 7950X in gaming   
    Then you have people like me. Buys a 12GB 3080, 4k 120hz OLED TV, 5950X and pushes an all-core OC of 4.7Ghz only to play 15 year old MMO's and 20 year old shooters, lol. 
  20. Agree
    WereCat got a reaction from pas008 in Intel claims Core i9-13900K will be 11% faster on average than AMD Ryzen 9 7950X in gaming   
    I think we just suddenly jumped into a big diminishing returns. 
     
    You need RTX 4090 at 1080p/1440p to expose CPU overhead this marginal... at the same time there are very few people with 4090 that will play at 1080p and at 4k the CPU basically doesn't matter (newish one). 
     
    So there's just a kind of a hole at 1440p where all of this can make sense depending on what you play. 
  21. Agree
    WereCat got a reaction from MageTank in Intel claims Core i9-13900K will be 11% faster on average than AMD Ryzen 9 7950X in gaming   
    I think we just suddenly jumped into a big diminishing returns. 
     
    You need RTX 4090 at 1080p/1440p to expose CPU overhead this marginal... at the same time there are very few people with 4090 that will play at 1080p and at 4k the CPU basically doesn't matter (newish one). 
     
    So there's just a kind of a hole at 1440p where all of this can make sense depending on what you play. 
  22. Agree
    WereCat reacted to MageTank in Intel claims Core i9-13900K will be 11% faster on average than AMD Ryzen 9 7950X in gaming   
    I think we reached a point in general where the CPU isn't going to matter unless you are trying to push 1080p at 500Hz on that new ASUS panel and don't want to be bottlenecked by the CPU feeding that frame data, lol.
     
    For most people, the compromise is going to be a nice resolution and a nice refresh rate. 1440p 165 or 4k 120 is going to be fine even on mid-range processors. I'd even go as far to say that most previous generation processors from the past 2-3 years will be fine at those resolutions. Even now, we are reaching a point where display technology is starting to bottleneck the power offered by GPU's, and these halo tier cards are going to be pointless unless new technologies (akin to raytracing) come out and magically start to tax our hardware again.
     
    Odd to see people splitting hairs over which CPU is faster when both are beyond what 99% of people are going to need.
  23. Funny
    WereCat reacted to Tan3l6 in Intel claims Core i9-13900K will be 11% faster on average than AMD Ryzen 9 7950X in gaming   
    Nobody plays at 1080p anymore. Except the poor people.
    And nobody plays at ultra except the stupid people.
     
  24. Like
    WereCat got a reaction from Birblover12 in Amazon creating a system to try and stop scalpers from buying up large quantities of high demand products.   
    Jeez, you took this way too personally. 
     
    I'm not a scalper. I know you can't have unlimited stuff and it costs resources and energy. 
    I'm expressing my frustration BECAUSE of scalping and that we have to now have systems like this in place in the first place. 
     
    Perhaps my analogy was not that great or you took it way too far.
  25. Agree
    WereCat reacted to RONOTHAN## in DDR5 6400MHz C38 vs DDR5 6600MHz C32   
    If you're just gonna enable XMP and call it a day, good luck noticing a performance difference, and good luck getting the 6600 MT/s CL32 kit to enable XMP. 
     
    If you want to do RAM overclocking, 6400 CL38 and 6600 CL32 are the same memory IC (Hynix M die) and both can do the other's settings. M die is pretty consistent and even OEM sticks can do 6800MT/s CL34 (given an unlocked PMIC), with the only real differences coming down to subtimings, max frequency, and tRCD/tRP, which subtimings aren't part of the XMP profile, max frequency only ranges between 6800 and 7000MT/s so both of those kits should clock the same, and 2 ticks on tRCD isn't gonna be noticeable in anything but SuperPi 32M. 
     
    No matter what, you should just get the one that's cheaper, and that will almost certainly be the 6400MT/s kit. Heck, it's probably not worth going for a kit faster than 5600MT/s CL40 since that's also a Hynix bin if you want to overclock (though it's not guaranteed like 6400 CL40 is), and at XMP there isn't gonna be much of a noticeable difference (maybe 5% faster in most realistic tasks, not specific RAM benchmarks).
×