Jump to content

KaitouX

Member
  • Posts

    1,072
  • Joined

  • Last visited

Reputation Activity

  1. Agree
    KaitouX got a reaction from pedrosenedesi in GPU unstable when undervolted - lost the silicon lottery?   
    My 6700XT is also complete trash for undervolting, anything over -10mV is already unstable in some ways. To reduce power consumption I personally found useful to limit the clock, even something like 2.35Ghz already brings the power consumption down by almost 40W, while losing only 5% or so performance.
  2. Like
    KaitouX got a reaction from DrJankenstein in Power Efficient Machine for dvr-scan   
    At idle Apple silicon is king, but at load it might not be. In some loads both 13900K and 7950X can be more efficient when optimized for efficiency. It also depends if GPUs are usable or not, in some cases a Nvidia GPU could be multiple times faster than the M2 while using similar amount of power at load, but again at idle it would be significantly less efficient than the M2, as you would be looking at around 50~80W usage on idle for a PC with a dGPU compared to about 20W for a Mac Studio.
    An example that may or may not be helpful to you, on Cinebench R23 at stock the 3990X(280W) seemingly achieves ~58000, the 13900K while limited to 88W achieves ~28000 and the 7950X achieves ~30000 while limited to the same 88W. Both are apparently roughly on a similar efficiency level as the M2 Ultra, at least when looking at CPU only, the entire system might give Apple an advantage, but you could build a 7950X system for much cheaper than a Mac Studio, that price difference could compensate for the slightly higher power consumption.
  3. Like
    KaitouX got a reaction from rothbardfreedom in Is this normal behavior for a 13700k? (Thermal throttling)   
    By the score I will guess you set the PL2 to expire after a minute or so, but Intel spec I believe is infinite PL2 for K CPUs. Anyway I would probably lower the PL2 to about 160~180W, as that should keep the temperatures in check while being almost as fast as stock even in all-core workloads. You can also set the PL1 to 180W and PL2 to 253W, and make it drop to PL1 after a minute, that should keep short all-core loads as fast as stock, while keeping it a bit more controlled in longer ones.
  4. Like
    KaitouX reacted to SlimeCore_ in I5-13600KF - Liquid Freezer II 360 concerns.   
    I have reseated the cooler one more time, and... Cinebench/Aida64 are now maxing out at 80C which is alot better!.
    One of the standoffs where the coolers gets mounted onto was slightly lose which probably caused a tiny gap.

    Cyberpunk still runs at 65-80 though, to be more precise roughly half of the cores are 65 while the other half is at 75-80.
    Another game i played was "Squad" a Milsim, it maxes out at 60-65 before and after i reseated the cooler.
    Games basically havent changed.

    Edit: I've found out something intresting worth noting, the Pump and Fan might be both controlled by the CPU_FAN header on the Mainboard but the Pump already maxes out its speed at 40% PWM. So i can still have a Fan curve to reduce noise when im not doing anything, it just has to beginn at 40% to keep my pump at 100%.
  5. Like
    KaitouX got a reaction from SorryBella in Im upgrading Finally from Z270, need a bit of insight/advise.   
    12700K doesn't make sense unless you can get it for significantly cheaper than the 13600K, the 13600K is faster in almost every way.
     
    Depends on what you expect the performance jump. For gaming there's probably a small, but noticeable performance increase in some games, some will stay the same, while some may have big improvements, in particular to the lows, if you upgrade the GPU the difference would be bigger. For other tasks the 12600K and 13600K are going to be significantly faster, they're 70%+ faster per core, while including 2 extra cores, HT and E-cores, making for a 3x(12600K) or 4x(13600K) improvement compared to an OC'd 7600K in heavy MT loads.
     
  6. Agree
    KaitouX got a reaction from Slottr in New Build for a friend [900-1250 euro] 1080P 60FPS+   
    That's even worse, a 500 EUR+ GPU that performs only 10% better than one that costs 360.
    The 4060 is also bad, the 3060Ti is 10% faster for 20 euros more, and the 6700XT is a bit over 10% faster than the 4060 and has 4GB more VRAM for 40 euros more. The 6650XT is also a better option over the 4060, being less than 10% slower while costing 80 euros less.
  7. Like
    KaitouX got a reaction from SorryBella in 1500-2000€ Productivity focused new build   
    The 7900X can also be limited to the same PPT as the 7900. Going for the cheaper is the right choice(if the stock cooler isn't going to be used), they are pretty much the same CPU.
     
    The 13700 is faster than the 7700X even if you set the power limit to 88W.
    Also 13700 at 88W is about equal to the 7900 at 65W, but the 7900 pulls ahead at 88W with the 13700 needing ~125W to get similar performance in heavy all-core workloads.
  8. Like
    KaitouX got a reaction from Why_Me in 1500-2000€ Productivity focused new build   
    The 7900X can also be limited to the same PPT as the 7900. Going for the cheaper is the right choice(if the stock cooler isn't going to be used), they are pretty much the same CPU.
     
    The 13700 is faster than the 7700X even if you set the power limit to 88W.
    Also 13700 at 88W is about equal to the 7900 at 65W, but the 7900 pulls ahead at 88W with the 13700 needing ~125W to get similar performance in heavy all-core workloads.
  9. Informative
    KaitouX got a reaction from trivialagate02 in Do clock speeds vary?   
    You probably shouldn't care about the clock speed in the specs, in particular for the lower-end ones, most GPUs are going to go past it regardless, on higher-end models it usually means it's factory overclocked.
    As example here are the clocks of the 4070 Ventus 3X(2475 MHz) :

    4070 FE(2475 MHz) :

    4070 Gaming X Trio (2610Mhz) :

  10. Agree
    KaitouX got a reaction from starsmine in Best RX 6800 XT To Buy?   
    Sapphire and Powercolor are like walmart brands??
    3080Ti for the same price or cheaper than the 6800XT???
    Using userbenchmark unironically????
     
    Sapphire and Powercolor often have some of the best coolers on AMD GPUs. Even if they have their duds like every AIB, most of their coolers are well built.
    The cheapest 3080Ti i see on PcPartPicker is $1030, which is a completely trash price for a GPU that is only 10% faster than the 6800XT, which goes for $490. Even the 4080 which is horrible value is ridiculously better than the 3080Ti, as it's "only" $100 more but is over 20% faster when looking at proper benchmarks done by HardwareUnboxed/Techspot, Techpowerup, GamersNexus and other reviewers.
     
     
    Now answering the topic question, just go for the cheapest one, for most people that's the best option, if there are multiple models for the same price, try looking for reviews that compare them. Better models are only really good for people that really want a quieter GPU and don't mind paying significantly more for it.
  11. Agree
    KaitouX got a reaction from IkeaGnome in Best RX 6800 XT To Buy?   
    Sapphire and Powercolor are like walmart brands??
    3080Ti for the same price or cheaper than the 6800XT???
    Using userbenchmark unironically????
     
    Sapphire and Powercolor often have some of the best coolers on AMD GPUs. Even if they have their duds like every AIB, most of their coolers are well built.
    The cheapest 3080Ti i see on PcPartPicker is $1030, which is a completely trash price for a GPU that is only 10% faster than the 6800XT, which goes for $490. Even the 4080 which is horrible value is ridiculously better than the 3080Ti, as it's "only" $100 more but is over 20% faster when looking at proper benchmarks done by HardwareUnboxed/Techspot, Techpowerup, GamersNexus and other reviewers.
     
     
    Now answering the topic question, just go for the cheapest one, for most people that's the best option, if there are multiple models for the same price, try looking for reviews that compare them. Better models are only really good for people that really want a quieter GPU and don't mind paying significantly more for it.
  12. Agree
    KaitouX got a reaction from PC Player in Best RX 6800 XT To Buy?   
    Sapphire and Powercolor are like walmart brands??
    3080Ti for the same price or cheaper than the 6800XT???
    Using userbenchmark unironically????
     
    Sapphire and Powercolor often have some of the best coolers on AMD GPUs. Even if they have their duds like every AIB, most of their coolers are well built.
    The cheapest 3080Ti i see on PcPartPicker is $1030, which is a completely trash price for a GPU that is only 10% faster than the 6800XT, which goes for $490. Even the 4080 which is horrible value is ridiculously better than the 3080Ti, as it's "only" $100 more but is over 20% faster when looking at proper benchmarks done by HardwareUnboxed/Techspot, Techpowerup, GamersNexus and other reviewers.
     
     
    Now answering the topic question, just go for the cheapest one, for most people that's the best option, if there are multiple models for the same price, try looking for reviews that compare them. Better models are only really good for people that really want a quieter GPU and don't mind paying significantly more for it.
  13. Like
    KaitouX got a reaction from thejackalope in Why does everyone hate the 4080?   
    Considering that it's 70% more expensive than the 2 year old predecessor, and only 50% faster, It's pretty shit.
    Any new GPU that is worse value than the old ones is shit for me.
    The current market isn't what it was 1 year ago. GPUs are mostly below their original MSRP, and in AMD case well below it, with the 6900XT going for as low as $650 while being pretty close to the 3090 gaming performance wise. The 4080, together with the 4090 are some of the worst value GPUs you can get currently only being slightly better than some of the cash grabs Nvidia launched during the GPU shortage.
     
    I didn't look that much, but I have yet to see any shop where you would have a hard time getting one. I've seen claims that physical shops are full of 4080 with no interest from consumers, including a few photos/videos from Microcenter that show loads of GPUs(4080 specifically) with no one picking them.
  14. Informative
    KaitouX got a reaction from Senzelian in Will a 7900 (65W) and a 7600 (65W) output the same amount of heat?   
    They both use the same amount of power at stock, I believe it's 88W PPT for all Ryzen 7000 65W parts.

  15. Agree
    KaitouX got a reaction from danwat1234 in Apple fans, start typing your angry comments now…   
    The efficiency numbers are good, but it would be nice to know how they compare against a 13900K/7950X and 4090 that are on a better place on their efficiency curve. For example the 13900K achieves roughly 85% of the stock performance when limited to 142W, while the 7950X is ~95% of stock while limited to 142W.
    Considering that the 3090 also could be incredibly efficient when undervolted or power limited, I will guess the same would be true for the 4090, but I don't really know if someone tested it.
    I'm aware most people don't power limit their systems, but it could be an interesting test to see regardless of that.
  16. Informative
    KaitouX got a reaction from thebeastking1012 in Hyte Y60 and Dark Rock 4 Pro. Will it Fit?   
    Apparently it fits: https://pcpartpicker.com/b/hNNPxr
  17. Informative
    KaitouX reacted to igormp in Apple fans, start typing your angry comments now…   
    While using stuff like llama cpp is nice, keep in mind that no training is done on those M2, so they're only relevant for running your fancy chatbot, but not useful for developing new models or fine-tuning those, you're still stuck with an nvidia GPU.
    Those are not needed at all for running multiple GPUs, and irrelevant when you're at 4~8 GPUs only, so not a deal breaker at all.
    I find your claim to not be that useful, because with a pair of 3090s you have way more performance than the M2 Ultra at a fraction of the cost, while also being able to fine tune those models. Quantization optimizations also apply to the 3090, with Ampere supporting both int4 and int8 acceleration (and Ada even having FP8 support), and thus getting both lower memory usage and FASTER inferences (while on M1/M2, you do get the lower memory usage, but no improved inference times, sometimes those even get worse).
     
    Also, the M2 is irrelevant for training those models since you can't properly cluster those.
    Hi, I did:
    Some extra numbers using resnet with fp16 and a batch_size of 128:
    250W - 1074.36 imgs/sec
    275W - 1125.18 imgs/sec
    300W - 1151.24 imgs/sec
    350W - 1241.81 imgs/sec
    370W - 1233.22 imgs/sec (GPU's stock PL)
    390W - 1254.26 imgs/sec (vBIOS max limit)
     
    Other folks have also done such tests:

    https://www.pugetsystems.com/labs/hpc/Quad-RTX3090-GPU-Wattage-Limited-MaxQ-TensorFlow-Performance-1974/
     

    https://timdettmers.com/2023/01/30/which-gpu-for-deep-learning/#Power_Limiting_An_Elegant_Solution_to_Solve_the_Power_Problem
     
    IIRC, some one also did those tests for the 4090, it was some tech youtuber from what I recall.
     
    That's not workstation hardware, and only has less than 1/8 of the memory bandwidth of the M2 Ultra.
     
  18. Like
    KaitouX got a reaction from Paul17 in RTX 4060 debate, potentially not terrible? *Nope, it is.   
    6600XT and 6500XT? The 6500XT in particular has pretty much the same specs as you would expect from a 1030/RX550 class GPU, a downgrade from the already subpar 5500XT in pretty much everything including performance. The 6600XT was also a clear downgrade specs wise from the 5700(XT) which had similar MSRP while also being a downgrade in many ways compared to the much cheaper 5600XT.
  19. Informative
    KaitouX got a reaction from RevGAM in RTX 4060 debate, potentially not terrible? *Nope, it is.   
    My 3060Ti was ~8% slower than stock when power limited to 160W, undervolting it was slightly faster than stock using about 160W, but that is naturally silicon lottery.
    My guess is that the 4060Ti is ridiculously overclocked out of the box, so it doesn't lose to the 3060Ti and Nvidia can somehow "justify" the price and name. I wouldn't doubt if you were able to drop the 4060Ti to 100~120W while keeping most of the performance.
  20. Agree
    KaitouX got a reaction from LAwLz in AMD accused of (and implicitly admits) preventing sponsored Games from supporting DLSS   
    The issue isn't that FSR/DLSS/XeSS is the only option in some games, the issue is that AMD is supposedly forcing DLSS and XeSS out of the games they're sponsoring.
    Nvidia was clear when they said they don't block alternative technologies in games they sponsor, and that can easily be seen from the amount of Nvidia sponsored games that support FSR and XeSS, and the issue on AMD side isn't limited to DLSS, from a quick check I did, 66% of the games Nvidia sponsored since FSR was released include FSR and 33% include XeSS, while 33% of the games sponsored by AMD include DLSS and only 15% include XeSS.
  21. Informative
    KaitouX got a reaction from IkeaGnome in RX 6700 or RX 7600?   
    I did a quick test on "power limiting" my 6700XT with Uniengine Superposition 4K optimized, like @RONOTHAN## said power limiting is limited, in my case limited to -6%, so I downclocked the GPU rather than power limit it, the result seems to be the same, as the GPU draws less power.
    I already had it downclocked before, because my 6700XT is complete trash at undervolting, where even a -10 offset makes it unstable, so to lower the noise of the GPU, I messed around with alternatives, and ended up finding downclocking the best/easiest option.
     
    The results are the following, the wattage is the number stated by the Driver and HWInfo(TGP), not the actual power draw(TBP):
    Stock(~2.5Ghz max) 186W - 9337 points - FPS min 51.5 avg 69.8 max 85.9
    2.4Ghz max clock 155W - 8917 points - FPS min 47.3 avg 66.7 max 82.4
    2.3Ghz max clock 140W - 8614 points - FPS min 46.0 avg 64.4 max 79.4
    2.2Ghz max clock 126W - 8316 points - FPS min 45.3 avg 62.2 max 76.3
    I forgot to get the power avg of the 2.1Ghz run, so i skipped it.
    2Ghz max clock 108W - 7695 points - FPS min 41.7 avg 57.5 max 70.3
     
    The results might not be representative of all games, but at least on this benchmark(and the games I've tested before), downclocking can have pretty good results.
    Also my experience with downclocking has also been pretty good, I don't really notice any significant performance drop from using it limited to ~2.35Ghz. I believe my tests at the time I set it up showed about a 5% performance loss while using ~40W less, that being the average in 3 games + this benchmark.
  22. Like
    KaitouX got a reaction from RONOTHAN## in RX 6700 or RX 7600?   
    I did a quick test on "power limiting" my 6700XT with Uniengine Superposition 4K optimized, like @RONOTHAN## said power limiting is limited, in my case limited to -6%, so I downclocked the GPU rather than power limit it, the result seems to be the same, as the GPU draws less power.
    I already had it downclocked before, because my 6700XT is complete trash at undervolting, where even a -10 offset makes it unstable, so to lower the noise of the GPU, I messed around with alternatives, and ended up finding downclocking the best/easiest option.
     
    The results are the following, the wattage is the number stated by the Driver and HWInfo(TGP), not the actual power draw(TBP):
    Stock(~2.5Ghz max) 186W - 9337 points - FPS min 51.5 avg 69.8 max 85.9
    2.4Ghz max clock 155W - 8917 points - FPS min 47.3 avg 66.7 max 82.4
    2.3Ghz max clock 140W - 8614 points - FPS min 46.0 avg 64.4 max 79.4
    2.2Ghz max clock 126W - 8316 points - FPS min 45.3 avg 62.2 max 76.3
    I forgot to get the power avg of the 2.1Ghz run, so i skipped it.
    2Ghz max clock 108W - 7695 points - FPS min 41.7 avg 57.5 max 70.3
     
    The results might not be representative of all games, but at least on this benchmark(and the games I've tested before), downclocking can have pretty good results.
    Also my experience with downclocking has also been pretty good, I don't really notice any significant performance drop from using it limited to ~2.35Ghz. I believe my tests at the time I set it up showed about a 5% performance loss while using ~40W less, that being the average in 3 games + this benchmark.
  23. Like
    KaitouX got a reaction from will0hlep in Will there be more AMD 7000 series GPUs in 2023?   
    You listed them by most to least expensive, not by performance. 7900XTX is ~3% faster than the 4080, and the 7900XT is 5%~10% faster than the 4070Ti. 6800XT is about the same performance as the 4070 with the 3070Ti being around 15% slower, roughly matching the 6800 non-XT.
  24. Agree
    KaitouX got a reaction from Spotty in AMD accused of (and implicitly admits) preventing sponsored Games from supporting DLSS   
    The same as Nvidia? They could've just done the same as Nvidia and said:
    AMD comment ignored the actual question and have yet to publish anything actually denying that they are blocking DLSS and XeSS implementation on those games.
     
    GN asked AMD about Starfield deal, the following was the exchange according to them:
    and AMD responded:
    Again they could have said something generic like "AMD does not and will not block or limit implementation of alternative upscaling technologies.", or if they can't mention the Bethesda contract they could just use "We have no comment at this time about the contract with Bethesda, but we never blocked alternative upscaling technologies." and it would have been all fine.
  25. Agree
    KaitouX got a reaction from LAwLz in AMD accused of (and implicitly admits) preventing sponsored Games from supporting DLSS   
    The same as Nvidia? They could've just done the same as Nvidia and said:
    AMD comment ignored the actual question and have yet to publish anything actually denying that they are blocking DLSS and XeSS implementation on those games.
     
    GN asked AMD about Starfield deal, the following was the exchange according to them:
    and AMD responded:
    Again they could have said something generic like "AMD does not and will not block or limit implementation of alternative upscaling technologies.", or if they can't mention the Bethesda contract they could just use "We have no comment at this time about the contract with Bethesda, but we never blocked alternative upscaling technologies." and it would have been all fine.
×