Jump to content

Coachdude

Member
  • Posts

    229
  • Joined

  • Last visited

Reputation Activity

  1. Agree
    Coachdude got a reaction from Beerzerker in Over current after breaking my usb port   
    To add to what the others here have said, it looks like your motherboard isn't quite mounted properly, judging by the angle of the audio jacks on the bottom, they look askew to me, and that's probably what lead to the damage to that specific USB port in the first place. So if you feel comfortable remounting it, I'd definitely go ahead and do so...
  2. Like
    Coachdude got a reaction from Boomer_17 in Should I switch my 9 5900X for a 5 7600X/7 7800X3D?   
    Bro you've got such a good system there already, why waste money on anything right now? You paid good money for your system as is, get your money's worth out of it. I'm still using a 3900X which is already a good bit slower than your Zen 3 counterpart, and it's still getting the job done fine. As long as your PC does what you want it to, why worry about the new things? Don't get caught up in the FOMO, IMO.
  3. Agree
    Coachdude got a reaction from MadAnt250 in Should I switch my 9 5900X for a 5 7600X/7 7800X3D?   
    Bro you've got such a good system there already, why waste money on anything right now? You paid good money for your system as is, get your money's worth out of it. I'm still using a 3900X which is already a good bit slower than your Zen 3 counterpart, and it's still getting the job done fine. As long as your PC does what you want it to, why worry about the new things? Don't get caught up in the FOMO, IMO.
  4. Like
    Coachdude got a reaction from Lion925 in What is the worthy upgrade from 9700k and would feel snappy ???   
    Honestly, if you're wanting to stick with DDR4 and aren't planning on moving higher than a 6700XT, I'd just stick with the 9700K for now. Wait for the next releases from both AMD and Intel and see what your options are then. The 9700K is a bit dated sure, but it's still a decent performer if your main use is just gaming. Personally I probably wouldn't have gone with this particular i7 in the first place due to the lack of hyperthreading, but it does have 8 cores and can usually clock up to 5GHz+ or so, and with a modern mid-range card like the 6700XT, you should have a perfectly respectable experience in the vast majority of titles out there.
     
    I'd say hold off until you decide to make the leap to a DDR5 powered platform. Going from LGA 1151 to AM4 just doesn't make sense to me. It would be faster sure, but not an insane difference in my opinion. The X3D would be significantly faster in some titles yes, but you'd be going from one dead platform to another dead platform, and you'd probably be in the same boat in a few years down the line that you're in now. So again, unless you're willing to move to AM5 or DDR5 13th gen/14th gen, an upgrade for strictly gaming performance just isn't warranted given the money you'd spend to do so in my opinion.
     
    Just my two cents.
  5. Informative
    Coachdude got a reaction from Joker-_-Glitch in cpu ryzen not boosting   
    I would expect a bit higher than 3.6 in Cinebench honestly, but I do believe R23 and 2024 are a bit heavier than older versions, so perhaps that's just your cpus limit in those workloads. What kinds of temps are you getting? You can try enabling PBO in bios to see if the higher power limits allow it to boost further, but you're probably going to max out around ~4GHz or so with default PBO. As a 3900x owner, I can tell you I see around ~4.15 GHz in Cinebench 2024 with PBO enabled, so I wouldn't expect a 3600 to hit much higher than that without manual overclocking. 3000 series just didn't clock that high to be honest. But you are hitting at least 3.6GHz which is the 3600's base clock, so it is running within spec. So as far as I know everything seems to be working as it should.
     
    Definitely check your temps and try enabling PBO though, if you want a quick and easy way to get a bit of a boost that's where I'd start. I wouldn't bother overclocking with Zen 2 though to be quite frank. Any gains you may get would be negligible. Zen 2's biggest weakness was the split CCX design, and really no amount of realistic clock speed gain you'd get is going to offset that. That's where RAM and FLCK tuning comes into play, but that's a whole other thing entirely.
  6. Like
    Coachdude got a reaction from austind903 in CPU upgrade help   
    Zen 2 will definitely bottleneck a 4080 in certain scenarios some of the time, that being said, like always, if it does what you want it to do as in performing to your expectations fps wise and programs wise, I wouldn't mess with it. Choosing the right cpu will depend on rather you lean more towards gaming or more towards work related tasks, the 5800X3D will be the fastest gaming wise, but it'll lose ground in work tasks just based on the core count vs the other processors you've listed. I'd use the system as is with the 4080 and see how it performs. If it isn't up to snuff for you then consider your options and what you think will give you the most value for the tasks you mainly do.
     
    As far as overclocking goes, it won't make much of a difference with Zen 2. Zen 2 still had the split CCX design and no amount of clockspeed you'll realistically hit will make a difference worth noting. Zen 2 's weakness is the core to core latency as a result of the split CCX/cache design. Most of the time it isn't a problem but you can run into cases where the latency between cores just limits the maximum amount of performance/frame rate you'll be able to achieve.
  7. Agree
    Coachdude got a reaction from Middcore in Bus sim 21 using up 80% of my i7 7gen   
    Not really familiar with the game, but the 7700/K are old Skylake based quad cores, it isn't unheard of for them to be near maxed out in newer games. That being said it probably isn't a problem if you're satisfied with your current performance, and it's currently a decent pairing with the gpu you have.
  8. Like
    Coachdude got a reaction from empresa87 in Is my Ryzen 5 3500x bottlenecking my RTX 2060 Super or do I have another problem?   
    The 3500X is a 6c/6t cpu, from what I've seen it'll be fairly typical for it to come close to maxing out in newer more multithreaded games. Same story for the 9th gen i5s. I'd say overall the 3500X with a 2060S is a decent pairing, I really wouldn't have any complaints about such a system myself. I run a 2060 reg and it runs everything I need it to just fine. If I were you, I'd probably scale back the Raytracing and just enjoy the game with a bit lowered settings, as it still looks good even with settings lowered in my opinion. If you find yourself needing a bit more cpu grunt in the future there's always Zen 3, but really I don't think it's warranted with a 2060S. Like I said there will be some games that are limited by the cpu some of the time, but there will also be plenty of others that aren't, and max out the gpu instead. I'd say if you're happy with the performance currently, don't worry about it, just enjoy the system for what it is.
  9. Like
    Coachdude got a reaction from Svetozar N in Need help deciding on an upgrade path!   
    I think the 5800X3D is probably going to be your most logical choice here, assuming you're wanting to keep the same motherboard and ram. If you do want to switch it out, I'd probably go 13th gen Intel personally, as from what I've seen they're just better all around processors than most of the 7000 series Ryzens, and you do have the option of going with one of the DDR4 based boards as well, but if you're going to be switching out the board I'd personally go all the way with DDR5, but that's just me. Regarding PCIE gen 3, I don't think it'd make much of a difference for a 3080, maybe a few percent here and there, but it wouldn't be drastic. In the end I'd probably just go with the X3D and be done with it. Assuming this is purely gaming as there will be bigger differences between 13th gen options vs the X3D if you're going to be doing work/production tasks with it as well, but that's up to you to decide.
  10. Informative
    Coachdude got a reaction from superfantastic in I recently upgraded my gpu and am wondering if my cpu is holding my gpu back?   
    @superfantastic
    Higher resolutions tend to shift the load onto the gpu more yes, because it's literally having to render more pixels than vs. a lower resolution. Regarding graphics settings, turning them to Ultra will usually hit the gpu harder yes, but some can also burden the cpu more because they increase the number of draw calls and simulations the cpu has to work on, think crowd density in the Witcher 3 or Cyberpunk for example. It isn't always so cut and dry. But honestly I think you're overthinking things a bit. Your hardware is more than capable of pretty much maxing out a great majority of games that currently exist. I always set things a tick or two lower than Ultra just because the performance trade off is hardly worth it most of the time, and because my gpu isn't really all that powerful to begin with. But if I'm playing something like Battlefield 5 for instance, I don't really care that the grass looks more pretty on Ultra vs low, because that isn't going to help me acquire targets any easier. So I put that particular setting to low because it helps with visibility. It all depends on what you want out of the game. Is it a single player story focused game where you can afford to turn graphics up in exchange for framerate? Then by all means do so. Or is it a fast paced multiplayer shooter where being able to easily spot enemies at a distance will often times win you the engagement, then lower the settings necessary to achieve that. At the end of the day its all up to you, there's no right or wrong answer here. It's what makes the flexibility of PC gaming so great imo, you have all those options at your disposal. =)
     
    Keep in mind I hardly claim to have all the answers regarding graphical settings in regards to gpu vs cpu loads. Different game engines perform differently, and there's just so many out there that whatever I say will just end up being anecdotal to my personal experience regarding games I have played, and that could be totally different from the ones you play. Personally, I just lower and disable things I don't like, like motion blur or chromatic aberration, or until I hit my monitor's target refresh rate, which for me is 144hz, and call it a day. Most of the time games still look amazing at lower settings anyway, so it isn't really a big concern for me. 
  11. Agree
    Coachdude got a reaction from jaslion in Cpu for unreal engine 4 dev. i5-7600K or i7-3820?   
    So this is just a random Layman's opinion, but looking at your flair, it says you're using a 2400G in a living room pc. Honestly, that right there even being a Zen 1 part would probably outperform both the 3820 and 7600K for multithreaded UE4 development. Sandy Bridge E whilst good at the time, is starting to age out a bit, and the 7600K is just kind of anemic for anything remotely multithreaded. If it were me, I'd repurpose that 2400G into my main pc, and stick the 7600K in the living room pc, as I think the additional threads of the 2400G even whilst using a slower architecture, would probably come out on top for that kind of work most of the time. 
     
    That being said I know next to nothing about the specific task at hand, and this mostly an "educated" guess just based on what I know of the performance levels of the cpus mentioned, so of course you'll have to verify with testing and what not. The 7600K will probably have faster single threaded performance than the 2400G, but not by a whole lot at stock. If you were to overclock it, it would probably end up being around ~15% faster per core than the 2400G, but still wouldn't close the gap in multithreaded performance because of the lack of hyperthreading. I really don't know which would ultimately end up being faster in this scenario for UE4 specifically, but I would imagine they would both probably be faster than the 3820. Again, testing is necessary here. 
  12. Agree
    Coachdude reacted to Mister Woof in I5 8400 to i7 8700 worth it for gaming?   
    9400 isn't an upgrade over an 8400 in real world performance. You won't notice a thing.
     
    8700 would, but as stated, its old and expensive. Your motherboard limits more powerful CPUs.
     
    I'd just sell the 8400/mobo as a unit, and grab a 12400F/B660. Don't waste money on a 5 year old CPU.
  13. Like
    Coachdude got a reaction from Dog 234 in Is 1.5 Volts Safe for overclocking a i7 920   
    Really depends on if it's your main system or not. If it is, then I personally wouldn't run it against the wall just because it's such an old cpu that any performance gains you'd get, while relatively probably a decent amount considering stock clocks, still wouldn't amount to "good" performance in a lot of things anyway. If however this is just a secondary "fuck around" machine, then I'd say sure, why the hell not? In this case its relative age and thus "value" isn't worth losing sleep over, so have fun with it! =) 
  14. Agree
    Coachdude got a reaction from YoungBlade in Is 1.5 Volts Safe for overclocking a i7 920   
    Really depends on if it's your main system or not. If it is, then I personally wouldn't run it against the wall just because it's such an old cpu that any performance gains you'd get, while relatively probably a decent amount considering stock clocks, still wouldn't amount to "good" performance in a lot of things anyway. If however this is just a secondary "fuck around" machine, then I'd say sure, why the hell not? In this case its relative age and thus "value" isn't worth losing sleep over, so have fun with it! =) 
  15. Like
    Coachdude got a reaction from frozensun in overclocking 3900X via Ryzen Master   
    I respect your opinion of course, it's your hardware, you do what you want to with it. =) Planned obsolescence is definitely a real thing, but I doubt it has much to do with voltage tolerances of silicon. I've had both Intel and AMD chips in the past, and they still work years later, I don't see this 3900x being any different. 
  16. Like
    Coachdude got a reaction from frozensun in overclocking 3900X via Ryzen Master   
    So I use a Be Quiet! Dark Rock Pro 4 cpu cooler. Idle temps fluctuate between ~35-40 celsius depending on ambient. When running Cinebench R23 temps hover around ~76 celsius. 
  17. Like
    Coachdude got a reaction from ViruzMage in overclocking 3900X via Ryzen Master   
    So I use a Be Quiet! Dark Rock Pro 4 cpu cooler. Idle temps fluctuate between ~35-40 celsius depending on ambient. When running Cinebench R23 temps hover around ~76 celsius. 
  18. Like
    Coachdude got a reaction from tuunade98 in what will event viewer say if pc crashes from high cpu temps?   
    If the pc is connected via a UPS that would then imply your undervolt probably isn't fully stable. I don't have much experience overclocking and or undervolting personally, as I run all my hardware at stock, so perhaps someone more knowledgeable could chime in further, but from everything you've told me I'd guess your undervolt again probably isn't fully stable, and 2042 just exposed that. You can either test more/play more and see if you end up with any errors/further crashes, and adjust your undervolt accordingly, or revert to stock settings and see what happens. Things like this are really difficult to troubleshoot over a text based forum just because of how generic the error is, it could be a multitude of things, you'll just have to check and test on your own. Sorry if this isn't very helpful but there's only so much one can do without physical access to the machine. 
     
    I'll repeat though, it sounds to me like an unstable undervolt/overclock. Whatever you end up doing I hope you get it sorted. Cheers. =)
  19. Informative
    Coachdude got a reaction from ebprince the computer nerd in is first-gen ryzen still decent value?   
    It really depends on what you're going to be doing with it to be honest. First gen Ryzen's single core performance is pretty lackluster nowadays, but if you can get a 1700 for about 100 and you use it for other things that can leverage the core count, then it could still be decent value. If your main thing is going to be gaming though, honestly I think a 150 3600 would probably be better suited for that specifically. You could also wait to see if AMD drops pricing on the 5600X on account of Alder Lake, but I wouldn't expect too much, probably around 250 eventually if I had to guess. 
     
    Keep in mind the prices listed are what I would personally pay for each part, not necessarily what they're actually going for. I wouldn't pay any more than 100 for a 1700 ( Or generational equivalent. )
     
    Keep in mind also that Intel will be releasing lower end parts of the Alder Lake lineup, such as the eventual 12400 etc, that might outright obsolete the 5600X in terms of price to performance. So I'd really recommend waiting for a bit to see what their entire lineup looks like and see what AMD ends up doing in response. 
     
     
  20. Like
    Coachdude got a reaction from CommanderAlex in Post your Cinebench R20+15+R11.5+2003 Scores **Don't Read The OP PLZ**   
    Your system your rules of course. Have a good day. 😃
  21. Agree
    Coachdude got a reaction from CommanderAlex in Post your Cinebench R20+15+R11.5+2003 Scores **Don't Read The OP PLZ**   
    I'm not blaming you. I'm just warning you a static voltage that high on Zen 2 is definitely degradation territory. I'd put it back to stock if I were you. And I wouldn't use any auto overclocking software like that, because that's definitely way too much voltage for a static oc. 
  22. Agree
    Coachdude got a reaction from CommanderAlex in Post your Cinebench R20+15+R11.5+2003 Scores **Don't Read The OP PLZ**   
    Just a friendly word of advice, you're going to seriously degrade and/or damage your cpu by running a static voltage that high. Especially for that kind of clock speed, I'd dial that bitch right the fuck down if I were you. Read up some more if you still want to overclock Zen 2. Until then I'd run it at stock until you have a firm grasp of things. 
     
    If this is just for this test, it's probably fine, but no way should you run that as a 24/7 overclock. 
  23. Like
    Coachdude got a reaction from Mister Woof in 1600 af vs 3600   
    This right here, but personally I'd even go for the 10400 seeing as it's only 10ish dollars more than the f version, and it comes with an igpu. Even if you know you're never going to use it, having it as a backup is more than worth the 10 dollars or so more they're asking for it. 
     
    That aside, the 10400(f) will match the 3600 and even beat it in some games. It is a bit slower in production type tasks, but it's still way faster than the 1600af and definitely way more deserving of your cash than the 1600af is at this price point. 
  24. Agree
    Coachdude reacted to Mister Woof in 1600 af vs 3600   
    10400F for $155 IMO
     
    https://www.amazon.com/Intel-i5-10400F-Desktop-Processor-Graphics/dp/B086MHSTWN
     
    Best bargain right now for gaming 6-core. Much faster than the 1600AF at about the same price.
  25. Agree
    Coachdude reacted to TheBean in Ram usage not exceeding more than 50 %   
    the game will only use however much it needs. adding more does not mean that the game will use more. 
     
    if I have 8gb ram but the game wants 6gb ram, it will use 6gb ram. 
    if you then upgrade to 16gb ram, the game still only wants 6gb ram so nothing will change.
     
    adding more ram is only useful if you are using 80% or more ram or if you want to be able to keep more things open at the same time 
×