Jump to content

drumn_bass

Member
  • Posts

    110
  • Joined

  • Last visited

Reputation Activity

  1. Like
    drumn_bass reacted to Zando_ in ASUS Z690 and multiple NVME drives. Drives become unresponsive under load.   
    Yep. Mainstream just does not have enough PCIe lanes. Sounds like you're simply bouncing off the bandwidth limits of the lanes you have available. You have an x8 DMI 4.0 uplink from the chipset to the CPU. That's equivalent bandwidth to x8 PCIe 4.0 IIRC, and all drives in chipset slots + lots of your other I/O (USBs, NIC, etc) also go through this. Only takes 2 fast PCIe 4.0 drives running full bore to hit that limit or get very close to it. Thus why it's ok with 2 of them, but once you add a 3rd drive of any kind and load them all, it goes over the bandwidth limit and chokes.
  2. Like
    drumn_bass reacted to RONOTHAN## in Overheating or not? I7-14700k   
    Those chips are designed to be pegged at 100C, so it's not dangerous. It does mean that the CPU is thermal throttling though and that you're going to be losing some performance when compared to a higher end cooler. 
     
    If you don't have a contact frame, consider getting one as it's quite a bit cheaper than a new AIO and it helps improve temps quite a bit, though admittedly for a 14700K, a 360mm AIO is a pretty good idea. 
  3. Like
    drumn_bass reacted to Lukjo in 4080 or 4090 (future proofing?)   
    Hard to tell whats future proofing at this point
    With Nvidia nothing is future proof... they will lock out features from older cards just to sell the new ones
    exemple? : 4060 having the same performance as a 3060 and the only selling point is dlss 3.5 and frame gen
    Whats more funny is that there was a driver bug that allowed 20 and 30 series cards to use frame gen which nvidia said its not possible but they patched that silently (so much for hardware limited feature huh)
     
    So future proofing for performance,sure the 4090 in terms of performance wont ever fall out of favor,i dont really imagine games becoming more performance heavy,unless they become less optimized... but that wont include feature proofing since nvidia will lock you out of future features in order to sell their new cards.
    Nvidia will eventually make you consider to buy the 50 series cards with their shiny new feature (you might not,but u will still consider the thought, thats how they get you)
     
    IF not the 50 series,maybe the 60 series will pull you in harder with the newer shinnny new feature, maybe DLSS 5.0 with a frame gen that works at lower frame rates,making sure ur GPU wont become obsolete ,but... only available for 70 series. U get the gist? Nvidia will always have something to pull users in to upgrade regardless what they bought and where they bought
     
    I see you gave in and bought the 4090, not a bad purchase but was a bad purchasing decision, but then again its your money and your wants
    I also did the same ,upgraded from my 3070 to the 6950xt cause i saw a sweet deal (also saw that my 3070 was immediatly becoming obsolete with its lack of VRAM and seeing how Nvidia is already feature locking my card into the nothing realm,at least i can hope AMD will keep me up to date with features)
    bad purchasing decision? yes 100%
    But at the end of the day its my money and i wont try to justify my purchases
  4. Like
    drumn_bass got a reaction from RevGAM in 4080 or 4090 (future proofing?)   
    @Echothedolpin
     
    That does make sense, but on the other hand, if instead of doing that, I just buy a top tier card now and let's say 6 years later, wouldn't it still likely be at least around the same level as a second tier card from a couple of years ago at that point? I hope it makes sense. 
     
    Just to make sure I understand, when you say a second tier, do you means something like 3080? So when the 50 gen launches, I can get a 4080? 60 gen a 5080, or even 5070/70Ti, and finally say 6070Ti when the 70 gen cards launch, assuming a new generation every 2 years. Let's say it cost $500 to upgrade in this fashion, in 6 years I'll be looking at $2000, minus say I resell my previous cards for $300 a pop, $1100. Or I buy a 4090 now for $1440, it's technically a bit more, but I get to enjoy the top tier power for the whole 6 years, can still sell it at that time, let's say for $400-$500, so actually around the same overall, if not even a bit cheaper, plus less hustle with buying/selling used cards. 
     
    I have a feeling, and there are some rumors, that Nvidia might be going away with the whole new generation refresh every couple of years, which might make things even more complicated, plus I just saw some reports of 4090s actually going up in price in Europe... IDK man, just the uncertainty of how these things will develop, plus the inflation/world state and stuff like that, I kind of feel like who knows, this might be the last chance to grab something really good, before it all goes to sh...
  5. Agree
    drumn_bass got a reaction from RevGAM in 4080 or 4090 (future proofing?)   
    Well, that's a great point. I can technically still live with the 3070, so I suspect whatever I get, I probably won't be using it for 5-7 years anyway. Thanks for your input!
  6. Like
    drumn_bass reacted to Crunchy Dragon in 4080 or 4090 (future proofing?)   
    You might get another 2-3 years or so out of a 4090 than a 4080. Both cards will be relevant for at least 6 years, and should be viable choices for at least 10.
     
    Realistically, and the obvious pain point with graphics cards in the modern era, you're limited to whatever Nvidia decides for whatever the newest graphics technology is. When you buy a graphics card, you're more buying the software and technologies that GPU allows you to use(until such a time as the manufacturer adjusts it) than you are the actual hardware itself.
     
    If you don't care about the newest and shiniest graphics technology and maybe do a little bit of overclocking, there's no reason you shouldn't get 10+ years of use out of either card, depending on how the game industry goes and how your use case evolves.
     
    Personally, I've given up entirely on building one PC to rule them all for the conceivable future, I find it more beneficial to do smaller upgrades every 3-5 years as I feel they become necessary.
  7. Like
    drumn_bass reacted to Tetras in 4080 or 4090 (future proofing?)   
    You have the 4080, you're happy with it, I'd just keep it. For 1440p it is overkill anyway.
     
    In reference to how long they will last, I somehow doubt you won't have a case of upgraditus way before either card is obsolete.
  8. Like
    drumn_bass reacted to emosun in Could the RTX 4090 be considered "worth it" simply due to it's extra VRAM?   
    The bad price to performance ratio increases exponentially the higher end you go kinda like a race car. You pay more for less % increases. 
     
    Id get a 4080. The 4090 isnt worth the cost unless you just want the clout of a 4090
     
  9. Like
    drumn_bass reacted to GTC in 4080 vs 4090   
    This is such stupid logic. If you can afford "X" why not just pay 30% more and get "Y".
    It makes no sense.
    Also not everyone wants to just spend an extra 1000$ on a rig even if they could afford it.
  10. Like
    drumn_bass got a reaction from MisterBeast2169 in DaVinci Resolve eats up all available memory, then crashes.   
    @MisterBeast2169
     
    You're welcome, my friend 🙂
  11. Like
    drumn_bass reacted to MisterBeast2169 in DaVinci Resolve eats up all available memory, then crashes.   
    I was sitting here for like 5 days with Davinci just crashing every few minutes, with no problems quite like this before. Trying to retrace my steps, and after those gut wrenching days I finally found the key information I needed. I love you.
  12. Like
    drumn_bass got a reaction from MisterBeast2169 in DaVinci Resolve eats up all available memory, then crashes.   
    A quick update in case someone with a similar issue comes across this topic.
     
    With some help from AMD forums, I now know (tested and confirmed) that the latest beta of Davinci Resolve, version 18.5, works fine with all AMD drivers. An AMD driver 23.4.2 broke something, but apparently an update to Resolve fixes it. So it is a workaround, but there is hope that either one of those companies, or both, can figure out what's going on and get it fixed for good.
     
    Conclusion. Two, not ideal, but working solutions are:
     
    1. Downgrade AMD GPU driver to 23.4.1, or...
     
    2. Download a Beta version of Davinci Resolve (18.5) which works with all AMD drivers at this time, including the latest as of today, 23.4.3.
  13. Like
    drumn_bass got a reaction from OddOod in AV1 / Davinci Resolve / 4070Ti   
    @Paul17
     
    I like this 4070Ti TUF I got here. It matches my TUF Gaming motherboard (which is OK, I guess, I had my fair share of issues with it too, had to replace it once, I have a thread on that here somewhere). There is a bit of coil whine under full load, but it's actually quieter than the XFX MERC 310 RX 7900 XT, and much quieter than the MERC 6950 XT I tried before it. I returned them both. 20 gigs of VRAM and performance closer to 4080 than 4070Ti looked good on paper and in benchmarks, but in 2 weeks or so, I've experienced several driver and hardware related issues with both AMD cards. Computer was sometimes failing to go into sleep, and I had to manually shut it down by holding the power button, others are seeing it too, and it's still unresolved. Removing Adrenalin and using a driver only may be a workaround, but not confirmed. I've experienced a memory leak issue with Davinci Resolve, causing it to crash after about a minute of use. This issue went away with a beta version of DR, 18.5. In general, Resolve was laggy while editing, frequent playback freezes, but exports were great, very quick.  Forza Motorsport 7, which should be easy to run, had a stutter once every second. I fixed it by disabling ULPS, but later tried again after a clean W11 install with no Adrenaline, and it worked better, but still had a stutter here and there. Other games had crazy stutter too, even when they displayed high FPS, it was just very noticeably not smooth. Might be a Freesync issue, I don't know, I gave up and exchanged it for 4070Ti. 7900 XT uses about 10 times (!) more power on idle, with 2 monitors, one being a high refresh 1440p, and the second a standard 60 1080, it was drawing 85W, on idle! While 4070Ti draws 8-9W. I had several game freezes and crashes. It made my computer boot slower, by about 12-15 seconds. And I'm sure there is more, it's just what I ran into, so idk if I can recommend the 7900 XT to anyone. People often talk about AMDs terrible drivers, and I thought by now, they would surely have it figured out, but it was definitely not my experience.
     
    I figured I'll just go with 4070Ti for now. I don't have any issues with it so far. Not a fan of only 12GB of VRAM, but I will probably swap it with something better in a couple of years and pass it down to my kid (who just got my older 3070 upgrade from 1070). Lots of people say Nvidia got too greedy, and I get it, 4080 is too expensive, 4070s have only 12GB of VRAM, so people look at AMD as the better option, and on paper they look like it, but in real life, for me at least, it was nothing but a pain in you know what... Just thought you should know if you're considering it.
  14. Informative
    drumn_bass got a reaction from OddOod in AV1 / Davinci Resolve / 4070Ti   
    @Paul17
     
    I did try posting on BMD forums, about 8-9 hours ago, it said my post needed to be approved by moderators first, and it's still not there. Their forum is kind of weird to be honest, they insist on using real names, and apparently the content is curated, so they may or may not allow it, and if they do, god knows when.
     
    I'll be surprised if it does in fact require a paid version of software, because it's available in the free one with an AMD gpu. I tested it again, replaced the 4070Ti with the 7900XT, and the AV1 option appeared under MP4. Back to Nvidia... it's gone.
     
    (But, I guess it may be possible that it's enabled in the beta for AMD for testing purposes, and once it moves into a stable version, it will become a paid version feature. There is no trial, the free DR is the trial, so idk, may be a way to test, but I probably can't talk about it here, you know what I mean?)
     
    I also posted on Nvidia forums 8 hours ago. Silence.
     
    It was a clean Windows 11 installation last night when I first tried it, there should be no driver conflicts, it's just not there, while both BM and Nvidia say it's supported.
  15. Like
    drumn_bass got a reaction from Paul17 in AV1 / Davinci Resolve / 4070Ti   
    @Paul17
     
    I like this 4070Ti TUF I got here. It matches my TUF Gaming motherboard (which is OK, I guess, I had my fair share of issues with it too, had to replace it once, I have a thread on that here somewhere). There is a bit of coil whine under full load, but it's actually quieter than the XFX MERC 310 RX 7900 XT, and much quieter than the MERC 6950 XT I tried before it. I returned them both. 20 gigs of VRAM and performance closer to 4080 than 4070Ti looked good on paper and in benchmarks, but in 2 weeks or so, I've experienced several driver and hardware related issues with both AMD cards. Computer was sometimes failing to go into sleep, and I had to manually shut it down by holding the power button, others are seeing it too, and it's still unresolved. Removing Adrenalin and using a driver only may be a workaround, but not confirmed. I've experienced a memory leak issue with Davinci Resolve, causing it to crash after about a minute of use. This issue went away with a beta version of DR, 18.5. In general, Resolve was laggy while editing, frequent playback freezes, but exports were great, very quick.  Forza Motorsport 7, which should be easy to run, had a stutter once every second. I fixed it by disabling ULPS, but later tried again after a clean W11 install with no Adrenaline, and it worked better, but still had a stutter here and there. Other games had crazy stutter too, even when they displayed high FPS, it was just very noticeably not smooth. Might be a Freesync issue, I don't know, I gave up and exchanged it for 4070Ti. 7900 XT uses about 10 times (!) more power on idle, with 2 monitors, one being a high refresh 1440p, and the second a standard 60 1080, it was drawing 85W, on idle! While 4070Ti draws 8-9W. I had several game freezes and crashes. It made my computer boot slower, by about 12-15 seconds. And I'm sure there is more, it's just what I ran into, so idk if I can recommend the 7900 XT to anyone. People often talk about AMDs terrible drivers, and I thought by now, they would surely have it figured out, but it was definitely not my experience.
     
    I figured I'll just go with 4070Ti for now. I don't have any issues with it so far. Not a fan of only 12GB of VRAM, but I will probably swap it with something better in a couple of years and pass it down to my kid (who just got my older 3070 upgrade from 1070). Lots of people say Nvidia got too greedy, and I get it, 4080 is too expensive, 4070s have only 12GB of VRAM, so people look at AMD as the better option, and on paper they look like it, but in real life, for me at least, it was nothing but a pain in you know what... Just thought you should know if you're considering it.
  16. Like
    drumn_bass got a reaction from OddOod in AV1 / Davinci Resolve / 4070Ti   
    Thanks guys. Well, it was definitely included when I used an AMD card, in the free version, but IDK, maybe some special treatment for AMD vs Nvidia.
     
    And yes, I tried both the latest stable release of DR, and the latest beta. The AV1 support for AMD was added in the beta. Nvidia was supposedly supported for a few months, and Intel Arc for about a year, from what I can see.
     
    At the end of the day, it's not that big of a deal, just something I assumed would be there, and it's not:( You can't be too picky about the free software, that is already so powerful, it's hard to believe they're giving it away.
  17. Like
    drumn_bass reacted to Needfuldoer in How to safely move my PC   
    Lift with your knees, not your back.
     
    Just hand-carrying your PC next door should be fine. It's not going to get tossed around like it would in shipping.
  18. Like
    drumn_bass got a reaction from cwil1 in How to safely move my PC   
    Yeah, man, it'll be just fine. I think what they mean is if you have to put it in a moving truck, or ship it, you want to either have more support, or remove a GPU, but just carrying it over for a few minutes should have no effect on anything. I carry my PC in and out of my office all the time. Never had any issues.
     
     
  19. Like
    drumn_bass reacted to RONOTHAN## in How to safely move my PC   
    If you're just walking next door, I wouldn't even bother with the box. Just don't shake or drop your computer and you should be fine. 
  20. Like
    drumn_bass reacted to andrewmp6 in 7900 xt or 4070 tits?   
    If you want to save some money,The 6950xts are all under 700 us now.And maybe 10% slower if that in some games at 1440p.
  21. Like
    drumn_bass reacted to WallacEngineering in 7900 xt or 4070 tits?   
    Im probably going to get the ASRock Taichi RX 7900-XT for 3440x1440p UltraWide. Seems like the perfect card for high refresh rate at Ultra Settings or very high refresh rate at High Settings. My Goal is a card that will last at least a few years at 100-120 FPS High Settings and it looks to be the perfect solution.
     

     

     
    I was just talking about this over on my thread about why used GPU prices are stupid right now and you should just buy new:
     
    If you are planning on playing games at standard non-ultrawide 1440p then the RX 7900-XT will be an absolute MONSTER. You will be able to push almost every game ever made at Ultra Settings with a framerate over 120 FPS. I personally don't really consider Ultra Settings to be necessary but it would be rather nice to have.
  22. Like
    drumn_bass reacted to Mista G in Thinking about the 7900 XT/X   
    I'm not familiar with the ASRock brand so usually shy away from these however, have owned a few XFX and Sapphire cards over the years and have all been great in my experience.
  23. Like
    drumn_bass reacted to Ishimuro in DaVinci Resolve eats up all available memory, then crashes.   
    I am Happy to report, that upgrading from 18.1 official to the 18.5 Beta indeed fixed the Problem. Now 20 Gig of VRAM are Plenty xD Sipping around 6gb while editing GoPro 11 Footage.
     
    System R9 3900x, 32gb Ram and a 7900XT Founders.
     
    This Forum just rules.
  24. Like
    drumn_bass got a reaction from Ishimuro in DaVinci Resolve eats up all available memory, then crashes.   
    A quick update in case someone with a similar issue comes across this topic.
     
    With some help from AMD forums, I now know (tested and confirmed) that the latest beta of Davinci Resolve, version 18.5, works fine with all AMD drivers. An AMD driver 23.4.2 broke something, but apparently an update to Resolve fixes it. So it is a workaround, but there is hope that either one of those companies, or both, can figure out what's going on and get it fixed for good.
     
    Conclusion. Two, not ideal, but working solutions are:
     
    1. Downgrade AMD GPU driver to 23.4.1, or...
     
    2. Download a Beta version of Davinci Resolve (18.5) which works with all AMD drivers at this time, including the latest as of today, 23.4.3.
  25. Like
    drumn_bass got a reaction from jsnotlout1 in DaVinci Resolve eats up all available memory, then crashes.   
    @sub68
     
    Never mind man.
     
    I just went into my Device Manager - Display Adapters - Properties for the AMD GPU - Driver - Roll Back. It did its thing, and I'm back one version of a driver, back to 23.4.1 from 23.4.2 that I just got a couple of days ago. And IT ******* WORKS! Resolve now works normally. So, I thank you for giving me the excellent idea/hint! I guess now I can try updating again and see if the issue comes back, then I can just stay on the previous version for a bit. Or just keep it here since it's already fine, and try an update when the next driver version comes out.
     
    I knew I could count on this community. An AMD driver. After all, they're kind of famous for it.
×