Jump to content

maartendc

Member
  • Posts

    2,037
  • Joined

  • Last visited

Everything posted by maartendc

  1. It depends if you are talking about upgrading an existing system or building a new one. If you are upgrading the GPU, and keeping your CPU, or vice versa, sure, get whatever you want. With any given CPU, an RTX 4080 will be faster than an RTX 4070. Period. When building a new system however, it doesn't make sense to spend your money on a $1600 RTX 4090 and get a $100 core i3 CPU. You will get better performance in more games by going with for example an $1200 RTX 4080, and spending that saved $400 budget going with a $500 i7 CPU. The total amount spent is the same, but it really does matter where you spend it. I think that is where the cliche of "is my system bottlenecking" comes from. ---- Just an afterthought: we are kind of in unprecedented times with AM4 right now. Many people own an older AM4 CPU, like yourself with the 3900X (although I wouldn't consider that a 'weak' CPU). These people can potentially jump 3 generations to a Ryzen 5000 series without upgrading anything else. For those people, when upgrading their GPU it might make sense to get a tier lower than they would normally, if that allows for an upgrade to a 5800X3D. Just yesterday I encouraged some guy to get a 6950XT + 5800X3D over the 7900XTX + keep his old CPU. For CPU heavy titles, that would be the better combo. But like I said, this is kind of unprecedented, because typically you'd have to spend way more on a meaningful CPU upgrade, including the motherboard and maybe RAM, so that choice was not really an option.
  2. This is the correct answer to any and every question regarding. "Will X bottleneck Y? " There should be some sticky post somewhere at the top of the forums. The answer is: something will always be the limiting factor, either GPU or CPU. In general, the best solution is to not overspend on either CPU or GPU, but kind of spend your budget on both.
  3. I'm confused. Geforce 4000 series architecture is Codenamed Ada Lovelace.. It is out now. I would agree that it will be a LOONG time before we see RTX5000 or RX8000 series, judging from the fact that the RTX 4000 and RX7000 haven't even got a complete lineup. Between RTX 3090 and 4090 was a full 2 years.
  4. wow, very informative. This makes a lot of sense. I had no idea their manufacturing of GPUs and server CPU's are connected like that. Well, I guess we will really have to wait until there are absolutely no 6000 series anymore, and they are losing money and market share. I guess untill Nvidia releases the 4060 and 4050s they dont really have to hurry since they own the lower and midrange market now.
  5. Lol. That got a chuckle out of me You make a good point, I had never thought of or heard that comparison before. But I also don't remember PhysX ever gaining the traction that RT has so far. I hardly remember any games that had it, the only time I ever saw it was in Nvidia marketing slides while installing a new driver or something . RT on the other hand is being implemented more and more widely. Not sure if they can offload it to the CPU, but anything is possible I guess. Raw performance is raw performance now though. Can't argue with that. I honestly haven't seen any games that I would play with compelling RT features to take the performance hit. And I wonder if today's RTX cards will have the horsepower to drive RT features a few years from now. Like, can you really run all the new Cyberpunk 2077 RT features with decent framerates on like a RTX 2080? I doubt it.
  6. Ok, got it. I'd say its up to you if its worth it. Like I said the 6950XT gives 50-60% more performance. Some people upgrade every generation with only +25% performance increase.. If spending 650 to get that is worth it to you, then its a good upgrade. Only you can decide. I'd say that 50-60% extra performance is worth upgrading for if you have the money. Also considering you can still sell the 6750XT for like 350 bucks probably, so the real cost is more like +300..
  7. Generally this is somewhat true, but not really the case with the 5800X3D. The 5800X3D will give both higher average and better frametime performance. In the hardware unboxed review 8 game average, the 5800X3D compared to the 5800X (generally very similar to your 5700X) gives 22.6 % better average framerate, and 27.6% better minimum framerates (so better smoothness). This is at 1080P though with a 3090Ti (comparable to 6950XT). So this is similar to the performance uplift of 6950X to 7900XTX, at less money spent (+290 EUR vs +450 EUR). At 1440P the difference will be smaller, as you will be more GPU limited in more cases. But it still gives a good indication of how much faster the 5800X3D is in games over the 5700X. Also keep in mind that this is average over 8 games, some games see more difference than others. (up to 50% or almost nothing). In the end, the more expensive 5700X + 7900XTX combo will probably still be faster in some games than the 5800X3D + 6950XT. And it would really depend on the game. In CPU heavy games like Warzone 2 (with heavy CPU usage because of the massive open world) the 5800X3D+6950XT would probably win. (The video below shows a guy going from 200 fps to 300fps with 5800X3D vs 5800X in Warzone 1, so +50%). In more GPU heavy single player titles, the 5700X+7900XTX would probably win. You cannot really go wrong, both are strong combos. LONG STORY SHORT: If you mostly care about the Warzone 2 performance, I would get the 5800X3D + 6950XT. Will probably be fastest.
  8. 7900xtx is only roughly 25% faster than 6950XT, at roughly double the price... So I'd say that is not a good value. The 6950XT should be roughly +50-60% faster than your 6750XT, whether that is worth the money is up to you. It it a significant upgrade I think. The best value upgrade right now for your money is probably getting the 6950XT, and dropping in a 5800X3D (currently around 290 EUR). That will give you the best experience in both average framerate and frametimes in most games probably. With that combo, you should be good for a couple of years to play anything at 1440p. The boost from going to the 5800X3D might be even as big as the boost you get from 6950XT to 7900XTX in certain games...
  9. Hello all, With the release of the 4070Ti and 4070, and AMD having to drop the pricing on the 6950XT drastically to compete with the 4070, I was wondering: Why is AMD seemingly so late in releasing their upper-midrange and midrange GPU's this generation? Are they just sitting on stacks of 6950XT's, 6800XT's and 6700XT's, that they are hoping to sell before releasing a 7800XT and 7700XT respectively? Or are they just not done with the design / manufacturing of the lower tier 7xxx models yet? It seems to me they are in a vulnerable position right now, unless they are still selling boatloads of 6950, 6700 and 6800 cards? I kind of doubt it, because these cards are now 2+ years old, and people are holding out for the new gen or going Nvidia. The exception is perhaps the 6950XT, since it is priced so aggressively against the RTX 4070. It seems to me like if AMD released a 7800XT right now, at $600 or $550, with 16GB of VRAM and 6950XT levels of performance, they would sell LOADS of them, since it just makes the RTX 4070 look bad. Thoughts?
  10. I actually played it for hours and hours, right up to the last few missions. I never had a problem with any bugs, game ran fine for me. I just thought it was a very mediocre open world game. The main story was pretty enjoyable, but apart from that, the world just felt empty and lifeless to me. A far cry from The Witcher 3, which had so many interesting side quests around every corner.
  11. I will say Red Dead Redemption 2 on Ultra settings still looks very good, even though it is a few years old. It may not have the latest Ray tracing etc., but the atmosphere and just general quality of the game make it look soooo good. Also the newest Call of Duty (Modern Warfare 2?) the Amsterdam level looks insane.
  12. It is so funny to me that Cyberpunk looks impressive visually and still gets the latest Nvidia features patched in etc. .... ... but the game is still just not fun to play :D.
  13. Just look at if the PSU manufacturer sells the cables separately. Just buy another proper PCI-e power cable with the 8-pin output. What you are doing sounds like a bad idea, but I'm no PSU expert.
  14. From what I can find online the AMD FirePro W5000 runs on / requires (?) a PCI-e 3.0 slot. Minimum system requirements for Diablo IV are: Graphics: NVIDIA® GeForce® GTX 660 or AMD Radeon™ R9 280. So not very demanding. Something like a used GTX 970 or faster would be great, as others have mentioned. But check the PSU for how many PCI-e power connectors it has, and see that you don't get anything that requires more plugs than you have... Depends on the specific model. Other than that, don't bother upgrading anything. See if it runs, and if it doesn't, get a new PC. The Quadro M1000M is about 5x slower than the GTX 660 that is the minimum requirement, so no, it won't run on that laptop.
  15. Thats helpful, thanks. Yes, I actually had a quick look at DDR3 pricing, and it is actually still really cheap. So DDR4 should still be cheap in a few years I guess..
  16. Hello all, My wife's Framework laptop came with 16GB of DDR4 3200 RAM. With RAM prices being so cheap at the moment, should I upgrade it now to 32GB of RAM? Currently she doesn't have any use for more than 16GB, but a few years down the line with the software she runs (Illustrator, Photoshop, etc.), I think she might make use of it. Is DDR4 being phased out for DDR5, so is now the best time to upgrade? Or will DDR4 SODIMM's be available at good prices let's say 3 years from now? In other words, will DDR4 creep back up in price slowly from where it is now? Thanks!
  17. Honestly it has been a while since I have connected my iPhone to a Windows machine, I use a Mac on the daily. It used to be impossible to get an iOS device to show up as a removable storage on Windows without iTunes. Maybe they changed that somewhere down the line. On MacOS it still doesn't work like a normal removable storage device. You have to go through the Photos app, which I don't like to use. You can't just manually get the photos off and place them in a folder. And cloud storage: sure, you can use that, and I do. You get 5GB of free iCloud storage. Which fills up in like 2 weeks. After which they will happily charge you the subscription fee. I just feel they push everyone to iCloud to just sell those subscriptions, and make every other solution harder than it needs to be on purpose. Well there is Airdrop, which only works between MacOS and iOS. But it does what you are describing. Kind of. You can drag and drop files between devices wirelessly as long as they are on the same Wifi network and bluetooth is enabled. But it is not like you can see the whole storage of the device and browse the files. Also, it is painfully slow. I don't know whether it actually transfers through Wifi or Bluetooth, but it takes forever to transfer larger files.
  18. If he can test the card and verify it works 100% before buying, i would consider buying used. But there is also the issue of warranty. The performance difference between 6750XT and 3080 for the same money is not that big (about 26%). I mean, it is something, but it is still in the same "class" of card, it won't make the difference between a game being playable or unplayable. I would get a card with warranty over that performance increase personally... it depends on your risk tolerance.
  19. Exactly. I don't even understand how his channel even has 100 subscribers, let alone 150k. But other media outlets taking his garbage and making articles out of it is certainly not helping. And neither is posting his garbage on here as news items.
  20. True. And indeed the convoluted ways of file transfer to and from iOS devices is really annoying. I really don't understand even why they do this. To force people to get an iCloud subscription? Just last week a colleague was asking how to get photos off their iPhone onto a Windows computer, it is just not intuitive and not a good user experience. And iTunes is not even their main media app anymore, because they have Apple music and Apple TV etc. iTunes is like this weird legacy software that they still force people to use for certain functionality. Everything is done through iCloud now, including phone backups etc., but as soon as you connect with a cable, you need iTunes for a lot of things. Weird.
  21. Yeah but you don't understand man, they just "changed plans" again, it was totally legit and definitely not made up. He should change his channel's name to "Definitely not made up"
  22. Haha I knew someone was going to say that. Bro, MLID always prefaces every supposed "leak" he comes up with out of thin air by saying "of course this is still subject to change". So basically he can never be wrong eh? If he was wrong, there was just a "change of plans". If he was right, there happens to have been no change of plans. Genius. Even a broken clock is right twice a day.
  23. LOL. The reason Apple forces everything through the App store is obviously not that it is just "a feature they didn't have yet". They want to be able to take that 30% revenue cut on anything anyone purchases on iOS. By having 3rd party stores, it opens the door for them missing out on that revenue. Furthermore, I will posit that 99% of all Android users only install apps through the Google Play Store. So it doesn't really matter too much either way. I've owned several Android phones. I consider myself a tech savvy user, but even I didn't ever see the need to go outside of Google Play store. And my mother, grandmother etc. most certainly wouldn't. The only thing I would see as useful is if this ruling allows developers to sidestep Apple for in-app purchases and subscriptions. That is where the real money is.
  24. I don't deny there are useful homebrew tools and hacks. I am just saying that the #1 reason 95% of people jailbreak or mod their consoles to play pirated games. You might be part of that other 5%.
×