Jump to content

Origami Cactus

Member
  • Posts

    4,846
  • Joined

Everything posted by Origami Cactus

  1. Looks fine, but you are way overspending on the fans and have an overspecced PSU. Drop those fans, use the ones that come with your case, and get a 6600XT as a gpu instead, if you need a bit of extra budget buy a 650w psu, should cost about 70$ less. If you are worried about upgrading your gpu in the future, with a 5600x you could get a 3090/6900XT with a 650w psu and would be fine. The ram seems also kinda expensive, as 32gb kits are like 100$ if I converted the currency correctly. But absolutely drop the Noctua fans and get a 660XT, and you will be golden. I recommend looking at Sapphire cards, they are the best AIB for AMD gpus.
  2. It puts out less heat (lower TDP), but the silicon itself runs hotter, as it has an extra barrier of vcache between the heatspreader and cpu cores, and on the 5900x the cores are spaced over 2 CCX dies, so they are more spread out, causing better heat dissipation. BUT the chip itself running hotter is completely meaningless, it doesn't reduce lifespan, it doesn't heat up other components more, as the heat output itself is lower, so idk why you would even be worried about that. The cpu cores themselves running hotter is completely meaningless, like actually it won't make any difference in your usage of the computer. So I actually can't understand why you would put ANY weight to the cpu core temp when debating between these 2 cpus. Higher core temp doesn't mean higher heat output! It won't heat up your room more etc etc. It just makes no sense to me. Look at the gaming performance instead.
  3. I currently bike as a time killing / light exercise, but I have seen people rollerskating and it seems fun and a much more of an exercise. Also seems more fun than just running. So is there anyone on this forum who does this? Any advice? Should I pick it up as a hobby? I was never good at skating, but a few winters I could skate without aid and without falling over, so I am sure I could learn to do the same on asphalt as on ice.
  4. If you only need to connect 1-2 more fans, don't get a fan controller, just get either 1 3way PWM splitter, or 2 2way splitters. Way cheaper, you don't need to install bloatware, and everything will work exactly as it should. RGB is a bit more difficult, some RGB can be used with splitters, some can't.
  5. It would be fine, it would definitely bottleneck the 4090 at 1440p high fps, but even then it would probably hit 144fps, it is not like you have a 240hz monitor. So 3080/3090 would be completely fine.
  6. So when I hit the gpu core with a 100% utilization it just ramps the fan1 up to 100%, even if the hotspot temp, memory temp, gpu core temp are all completely fine. WHY? Here is a screenshot of it happening, you can see that both fans are ramped to 100%, even tho I used a custom fan curve in MSI afterburner. It only happens in some workloads, not others. And for a few seconds at a time. I have updated the GPU firmware to get the resizable bar support, I don't think it did that before that. Here is a GPU z reading, it shows PefCap reason thermal, but nothing is overheating? Is some part of the gpu die hitting t-junc max and it is causing that? Because the hotspot is completely fine, it is 7c under the max that is fine. Found this thread, but with no solution: https://forum-en.msi.com/index.php?threads/rtx-2080-ti-ventus-fan-problems.331462/ Yes, my gpu is also vertically mounted, but IO is pointed downwards, and many NZXT h1 users have used the ventus with success. I got this card used, maybe I should change the thermalpads/ thermal paste? Or change the shroud for 2x Noctua 120 slim fans? But reported temps are fine, that is my main issue. OK, temporary fix that seems to have done it. I have to lower my power limit all the way down to 65%, and then overclock it to get the lost performance back: If I go even with 70% power limit it seems to hit some kind of tempperature limit, even tho hotspot is not actually hot.
  7. Most of these fps comparasions are just fake number generators, a person with 11 subs doesn't have the money to buy every new gpu cpu and compare them 24/7, so the vast majority of those are fake. The bottom review looks fine, but it is so old that it is outdated, ryzen performance has gotten better, especially in the early days windows had problems with dual CCD design. The only way to know if you get fps improvements would be to swap the gpu into the other system, no easy way around that.
  8. You have been given the answer like 50 times mate. You are heavily cpu bottlenecked. That 6700k is about on par with r5 1600, which also heavily bottlenecks fast enough gpus. The 3900x is much faster, in high framerate gaming it should get you much higher fps. But just test it yourself. In mostly gpu bound games like heaven benchmark there shouldn't be a difference, but in games where the cpu also has to do work, for example far cry 6, you might see 20%+ higher fps with the 3900x. The gpu power is not 100% even if the gpu usage is, because not every single transistor is used in every single scenario. The simplest way to understand that is that when you are playing a non raytraced title, obviously the raytracing processors on your gpu just sit and do nothing. If you are not looking at a video and at the same time also streaming your game, the hardware video decoder and encoder also sit and do nothing. There are many more different parts of the gpu that might get used in some situations and not in others, so it is NORMAL.
  9. Because unlike nvidia who cut the 4080 in half with a bandsaw and then called it the same exact model, the 7900XT is only about 10-15% slower than the XTX. It uses the memory, gpu die, etc etc. It is more clearly just a cutdown XTX, instead of Nvidia 4080 12gb where it was clear it used an entirely different gpu die and it had 30% less performance. They could have done like they did last gen, 7900xt and 7950xt, I would think that would have been the best course of action, but that doesn't leave open the flagship slot, where they want to come out with a 4090 competitor next year.
  10. Yes, m.2 slot that has PCIE support in other words means it supports NVME.
  11. Origami Cactus

    have you ever had your taste in food change as…

    You just need to add a pinch of salt to outmeal, and cook it with milk instead of water, and it is absolutely fantastic, and nutritious. Even if you cook it with water, just a dash of salt makes all the difference in it tasting bland vs rich.
  12. OOHHH. Yeah some motherboards disable 1 or 2 sata slots when you connect an NVME drive into a specific M.2, it is usually marked in the manual. But glad to hear that you got it to work finally.
  13. It is an issue with the memory controller, and on AM4 the memory controller is in the cpu, not the motherboard. So yes, when you swap the cpu for a newer one, you get a better integrated memory controller (IMC) as well. Don't listen to the guy that says 2 different kits won't work together, they absolutely will, without issues. You can even mix and match memory speeds, it will just use the lower speed then. The issue shouldn't really be the sticks, even if they are bottom of the barrel (Corsair Vengeance ). Ryzen likes more something with better memory chips, for example Gskill, but we are only talking about 3200mhz here, so especially on a 5800x it will be totally fine.
  14. To add to that, first and second gen ryzen processors really disliked having all 4 slots filled with high speed ram. While with 2 sticks it might run fine, the 4 sticks put too much strain on the IMC like you said. I also had to run my 3600mhz ram at 2933mhz on an 1700x, because I decided to go 4x8gb, instead of 2x16gb. They greatly improved the memory controller after 3rd gen, 3600x for example should have no problems running 3200mhz with 4 sticks.
  15. Seems like you formatted a partition, not the whole drive. If you look in disk manager, when you install windows it creates about 3-4 different partitions, and especially the boot ones are hard to get rid of. I would install samsung disk software and try secure erasing the drive, that should get rid of all the partitions that are hard to remove.
  16. Missed the 8 year part, my bad. I would recommend getting off this forum and straight to Noctua's support, they will send you new fans for free most likely, even if out of warranty, and you will get closure on what happened. My best guess would be that the oil that lubricated the bearings finally ran out, and so it was rubbing metal on metal, kinda bad. But if it was a bearing problem, you should have heard it making noise before bursting into flames. I don't think it was the motor, as it should use a brushless motor.
  17. I played it about 10 years ago when it first came out, connected a mouse to my phone over usb micro-b adaptor, was cool.
  18. Noctua industrial fans come in all sorts of voltages. It is possible you had 5v fans, that you connected to a 12v hub, burning them out that way. It is impossible to tell if you don't give the exact info that is on the back of the fan hub. (model number, voltage, power draw, PWM/DC etc etc)
  19. Possible, but I would first make sure that the monitor is running as it should. Are you sure it is running at 180hz? Go to nvidia control panel and make sure it is. Also enable Gsync compatible (there are 2 checkmarks you need to enable) while you are there to also get a better experience.
  20. Exactly this. The ddr5 wont make the i5 12th gen any faster, so it is a ton of money, with ZERO performance benefits. A more useful upgrade as you said, would be to get a m.2 adaptor, and I would say if he wants more performance, then the i5 13600k as a new cpu, should fit the current motherboard just fine.
  21. That is not a GPU thing, that is 100% a monitor thing. On the monitor itself, go to settings->picture->game adjust-> response time-> and put it to fast. Should noticably improve the ghosting.
  22. Origami Cactus

    Remember kids, always use protection. 😛 I just…

    Fixed the photo
  23. Origami Cactus

    I feel slightly vindicated. According to Steve…

    Technically the 7950x can be more efficient, the underlying architecture is. So yeah, if you leave your processors at stock settings then the 5950x is more efficient. BUT if you lock the 7950x to the 5950x power limit, it will be noticably faster. That being said, the 5950x is a pretty nice cpu, so no reason to go out to buy the 7950x because it is a bit more efficient.
  24. Yes, physics simulations especially are much faster on the 5800x3d because of the vcache. Look at f1 2022, or asseta corsa competizone benchmarks in the 5800x3d reviews to see how much faster they are. But that is because of the single threaded nature of those simulations, I don't know how blender smoke simulations work.
  25. You may underestimate just how fast the 5950x is also in gaming. It hits 5ghz, while the 5800x3d is about 4.5ghz, so it gains back some of the lost fps that way. But there are absolutely games, for example sim games, where because of the physics calculations being single threaded, and the 5800x3d having 100mb of cache accessible by that single thread, where it will be much faster. BUT there are also games where the 5950x is faster thanks to the faster clockspeed.
×