Jump to content

hollyh88

Member
  • Posts

    906
  • Joined

  • Last visited

Everything posted by hollyh88

  1. Pretty sure diablo does not like 8gb vram cards at 4k. So that may be your issue. Try 1440p. If it still causes issues then try 1080p
  2. im fairly certain your board does not support it. I would suggest sticking with a 5600 for now till you have the budget to upgrade again.
  3. its an alright budget board. generally you would want 8c and 16t for streaming in my opinion. but for that price its going to get a bit difficult. you can get a 5600 but that only leaves you with about 310 euros dependend on where you live of course. if you can get a 1080ti for that price do it. unless you find a decent 6700 around that price then take that instead in my opinion.
  4. no it would hurt the perf of the 6700xt. what is your motherboard and budget?
  5. thats a better card for gaming perf than the others. the streaming is a bit less good nvenc (nvidia) vs h.264(amd). its not much worse though. but the vram makes up for it. however i would say that it would be better to get a better cpu than the 3600 when you want to stream. but thats a personal opinion.
  6. The rtx 2000 series utilises any dlss besides frame gen though. The 2070 will be better for streaming. The 1080ti has better performance compared to the 2070. (The 2070super is onpar or beats the 1080ti) but vram is going to be more important maybe not for 1080p but 1440p it will. Personally I would suggest going with the 2070 though if you play at 1080p as dlss is nice to have. If you play at 1440p definitely go with the 1080ti. With the 1080ti you still get fsr which isn't as good quality as dlss but it's alright when it is fsr2 This all depends on pricing though. As you didn't list that.
  7. Oh yes I have heard the same issues. For that game. You could try things like using vsync to limit the framerate. See if that helps.
  8. Yes gigabytes fans have some issue. I had the 2070s windforce 3x oc orso. Same issue. If the temps or hottemp or mem temp where a bit too high it went crazy. Did the even lower power usage work? Or the curve undervolt?
  9. The problem is your hotspot probably. Happened with my 2070s too till I decreased the power usage. If 90% didn't do it. You have to try it lower. Mine now in my partners rig is around 85%. Another method I would suggest is doing an undervolt curve. Or doing the thermal paste but it depends on the age (from when you got it new) whether you should/have to do it. Edit, I just noticed (sorry sitting outside) that your limit is 83 degrees. If it hits that it will try to get below it. My previous points still stand though. Your gpu is getting too hot and you have to cool it down. Then this problem should go away. Try it via the curve to undervolt it. Or the power slider. (And just drag the temp limit to the right)
  10. Can you show all the gpu statistics in hwinfo when you are playing a game?
  11. Could be the hotspot getting to hot and the gpu fans kicking in to combat it. Try to lower your power limit to 90% see if it still happens. If yes drop it lower.
  12. I hope it will be as good as was shown but I think it will be like so. The delays were needed and there is a drastic difference in fidelity and gameplay between last showcase and now. Will it have bugs? Probably it's a big game. But as long as they are the funny Bethesda bugs and not the game breaking bugs It ain't that bad. I just hope it will be 60+ fps on pc and not capped to 60. That's my main "worry". That they learned and don't tie the ingame physics on the framerate.
  13. I think the difference between cyberpunk and starfield is that starfield was delayed massively and you can see the massive improvements over the last showcase. And they showed a lot. Cyberpunk didn't. And was very secretive. Starfield is not so much.
  14. Thank you for your response. I didn't know it did that. Maybe I skipped over the message when I switched to a 3800xt from a 2600x. I guess I'll just do the default thing just in case and then swap them out. Pretty sure my arctic freezer II will be good enough with an mx-6 paste. Yeah I was debating between a 5800x and 5800x3d which costs about 100 euros more. But since it will be the last upgrade I will ever do on this mobo I thought I may as well go the full way. As its either equal or way better. And the added bonus to stellaris was really nice sounding to me.
  15. So, I decided to upgrade my cpu from a 3800xt to a 5800x3d so I would be golden for a while. With the newest gpus that will come. And with my 6950xt as some new games really struggle on this cpu. Which is weird to me but some games are just bad cpu optimised like for example jedi survivor where at 1440p in some sections my gpu is doing nothing and I'm hovering at around 50 fps and that with fsr enabled to quality or even off or even set to high it doesn't matter my gpu simply isn't getting enough data from my cpu. Which was a clear indication I had to do this upgrade. And for stellaris as I heard this cpu has a massive uplift in that game. So anyway here is my question: I do have one thing I'm not quite sure about. Since the x3d chips are so picky when it comes to voltages and such. Is it important for me to reset the bios to default (in the bios) before i turn it off and swap cpus? Due to some settings obviously being set specifically for my 3800xt I already have the latest bios version installed and I'm pretty sure nothing of my build would be any issue for a 5800x3d (see my pc build under this message) so would a simple set bios to default be enough? Or is it better to clear the cmos? Thanks in advance!
  16. even rt isnt that bad im seeing roughly 50-60 fps at 1440p cyberpunk everything to ultra with fsr set to quality. and im running a somewhat bad cpu to have it paired with. its still playable. pretty good even.
  17. I have had this happen from time to time but only when i have shut down the pc and started it again the next day. and it doesnt always happen. but fixing this is just 4 mouse clicks and a few seconds for me if you saved your oc profile. So i hardly call that pain. Thats the only thing that i have had so far.
  18. So ever since going to windows 11 i have this weird bug that my cpu usage isnt being correctly reported. I see the core usages individually for example at 40% orso but the overal will still say 1% bit annoying of course using an overlay from amd orso. Is this widespread/is there a fix?
  19. if you tried everything than it purely comes down to games simply not having proper support for much newer hardware. ive had similar issues on older nvidia gpus. this is the first "high end" amd gpu i have owned and so far no real issues. so while i get that you want to rant i just dont see the actual valid points.
  20. my 6950xt now isnt exactly old. the 2070s i had yeah but the older the card the better its supported especially with older games. jedi fallen order boots just fine on my system the perf is garbage but thats poor cpu optimization. so i really think its either a game issue or a your pc issue which isnt nice to hear ofc. but it sounds like you have many driver issues that the majority simply doesnt have, So, is your windows up to date? your driver up to date? did you properly uninstall the old nvidia drivers BEFORE installing the new amd drivers with ddu? cause something is going on. DDU can miss some driver relating things from time to time so you may have to do it again.
  21. as a streamer you probably want nvidia yes, there are ways with amd but nvidia is simpler. a 3060 simply isnt worth it. i hate to say it but for you if you want to only spend between that the 4060ti (WITH 16GB!) might be worth it. as it will come out for 500 bucks but... that is a lot of money you can totally spend on something else instead.. however just to telll you i dont know for how long it still lasts but i do see a 6800xt for 510 bucks on newegg. and with the h265 encoder it really isnt that bad, But if you most buy nvidia then there really isnt anything great new.. and the 4060ti just sucks compared to the 6800xt in performance. and value.
  22. ive had 0 issues on my 6950xt that i got new for about 1 month now. even have upgraded to windows 11 and new bios for my b450 to enable sam. Still 0 driver issues with even older games. so it most just be simply older titles being older titles. they crash a lot. they dont play well at all. it can happen. ive had the same issues before on my old 2070s. so it really isnt an amd thing. older games are much like a russian roulette you dont know when the game will be the actual bullet that will "kill" your pc. as for need for speed rivals, i remember seeing people having vram crashes with it on numerus cards that have plenty of it. not sure about fixes though. might try running in compatibility modes.
  23. yes 12gb is more then enough. but you need to remind yourself to also ask, is this all i will play and continue to play? if the answer is yes than go for the 4070. if the answer is no ill play more games (especially if they will be tripple a) you may want to consider going for a 6800xt. the 5800x3d shouldnt really take more than 120w on all core loads or when gaming closer to 50. the 6800xt is a bit more around 200-280w which is more than that of a 4070. but not awful. a 650w psu should be just fine to run it. as i highly doubt you have 100-250w more in your other peripherals. but if you want to be extra safe yeah get the 4070 but the vram "issue" will remain especially as you play at 1440p some games i already see at 1440p reaching 13gb which wouldnt be fine and will create some issues for you. this will only continue to happen more and more. so be warned for that.
  24. I have found this 6800 around 500 euros. idk if it's a proper website though as I'm not Greek. https://www.jmctech.gr/product/vga-sapphire-pulse-radeon-rx-6800-16gb-gaming-gddr6-oc-uefi/?skr_prm=WyIyYTk5MmM0ZS00NGI4LTRmNTItYmRmZC1mYTNiODYzM2E0NzIiLDE2ODQ1NjQwNTQ2OTkseyJhcHBfdHlwZSI6IndlYiIsImNwIjoiYiIsInRhZ3MiOiIifV0 I found it via this website https://www.skroutz.gr/s/26321297/Sapphire-Radeon-RX-6800-16GB-GDDR6-Pulse-Κάρτα-Γραφικών-11305-02-20G.html there it shows several shops that sell it below. In that websites I also see pretty high end 6750xts going for below 460. So I would suggest checking it out.
  25. That's not true at all. If it comes close to a 3070 or in-between a 3070-3070ti how on earth does that make it a 1080p card? The 3070/3070ti was a 1440p card. It's just nvidia doing nvdia things. Because the only limitation will be the 8gb version unless you get the 16gb but for 500 dollars new us you will be able to get so much better. If I look on microcenter I see several 6800 below 500 and 6800xts around 500.
×