Jump to content

D2ultima

Member
  • Posts

    4,396
  • Joined

  • Last visited

Everything posted by D2ultima

  1. Your card is probably dead bro At this point I'd take it to a shop that can electrically test it for faults instead of trying to blindfire it.
  2. Oh hay unclewebb how you doing? Yes this person needs to not overvolt when already overheating.
  3. If you have the option of similar aggregate transfer (aka how much memory in a second, or however long) then take the faster speed. 3200MHz cl16 and 3600MHz cl18 are roughly the same operations per second, sure, but the 3600MHz memory has a chance to complete requests in one cycle that the 3200MHz requires two cycles to complete. Because of this my advice has always been to pick the faster memory speed if memory is otherwise the same
  4. It could be, but if you're overheating so badly you're overheating. You're not going to keep clocking up if you're overheating like that. Either way, you probably do not need MORE voltage, considering a 13900K out the box should be something like 5.4GHz on all cores and you can get away with undervolting it more than likely. Try setting all the P cores to 55x and a -100 offset to both and see if that works Worst case, you crash. Best case, your CPU works way better.
  5. Yes, that's a default power LIMIT. Not an estimation. Here let's see, we have 13600K slightly OC'd to 5.2GHz playing cyberpunk 2077 with a RTX 3080 Looks like 80W to me? The highest load I've seen in the game was after Phantom Liberty patch when I ran optimized settings and turned on RTO with Ray Reconstruction But even that is only 114W. It's quite difficult to make a 13600K pull 180W out the box, I think. I've seen 144W under synthetic loads, though. In fact, since my 3080 cannot pull over 378W (except for a spike) because it's a FE card and 115% power limit calculates to that, even if my CPU could draw 200W flat while gaming, that's still under 600W total, and a high quality 650W PSU would likely be enough, because board/SSDs/RAM/Fans doesn't usually draw over 70W by themselves. Would I RECOMMEND someone buy a 650W PSU for this system? No not really, but Nvidia's website says you want a 750W PSU for a 3080, and well that's simply more than necessary in general.
  6. "TDP" is "Thermal Design Power" and is a direct reference to how much watts of power must be dissipated in heat for cooling. The TERM TDP itself is incorrect for what we usually use it for. TGP is the more correct term. But nobody recognizes it and it's rarely used outside of like, internal engineer speak. Most 4070s have a power limit of 200-220W, give or take. If a CPU is drawing 200W and a GPU is drawing a maximum spike of 250W, that still leaves 150W for the system to operate on a 600W PSU, and is far more than enough. Assuming the PSU is good quality, anyway. Most of the time, GPUs say you want/need much higher PSUs than are necessary because they want to: A - Make certain you have headroom B - Make allowances for bad PSUs that won't properly provide full rated power under long periods of time C - Help sell bigger PSUs probably My old laptop has a 190W GTX 1080 in it and an i7-7700K (the desktop CPU). I can run that system, including the internal screen, speakers, keyboard, trackpad, four SSDs, wifi, bluetooth, etc on a single 330W power brick. This means a desktop could run a GTX 1080 with a 190W power limit and a 7700K with a 330W PSU. 4070 with a 200W power limit isn't much different, it'll also run on a competent 330W PSU in such a system. My 13600K pulls between 60W and about 90W while I'm gaming at 5.2GHz. If I had a 4070 in here drawing 200W, that would mean that on average I would be perfectly fine with a 350W PSU, maybe 450W if you wanted to be really safe. There is no harm in having a bigger PSU and it helps when you want to upgrade your GPU in the future, but most of the "minimum PSU required" posts do not actually hold any water. MOST of them, anyway... 4090s and 13900Ks actually can suck absurd levels of power and them suggesting 850W+ PSUs is not a joke when your GPU is gaming at 450W and your CPU is sucking 250W in cyberpunk for a nice toasty 700W of power draw
  7. People who have experienced both and settled for one usually have the best opinions I'm not asking about a 10fps difference at 300fps. I'm asking for a notable, marked difference where not using an e-core system is actually significantly beneficial. Otherwise you're just splitting hairs, because in most cases people will already be at a GPU limit. Except it generally does, as long as you do anything other than purely game. I went from intel (non-e-cores) to intel (e-cores) and have access to an AMD machine too, as well as a low power laptop also with E-cores. I've used a fair bit of systems and done a lot of optimizations in my time, my systems don't slow down, and I don't lose performance if I don't reboot it for a month like many people do. I can positively say that it literally might as well be magic for system usage, especially multitasking, livestreaming and high CPU load tasks. Lack of stutter with videos playing on a second monitor while a game is up, or OBS preview window is open, especially in borderless windowed mode (which quite a few newer games actually do force and lack an exclusive fullscreen mode), overall windows smoothness, and things I didn't even REGISTER as stutter or problems before using these new systems make me recommend it EXTREMELY HIGHLY, and most of this is stuff that will never show up in benchmarks. Hell I could even casually browse and use other programs while running cinebench R23, I know for sure that such situations would normally cause a lot of lag in the past. If you don't notice any difference, or don't do anything that you notice any differences, fine. If you wanna say they're not perfect, fine, I'll even agree. But I felt you kinda overblew the issues (especially with game performance, because I have seen the SLIGHT improvements to fps that some games get and it generally isn't worth the loss in functionality to do so) where too much good is gained that AMD can't match for someone to say it's an equal stage. To me it's a lot more like 7800X3D = More fps in most games if you have like a 4090 Intel = *begins listing benefits and ends 10 lines down range* No brainer to me
  8. I'd really love to see this plethora of games where having E-cores on is a large detriment to the experience, since all I can think of where it makes a marked difference off the top of my head is in CS2 (which Valve is fixing already) and your aforementioned Assassin's Creed, where.. it's a ubishit game, I don't expect much else. Not just my friends, how about the tons of people on this post who keep talking about issues they had on AMD platform that just wasn't present on intel? Yes, because they either 1 - Close everything and game and just want the most possible CPU-limited FPS, and this is all they realistically care about for a PC 2 - Don't know what they're missing Just because something is popular doesn't mean it doesn't have downsides, you act like AMD is flawless and intel just has issues. It's not the case, or even close to it. I never denied 7800X3D was usually the most fps for games, I simply said the rest of the benefits from e-core tech just outweigh the slight reduction in max FPS you'll get for some titles. And for that simplified reason, my judgement is intel is better until AMD makes similar tech, and probably raises RAM speed support for DDR5
  9. If you really want DLSS, DLAA, Frame Generation, NVENC, AV1 encoding, nice RT performance? I would understand your choice, but... it's still a hefty price to pay for not much raw rasterization benefit. I would say you'd need to move to the 4080 or 4090 to see a real difference, but that is probably too much of an expense? I assume high refresh rate? What stutter and low FPS are you referring to? What's the rest of your system look like? In terms of if the 6900XT is enough for now, I'd probably look at a 5000 series card which probably will be out next year if anything (my guessing)
  10. This is why I suggested the Ventus 2X OC actually. The cooling on the triple fan cards isn't necessary, especially for the 4070. I would cut the water cooling for the CPU and get one of the following; they should be very cheap- Thermalright Frost Commander 140 Thermalright Phantom Spirit Thermalright Peerless Assassin Deepcool AK 620 The frost commander stands out a little in performance, but otherwise these all perform fairly similarly and if you see one much cheaper than the rest, take them. They will save you quite a bit. You should be able to find 6000MHz RAM for a similar price or even cheaper (especially if going intel) I see your point about the motherboard for intel, I was going to suggest the MSI Z790 Pro A WIFI but it probably will still be expensive over there, which you don't want. In your case I suppose aiming for a 1500 euro limit or thereabouts might just be problematic Is the case a hard choice? mITX necessary? Cheaper cases and generally comparable mobos are afoot if not necessary. I can vouch for a fractal design pop air as I'm using it right now and it's been quite good Whether he keeps that CPU or gets a 13600k, 600W is more than enough for those chips + a 4070 which will only draw roughly 200W. That's the best fiddling advice I could give, whether he/you/both choose to stick to AMD or not. At least the CPU cooler choice will be good I moved from intel to intel I have access to AMD I have access to another intel with E-cores The experience of using a system with E-cores is pretty much incomparable if it's on W11 IMO. As for games that have issues, the vast majority of them do not gain massive amounts of fps by disabling e-cores, and the overall benefits of e-cores are better. Plus all of my friends who are enthusiasts and have used an AMD system for quite a while have complained about random stutters, so I'mma leave that as a knock too. You say it's double edged sword but to me if anything it's like a greatsword for attacking and a butter knife for hitting oneself.
  11. Have you tried putting it in a different slot on your motherboard if it exists?
  12. power spikes 100% exist, but I wouldn't worry about them unless you're very close to maximum PSU power and likely to trip it The more correct term would be TGP for Total Graphics Power, but it's not recognizable to enough people so they just call it TDP... however CPU TDP is and has been the power limit 1 value for all intel CPUs since as far back as their 2nd gen core series (I am unsure about 1st gen and prior, and it's too old to matter right now). I understand why people say "TDP" is meant for the coolers, and to a degree coolers do speak in TDP, but considering their performance varies based on contact size per chip, with some generations being very easy to cool and others not for various reasons, you can just ignore it. It will be a lot less complicated if you ignore it, and also if you realize that TDP is useless for non-laptops because almost everyone buys unlocked CPUs and boards that let you unlock PL1 to astronomical levels anyway. It absolutely can, and likely without much difficulty. Overclocking Nvidia cards since Pascal has allowed you to generally raise the performance at similar power draw, because the voltage associated with the clockspeed (and the memory clocks/type) are what mostly determine performance. If they ACTUALLY just overclocked it (I.E. raised the stock clocks in a similar manner to what you would achieve with an overclock software like afterburner) then the power draw would make almost no difference. Gigabyte has been having problems with their GPUs' PCBs cracking. I do not remember if it only affected the 4090s, but I would 100% avoid them. Get a MSI instead, they have good cards this generation, and a 4070 doesn't use that much power; the Ventus 2X OC should be great and is $550 USD according to PCPP so should also be rather cheap. As for R5 7600, it's a fine gaming chip, but for overall system usage (especially if you multitask like watching videos or streams while you play, or stream yourself, etc) I would suggest grabbing a 13600K or something instead. I know a lot of people like to recommend AMD, but those E-cores make a system so smooth and lovely to use it's just magical in my experience, especially if you use Windows 11 for it. If you still want the AMD, by all means go for it, buy my vote goes to intel until AMD gets similar tech to E-cores.
  13. Overclocking the CPU will help some, but in general your CPU is just kind of weak for what this game desires. As mentioned above, 5800X3D is your best option without a new board/RAM replacement set.
  14. Right so you have primary storage listed... that implies secondary storage exists. Are those games you mentioned installed on a HDD? Did you disable or delete your paging file from the system, or put it on a HDD?
  15. ASUS 4090s have a seriously high rate of very loud/annoying coil whine, so I would not suggest them for this generation. MSI Suprim X (not the Liquid X) is excellent and probably the best 4090 (so I've heard), but it doesn't make enough difference to pay a big amount of money for. Only buy the Liquid X if you want your card to be both cool and relatively quiet, as most all the coolers that come with 4090s, even the founder's edition, are heavily overbuilt and will keep the cards more than cool enough under most any realistic load you could throw at it. Also avoid gigabyte cards, as they have a running track record of cracking the PCB (the board the card is physically on) with Gigabyte refusing to fix/replace under warranty for people.
  16. Modern Nvidia GPUs can spike above their power limits more commonly than you'd think, and it's usually way higher. If there WAS a problem, your card would probably crash from lack of power, which it won't from anywhere within perhaps 50W of its default power limit It will not damage the card in any way Finally, don't worry about it. It's normal and fine.
  17. From my old memory (and I do mean very old, circa 2009) a MXM 3.0 card won't work in a MXM 2.0 socket. Also, the system BIOS likely does have a whitelist of some kind as mentioned above. I would not put any real stock or faith into that machine to get it running, honestly.
  18. Mmm my opinion in general is that a 4070 is the card you want, tbh. 7700XT is not exactly a bad card and it is worlds ahead of a 4060Ti, and actually it's surprisingly cheap on pc part picker at $430 for the Sapphire Pulse variant, but I also value DLSS, NVENC and the ease of using RT a lot, so my thoughts gravitate toward a 4070 which you can find for $520 on pc part picker (plus it's quite a bit stronger without RT). In the end it's up to you, buy my true suggestion is getting a decent GPU right now and upgrading the rest of the system later. You will thank yourself for it especially since you have a 240Hz monitor. As for how well a 7700XT handles games, HERE is an excellent video, and HERE is a back up. The 6800XT isn't an option we're recommending to you right now though; it's just for you to see what the 7700XT does being tested in more games. You can translate the 1440p tests to 1080p testing by ROUGHLY adding 30% more frames give or take, it's not perfect, but if you can't find any 1080p testing but you can find 1440p, then go with that.
  19. I also second the idea of ignoring Passmark, since they aggregate EVERYONE who runs the benches. This means the overall score can include people who dunno what they're doing bringing down an average. The more popular a card the higher the likelihood of this happening too.
  20. See in your case I would just outright consider selling your existing system wholesale, then. A 14600K or 14700K will probably be out by the time you're ready to upgrade and a decent board wouldn't be too expensive. E-cores help LOADS when you are doing something intensive yet still using the PC otherwise, so it'll help you with your editing (am verbatim quoting a friend here who uses those kinds of programs as well as Maya and the like, I haven't used those programs recently). New DDR5 RAM is very cheap too, I have seen 32GB 6400MHz cl32 go for $117 the other day. In short i think you probably would just benefit a lot more from a system overhaul even if it takes you a while longer; maybe buy the GPU first as it will directly benefit you then change out the whole system later? You could sell your old system, keep it as backup hardware, or simply use it as a local server for hosting files.
  21. You're going to need to give more information than that. By a lot. An 80W 3070 versus a 110W 4060 might perform much closer than you would imagine. In GENERAL I always suggest the newest Nvidia GPU one can afford because the tech simply lasts longer that way even if the performance is the same, due to backend improvements that won't show up so quickly, such as a solid fact is how AMD says while FSR 3's frame generation mode will function on the 1000 series, it's not going to be great because it was designed largely as an async compute task which the 1000 series wasn't that great at compared to the 2000 series onward. Or how 4000 series can use AV1 encoding which wasn't even a heavily touted launch feature for the cards. But in the world of laptops, a properly powered 3070 and properly powered 4070 are going to run roughly the same in performance, except for raytracing, video encoding, etc which will simply be better on the 4070. But you're looking at a 4060, which is probably a fair dip in performance. Anyway, give us the laptops in question, and the budgets you have, and the intended use cases, country of origin, etc.
  22. If you can get a 4070, it should still be fine. They're 200W max and your CPU isn't likely to pull over 100W on its own either; an extra 200W for the rest of your system is fine. Even if you account for larger power spikes or anything, 100W for a system is generally fine. I was going to advise against the 4060Ti until I saw it was $370 on PCPP which is a fair shave under the cheapest 4070 I found at $510, I thought the 4060Ti used to be $450, but I guess prices dropped around the lower mid range this time. But I would still suggest at least a 4070 for you if you are capable of buying it. There's QUITE a few games out there that would benefit even at 1080p, and you also will have the ability to use DLAA in some games for SUPERB anti-aliasing technology if you don't need DLSS to do upscaling
  23. I would not bother with SLIing them. I would probably pick the gigabyte, since 1000 series had decent gigabyte cards if I'm remembering correctly. I would probably try selling two 1070s and getting something newer if at all possible, you will SIGNIFICANTLY benefit from this.
  24. While I'd use the 3070, if you're not keeping both PCs for use (or if the second PC is going to be used as a server and has an iGPU) I'd consider selling both GPUs and buying a 4070, 7800XT or 4070Ti (if you can afford) instead. The extra vRAM will be helpful and access to frame gen and av1 encoding on the Nvidia cards will go a long way. The AMD card is just great rasterization for its cost, though since I only recommend Sapphire for AMD cards, it makes cheaper 4070 cards like MSI's 2X OC Ventus cheaper in the USA, so your mileage may vary.
×