Jump to content

D2ultima

Member
  • Posts

    4,396
  • Joined

  • Last visited

Awards

This user doesn't have any awards

About D2ultima

  • Birthday Nov 06, 1989

Contact Methods

  • Steam
    d2ultima
  • Twitch.tv
    d2ultima
  • Twitter
    D2ultima

Profile Information

  • Gender
    Male
  • Location
    Trinidad and Tobago
  • Interests
    Gaming, PCs, anime, reading, livestreaming, food
  • Biography
    Just a guy who loves tech in a country that's technologically stagnant.
  • Occupation
    Currently NEET
  • Member title
    Livestreaming Master

System

  • CPU
    i5-13600k
  • Motherboard
    ASUS Strix Z690-F Gaming Wifi
  • RAM
    2 x 16GB DDR5 6000MHz cl38
  • GPU
    RTX 3080 FE
  • Case
    Fractal Design Pop Air
  • Storage
    WD Black SN 850X 2TB, WD Black SN770 1TB
  • PSU
    Seasonic PRIME 1000W 80+ Gold
  • Display(s)
    Acer Nitro XV272U Vbmiiprx
  • Cooling
    Deepcool AK620
  • Keyboard
    Steelseries Apex 7
  • Mouse
    Logitech G900
  • Sound
    Audeze Mobius
  • Operating System
    Windows 11 Pro
  • Laptop
    Clevo P870DM3
  • Phone
    OnePlus 6T
  • PCPartPicker URL

Recent Profile Visitors

8,329 profile views
  1. Your card is probably dead bro At this point I'd take it to a shop that can electrically test it for faults instead of trying to blindfire it.
  2. Oh hay unclewebb how you doing? Yes this person needs to not overvolt when already overheating.
  3. If you have the option of similar aggregate transfer (aka how much memory in a second, or however long) then take the faster speed. 3200MHz cl16 and 3600MHz cl18 are roughly the same operations per second, sure, but the 3600MHz memory has a chance to complete requests in one cycle that the 3200MHz requires two cycles to complete. Because of this my advice has always been to pick the faster memory speed if memory is otherwise the same
  4. It could be, but if you're overheating so badly you're overheating. You're not going to keep clocking up if you're overheating like that. Either way, you probably do not need MORE voltage, considering a 13900K out the box should be something like 5.4GHz on all cores and you can get away with undervolting it more than likely. Try setting all the P cores to 55x and a -100 offset to both and see if that works Worst case, you crash. Best case, your CPU works way better.
  5. Yes, that's a default power LIMIT. Not an estimation. Here let's see, we have 13600K slightly OC'd to 5.2GHz playing cyberpunk 2077 with a RTX 3080 Looks like 80W to me? The highest load I've seen in the game was after Phantom Liberty patch when I ran optimized settings and turned on RTO with Ray Reconstruction But even that is only 114W. It's quite difficult to make a 13600K pull 180W out the box, I think. I've seen 144W under synthetic loads, though. In fact, since my 3080 cannot pull over 378W (except for a spike) because it's a FE card and 115% power limit calculates to that, even if my CPU could draw 200W flat while gaming, that's still under 600W total, and a high quality 650W PSU would likely be enough, because board/SSDs/RAM/Fans doesn't usually draw over 70W by themselves. Would I RECOMMEND someone buy a 650W PSU for this system? No not really, but Nvidia's website says you want a 750W PSU for a 3080, and well that's simply more than necessary in general.
  6. "TDP" is "Thermal Design Power" and is a direct reference to how much watts of power must be dissipated in heat for cooling. The TERM TDP itself is incorrect for what we usually use it for. TGP is the more correct term. But nobody recognizes it and it's rarely used outside of like, internal engineer speak. Most 4070s have a power limit of 200-220W, give or take. If a CPU is drawing 200W and a GPU is drawing a maximum spike of 250W, that still leaves 150W for the system to operate on a 600W PSU, and is far more than enough. Assuming the PSU is good quality, anyway. Most of the time, GPUs say you want/need much higher PSUs than are necessary because they want to: A - Make certain you have headroom B - Make allowances for bad PSUs that won't properly provide full rated power under long periods of time C - Help sell bigger PSUs probably My old laptop has a 190W GTX 1080 in it and an i7-7700K (the desktop CPU). I can run that system, including the internal screen, speakers, keyboard, trackpad, four SSDs, wifi, bluetooth, etc on a single 330W power brick. This means a desktop could run a GTX 1080 with a 190W power limit and a 7700K with a 330W PSU. 4070 with a 200W power limit isn't much different, it'll also run on a competent 330W PSU in such a system. My 13600K pulls between 60W and about 90W while I'm gaming at 5.2GHz. If I had a 4070 in here drawing 200W, that would mean that on average I would be perfectly fine with a 350W PSU, maybe 450W if you wanted to be really safe. There is no harm in having a bigger PSU and it helps when you want to upgrade your GPU in the future, but most of the "minimum PSU required" posts do not actually hold any water. MOST of them, anyway... 4090s and 13900Ks actually can suck absurd levels of power and them suggesting 850W+ PSUs is not a joke when your GPU is gaming at 450W and your CPU is sucking 250W in cyberpunk for a nice toasty 700W of power draw
  7. People who have experienced both and settled for one usually have the best opinions I'm not asking about a 10fps difference at 300fps. I'm asking for a notable, marked difference where not using an e-core system is actually significantly beneficial. Otherwise you're just splitting hairs, because in most cases people will already be at a GPU limit. Except it generally does, as long as you do anything other than purely game. I went from intel (non-e-cores) to intel (e-cores) and have access to an AMD machine too, as well as a low power laptop also with E-cores. I've used a fair bit of systems and done a lot of optimizations in my time, my systems don't slow down, and I don't lose performance if I don't reboot it for a month like many people do. I can positively say that it literally might as well be magic for system usage, especially multitasking, livestreaming and high CPU load tasks. Lack of stutter with videos playing on a second monitor while a game is up, or OBS preview window is open, especially in borderless windowed mode (which quite a few newer games actually do force and lack an exclusive fullscreen mode), overall windows smoothness, and things I didn't even REGISTER as stutter or problems before using these new systems make me recommend it EXTREMELY HIGHLY, and most of this is stuff that will never show up in benchmarks. Hell I could even casually browse and use other programs while running cinebench R23, I know for sure that such situations would normally cause a lot of lag in the past. If you don't notice any difference, or don't do anything that you notice any differences, fine. If you wanna say they're not perfect, fine, I'll even agree. But I felt you kinda overblew the issues (especially with game performance, because I have seen the SLIGHT improvements to fps that some games get and it generally isn't worth the loss in functionality to do so) where too much good is gained that AMD can't match for someone to say it's an equal stage. To me it's a lot more like 7800X3D = More fps in most games if you have like a 4090 Intel = *begins listing benefits and ends 10 lines down range* No brainer to me
  8. I'd really love to see this plethora of games where having E-cores on is a large detriment to the experience, since all I can think of where it makes a marked difference off the top of my head is in CS2 (which Valve is fixing already) and your aforementioned Assassin's Creed, where.. it's a ubishit game, I don't expect much else. Not just my friends, how about the tons of people on this post who keep talking about issues they had on AMD platform that just wasn't present on intel? Yes, because they either 1 - Close everything and game and just want the most possible CPU-limited FPS, and this is all they realistically care about for a PC 2 - Don't know what they're missing Just because something is popular doesn't mean it doesn't have downsides, you act like AMD is flawless and intel just has issues. It's not the case, or even close to it. I never denied 7800X3D was usually the most fps for games, I simply said the rest of the benefits from e-core tech just outweigh the slight reduction in max FPS you'll get for some titles. And for that simplified reason, my judgement is intel is better until AMD makes similar tech, and probably raises RAM speed support for DDR5
  9. If you really want DLSS, DLAA, Frame Generation, NVENC, AV1 encoding, nice RT performance? I would understand your choice, but... it's still a hefty price to pay for not much raw rasterization benefit. I would say you'd need to move to the 4080 or 4090 to see a real difference, but that is probably too much of an expense? I assume high refresh rate? What stutter and low FPS are you referring to? What's the rest of your system look like? In terms of if the 6900XT is enough for now, I'd probably look at a 5000 series card which probably will be out next year if anything (my guessing)
  10. This is why I suggested the Ventus 2X OC actually. The cooling on the triple fan cards isn't necessary, especially for the 4070. I would cut the water cooling for the CPU and get one of the following; they should be very cheap- Thermalright Frost Commander 140 Thermalright Phantom Spirit Thermalright Peerless Assassin Deepcool AK 620 The frost commander stands out a little in performance, but otherwise these all perform fairly similarly and if you see one much cheaper than the rest, take them. They will save you quite a bit. You should be able to find 6000MHz RAM for a similar price or even cheaper (especially if going intel) I see your point about the motherboard for intel, I was going to suggest the MSI Z790 Pro A WIFI but it probably will still be expensive over there, which you don't want. In your case I suppose aiming for a 1500 euro limit or thereabouts might just be problematic Is the case a hard choice? mITX necessary? Cheaper cases and generally comparable mobos are afoot if not necessary. I can vouch for a fractal design pop air as I'm using it right now and it's been quite good Whether he keeps that CPU or gets a 13600k, 600W is more than enough for those chips + a 4070 which will only draw roughly 200W. That's the best fiddling advice I could give, whether he/you/both choose to stick to AMD or not. At least the CPU cooler choice will be good I moved from intel to intel I have access to AMD I have access to another intel with E-cores The experience of using a system with E-cores is pretty much incomparable if it's on W11 IMO. As for games that have issues, the vast majority of them do not gain massive amounts of fps by disabling e-cores, and the overall benefits of e-cores are better. Plus all of my friends who are enthusiasts and have used an AMD system for quite a while have complained about random stutters, so I'mma leave that as a knock too. You say it's double edged sword but to me if anything it's like a greatsword for attacking and a butter knife for hitting oneself.
  11. Have you tried putting it in a different slot on your motherboard if it exists?
  12. power spikes 100% exist, but I wouldn't worry about them unless you're very close to maximum PSU power and likely to trip it The more correct term would be TGP for Total Graphics Power, but it's not recognizable to enough people so they just call it TDP... however CPU TDP is and has been the power limit 1 value for all intel CPUs since as far back as their 2nd gen core series (I am unsure about 1st gen and prior, and it's too old to matter right now). I understand why people say "TDP" is meant for the coolers, and to a degree coolers do speak in TDP, but considering their performance varies based on contact size per chip, with some generations being very easy to cool and others not for various reasons, you can just ignore it. It will be a lot less complicated if you ignore it, and also if you realize that TDP is useless for non-laptops because almost everyone buys unlocked CPUs and boards that let you unlock PL1 to astronomical levels anyway. It absolutely can, and likely without much difficulty. Overclocking Nvidia cards since Pascal has allowed you to generally raise the performance at similar power draw, because the voltage associated with the clockspeed (and the memory clocks/type) are what mostly determine performance. If they ACTUALLY just overclocked it (I.E. raised the stock clocks in a similar manner to what you would achieve with an overclock software like afterburner) then the power draw would make almost no difference. Gigabyte has been having problems with their GPUs' PCBs cracking. I do not remember if it only affected the 4090s, but I would 100% avoid them. Get a MSI instead, they have good cards this generation, and a 4070 doesn't use that much power; the Ventus 2X OC should be great and is $550 USD according to PCPP so should also be rather cheap. As for R5 7600, it's a fine gaming chip, but for overall system usage (especially if you multitask like watching videos or streams while you play, or stream yourself, etc) I would suggest grabbing a 13600K or something instead. I know a lot of people like to recommend AMD, but those E-cores make a system so smooth and lovely to use it's just magical in my experience, especially if you use Windows 11 for it. If you still want the AMD, by all means go for it, buy my vote goes to intel until AMD gets similar tech to E-cores.
  13. Overclocking the CPU will help some, but in general your CPU is just kind of weak for what this game desires. As mentioned above, 5800X3D is your best option without a new board/RAM replacement set.
  14. Right so you have primary storage listed... that implies secondary storage exists. Are those games you mentioned installed on a HDD? Did you disable or delete your paging file from the system, or put it on a HDD?
×