Jump to content
Search In
  • More options...
Find results that contain...
Find results in...


  • Content Count

  • Joined

  • Last visited

Everything posted by D2ultima

  1. Even if it wasn't, forcing devs to implement it natively (especially if that means DX12 or Vulkan-based only; it's unclear whether it does or does not) is simply... extra work, for no reason whatsoever. It's different with Nvidia TWIMTBP titles, where Nvidia helps the devs work with it and try to get things working, but as seen with AC Unity and most Ubishit titles in general as well as the basic quality of AAA games on PC, even TWIMTBP titles don't even like SLI very much or are simply far too unoptimized (well NVLink solved that, but then games would benefit from hacked profiles and whatnot
  2. Your GPU core is not where your vRAM is, and it's possible your card either has bad VRM cooling, and/or the thermal pads used to pull heat from your vRAM have degraded in quality. Listing what card you have would help in figuring out if the cooler is a problem. Good answer, I agree with what you've said. I would say 6GB is still a very solid baseline for vRAM right now, but I don't know if that will hold up in 2 years, and as much as "future proofing lol can't do it", I don't recommend GPUs for them to become a crippling factor in any time under 2 years (especially in notebooks where
  3. No, it's required. And yes, SLI is dead, because at the least you can make your own profiles with bits on pre-Ampere and force it on, but now if the devs don't want to spend extra time to uselessly add SLI support to their games, it's not working, which means... lmao
  4. If you are a tinkerer, and are willing to manually fiddle and such your stuff, two 2080Tis will definitely provide a noted benefit, more than a single 3080 can provide, in a lot of games. However that means buying a new 2080Ti, which I just do not recommend unless it's around $400 or less (USD anyway). If the 2080Ti truly is not enough for you, sell it yourself and grab a 3080 when the Ampere refresh comes out in a year (3080Ti or 3080 Super or whatever it will be). In YOUR case, you most certainly should just grab a 3080. The 3080 is being compared to the 2080 because of price, but t
  5. You cannot connect them via SLI I do not even understand what you're asking, you did not word yourself properly. Just try it and see. However it probably would be better to just use a new card alone. It's maxwell all over again
  6. You can use old cards to drive monitors and afaik as PhysX cards. I do not think your 970 however will be a fast enough PhysX card for a 3070 and it might be better to leave it to the 3070 itself, but it should run extra monitors just fine.
  7. Microcode is a BIOS hack that unlocks multiplier limits and requires early BIOS versions, which should in theory not support the 4710HQ he has, since that CPU is a refreshed model. Also, this is irrelevant to power limit 1 being stuck at 47W which is an EC setting and is designed specifically for 4xxxHQ chips. Even if you unlocked the multiplier and set it to 5GHz it doesn't mean you'll get over 47W PL1. You'll just throttle the CPU all the way down to 3GHz with the high voltage 5GHz would need.
  8. You can't disable power limit throttle on 4xxxHQ without powercut. It's not possible. I'm the author of that mobile i7 info guide whoever quoted above, and whatever someone else said isn't going to work. It was on purpose by intel that these things don't draw any more power.
  9. That's your available system power, so yes you have enough power to "feed the computer". Increasing power limit would let the CPU pull its max speed at any given time, once it isn't overheating. It's really simple, PL1 and PL2 are "Power Limit 1 and 2". PL1 is the indefinite power draw limit, and PL2 is the "burst performance" power draw limit. The highest PL2 can sustain itself is 2.5 minutes without any BIOS trickery, where the CPU can draw as much power as it allows (by default 57W for Haswell HQ chips). It works as sort of a way to give you a short bit of time extra pow
  10. Building off of what Unclewebb said, the 4710HQ is a +100MHz across the board over the 4700MQ, and can attain a maximum of 3.5GHz all-core turbo and 3.7GHz single core turbo. For Haswell, 47xx can do +200MHz 48xx can do +400MHz 49xx can do +600MHz 49xxMX are completely unlocked. I do not know if you will have enough power you can apply to the HQ chips, though, as all 4xxxHQ chips are hard-limited at a PL1 of 45W (without using some form of powercut anyway) and cannot likely sustain higher clockspeeds under heavier loads.
  11. If you read the guide you'd see I stopped suggesting SLI on mainstream boards like yours, and you should upgrade the rest of your computer first, and also that buying the single strongest GPU you can get is better than SLI-ing two midranged cards. If you read the guide you'd see that "barely increase" and "a lot of tearing" are not what happens, rather he simply would find many games that don't scale very well and he doesn't have a CPU properly capable of driving it.
  12. I'd say 2080Ti but if you can get a 2080S at maximum then that. Or you can wait till the next set of cards, if your existing PC works "well enough" for the moment.
  13. Can't explain Minecraft's shoddy java-ness to you. The "video" clock might be the old "shader" clock, that stopped being adjustable around Fermi IIRC. I am only guessing, and I am unfamiliar with whatever linux tool you are using.
  14. I don't know. I haven't tried it. I don't know what a 1070, a card I don't have, will get you in modded minecraft. I don't know if forcing AFR will even allow the game to render anything.
  15. I don't know about MC using SLI, so I would ditch both 1070s and pick up at least a 2070 and just enjoy your life
  16. The reason the guide isn't updated for newer things is because all three guides I have written (that're in my signature) were written before this board changed to IP v4, and when I edit them, they often completely break formatting and make me have to re-write the ENTIRE guide over and over. As for what I wrote, I have indeed verified that what i wrote was true... just talking about 2GB of vRAM in 2020 is far too old. But I cannot edit the guides properly. I am aware that caching in vRAM is a totally different thing. I am aware that GDDR3 is non-existent and GDDR5 should be compared
  17. I'm saying you can plug one screen into one card and another into another, and you can have one card encoding and use the other to game without hindering system performance except for writing to the disk. It's not "SLI", but it is a way to make use of your P106 if you were to purchase something like a 2060. Don't even bother, just get a newer card.
  18. AFR is the only way SLI currently works on any modern title. Scissor rendering what you're describing hasn't been used for a long time. And I don't know how many games use it on Linux. I would honestly consider getting a stronger single card instead of SLI-ing especially if non-NVLink, you can always put a gaming card in and run your extra monitors off the P106 and use it to render on your programs.
  19. My bad for not answering both in one post, hit submit without thinking. Anyway, SLI I'm pretty sure doesn't do "rendering", you could at best give a different task to each card. The other card is NOT bricked, you will simply see weird performance as one falters. I've had a card die on me (the master, no less) and the unit still functioned albeit oddly.
  20. Get a 2080Ti, or wait for next gen cards and get one of those. I said it multiple times in the comments, even with the bandwidth situation resolved, if you're not willing to do a lot of fiddling with Nvidia profile inspector and forcing SLI on games that don't otherwise use it, then don't bother. In this case since you are aiming for 4k 60 with current unoptimized games, the vRAM alone of the 2080Ti pushes it into better category for you. Nobody can comment on unreleased games, but when I tried Star Citizen AGES ago (like 5-6 years) on my 780M SLI setup, it seemed to sc
  21. Then get an ok video card now, and wait for the next generation stuff to come out which should be middle of this year people are assuming (I think so too, considering previous time frames of releases). Or a 2080Ti. I do not suggest SLI if you aren't gonna sit and tinker with NVPI and throw in a ridiculous amount of effort and time into figuring out profiles for games that don't otherwise support multi-GPU out the box. Note: doing that CAN provide quite rewarding experiences, I know of multiple games (TW3, DQ 11, Betrayer, UT 4, R6S, etc) that function with benefits in SLI when you
  22. That machine looks like a large waste of money on your SSDs, PSU and RGB RAM coolers. There is no point to RAIDing nor NVMe SSDs, and QVOs are trash QLC NAND. NVLink is a fix for when bandwidth is too low for optimum scaling. It is the difference between visual bugs/negative scaling/single digit % scaling without using PCI/e 3.0 x16/x16 + LED/HB bridges on previous gens. It is most useful when you force SLI on games that don't normally have a profile or give visual bugs in SLI, like Dragonquest XI, or which function badly in SLI like Witcher 3 with TAA enabled with minimal percenta
  23. Not go out of the way for a board, unless you're on mainstream where you need a PCH chip (which does add some latency but the scaling will be there). Once you hit HEDT you simply need a CPU with enough lanes, so any threadripper, the 7900X or higher or the 9---X chips (any of them, even the base one has 44 lanes) will do it. Since he has a 7800X he only has 28 lanes, which cannot provide the 32 required for good SLI pre-NVLink bridge As the author of the SLI info guide I must say if you're not on NVLink or x16/x16/HB bridge, there's little real point, and even then you still need manua
  24. The N is shortform for "notebook" as the laptop community refers to it. you need at minimum PCI/e 3.0 x16 on each GPU or you risk severe scaling issues and an impossibility to use SLI properly on some games outright. NVLink SLI destroys this requirement, but is unavailable for anything other than 2000 series cards (or later, I would guess).