Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

D2ultima

Member
  • Content Count

    4,365
  • Joined

  • Last visited

Awards

This user doesn't have any awards

About D2ultima

  • Title
    Livestreaming Master
  • Birthday Nov 06, 1989

Contact Methods

  • Steam
    d2ultima
  • Twitch.tv
    d2ultima
  • Twitter
    D2ultima

Profile Information

  • Location
    Trinidad and Tobago
  • Gender
    Male
  • Interests
    Gaming, PCs, laptops, 3D gaming, reading, livestreaming
  • Biography
    Just a guy who loves tech in a country that's technologically stagnant.
  • Occupation
    Currently NEET

System

  • CPU
    i7-7700K
  • Motherboard
    Clevo P870DM3
  • RAM
    4 x 8GB DDR4 2400MHz 17-17-17-39 (needs fixing)
  • GPU
    GTX 1080N 8GB x2 (SLI)
  • Case
    P870DM3 chassis
  • Storage
    850 Pro 256GB, 850 EVO 500GB M.2, Crucial M4 512GB, Samsung PM961 256GB
  • PSU
    780W Eurocom PSU
  • Display(s)
    AUO B173HAN01.2 17.3" 120Hz laptop display + 1360 x 768 Sharp TV (second screen)
  • Cooling
    P870DM3 un-modified internal cooling
  • Keyboard
    P870DM3 internal keyboard
  • Mouse
    Logitech G502 Proteus Core
  • Sound
    Corsair Vengeance 1500 v2 & Steelseries H-Wireless
  • Operating System
    Windows 10 Pro x64 (garbage)

Recent Profile Visitors

7,980 profile views
  1. Even if it wasn't, forcing devs to implement it natively (especially if that means DX12 or Vulkan-based only; it's unclear whether it does or does not) is simply... extra work, for no reason whatsoever. It's different with Nvidia TWIMTBP titles, where Nvidia helps the devs work with it and try to get things working, but as seen with AC Unity and most Ubishit titles in general as well as the basic quality of AAA games on PC, even TWIMTBP titles don't even like SLI very much or are simply far too unoptimized (well NVLink solved that, but then games would benefit from hacked profiles and whatnot
  2. Your GPU core is not where your vRAM is, and it's possible your card either has bad VRM cooling, and/or the thermal pads used to pull heat from your vRAM have degraded in quality. Listing what card you have would help in figuring out if the cooler is a problem. Good answer, I agree with what you've said. I would say 6GB is still a very solid baseline for vRAM right now, but I don't know if that will hold up in 2 years, and as much as "future proofing lol can't do it", I don't recommend GPUs for them to become a crippling factor in any time under 2 years (especially in notebooks where
  3. No, it's required. And yes, SLI is dead, because at the least you can make your own profiles with bits on pre-Ampere and force it on, but now if the devs don't want to spend extra time to uselessly add SLI support to their games, it's not working, which means... lmao
  4. If you are a tinkerer, and are willing to manually fiddle and such your stuff, two 2080Tis will definitely provide a noted benefit, more than a single 3080 can provide, in a lot of games. However that means buying a new 2080Ti, which I just do not recommend unless it's around $400 or less (USD anyway). If the 2080Ti truly is not enough for you, sell it yourself and grab a 3080 when the Ampere refresh comes out in a year (3080Ti or 3080 Super or whatever it will be). In YOUR case, you most certainly should just grab a 3080. The 3080 is being compared to the 2080 because of price, but t
  5. You cannot connect them via SLI I do not even understand what you're asking, you did not word yourself properly. Just try it and see. However it probably would be better to just use a new card alone. It's maxwell all over again
  6. You can use old cards to drive monitors and afaik as PhysX cards. I do not think your 970 however will be a fast enough PhysX card for a 3070 and it might be better to leave it to the 3070 itself, but it should run extra monitors just fine.
  7. Microcode is a BIOS hack that unlocks multiplier limits and requires early BIOS versions, which should in theory not support the 4710HQ he has, since that CPU is a refreshed model. Also, this is irrelevant to power limit 1 being stuck at 47W which is an EC setting and is designed specifically for 4xxxHQ chips. Even if you unlocked the multiplier and set it to 5GHz it doesn't mean you'll get over 47W PL1. You'll just throttle the CPU all the way down to 3GHz with the high voltage 5GHz would need.
  8. You can't disable power limit throttle on 4xxxHQ without powercut. It's not possible. I'm the author of that mobile i7 info guide whoever quoted above, and whatever someone else said isn't going to work. It was on purpose by intel that these things don't draw any more power.
  9. That's your available system power, so yes you have enough power to "feed the computer". Increasing power limit would let the CPU pull its max speed at any given time, once it isn't overheating. It's really simple, PL1 and PL2 are "Power Limit 1 and 2". PL1 is the indefinite power draw limit, and PL2 is the "burst performance" power draw limit. The highest PL2 can sustain itself is 2.5 minutes without any BIOS trickery, where the CPU can draw as much power as it allows (by default 57W for Haswell HQ chips). It works as sort of a way to give you a short bit of time extra pow
  10. Building off of what Unclewebb said, the 4710HQ is a +100MHz across the board over the 4700MQ, and can attain a maximum of 3.5GHz all-core turbo and 3.7GHz single core turbo. For Haswell, 47xx can do +200MHz 48xx can do +400MHz 49xx can do +600MHz 49xxMX are completely unlocked. I do not know if you will have enough power you can apply to the HQ chips, though, as all 4xxxHQ chips are hard-limited at a PL1 of 45W (without using some form of powercut anyway) and cannot likely sustain higher clockspeeds under heavier loads.
  11. If you read the guide you'd see I stopped suggesting SLI on mainstream boards like yours, and you should upgrade the rest of your computer first, and also that buying the single strongest GPU you can get is better than SLI-ing two midranged cards. If you read the guide you'd see that "barely increase" and "a lot of tearing" are not what happens, rather he simply would find many games that don't scale very well and he doesn't have a CPU properly capable of driving it.
  12. I'd say 2080Ti but if you can get a 2080S at maximum then that. Or you can wait till the next set of cards, if your existing PC works "well enough" for the moment.
  13. Can't explain Minecraft's shoddy java-ness to you. The "video" clock might be the old "shader" clock, that stopped being adjustable around Fermi IIRC. I am only guessing, and I am unfamiliar with whatever linux tool you are using.
  14. I don't know. I haven't tried it. I don't know what a 1070, a card I don't have, will get you in modded minecraft. I don't know if forcing AFR will even allow the game to render anything.
×