Jump to content

D2ultima

Member
  • Content count

    4,222
  • Joined

  • Last visited

Awards


This user doesn't have any awards

About D2ultima

  • Title
    Livestreaming Master
  • Birthday 1989-11-06

System

  • CPU
    i7-7700K
  • Motherboard
    Clevo P870DM3
  • RAM
    4 x 8GB DDR4 2400MHz 17-17-17-39 (needs fixing)
  • GPU
    GTX 1080N 8GB x2 (SLI)
  • Case
    P870DM3 chassis
  • Storage
    850 Pro 256GB, 850 EVO 500GB M.2, Crucial M4 512GB, Samsung PM961 256GB
  • PSU
    780W Eurocom PSU
  • Display(s)
    AUO B173HAN01.2 17.3" 120Hz laptop display + 1360 x 768 Sharp TV (second screen)
  • Cooling
    P870DM3 un-modified internal cooling
  • Keyboard
    P870DM3 internal keyboard
  • Mouse
    Logitech G502 Proteus Core
  • Sound
    Corsair Vengeance 1500 v2 & Steelseries H-Wireless
  • Operating System
    Windows 10 Pro x64 (garbage)

Contact Methods

  • Twitter
    D2ultima
  • Steam
    d2ultima
  • Twitch.tv
    d2ultima

Profile Information

  • Gender
    Male
  • Location
    Trinidad and Tobago
  • Interests
    Gaming, PCs, laptops, 3D gaming, reading, livestreaming
  • Biography
    Just a guy who loves tech in a country that's technologically stagnant.
  • Occupation
    Currently NEET

Recent Profile Visitors

3,560 profile views
  1. LOLOLOLOLOLOLOLOLOLOLOLOL O O O O O O O O O O O O O LOLOLOLOLOLOLOLOLOLOLOLOL
  2. The SLI information guide

    But you're asking for a paradox. You're saying that you want devs to lower bandwidth requirements to use SFR... but the problem with AFR IS heavy bandwidth requirements. SFR is unnecessary if we have the bandwidth or AFR-friendly tech is used. SFR requires more bandwidth than AFR even if it works better with TAA or allows SMAA T2x. It's a lose-lose situation. What we need is devs to optimize, and not use AFR-unfriendly tech just because it's easier/cheaper to implement. I.E. make games for PC first then port to consoles. Why don't consoles get less optimization, running at sub-low specs for "1080p" and "4K", barely holding 30fps like the PC titles end up doing if someone used console specs to run the games? It makes no logical sense, it's all on devs to fix things. That, or we get significantly more bandwidth available to us. If we could skip the extremely late PCI/e 4.0 spec and jump right into 5.0 spec, PCI/e 3.0 x16 will become PCI/e 5.0 x4, which means x8/x8 for SLI is more than enough, and a 4-lane width from chipsets to bridges will not be saturated by thunderbolt 3, or a single NVMe, or an eGPU, etc. Bandwidth is key right now, the tech in the last 3 years has exploded and the interfaces we have are either poorly designed (HB bridge, NVMe on M.2 NGFF interface, etc) or just have no bandwidth to keep up (4-lane bridge between chipset/CPU, no 10 gigabit ethernet readily available, no gigabit wifi available, etc). Optimization in games will help greatly, but the rest of the services we just need more bandwidth for.
  3. The SLI information guide

    Well see, that's the OTHER issue. SFR can have a line depicting where one card is rendering and where the other is, and has something akin to screen tearing when something on one portion ends up on the other. Think about spinning 90 degrees up/down or left/right. And further to that, it requires even MORE bandwidth than AFR does. In fact, the entire reason four-way-SLI was so bad was because it was SFR of AFR. Two cards ran half the screen in AFR mode, and the other two ran the other half in AFR mode, and then the first two would be using "SFR" with the second two like if they were two larger GPUs instead of four ones. And scaling was actually so bad that three-way SLI was better in a lot of games, despite two-way SLI having the best scaling. Theoretically, the scaling in four-way should be equal to two-way without a bandwidth bottleneck; since it's two two-way AFR (up to 95% scaling) cards, rather than just trying to AFR four cards. People were pushing for SFR with DirectX 12, but it's not that we needed DirectX 12 to do it. It was possible the whole time, but it was totally inferior because it needed too much bandwidth. Bandwidth is the primary issue in everything right now. PCI/e SSDs, the maximum lane count the chipset can provide (thunderbolt 3 can't even use its rated 40Gb/s unless directly connected to the CPU on an x8 bus; both AMD and intel have a maximum chipset <--> CPU connection speed of PCI/e 3.0 x4, which is only 32Gb/s, plus multi-PCI/e SSD setups, etc), the inter-GPU bandwidth (which NVLink could have fixed... just saying), AMD's entire CPU crutch is needing to increase inter-CCX bandwidth to reduce latency, etc. Tech is pushing forward and bandwidth is holding everything back. This is why I am an advocate of XDMA and also a hater of AMD's CrossfireX solution, because XDMA is such a brilliant design where even a PCI/e 2.0 x8/x8 solution provides more bandwidth than PCI/e 3.0 x8/x8 + HB bridge on Nvidia cards. There is a latency issue, but AMD seems to have dealt with it well enough and Nvidia sure as hell can deal with it better than AMD can, so I don't know why they bothered with bridges. They could make x8/x8/x8 no-bridge PCI/e 3.0 and tri-SLI would probably have blown past the competition entirely, maybe even pushing scaling all the way up to two-way levels for three-way. That's just speculations of course, but I'm simply extrapolating what I already know to make it make sense in my head. Yeah, TAA can be done superbly well or terribly. But that's true of everything from the Unreal Engine 4 kit. There's games that run like the god of IT itself blessed them using UE4, and then there's games that run like garbage. Dead by Daylight and The Culling are good examples of games that run like aids (or they used to be, I don't know if optimization happened recently). On the other hand, Tekken 7 can pretty much be run on a toaster with a screen attached. PUBG is laughably optimized, and Unreal Tournament 4 you could max out at 120fps on a kepler midranged card at 1080p last I checked it. Cryengine 3 as well can get very optimized, as Prey proved to us. It's just if devs would put in the work... either they don't care, the publishers don't care, or both.
  4. The SLI information guide

    People still fight it, don't feel otherwise, haha. I even have people telling me they think it's all like, gsync issues or other random things. Someone looked at the proof I sent and said they didn't believe it because gamers nexus did that stupid video by testing synthetics and games whose development started before PCI/e 3.0 was available, which would have meant most were using 2.0 x8/x8 on intel mainstream... which as you can imagine means that 3.0 x8/x8 is more than enough for when it was created. More bandwidth is meant for when scaling is bad, not when scaling is good. Because bad scaling is usually a bandwidth problem if the engine isn't directly anti-multi-GPU and the devs need to fix something (Dark Souls 3 scaled negatively with multi-GPU until the first DLC's launch time; the patch that held DLC1 fixed SLI scaling; Nvidia drivers weren't updated when this became a thing on my old system). Oh it certainly means that there is a bottleneck somewhere; incompatible SLI profile or not. But what I meant was, 99% util on each card constantly doesn't mean good scaling either, which you just understood anyway. However, That's.. the entire problem, though. SLI was not, and should not be right now, something you have to tip-toe around settings for. TAA is anti-SLI in nature because it requires data from the previous frames, and this kills scaling because you need bandwidth. Once again, look at my "The Bandwidth Issue" section. There's a user who bumped scaling significantly in witcher 3 at 4K by using a PLX chip on a mainstream (haswell) motherboard. There's another user who jumped from 41fps to 78fps in R6 Siege at 4K by going from x16/x8 to x16/x16, with TAA on. So yes, 20% scaling is pretty much solely a bandwidth problem for a large majority of titles. LED bridge and x16/x16 for 4K and under, 5K+ I'd recommend a HB bridge because it can provide more bandwidth and it's possible Nvidia has coded it so that extra bandwidth is provided above 4K (since HB is recommended for 5k+ as per their official charts). The problem is that a lot of game tech these days are basically lazy/cheap implementations that are easy to use but need too much bandwidth or just are outright incompatible with AFR. There is little reason (except TAA) to use tech that is ant-AFR... it's just easier to implement, so devs do it that way and let users salt. It wouldn't be a problem if optimization were a thing; console games are sometimes optimized even 5 times better than PC titles are. And these are usually the ones that need multi-GPU the most too, due to how bad they can be on PC.
  5. The SLI information guide

    You dropping to 70% utilization might very well be broken implementation of things, but utilization % means nothing for scaling. If you look at my bandwidth issue section, you'll see in particular a shot with GPU util between x16/x8 and x16/x16 in R6 Siege, where both cards have extremely high util but the scaling is off the charts on 16/16 versus 16/8. There's all sorts of similar issues like that all over. TAA adds a lot of CPU and GPU load when implemented well, and kills scaling. It needs a LOT of bandwidth between cards, more than a simple HB bridge will provide. Not "can" provide, mind. WILL provide. Because Nvidia killed the excess bandwidth provided by the HB bridge and diverted it into frame pacing improvements. So your games don't scale more at 4K and below than a simple LED bridge will do, but they're a little smoother due to more consistent frames. Makes the need for x16/x16 still exist. Since you don't have x16/x16 for your 1080Ti SLI due to the 5820K, it's possible that your bad performance is as a result of that. Maybe get a cheap 6850K for like $275 (they go around that price a lot lately) and see if it makes a difference and how much. Are you using a LED or HB bridge at 5K? I hope it's not a flex bridge. Also, "reputable" hardware reviewers don't even get the bandwidth issue for SLI. I've seen Digital Foundry try 8K gaming with a couple 1080Tis... and a 6700K. Like no please delete the video.
  6. The SLI information guide

    Something like that, though it used to be better. SLI was 70 to 95% scaling in the past for a lot of titles, apparently. I remember when it was near double for me as well. Using Firestrike provides a 95% scaling rate. Games now are all in the 20% range though.
  7. The SLI information guide

    If a GTX 1080Ti or Titan Xp is not enough for your needs, get a second card. If you must have a laptop, and a GTX 1080 is not enough for your needs, get a second card. Otherwise, I just see it as too much headache for too little reward; especially since I consider it mostly worthless at 3.0 x8/x8 which is what the majority of people will use it on; the need for a $500 CPU (last-gen intel, current-gen AMD) or $1000 CPU (current-gen intel, which needs a delid to be worth anything) and a $200+ board just to get 3.0 x16/x16, with cores not necessary for gaming on the AMD side or current-gen intel side of things, is a lot for someone to invest into a system just for gaming. Not that it's bad to invest as much; just most people don't see PCs and gaming as that important to them to consider it. Even if you find a single GPU is a lot weaker than two GPUs that are a step or two lower, the fact that the single card will have its power 100% of the time and the multiple GPUs will beat it only about 40% of the time (default Nvidia profiles, no fiddling) to 70% of the time (fiddling with NVPI) is something pretty heavy to consider... and I can guarantee you that most people will not be the fiddly kind.
  8. The SLI information guide

    GTA V uses SLI by default, even pre-launch drivers. You can open Nvidia control panel and change the SLI setting otherwise. I did not put pictures because it is foolproof: Anything beyond that, you use Nvidia Profile Inspector and input the values corresponding to the game. To which I linked a page explaining that, and how to do that, but here is how it looks for your convenience:
  9. Haven't seen anything from EVGA, they don't even register on my radar as a maker of good laptops. Razer is only on my radar because they're so terrible I have to make sure people DON'T buy them. As for OP's question, a P950HR or P650HS is probably up your alley for non-gamey-looking but good stuff.
  10. If anybody reviews a Razer notebook (or any product) and gives it a bad review, or even says they're not blown away by it, they stop getting products from Razer to review. So... good luck finding legit reviews.
  11. The SLI information guide

    Can't say. I only know the compatibility list that's in the guide. But my suggestion is get rid of them and buy a 1080 or 1080Ti or something if you can afford. It'll do you better.
  12. He's moving to a new site sometime soon from what I know. I don't know all off the top of my head. I don't know if this is still the case; there was a big talking-out on a NBR thread, and I asked for them to just lay out what was needed and neither did. I would say however that Prema says personal mods STILL are a thing; if someone wants one they can still have one by contacting and donating if Prema thinks you're ok. But I won't promise anything. I'd still say getting from OBSIDIAN and trying to get a personal mod is better than trying to get from some other people.
  13. Macbook Pro Alternatives?

    you're losing it, I'ma burn you at the stake. Plus, repasting voids your warranty with razer. Unless HID warranty overspecs Razer... Also, ICD won't work. It's bare heatpipe to die remember? You need something to fill the gap or the kind of paste they use. Repasting is pointless to the average user.
  14. Intel chips haven't had faulty sensors for a very long time. AMD is AMD and is almost guaranteed to have some program or other be incapable of reading the temps properly. But it isn't going to be a problem for his notebook. That's just Razer's shit cooling. I don't know what you're talking about, didn't you see this? they have world-leading engineering and design! And now support! Enterprise-grade notebooks like Thinkpads and Precision Ms with next business day someone is at your home/workplace to FIX your problem, eat your heart out! Razer's so good they solve the issues on the phone in 35 minutes! /s You should. That will rapidly hurt the life of the system, and Razer's cooling subsystem might as well not exist. The whole laptop is likely to die many years before it would have if Razer was not the creator. Thermal limits is not something any computer should be operating at out of the box, especially with, and let me stress this, LIGHT loads such as playing counter-strike. Thermal limits should be near synthetic tests designed to brutalize a CPU, such as prime95, or seriously heavy loads such as gaming + livestreaming using heavy-compression CPU-based encoding, or H.264/H.265-based video rendering, etc. Anything else and your OEM has produced shit. I implore people to not accept "shit" as a product they should keep and/or use. A reminder, that this is a 15W CPU, which includes the power the iGPU uses. You should try Gelid GC Extreme, or if your contact is particularly flat, Thermal Grizzly Kryonaut. I don't understand you. You basically described a completely broken laptop above, but you stuck with it instead of finding an alternative, yet just calmly say you won't buy another in the near future. I'd be a lot more livid if my stuff didn't work properly out the box and there was no known way to fix it. Correct, that "extended periods" bit is the important part.
  15. Nvidia's Max-Q broken down

    The article's been updated, this is the most complete form as far as I know: https://www.notebookcheck.net/Opinion-Nvidia-s-Max-Q-Maximum-efficiency-minimum-performance.232038.0.html By sunday probably we'll have it properly sorted out.
×