Jump to content

D2ultima

Member
  • Posts

    4,396
  • Joined

  • Last visited

Reputation Activity

  1. Like
    D2ultima got a reaction from lillianrmiller in Help me choose GPU   
    You're going to need to give more information than that. By a lot. An 80W 3070 versus a 110W 4060 might perform much closer than you would imagine.
     
    In GENERAL I always suggest the newest Nvidia GPU one can afford because the tech simply lasts longer that way even if the performance is the same, due to backend improvements that won't show up so quickly, such as a solid fact is how AMD says while FSR 3's frame generation mode will function on the 1000 series, it's not going to be great because it was designed largely as an async compute task which the 1000 series wasn't that great at compared to the 2000 series onward. Or how 4000 series can use AV1 encoding which wasn't even a heavily touted launch feature for the cards.
     
    But in the world of laptops, a properly powered 3070 and properly powered 4070 are going to run roughly the same in performance, except for raytracing, video encoding, etc which will simply be better on the 4070. But you're looking at a 4060, which is probably a fair dip in performance.
     
    Anyway, give us the laptops in question, and the budgets you have, and the intended use cases, country of origin, etc.
  2. Like
    D2ultima got a reaction from AI_Must_Di3 in RTX 2080Ti less prone (maybe) to crashing on 80% power target   
    Your card is probably dead bro
     
    At this point I'd take it to a shop that can electrically test it for faults instead of trying to blindfire it.
  3. Like
    D2ultima got a reaction from unclewebb in Help OC 13900k, Gigabyte z790 Aorus Master to 5.5ghz stable on all 8C/16T   
    Oh hay unclewebb how you doing?
     
    Yes this person needs to not overvolt when already overheating.
  4. Agree
    D2ultima got a reaction from podkall in ATX 3.0 really required for 40 series nvidia?   
    People who have experienced both and settled for one usually have the best opinions
     
    I'm not asking about a 10fps difference at 300fps. I'm asking for a notable, marked difference where not using an e-core system is actually significantly beneficial. Otherwise you're just splitting hairs, because in most cases people will already be at a GPU limit.
     
    Except it generally does, as long as you do anything other than purely game. I went from intel (non-e-cores) to intel (e-cores) and have access to an AMD machine too, as well as a low power laptop also with E-cores. I've used a fair bit of systems and done a lot of optimizations in my time, my systems don't slow down, and I don't lose performance if I don't reboot it for a month like many people do. I can positively say that it literally might as well be magic for system usage, especially multitasking, livestreaming and high CPU load tasks. Lack of stutter with videos playing on a second monitor while a game is up, or OBS preview window is open, especially in borderless windowed mode (which quite a few newer games actually do force and lack an exclusive fullscreen mode), overall windows smoothness, and things I didn't even REGISTER as stutter or problems before using these new systems make me recommend it EXTREMELY HIGHLY, and most of this is stuff that will never show up in benchmarks. Hell I could even casually browse and use other programs while running cinebench R23, I know for sure that such situations would normally cause a lot of lag in the past.
     
    If you don't notice any difference, or don't do anything that you notice any differences, fine. If you wanna say they're not perfect, fine, I'll even agree. But I felt you kinda overblew the issues (especially with game performance, because I have seen the SLIGHT improvements to fps that some games get and it generally isn't worth the loss in functionality to do so) where too much good is gained that AMD can't match for someone to say it's an equal stage.
     
    To me it's a lot more like
    7800X3D = More fps in most games if you have like a 4090
    Intel = *begins listing benefits and ends 10 lines down range*
     
    No brainer to me
  5. Agree
    D2ultima got a reaction from filpo in Will a 4060ti and i5 10400f run on a 500w power supply?   
    If you can get a 4070, it should still be fine. They're 200W max and your CPU isn't likely to pull over 100W on its own either; an extra 200W for the rest of your system is fine. Even if you account for larger power spikes or anything, 100W for a system is generally fine.
     
    I was going to advise against the 4060Ti until I saw it was $370 on PCPP which is a fair shave under the cheapest 4070 I found at $510, I thought the 4060Ti used to be $450, but I guess prices dropped around the lower mid range this time. But I would still suggest at least a 4070 for you if you are capable of buying it. There's QUITE a few games out there that would benefit even at 1080p, and you also will have the ability to use DLAA in some games for SUPERB anti-aliasing technology if you don't need DLSS to do upscaling
  6. Like
    D2ultima got a reaction from Adam670 in 4070 ti crashing in games (overwatch and cyberpunk 2.0)   
    Firstly, I would also try DDU as suggested above, clean and restart in SAFE MODE then install the new drivers fresh.
     
    Secondly, why do you have -245MHz on your afterburner? If this is what you meant by undervolting your GPU, that is not how you undervolt (though it should fix any issues with your card running at too high a clockspeed for its voltage).
     
    Have you tried maxing out your power limit in Afterburner as well?
     
    For what it's worth, you undervolt like this. This works for Nvidia 1000 series up until the 4000 series as of this writing.
  7. Agree
    D2ultima reacted to Bagzie in RTX 3080 VS 3090   
    As someone who ran a 3090 on a 750w psu for a while a 1000w psu is definately not "minimum".
     
    The problem originally was alot of people trying to run the 3090 with psus not designed to handle transient load spikes because they where usually group regulated multirail crappers from the stone age.
     
    That being said an 850w is fine provided you are not running a 250w cpu with a million case fans and the psu is good quality.
     
    I couldn't get the power to spike over 550w on the 3090 which honestly isn't any higher than my current 7900XT spikes to.
     
    The 6950XT I used on the other hand spiked even higher , Still didn't overwhelm the psu though.
     
     
     
     
  8. Informative
    D2ultima reacted to TatamiMatt in Advice on new gaming rig   
    Youve got the name right on the 6 heatpipe version, the phantom spirit was only released a little ago. The PS is just a 7 heatpipe version of the PA
     
    Review
     
    NHD15 Comparison
     
    Its an absolute beast for the price point its at
  9. Like
    D2ultima reacted to TatamiMatt in Advice on new gaming rig   
    Peerless assassin is good, but go for the phantom spirit, new rev of the peerless assassin with extra heatpipe, about $5-10 more, cooling is on par with nhd15, ak620 etc
  10. Agree
    D2ultima got a reaction from Mike Mike in The video RAM information guide   
    You clearly did not read what I said.
  11. Informative
    D2ultima got a reaction from Henru in The video RAM information guide   
    Ok. I did a SLI guide and now time to do a vRAM/memory bandwidth guide. A lot of people seem to be confused about vRAM in general. Well, here we go.
     
    Let's clear up some misconceptions about vRAM! 
     
     
    What does vRAM size have to do with gaming?
     
     
    Bonus: What happens if I don't have enough vRAM that a game asks for for certain settings?
     
     
    And now resolution, Anti Aliasing and operating systems
     
     
    And now about multiple monitors
     
     
    Now, onto memory bandwidth and memory bus and such. You may wanna skip this if you know already and are only here for the vRAM size portion above, but I might as well be thorough if I'm doing this. Spoiler tags save the day!
     
    vRAM types & memory clocks
     
     
    Next, memory bus and memory bandwidth!
     
     
    Extra: Memory bus + mismatched memory size section
     
     
    And the GTX 970 gets its own section! Hooray!
     
     
    FAQ
     
     
    Windows 10 and nVidia 353.62
     
     
    Final tidbits and stuff
     
     
    I started writing this guy mainly for the top section, to denounce misinformation people seem to have regarding vRAM and its relation to memory bandwidth, but I figured I might as well just go the full mile and explain as best I can about what most people need to know about GPU memory anyway. If I've somehow screwed up somewhere, let me know. I probably have. I'll fix whatever I get wrong. And thank you to everyone who has contributed and corrected things I didn't get right! Unlike my SLI guide, much of the information here was confirmed post-writing.
     
    If you want the SLI information or the mobile i7 CPU information guide, they're in my sig!
     
    Moderator note: If you believe any information found in this guide is incorrect, please message me or D2ultima and we will investigate it, thank you. - Godlygamer23
  12. Informative
    D2ultima got a reaction from Gondawn in The SLI information guide   
    Hi everyone. I originally wrote this guide over at the kbmod forums, but as it turns out that forum is as dead as Aeris in FF7. This forum is more lively and thus I figured it'd be good to copy over my guide for all to read. This is a real-world, layman's terms assessment of what SLI does and how it works. I have never used and therefore cannot say that all of these will hold true for CrossfireX. Original guide (no longer updated there) is over at http://kbmod.com/forums/viewtopic.php?f=22&t=6212
     
    ------------------------------------------------------------------------------------------------------------------------------------------------
     
    Basically, this is a guide meant to explain the upsides and downsides of SLI. It's mainly geared toward people who have midrange or last generation cards who wonder if they should get multiple cards or simply upgrade to a stronger one. This will list pretty much everything you will ever want to know about SLI in as great a detail as I can possibly create. It WILL BE A LONG READ. Also note that I have never attempted to Crossfire any cards, so this guide is MAINLY directed toward SLI, and while most of the ups/downs will be common to both SLI and CrossfireX, THIS IS NOT A CROSSFIRE GUIDE. There are large differences and I am not in a position to explain in-depth about CrossfireX.
     
     
    First, I will clear up some fairly common misconceptions about SLI. 
     
     
    What can I SLI? (970 SLI issue information and potential fix)
     
     
    Now that that's done, let's get into the benefits of SLI. There's some benefits I'll list that most people don't actually know.
     
     
    And now here come the downsides!
     
     
    Resolved and/or no-longer applicable downsides to SLI (If you have SLI, read this section to see if any of these fixes apply to you).
     
     
    My thoughts and suggestions section.
     
     
    The bandwidth issue
     
     
     
    I wish to add that as far as performance is concerned, two GPUs will far outstrip what one GPU can do, unless you're using two entry-level midranged cards and comparing a flagship enthusiast card (for example, two 960s versus a superclocked 980Ti). I DO like SLI, but SLI isn't for everyone and with the recent terrible state of SLI support that I see in a constant decline, as well as Maxwell and Pascal's anti-multi-GPU design, I can no longer recommend it to... well... anyone, really. If the current high end GPU isn't enough performance for you, SLI is the way to go, sure. But I would take a single stronger GPU over SLI-ing two weaker GPUs as long as that single GPU is 25% or more better than one of the weaker GPUs that would be SLI'd (I.E. I'd take Card A over Card B SLI if Card A is 25%+ faster than Card B no SLI). The amount of times with recent titles (that'll actually need GPU grunt, unlike many older titles with newer cards) where the single card will simply do a lot better than the SLI setup is going to be a very high number, and there is no guarantee that SLI will even properly work with nVidia Profile Inspector bits forcing (Dead by Daylight, for example, is a popular new title that will not get positive SLI scaling without flickering characters no matter what I do). This is, I believe, more the developers' faults than nVidia's, however nVidia's readiness to discard SLI is also apparent. They know it doesn't work well and are not showing intent on fixing it, as seen with GTX 1060s being incapable of SLI, despite being stronger than GTX 970s in raw performance.
     
    Further to the above bashing of the state of multi-GPU, here is a nice article's summary page for performance in multi-GPU games in 2015 and later titles, to back up the statements I make in here, since I often get people telling me I'm deluded or some other kind of nonsense when I make such claims.
     
    NB: I add to the benefits or detriments lists when I remember something, discover something or technology changes to keep the guide up to date. I wish I could speak more about Maxwell, but unless someone sends me a pair of maxwell GPUs and heatsinks for my Clevo, I'm not going to be able to test much, unfortunately.
     
    If you want the vRAM information or mobile i7 CPU information guides, they're in my sig!
     
    Moderator note: If you believe any information found in this guide is incorrect, please message me or D2ultima and we will investigate it, thank you. - Godlygamer23
  13. Agree
    D2ultima got a reaction from Dimondminer11 in How to OC my laptop RAM?   
    And? RAM benefits you straight. You can get 3000MHz RAM for laptops. You're telling this person that better RAM speed than the almost certainly 2133MHz 15-15-15-35 shit RAM that laptops come with by default is pointless, and that attempting to OC it grants no benefit but many downsides.
     
    This is false.
     
    He will see benefits from faster/tuned RAM; he's simply unable to actually overclock it with his system.
  14. Agree
    D2ultima got a reaction from Dimondminer11 in How to OC my laptop RAM?   
    https://youtu.be/43g3OTK2AbE?t=4m42s
    You're pretty much factually wrong.
    Since the "AW" model line (I.E. post-M17x, M18x, etc) the BIOS has been secure-flash locked. You cannot flash a BIOS. Period. You can flash something to an empty BIOS chip and replace the existing BIOS chip, but good luck with that.
    If you don't have the ability to adjust your timings, then you can't do it. You might have success using Intel XTU to change the timings, but I can't guarantee that it'll even let you fiddle with your timings.
     
    Don't listen to them. Good RAM makes a big difference. It raises minimum framerates, and flat out can reduce CPU loads in games. It's even better in some cases to get better RAM than to overclock, as you will see in the link higher up in this post.
     
    Almost everybody knows jack shit when talking about RAM but they're quite ready to speak as if they do.
  15. Informative
    D2ultima got a reaction from flyinglaserkiwi in The video RAM information guide   
    Ok. I did a SLI guide and now time to do a vRAM/memory bandwidth guide. A lot of people seem to be confused about vRAM in general. Well, here we go.
     
    Let's clear up some misconceptions about vRAM! 
     
     
    What does vRAM size have to do with gaming?
     
     
    Bonus: What happens if I don't have enough vRAM that a game asks for for certain settings?
     
     
    And now resolution, Anti Aliasing and operating systems
     
     
    And now about multiple monitors
     
     
    Now, onto memory bandwidth and memory bus and such. You may wanna skip this if you know already and are only here for the vRAM size portion above, but I might as well be thorough if I'm doing this. Spoiler tags save the day!
     
    vRAM types & memory clocks
     
     
    Next, memory bus and memory bandwidth!
     
     
    Extra: Memory bus + mismatched memory size section
     
     
    And the GTX 970 gets its own section! Hooray!
     
     
    FAQ
     
     
    Windows 10 and nVidia 353.62
     
     
    Final tidbits and stuff
     
     
    I started writing this guy mainly for the top section, to denounce misinformation people seem to have regarding vRAM and its relation to memory bandwidth, but I figured I might as well just go the full mile and explain as best I can about what most people need to know about GPU memory anyway. If I've somehow screwed up somewhere, let me know. I probably have. I'll fix whatever I get wrong. And thank you to everyone who has contributed and corrected things I didn't get right! Unlike my SLI guide, much of the information here was confirmed post-writing.
     
    If you want the SLI information or the mobile i7 CPU information guide, they're in my sig!
     
    Moderator note: If you believe any information found in this guide is incorrect, please message me or D2ultima and we will investigate it, thank you. - Godlygamer23
  16. Like
    D2ultima got a reaction from xWeegix in The SLI information guide   
    Even if it wasn't, forcing devs to implement it natively (especially if that means DX12 or Vulkan-based only; it's unclear whether it does or does not) is simply... extra work, for no reason whatsoever. It's different with Nvidia TWIMTBP titles, where Nvidia helps the devs work with it and try to get things working, but as seen with AC Unity and most Ubishit titles in general as well as the basic quality of AAA games on PC, even TWIMTBP titles don't even like SLI very much or are simply far too unoptimized (well NVLink solved that, but then games would benefit from hacked profiles and whatnot still, so shakes head)
    SLI is 1000% dead. I guarantee that. Nvidia killed it because it doesn't want to do any more work on it since they are the ones who made all the game profiles, but banning driver profiles is a complete nail in the coffin.
     
    Previously if you had PCI/e 3.0 x16/x16 and a HB bridge on Pascal/Maxwell you could get improved or positive and/or bug-free scaling in games that otherwise didn't support it, even UE4 titles, like Dragonquest XI (scaling but buggy and flashing on 3.0 x8/x8 with HB bridge), Unreal Tournament 4 (almost no scaling on x8/x8), and Witcher 3 with TAA on (low scaling on x8/x8), usually with custom SLI profiles (UE4 titles need these). Even games that had fairly solid SLI performance could improve from modified bits or other profiles, like how Black Ops 3 benefitted from the Battleforge SLI profile over its own Nvidia-provided one.
     
    Since bandwidth became such an issue (which NVLink did fix) I had been telling people single strongest card and then SLI if you want it, unless they wanted to spend time on guru3D forums with the master list of modded SLI profiles and/or tinkering with bits to get optimum performance, and there were lots of gains to be had with NVLink...
     
    But now they're saying they aren't allowing us to modify driver bits with Nvidia Profile Inspector for SLI or force it on games that don't otherwise support it, even if they could. That means Dragonquest XI, UT4, etc which supports and scales well with SLI if you have enough inter-card bandwidth (read: NVLink) and likely a whole host of other Unreal Engine 4 games and generally others I can't remember off the top of my head are now forever locked to ONE card.
     
    SLI is dead, there is no saving it, Nvidia killed it and admitted to killing it, and I highly doubt there is a way they're going to bring it back. Even if they insist they will "work with developers" to "get a lot of games supported" it doesn't mean developers will want to put in that extra work for free, even if you could SLI all the way down to whatever a 3050Ti will be, which is why pre-existing SLI support for a vast majority of games just doesn't really exist these days. If you think that there's some hope for it, feel free to buy a couple 3090 cards and run MSI AB + RTSS in every game and tell me how often your games actually properly load both cards and get back to us. 
  17. Agree
    D2ultima got a reaction from Retro Gamer in The SLI information guide   
  18. Agree
    D2ultima got a reaction from Retro Gamer in The SLI information guide   
    I'd suggest the R9 380 4GB wholeheartedly. SLI nothing under a 980Ti if you can help it.
  19. Like
    D2ultima got a reaction from MultiGamerClub in The SLI information guide   
    Hi everyone. I originally wrote this guide over at the kbmod forums, but as it turns out that forum is as dead as Aeris in FF7. This forum is more lively and thus I figured it'd be good to copy over my guide for all to read. This is a real-world, layman's terms assessment of what SLI does and how it works. I have never used and therefore cannot say that all of these will hold true for CrossfireX. Original guide (no longer updated there) is over at http://kbmod.com/forums/viewtopic.php?f=22&t=6212
     
    ------------------------------------------------------------------------------------------------------------------------------------------------
     
    Basically, this is a guide meant to explain the upsides and downsides of SLI. It's mainly geared toward people who have midrange or last generation cards who wonder if they should get multiple cards or simply upgrade to a stronger one. This will list pretty much everything you will ever want to know about SLI in as great a detail as I can possibly create. It WILL BE A LONG READ. Also note that I have never attempted to Crossfire any cards, so this guide is MAINLY directed toward SLI, and while most of the ups/downs will be common to both SLI and CrossfireX, THIS IS NOT A CROSSFIRE GUIDE. There are large differences and I am not in a position to explain in-depth about CrossfireX.
     
     
    First, I will clear up some fairly common misconceptions about SLI. 
     
     
    What can I SLI? (970 SLI issue information and potential fix)
     
     
    Now that that's done, let's get into the benefits of SLI. There's some benefits I'll list that most people don't actually know.
     
     
    And now here come the downsides!
     
     
    Resolved and/or no-longer applicable downsides to SLI (If you have SLI, read this section to see if any of these fixes apply to you).
     
     
    My thoughts and suggestions section.
     
     
    The bandwidth issue
     
     
     
    I wish to add that as far as performance is concerned, two GPUs will far outstrip what one GPU can do, unless you're using two entry-level midranged cards and comparing a flagship enthusiast card (for example, two 960s versus a superclocked 980Ti). I DO like SLI, but SLI isn't for everyone and with the recent terrible state of SLI support that I see in a constant decline, as well as Maxwell and Pascal's anti-multi-GPU design, I can no longer recommend it to... well... anyone, really. If the current high end GPU isn't enough performance for you, SLI is the way to go, sure. But I would take a single stronger GPU over SLI-ing two weaker GPUs as long as that single GPU is 25% or more better than one of the weaker GPUs that would be SLI'd (I.E. I'd take Card A over Card B SLI if Card A is 25%+ faster than Card B no SLI). The amount of times with recent titles (that'll actually need GPU grunt, unlike many older titles with newer cards) where the single card will simply do a lot better than the SLI setup is going to be a very high number, and there is no guarantee that SLI will even properly work with nVidia Profile Inspector bits forcing (Dead by Daylight, for example, is a popular new title that will not get positive SLI scaling without flickering characters no matter what I do). This is, I believe, more the developers' faults than nVidia's, however nVidia's readiness to discard SLI is also apparent. They know it doesn't work well and are not showing intent on fixing it, as seen with GTX 1060s being incapable of SLI, despite being stronger than GTX 970s in raw performance.
     
    Further to the above bashing of the state of multi-GPU, here is a nice article's summary page for performance in multi-GPU games in 2015 and later titles, to back up the statements I make in here, since I often get people telling me I'm deluded or some other kind of nonsense when I make such claims.
     
    NB: I add to the benefits or detriments lists when I remember something, discover something or technology changes to keep the guide up to date. I wish I could speak more about Maxwell, but unless someone sends me a pair of maxwell GPUs and heatsinks for my Clevo, I'm not going to be able to test much, unfortunately.
     
    If you want the vRAM information or mobile i7 CPU information guides, they're in my sig!
     
    Moderator note: If you believe any information found in this guide is incorrect, please message me or D2ultima and we will investigate it, thank you. - Godlygamer23
  20. Informative
    D2ultima got a reaction from thekingofmonks in The video RAM information guide   
    Ok. I did a SLI guide and now time to do a vRAM/memory bandwidth guide. A lot of people seem to be confused about vRAM in general. Well, here we go.
     
    Let's clear up some misconceptions about vRAM! 
     
     
    What does vRAM size have to do with gaming?
     
     
    Bonus: What happens if I don't have enough vRAM that a game asks for for certain settings?
     
     
    And now resolution, Anti Aliasing and operating systems
     
     
    And now about multiple monitors
     
     
    Now, onto memory bandwidth and memory bus and such. You may wanna skip this if you know already and are only here for the vRAM size portion above, but I might as well be thorough if I'm doing this. Spoiler tags save the day!
     
    vRAM types & memory clocks
     
     
    Next, memory bus and memory bandwidth!
     
     
    Extra: Memory bus + mismatched memory size section
     
     
    And the GTX 970 gets its own section! Hooray!
     
     
    FAQ
     
     
    Windows 10 and nVidia 353.62
     
     
    Final tidbits and stuff
     
     
    I started writing this guy mainly for the top section, to denounce misinformation people seem to have regarding vRAM and its relation to memory bandwidth, but I figured I might as well just go the full mile and explain as best I can about what most people need to know about GPU memory anyway. If I've somehow screwed up somewhere, let me know. I probably have. I'll fix whatever I get wrong. And thank you to everyone who has contributed and corrected things I didn't get right! Unlike my SLI guide, much of the information here was confirmed post-writing.
     
    If you want the SLI information or the mobile i7 CPU information guide, they're in my sig!
     
    Moderator note: If you believe any information found in this guide is incorrect, please message me or D2ultima and we will investigate it, thank you. - Godlygamer23
  21. Agree
    D2ultima got a reaction from WhitetailAni in The video RAM information guide   
    Ok. I did a SLI guide and now time to do a vRAM/memory bandwidth guide. A lot of people seem to be confused about vRAM in general. Well, here we go.
     
    Let's clear up some misconceptions about vRAM! 
     
     
    What does vRAM size have to do with gaming?
     
     
    Bonus: What happens if I don't have enough vRAM that a game asks for for certain settings?
     
     
    And now resolution, Anti Aliasing and operating systems
     
     
    And now about multiple monitors
     
     
    Now, onto memory bandwidth and memory bus and such. You may wanna skip this if you know already and are only here for the vRAM size portion above, but I might as well be thorough if I'm doing this. Spoiler tags save the day!
     
    vRAM types & memory clocks
     
     
    Next, memory bus and memory bandwidth!
     
     
    Extra: Memory bus + mismatched memory size section
     
     
    And the GTX 970 gets its own section! Hooray!
     
     
    FAQ
     
     
    Windows 10 and nVidia 353.62
     
     
    Final tidbits and stuff
     
     
    I started writing this guy mainly for the top section, to denounce misinformation people seem to have regarding vRAM and its relation to memory bandwidth, but I figured I might as well just go the full mile and explain as best I can about what most people need to know about GPU memory anyway. If I've somehow screwed up somewhere, let me know. I probably have. I'll fix whatever I get wrong. And thank you to everyone who has contributed and corrected things I didn't get right! Unlike my SLI guide, much of the information here was confirmed post-writing.
     
    If you want the SLI information or the mobile i7 CPU information guide, they're in my sig!
     
    Moderator note: If you believe any information found in this guide is incorrect, please message me or D2ultima and we will investigate it, thank you. - Godlygamer23
  22. Funny
    D2ultima got a reaction from Abdullah Bhutta in The video RAM information guide   
    It's 5:35am and I have no idea if you're serious or joking .
  23. Agree
    D2ultima got a reaction from BTGbullseye in The SLI information guide   
    Even if it wasn't, forcing devs to implement it natively (especially if that means DX12 or Vulkan-based only; it's unclear whether it does or does not) is simply... extra work, for no reason whatsoever. It's different with Nvidia TWIMTBP titles, where Nvidia helps the devs work with it and try to get things working, but as seen with AC Unity and most Ubishit titles in general as well as the basic quality of AAA games on PC, even TWIMTBP titles don't even like SLI very much or are simply far too unoptimized (well NVLink solved that, but then games would benefit from hacked profiles and whatnot still, so shakes head)
    SLI is 1000% dead. I guarantee that. Nvidia killed it because it doesn't want to do any more work on it since they are the ones who made all the game profiles, but banning driver profiles is a complete nail in the coffin.
     
    Previously if you had PCI/e 3.0 x16/x16 and a HB bridge on Pascal/Maxwell you could get improved or positive and/or bug-free scaling in games that otherwise didn't support it, even UE4 titles, like Dragonquest XI (scaling but buggy and flashing on 3.0 x8/x8 with HB bridge), Unreal Tournament 4 (almost no scaling on x8/x8), and Witcher 3 with TAA on (low scaling on x8/x8), usually with custom SLI profiles (UE4 titles need these). Even games that had fairly solid SLI performance could improve from modified bits or other profiles, like how Black Ops 3 benefitted from the Battleforge SLI profile over its own Nvidia-provided one.
     
    Since bandwidth became such an issue (which NVLink did fix) I had been telling people single strongest card and then SLI if you want it, unless they wanted to spend time on guru3D forums with the master list of modded SLI profiles and/or tinkering with bits to get optimum performance, and there were lots of gains to be had with NVLink...
     
    But now they're saying they aren't allowing us to modify driver bits with Nvidia Profile Inspector for SLI or force it on games that don't otherwise support it, even if they could. That means Dragonquest XI, UT4, etc which supports and scales well with SLI if you have enough inter-card bandwidth (read: NVLink) and likely a whole host of other Unreal Engine 4 games and generally others I can't remember off the top of my head are now forever locked to ONE card.
     
    SLI is dead, there is no saving it, Nvidia killed it and admitted to killing it, and I highly doubt there is a way they're going to bring it back. Even if they insist they will "work with developers" to "get a lot of games supported" it doesn't mean developers will want to put in that extra work for free, even if you could SLI all the way down to whatever a 3050Ti will be, which is why pre-existing SLI support for a vast majority of games just doesn't really exist these days. If you think that there's some hope for it, feel free to buy a couple 3090 cards and run MSI AB + RTSS in every game and tell me how often your games actually properly load both cards and get back to us. 
  24. Agree
    D2ultima got a reaction from BTGbullseye in The video RAM information guide   
    Your GPU core is not where your vRAM is, and it's possible your card either has bad VRM cooling, and/or the thermal pads used to pull heat from your vRAM have degraded in quality. Listing what card you have would help in figuring out if the cooler is a problem. 
    Good answer, I agree with what you've said. I would say 6GB is still a very solid baseline for vRAM right now, but I don't know if that will hold up in 2 years, and as much as "future proofing lol can't do it", I don't recommend GPUs for them to become a crippling factor in any time under 2 years (especially in notebooks where they cannot be swapped out).
  25. Informative
    D2ultima reacted to BTGbullseye in The video RAM information guide   
    Better to get educated at the risk of sounding stupid, than to actually be stupid.
    Yes.
    That is not how VRAM works. That's like saying "can I just connect another PC to mine with a USB cable to increase the system RAM". (not being condescending, just giving an example of approximate equivalence)
    It's not really implemented in any games, and there is no indication that it ever will be. The process is incredibly inefficient in all but a few non-gaming workloads, (will actually reduce framerates significantly) requires enterprise grade GPU hardware, (like the $2000-$8000 Quadro GPUs) and requires software implementation of the process.
    Honestly, I wouldn't go for any GPU with less than 8GB of VRAM anymore, as 8GB cards are currently available in the $150 range when new. (RX 570 8GB, or $200 for similar performance as your GTX 980 on an RX 5500XT)
×