Jump to content

Frankenburger

Member
  • Posts

    1,683
  • Joined

  • Last visited

Reputation Activity

  1. Like
    Frankenburger got a reaction from Isaac Clarke in how to undervolt a gtx 980 reference card?   
    Lower the power target and raise the core clock until you hit your desired results. It won't be as specific as editing the clock/voltage curve, but it'll basically achieve the same thing in the end.
  2. Like
    Frankenburger got a reaction from Isaac Clarke in how to undervolt a gtx 980 reference card?   
    Exactly
  3. Like
    Frankenburger got a reaction from Isaac Clarke in how to undervolt a gtx 980 reference card?   
    Undervolting = restricting power while raising the default clock curve to maintain performance
     
    Specific power points have fixed clock speeds. If all you do is lower the power target, then you are reducing power consumption while also using a lower clock speed.
     
    For example, a default clock/voltage curve may tell your GPU to run at 1400MHz at 900mv. If your card uses 1050mv at 100% and runs at 1800MHz, and you want to maintain a comparable clock speed, you need to tell the GPU to run faster, otherwise you'll be running at 1400MHz.
  4. Like
    Frankenburger got a reaction from Isaac Clarke in how to undervolt a gtx 980 reference card?   
    The only thing you risk is system instability, which you can correct if you undervolt incorrectly. The only way to damage a video card using MSI Afterburner is if you overvolt it to unsafe levels, which is basically an impossibility since the power target is locked.
     
    Undervolting is perfectly safe. Just make sure you don't allow the undervolt to apply automatically until you know it's stable. If anything, undervolting may increase GPU longevity. Less power means less electricity running through the GPU, means less heat, means less work on the cooler. Increasing the core clock won't change that.
  5. Like
    Frankenburger got a reaction from Isaac Clarke in how to undervolt a gtx 980 reference card?   
    165w is generally what GTX980's consume under load, so there's definitely a decrease in power, especially if the average power consumption is below 161w. Though I've seen GPUz report weird values when using MSI Afterburner to undervolt. My suggestion would be to reference Afterburner's sensors to verify everything is running as it should.
  6. Like
    Frankenburger got a reaction from Isaac Clarke in how to undervolt a gtx 980 reference card?   
    I honestly can't say. Power Target and voltage don't scale linearly with each other. Also, there's slight variations in each processor that could affect efficiency. I believe the GTX980 uses around 1050mv at stock, but I could be wrong. Though of it does, I would expect it to be around 900mv.
  7. Like
    Frankenburger got a reaction from Isaac Clarke in how to undervolt a gtx 980 reference card?   
    You're welcome.
     
    The best way to confirm its using less power is to lock the fan speed and compare the GPU temps from stock to undervolt. If the temps go down while undervolted, then all is well.
  8. Like
    Frankenburger got a reaction from BenniAdioz in Audio & Video stutter on Ryzen 7 1700 with x264, no stutter with NVENC   
    Looks like it was a timing issue between the computer's hardware and the sync of NDI Source. Disabling Device Timestamps on the desktop audio and setting NDI to Network sync on the encoding system resolved the issue.
     
    Thank you for brain storming with me. I really appreciate it!
  9. Like
    Frankenburger got a reaction from Paul Rudd in Gaming screenshots   
    MHW Iceborne
    Can't decide if I like the shorter or longer hair more on my character.

  10. Like
    Frankenburger got a reaction from Ithanul in Gaming screenshots   
    Spyro Reignited
    4k RTGI + SLI
     





     
  11. Informative
    Frankenburger got a reaction from The Old Myron Siren in Would GTX 1080 SLI Be a Real Dumb Idea?   
    I've been running SLI since 2014, currently on SLI 1080Ti.
     
    Stability really isn't an issue. Either SLI works, or it doesn't. If SLI doesn't work, then you can often get it working by changing the compatibility bits. In the rare occasion SLI causes negative scaling, and changing the bits doesn't help, you can simply disable SLI for the game's process.
     
    My only recommendation would be to run games with DSR, SSAA, or with graphics mods. Of course, you don't have to, but SLI is easy to bottleneck at 1080p. Most games simply don't stress modern high end GPUs for SLI to really show its stuff until you start rendering above 1440p.
  12. Agree
    Frankenburger got a reaction from Zando_ in Would GTX 1080 SLI Be a Real Dumb Idea?   
    I've been running SLI since 2014, currently on SLI 1080Ti.
     
    Stability really isn't an issue. Either SLI works, or it doesn't. If SLI doesn't work, then you can often get it working by changing the compatibility bits. In the rare occasion SLI causes negative scaling, and changing the bits doesn't help, you can simply disable SLI for the game's process.
     
    My only recommendation would be to run games with DSR, SSAA, or with graphics mods. Of course, you don't have to, but SLI is easy to bottleneck at 1080p. Most games simply don't stress modern high end GPUs for SLI to really show its stuff until you start rendering above 1440p.
  13. Like
    Frankenburger got a reaction from Totyxd in My pc only turns on when i remove the pcie cables from graphics card   
    My money is on the absence of power cables. I would either try a different card, or plug the card into a different computer to verify the issue is in fact with the card itself before flashing it.
  14. Like
    Frankenburger got a reaction from Tonberry in Gaming screenshots   
    I did and one other did, yes. There's a couple others in my gaming community that's played MH4U, but they're currently infrequent to MHW.
    I've been playing since the first MH, and have played through most of the other installments along the way. Aside from MHW (~650 hours), the one I've sunk the most time in was Generations (~125 hours). MH4U I have around 90 hours sunk, which is actually about the average for me with past installments.
  15. Like
    Frankenburger reacted to xAcid9 in Gaming screenshots   
    Looks great imo.
     
    30% is pretty good, mine over 2x the performance drop from like 130-ish FPS to 40fps-50fps. ?
    Yeah i also only use it for taking screenshot now.
     
    Flickering depth buffer is related to network, you need to find Reshade build with network detection turn off or compile it yourself. They did that to prevent cheat detection from some online games.
  16. Like
    Frankenburger got a reaction from xAcid9 in Gaming screenshots   
    It has about a 30% hit on performance with the settings I'm using. I'm going for a 60 FPS target, but a flickering depth buffer makes it distracting with the RT cuts in and out. Also, this is using the original demo build PG released a few months ago, so it's not representative of where the plugin is at now. Even if the depth buffer didn't cut in and out, the visual artifacts of the plugin makes it unusable IMO. Great for screenshots, not so good for gaming. I'm sure it'll be fixed with the plugin hits full release.
     
    Anywho
     
    RT Off
    RT On
     
  17. Like
    Frankenburger got a reaction from Olivejuice in 980ti temp increase with cpu upgrade?   
    That's looking a lot better, yeah. Don't worry so much about the score. As long as the average FPS is mostly consistent (around a 3 frame margin of error), then you should be good. Furmark is pretty good when it comes to consistency with FPS.
  18. Like
    Frankenburger reacted to Olivejuice in 980ti temp increase with cpu upgrade?   
    Looks like everything's back to normal - after three runs im hitting about 5950-5960. I dont know what the margin of error is for Furmark but i think that's close enough?

  19. Like
    Frankenburger got a reaction from Olivejuice in 980ti temp increase with cpu upgrade?   
    Nvidia's default fan curve favors silence when idle. It's not surprising the idle temps would go up by a few degrees. The temp increase shouldn't be a concern, but you should't be losing almost 50% of your performance. It makes me wonder if a driver didn't install correctly, if the BIOS is messed up somehow, or if you have a faulty CPU. Try doing a BIOS update and reset, and reinstall the chipset and GPU drivers.
  20. Agree
    Frankenburger reacted to RasmusDC in BOTTLENECK ISSUE   
    bullshit...
     
    i have a 3570k 4790k and my 8700k, taking a 970GTX which is nearly the same as the 1060, actually faster than a 1060 3GB, the performance increments are small, they are there, but in 1080p it does not matter.
     
    a 1080 on a 3570 will make everything run easily... YES you will loose some frames compared to a faster CPU in some games.. but it is still a nice pc.
  21. Agree
    Frankenburger got a reaction from aham in So if games have to suport multi GPU setups?   
    The way games are rendered are fundamentally different from the modern era and the 1990's and early 2000's. AFAIK, Voodoo cards used interlacing, where one GPU would render every other line. I don't know if there were compatibility issues back then, since that was before my time. More modern multi GPU setups require compatibility layers to be enabled within the drivers. In the early 2000's, it took Nvidia about 2 years to add enough compatibility layers to officially support around 100 games when Nvidia's SLI hit the market.
     
    Today is a whole different beast though. Nowadays, there's rendering modes that simply don't work with multi GPUs depending on how it's implemented. Temporal AA is famous for not being compatible with SLI, because it screws with the alternate frame rendering (AFR) mode.
     
    Games don't need to "support" multi GPU setups per say, because most of it is still handled by the drivers, but there is a certain amount of responsibility that developers needs to account for when developing a game. If developers use rendering methods that mess with the foundation of modern multi GPU technology, then there's nothing AMD or Nvidia can do about it.
  22. Agree
    Frankenburger reacted to Mira Yurizaki in Skyrim 4k RTX mods (star wars bf II 2005)   
    The ray tracing model this mod likely uses is called screen space ray tracing (http://casual-effects.blogspot.com/2014/08/screen-space-ray-tracing.html). Which means anything you can't seen on your monitor does not contribute to the lighting. And this is literally anything you can't see, like say the part of an object that's behind some world geometry. Games that use DXR ray tracing don't use screen space ray tracing. It takes into account of everything including things that aren't visible on screen.
  23. Agree
    Frankenburger got a reaction from Mira Yurizaki in Skyrim 4k RTX mods (star wars bf II 2005)   
    Like I said before, that's using ReShade. It's not the same as RTX.
     
    Pascal Glicher/Marty McFly's ray trace plugin uses screen space to calculate ray tracing. It's functionally different from RTX.
     
    You can follow his progress here https://www.patreon.com/mcflypg
  24. Agree
    Frankenburger got a reaction from goto10 in So if games have to suport multi GPU setups?   
    The way games are rendered are fundamentally different from the modern era and the 1990's and early 2000's. AFAIK, Voodoo cards used interlacing, where one GPU would render every other line. I don't know if there were compatibility issues back then, since that was before my time. More modern multi GPU setups require compatibility layers to be enabled within the drivers. In the early 2000's, it took Nvidia about 2 years to add enough compatibility layers to officially support around 100 games when Nvidia's SLI hit the market.
     
    Today is a whole different beast though. Nowadays, there's rendering modes that simply don't work with multi GPUs depending on how it's implemented. Temporal AA is famous for not being compatible with SLI, because it screws with the alternate frame rendering (AFR) mode.
     
    Games don't need to "support" multi GPU setups per say, because most of it is still handled by the drivers, but there is a certain amount of responsibility that developers needs to account for when developing a game. If developers use rendering methods that mess with the foundation of modern multi GPU technology, then there's nothing AMD or Nvidia can do about it.
  25. Like
    Frankenburger got a reaction from meisterveda in Help getting to 3200 ram speeds.   
    XMP is garbage and often creates more headaches than what its worth. The best way to hit the rated speeds of the modules is to key in the speeds, timings, and voltages manually. You can usually find the rated timings and voltages on either the product page of the RAM, or the sticker for each module. Do this 1 stick at a time to ensure you don't have a bad module.
×