Jump to content

Codewyn

Member
  • Posts

    2
  • Joined

  • Last visited

Awards

This user doesn't have any awards

Codewyn's Achievements

  1. I added the display driver version to the specs in the OP. I wonder why clock speed would make a difference though since this has happened fairly intermittently over the last few months (only constantly in the last week). Most of the time it runs fine on the same games with the same settings, sometimes for longer sessions. While I have experienced this over multiple driver versions, the version previous to the one in the specs (current) was fine for a while. The brunt of it started at the tail end of my time with that driver version, and continues into the current one.
  2. I’m experiencing what appears to be a display issue, but want to detail it in case someone can point at another cause or otherwise elaborate on it. So I bought a custom built PC about a year ago - hardly used, build only a month or so before I bought it (I knew the guy enough to know he wasn’t giving me a lemon). Here are the specs on it... MOTHERBOARD: ASUS Z97-PRO (WiFi AC) CPU: Intel Core i7-4790K (liquid cooled) GPU: EVGA GeForce GTX 980 4GB SC GAMING ACX 2.0 - 04G-P4-2983-KR (x2/SLI) - SLI: Eathtek New ASUS Nvidia 120mm Long VGA Card SLI flexible bridge cable interconnect connector - DRIVERS: GeForce Game Ready Driver v 365.10 (5/2/16) RAM: Kingston HyperX Fury 8GB PC3-12800 DDR3 SDRAM (KHX1600C10D3/8G) (x2) POWER: EVGA SuperNOVA 750 G2 (220-G2-0750-XR) DISPLAY: BenQ XL2720Z 144Hz 1ms 27" Gaming Monitor (x3/Triple) Other than a botched Windows 10 upgrade from 8.1 that forced me to reformat a few months back, I didn’t experience any issues with it. That is until I decided to add two more of the same monitors and one more of the same graphics cards. Been running it for a couple months now with triple monitors (144hz on DisplayPort) and SLI, and a few times, seemingly randomly, one of my monitors will start to cut out while gaming and finally go out altogether - the message I get from it is that it’s out of range or something (usually happens on the first (left-most) monitor, but all have been affected before). The first or second time that happened I found a post that recommended dropping from 144hz to 120hz on them all, so I did and it seemed fine. Then I decided to give 144hz another shot, and it ran fine on that for a while. Recently, though, I was in-game (Rust, which doesn’t like my triples setup and flickers under it, so I only run it on the center monitor, though it takes up all three) and running YouTube in a browser on one of the other monitors when the first one started going in and out (progressively longer outage), which then spread to the right most monitor. So I quit the browser, then the game, and was on the desktop when I encountered an issue that I haven’t encountered before: not entirely sure what was happening, but it almost seemed like the graphics card was resetting, where it first disabled SLI and then started making the hardware/peripheral disconnect and reconnect sounds while disabling the display. This event would repeat each time I did something in Windows, like opening the Start menu, or opening the GeForce graphics settings or GeForce Experience app. I restarted and it didn’t have any effect on it, kept happening. So I thought maybe it was related to the display driver, which I downloaded the latest version of (which appeared to be the one I had) and did a clean reinstall on. I then re-enabled SLI, reconfigured the GeForce surround setting (for triple display setups), and restarted. This seemed to resolve it. The next day I started a game up again and it ran fine for a while, until the left monitor started going in and out again (again, progressively longer each time). So I exited out of the game, and the issue started happening again with the disconnect and reconnect sound and the primary display going in and out (with the OS seemingly resetting the display so my cursor would center on-screen and any menus I had open would close) - SLI was also disabled again in the settings. I then noticed there was a new GeForce display driver, so I downloaded and did a clean install on it. Again, the issue seemed resolved once I set everything back to the way I had it in the display settings and restarted. But the issue still persists under load. Does anyone have any ideas on what is causing this, and moreover how I can resolve it? I’ve been trying to figure out the culprit. Is it a bad video card? How can I be sure if it is? Or is it bad display port cables? I didn’t buy cheap ones ("Accell B142C-010B UltraAV DisplayPort to DisplayPort 1.2 Cable with Latches - 10 Feet (3 Meters)”, which have great reviews on Amazon, especially for 144hz at 1080p). Could it be a bad SLI cable? Any way to determine that? Could it be a power issue - i.e.: insufficient wattage? What’s the best way to confirm that if it’s a possibility? Could it be a bad controller board in the monitors themselves? Though I have my doubts about that (correct me if you think otherwise). Any tools I can run or techniques I can employ to shed some further light on this? I can provide any additional info you guys request if you think it will help to that effect. This is a pretty frustrating experience and has basically rendered my machine inoperable for its intended use. The sooner I can get this figured out and fixed the better. Any help would be much appreciated! Thanks in advance.
×