Jump to content
Search In
  • More options...
Find results that contain...
Find results in...


  • Content Count

  • Joined

  • Last visited


This user doesn't have any awards

About DividedByZero

  • Title

Recent Profile Visitors

280 profile views
  1. I've got a few questions about overclocking the i7-9700k. I've currently got it running at 5Ghz with 1.35V static mode. 1. Should I worry about voltage fluctuations? A few things I noticed when changing the voltage settings is that the voltage fluctuates quite a bit from what I set it to. With it set at 1.35V it peaks quite a lot at a bit over 1.4V. When I tried to set the voltage to auto it would spike up to as much as 1.46V sometimes. Should I be worried about this? 1.4V is a little too close for comfort for me personally. I've currently got LLC set to level 1 and lowering it didn't seem to change the spiking at all. 2. Howcome does frequency fluctuate so much when an AVX offset is set? Whenever I change the AVX offset from 0 the frequency gets pretty wonky even under no load. I had it set to 3 and even tried with 2 but the frequency would constantly jump from 4.6Ghz to 4.9Ghz (was testing with 4.9Ghz in this scenario). For now I have it just set to 0 again and everything's well now, 3. Should I worry about a crash during the XTU benchmark? 5Ghz doesn't seem to quite cut the built in XTU benchmark. I don't notice any other crashes during games or Cinebench so should I care that it can't pass that benchmark? 4. At such an overclock with a hardline loop what kind of temperatures should I be expecting? Cinebench runs at 75C and in GTA V it sits at anywhere between 60C and 80C depending on where I am on the map. It doesn't seem to spike above 80C though. I dislike making posts like this because you can usually find answers online but the answers to the same questions being asked were very different from one another so I wanted some consecutive answers.
  2. Well with a 15AWG wire and such a short run the voltage drop won't be that severe that it would matter. That's kinda what I meant. Voltages sensing will always matter because under heavy load it can adjust the voltage back up. I doubt I can just leave them unconnected as the PSU will probably think no voltage is going out and probably disable output. Thanks for the help though.
  3. I don't see all that much purpose in them. Since they're sense wires I assume they'll have to be connected? If that's the case I'll just bridge them on the PSU side. Having 2 wires in one crimp on the 24pin will look pretty ugly after cable sleeving.
  4. I'm using a corsair RM850 black label and intend to create custom sleeved cables for them so I did some measurements on the old cables. The 24 pin one was a peculiar one as it had an 18 pin and a 10 pin on the power supply side. My measurements showed that those extra 4 pins were +3.3V, +5V, +12 and Ground. They were put in parallel on the 24 pin side having used 2 wires in one pin. Is there any point in doing this other than creating a proprietary cable or splitting the current among the two? Will it hurt if I just leave these out as it won't changes the voltages/signals on the motherboard side?
  5. I tried the graphics card in a completely different system, same issues. So I'm just gonna send it back and either request a refund or a replacement.
  6. Swapping out to my old PSU did not fix it either
  7. Older driver makes no difference and actually made it worse seeing as it crashed like this much sooner than it did before. Trying older PSU right now.
  8. I'll try the older driver first and then I'll give that a shot. Neither GPU nor CPU are overclocked so I shouldn't be dealing with power limit issues but it maybe another issue with the PSU.
  9. My apologies seems like it isn't the yellow label but the black label. I thought the yellow label was the 2019 one. Apart from that I have an old yellow label RM750 to try it with.
  10. Corsair RM850 Yellow Label. Should've mentioned that in the original post, my bad.
  11. I'm currently using a ZOTAC 2080 Super Twin Fan graphics card in my new build. All the hardware in this build is new and today is the third day I run it. However today weird crashes started happening. When playing GTA V for anywhere between 1 and 1 and half hour my displays would stop receiving input and the fans on the graphics card would go up to a 100%, on top of that the RGB logo also turns off. The display no longer turns on at this point but the PC is still running since I can still hear Discord and continue talking on discord but I have no video output. The temperatures and power usage are all in check under load and never exceeded what they shouldn't exceed. This seems to be a known issue but none of the ideas that I found on the internet helped me fix this issue. On top of that event viewer also shows the following: "Display driver nvlddmkm stopped responding and has successfully recovered." Here's a list of things I tried: - Re-seating the card - Using a different plug on my PSU to power the card - Reinstall the driver - Check voltages on the motherboard, none deviate greatly from their targeted value The cable that came with my motherboard connects with a single 8 pin and splits into an 8 pin and a 6 pin. Could it be that one port is power limited and my GPU is surpassing that limit? I'm at a loss for what to do now. One thing I haven't tried is downgrading to a previous driver version and I'll try that shortly. From all of the testing I've done the card just seems borked all together so I'm just going to send it back and have it refunded or replaced.
  12. I don't have experience with that specific monitor but I have an AOC monitor that's not curved with the same specs and panel type. Never had any problem with it, image quality probably won't be the best there is out there but it's good nonetheless.
  13. I've been looking for a good RGB controller that suits my build but I'm not having any luck finding one, hoping I might find some more help here. I have the following RGB things in my build: 6 x EK-Vardar EVO 120ER RGB fans 1 x EK-Supremacy Classic RGB - Nickel + Plexi CPU block 1 x EK-FC RTX 2080 +Ti Classic RGB - Nickel + Plexi GPU block 1 x EK-Classic DP Side PC-O11D G1 D-RGB + DDC 3.2 PWM distribution block Except for the distribution block everything is the typical 12V RGB control. I have a 5V header to take care of the distribution block on my motherboard and I have an additional 2 12V RGB headers on it as well so I can take care of the CPU and GPU block with those. So that pretty much just leaves the fans. I'd prefer if the controller was compatible with PolyChrome Sync so I can have everything housed under 1 software but that's not a must. I'd also prefer individual control over the fans.
  14. Sounds like a good idea. Thanks for the help.