Jump to content

jonathan13

Member
  • Posts

    295
  • Joined

  • Last visited

Awards

This user doesn't have any awards

4 Followers

System

  • CPU
    i7-6850k
  • Motherboard
    MSI X99A Gaming Pro Carbon
  • RAM
    Corsair Dominator Platinum [32gb(DDR2800)]
  • GPU
    EVGA GTX 1080 FTW submerged
  • Case
    Phanteks Enthoo Pro M Acrylic
  • Storage
    Adata 240gb SSD
  • PSU
    Corsair G2 650w
  • Display(s)
    Acer B346C Ultrawide IPS
  • Cooling
    EKWB Predator 360
  • Keyboard
    Corsair Strafe Cherry MX Blue RGB
  • Mouse
    Corsair M65 Pro RGB (white)
  • Sound
    HyperX Cloud 2
  • Operating System
    Windows 10

Recent Profile Visitors

1,433 profile views

jonathan13's Achievements

  1. Appreciate the explanation! Unfortunately, I tried all levels of LLC and it didn't help. It was previously set at 3 to begin with from when I overclocked the CPU. Oh well. I'm going to lean towards the PSU being bad. I'm gonna snatch one up and test it out and if it works, I'll RMA this one. If it doesn't work, I'll throw the whole computer out the window.
  2. I'll give it a shot. Any reason why you would think VDroop when it is happening from an overclocked GPU? Seems like the two wouldn't be related to me. (not saying you are wrong- just thinking out loud) Also, if it was a VDroop issue, wouldn't it just blue screen instead of cutting off? I've never had an OC cause a computer to just cut off.
  3. Looking for a little help troubleshooting my gaming rig. The specs are as follows: 8700k OCed to 4.9GHz 32gb HyperX Fury OCed to 2400MHz Asus Z370-i mini itx board Nvidia Titan Xp Seasonic Focus Plus Gold 650w PSU When I overclock my Titan, and try to game the computer shuts off like someone has flipped the power switch on the back of the PSU. It boots right back up after this happens. However, if I don't OC the Titan at all, it games fine, and doesn't cut off. I know my OC is stable because Superposition stress testing confirms it, and to be honest I never even reached the top of my stability before settling on +125 core and +500 on the memory. I'm heavily leaning towards it being a bad PSU since I'm not experiencing a crash, but rather a full power cut. The reason why I am doubting myself is because this Seasonic unit is a Tier 1 PSU. It's not a garbage PSU by any means, and I'm no where near the 650w this unit has to give. Any suggestions or insight would be greatly appreciated!
  4. For anyone that stumbles upon this post in the future, as of now there is no way to use two controllers at once. Unless you want to run splitters and get rid of individual fan control, it is impossible with the current software. It's a shame Cooler Master F'ed this up. I think it is inexcusable that they will sell you two kits for $180 with no mention of the fact that you cannot control all of those fans. Sad. I ordered RGB splitters, and now all 8 fans in this build are running off of the ASRock on-board RGB headers on this X370 Taichi board. Works perfectly fine this way, although you do lose individual fan control.
  5. I understand what you mean, but does Cooler Master then want me to purchase two kits ($90 a piece), and then also purchase splitters separately to add a second one of their own kits? In addition, if I do that, there is no way to individually control each fan which is one of the selling points of this setup. The manual is the most lacking piece of informational literature I have ever seen.
  6. In my current build, I have two sets of Cooler Master RGB fans (3 fans per pack). This means I am running two Cooler Master RGB "controllers". In the software Cooler Master provides for these fans and controller, it is only recognizing one of the two controllers. I have no way to select the other controller and subsequent fans hooked up to it. When I select a new color and hit apply, it only applies to one of the controllers and it's fans. Any help would be GREATLY appreciated!
  7. That is what I am leaning towards thinking at this point. Neat that they decided to give the full core. I just wonder if this is a normal occurrence and people with 3GB 1060 laptops just haven't noticed they have the full 1280 Cuda cores.
  8. So more than likely even though this particular laptop seems to have the full 1060 core, there isn't a useless 3GB hanging around that is disabled? Also, do you know if this is a laptop-wide thing of the 1060 using the full core, or just something MSI has chosen to do in this case?
  9. You think disabling vram being dumb is a reason that a large company wouldn't do it? lol
  10. Right. That is what I am wondering. Is this something that only MSI has done, and does it leave the possibility for there being another 3GB of vram that has just been disabled to fit a price point?
  11. As everyone knows, Nvidia no longer denotes or separates them by giving the laptop versions an "M" at the end. With that being said, I think you are asking if it is the MXM version of the card, which it is not. This GPU is soldered to the main board.
  12. I hope that it is okay that I post this question here rather than the laptop section as I feel like the people better equip to answer, would probably be here instead of in the laptop section. Either way, if it isn't okay, mods please move this to the laptop section. I had a comment on one of my YouTube videos today where I had reviewed an MSI laptop. The individual pointed out that he had the same laptop with the same 3GB GTX 1060. He noted that when looking at software to show the particulars of his GPU, it stated he had the full 1280 Cuda cores that normally come with a 6GB GTX 1060, rather than the cut down 1152. Is this something that is standard on the laptop versions of the GTX 1060 or is this because MSI has used the full GPU and just disabled 3GB of vram? As you can probably tell where I am going with this- I wonder if it would be possible to flash the internal 1060 to the full 6GB of vram. Obviously if this didn't work, it isn't as simple as buying a new 1060 to replace the one you just bricked, so I don't have the intestinal fortitude to try this out. Regardless, I am still curious. Any insight would be greatly appreciated!
  13. Comparing shared system memory to dedicated VRAM on a discreet mobile GPU is apples to oranges. The discreet GPU would, in most scenarios, always do better than the integrated GPU on the CPU die- the exception would be an older discreet GPU. In other words, your integrated GPU isn't going to be powerful enough to ever need 8gb of VRAM.
  14. I'm running on whatever the latest update of CC is- sorry I am out of town for the holidays without access to my rig, to give you an exact build number. Currently, I am running an EVGA GTX 1080 FTW Hybrid, but I have also used the non Hybrid 1080 FTW, the 1070 Superclocked, 1060 Superclocked, and MSI 1050ti Gaming X. All ran fine, with Cuda acceleration with the latest version of CC. I am sure it is something simple that we are all missing. I haven't been using Premiere Pro long enough (6 months) to give any useful suggestions, unfortunately.
  15. matrixmodulator- Unfortunately, I cannot comment with a solution, but I can tell you that I have used several of the Pascal generation cards with Premiere Pro CC. For me, it automatically recognized that I had a Cuda equip card, and set Cuda acceleration as default. Also, I have used Polaris for PP CC, and when switching back to Pascal, again, it automatically detected and switched the option to Cuda acceleration. Good luck brother! It is definitely possible, and I am sure that someone here will be able to provide some help.
×