Jump to content

Cany

Member
  • Posts

    57
  • Joined

  • Last visited

Awards

This user doesn't have any awards

Contact Methods

  • Steam
    Heimdall
  • Origin
    DrPikaJu

Profile Information

  • Gender
    Male
  • Location
    Germany, Near Stuttgart

System

  • CPU
    i7-4700MQ
  • RAM
    16GB
  • GPU
    NVidia GeForce GTX 770M
  • Storage
    Samsung 850 Evo 500Gb
  • Mouse
    Logitech G502 Proteus Core
  • Operating System
    Windows 10

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. That is in fact correct. The tools show the physical frequency the sticks run at. The data rate however is double the physical frequency because the data transfer is triggered with every up and every down of the frequency. That's why it's called DDR (Double Data Rate)
  2. Hello guys, I need your help, I own a GTX 1080 FE manufactured by Gigabyte and I just don't like the temperatures this card goes up to ( 82°C+) and the sound the blower style cooler does. I found out that the EVGA GTX 1070/1080 SC cooler base on the same reference pcb design and I found one nearby that I went to pick up. As you can see at the Product page of the cards they are designed a little different. I am missing the plate which covers the VRAM chips and the MOSFETs on the psb. The cooler itsself fits perfectly, and I asked the seller if he might have the baseplate for VRAM and vrm cooling. He said it will work without it and I should just use direct cooling with the fans. I am a little unsure and not willing to risk my GPU (mostly because I don't have the funds to source a new one if it fails). What do you guys think? Is it possible to use the better EVGA SC cooler without contact to the heatsink for vrms/vrams? Thanks in advance!
  3. Take a look at the specsheet I provided in the post. The card has 3x DP 1.4 which supports 4k@60Hz and one HDMI 2.0b which also supports 4k@60Hz. So the ports are there.
  4. I found that the 1060 3GB supports 8k at 60Hz. Theoretically the card should be able to handle four 4k at 60Hz. Why should the limit that to 3 monitors?
  5. I have this Zotac GTX 1060 3GB, which has 3x DisplayPort 1.4 and one HDMI 2.0b. It supports an resolution up to 7680x4320. I want to run four 4k monitors at 60 Hz. Is this GPU capable of doing that? I don't need to run games and rarely a video on one screen. Haven't foudn anything useful on the internet yet, but my suspicion is that the 8k resolution is given for 30 Hz so i might be able to "only" run 2 4k monitors at 60Hz.
  6. You sound like someone who really wants this to be true. So hard that you might get yourself into trouble. You can't be sure, I might even sell his OWN iphone and later reclaim it back with his girlfriend and you might be sitting there loosing 250$
  7. The Apple Account Owner information is linked to the IMEI number of the phone, which cannot be changed. Also, you cannot reset a phone if you do not have the passcode. There are very rare occasions where you do not set a passcode. I would be highly suspicous that he sells you some copied china garbage, or just outright scams you.
  8. Using only one stick, you will lose the benefit of Dual Channel memory, which helped with my laptop quite a bit.
  9. I dont think your CPU is the problem. Do you have any other PSU to swap it out and check? Maybe borrow smth from a riend just for troubleshooting purposes. How is it with gaming? Does it crash there often? The browser actually uses 3D rendering to fasten the pages so a GPU from a friend to try it out would be good too.
  10. Inconsistent voltage can easily get you a BSOD with variing messages. I do not know how a PSU handles voltage drops from the plug so that really might be the issue. (or the UPS had a bug and just needed a restart?)
  11. Hello stereorainbow and welcome to the LTT forum, this sounds like a pretty tough one to troubleshoot. My only last guess would be a) a faulty MB or b) a faulty PSU. If you happen to have some spare parts lieing around or another computer I would start with swapping the PSUs. Those are the only two components which you havent tested yet and it may explain why it became better after removing the 1660, as you dont draw that much power without it. Have you tried a CMOS reset? Worth a try tho.
  12. So I try to make it quick. I own a Gigabyte Aero 15Xv8. The SSD the laptop came with in the first place suddenly started to act weird (was recognized but not accessible; not writeable; sometimes not readable; needed several restarts to function properly; etc.), to the point where it failed completely. Lucky for my I was able to just RMA the SSD without sending in the whole laptop. After setting up the new system, i experienced weird framerate drops in games like 144 -> 90 -> 144. I blamed it on bad thermal interface material and ordered some thermal pads. When they arrived I went to work and found those two blown up capacitors (pictures) near the VRMs of for the GPU. The awkward thing is, that the laptop is functioning properly (if you disregard the FPS drops). Gigabytes RMA team is saying I "opened it up too much" even tho i asked them via their eSupport platform if replacing the thermal interface voids warranty on which they said it does not. While opening the laptop my person was grounded and i disconnected the battery and decharged all capacitors, so I am very confident that I am not responsible for that damaged parts. So I have multiple questions here: Does anyone know if that blown up capacitors will affect/damage the laptop in the long run? Like I said, atm it functions properly Can i solder new parts in? If so does someone now which ones they are? (Last option if Gigabyte refuses RMA and it dies) Does anyone know what the capacitors are for? Maybe not the GPU? I really think that those are the real cause for the SSD dieing. Propably there was a false current anywhere which damaged those and the SSD. Appreciate every help.
  13. I've maybe found why it happened. The first undervolting attempt were made by changing the points of the voltage/frequency curve. by doing that i increased pitch of some parts of the curve (as i wated a lower voltage at earlier MHz stages). This steeper curve can result in performance loss! (as mentioned here ). So I changed my behaviour. I firstly overclocked the GPU to the point where i wanted the stock frequency to be and flattened all frequency points after that. Same performance with a little less temps now.
×