Jump to content

KzE

Member
  • Posts

    33
  • Joined

  • Last visited

Awards

This user doesn't have any awards

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Sooooo, after a incredibly stupid amount of testing by myself, bribing a two man horde of drunk IT-friends and months of dispair incl. questioning my sanity it turned out to be a classic RTFM moment. yep Aaaaaanyway, PC runs as should, it is hanging on my wall and I finally installed and played Witcher III, game of the year 2017. (The game most appropriate for this build, which you realise when you scroll aaaaaaaall the way up and read the time stamp of the first entry of this topic) But does it keep its cool? The main question of this build though was always; will the cooling fins deliver enough cooling capacity to a system on fire? Gondor called for help, so I lit the combined beacon of Furmark and CPU Burner to see what happens. I ran both tests at the same time to simulate a heavy load which would never occure in a real world application. The GPU stayed surprisingly cool while the CPU was much hotter, but concerning the load not too much either. I wonder if anyone has an idea as to why the GPU is so much cooler. The temps rise faster at the beginning and as the pump starts to work more the curve flattens. It ramped up to 4700 RPM at the end. Unfortunately I did not record the RPM at each data point. The hotter the cooling fins get, the bigger the difference to ambient temperature (room at 21°C) and the better the cooling efficiency. Compared to an forced air cooled system, the mass of the cooling fins, pipes and about 3.5 liters of cooling liquid makes a big difference. This is why I think the system needed a full 1.5h to reach thermal max. as there was a lot of thermal mass to be "filled" first. I guess you could just hook up a water loop to a 200L rainbarrel and run your PC savely for at least 10 hours. Good idea for an episode of "bad cooling ideas", right @Linus? At around 1h 30mins it seemed to me that the CPU's temperature had reached a plateau. Within 20 minutes it only went up 1 °C. I turned the burners off to see how the system would cool down. (graph past 90 minutes) Thermal Imagery The surface temperature was monitored with a FLIR camera module for a smartphone. It is the cheapest way to get thermal imagery. Looking at the gray graph you see some irregularities, which I think are due to the "cheap" module. A curve is noticable, but there are some peaks like around 45mins, which seem out of place. To visualize I recorded the whole test in a thermal video timelapse. This shows the full 2 hours of the test. 20191010 Wall PC time lapse.MP4 You notice that some of the elements stay cold. This is because whereever there is an angle fitting, the brackets which transport the heat from the copper pipes to the cooling fins couldn't be installed. If you count the big squares as one full cooling element then there are a total of 19 elements. Of these, 4.5 are not used (marked blue), which is 23.7% of them. So there is almost a quarter of the potential wasted. But at least it looks sexy At the beginning I wanted to use solid copper blocks where the pipes would go into directly, eliminating the need for angle fittings and providing a much more direct and efficient heat transfer. But as mentioned earlier, that would have been 70kg more and something like 1000 bucks in copper alone. I am glad the cooling capacity is enough nevertheless, but it made me think how I could have activated the unused elements. The diagram only shows the copper loop, not the flex tube loop to the pump in the back. The pic below shows the two different mounting ways. I will come back with photos of the finished product. I'm still working out how to do the backlighting. Does anyone know a good LED strip system which I can hook up to the PC? I need about 5m of strip though. So all these solutions with 50cm long LED strips from PC tuners don't really do it or are too expensive.
  2. In this thread a Vectorworks employee says the following: We don't (and likely wont) support the more advanced features in the RTX line before that card hits end of life anyway, .. https://forum.vectorworks.net/index.php?/topic/61654-graphics-card-gtx-1070-ti-or-rtx-2060/
  3. So I tried my GTX 1080 in a friends PC and there it does 125fps. This means the GPU runs good. But the same card in my wall PC, with the ordinary corsair 650W PSU only delivers 4fps. This indicates a problem not related to my dual 250W Mini-PSU setup... no? Just doesn't make sense to me..
  4. I did the whole thing DDU (Display Driver Uninstaller) every time, and installed: EVGA Driver: - newest - version that worked two years ago - oldest version nVidia Driver: - newest - version that worked two years ago - oldest version FPS / WATTS In all versions the fps never went past 6 in the stresstest with the GTX 1080. I tried my friend's GTX970 again and got 22fps out of it. It drew 278 Watts. My GTX 1080 got 6 fps and drew 156 Watts. I find this very interesting. As if it is somehow throttled. For this whole setup I used a brand new never used 650W corsair PSU. So there is no way my dual Mini-PSU has anything to do with it. (they were entirely disconnected) HWmonitor received all sensor data with any driver version and the cards was never too hot. The device manager showed all versions as installed without any problems. What else can I do? GTX970 results GTX1080 results
  5. sounds intreguing. We didn't install experience though.
  6. So I ran into really big problems with this build. In this topic I am asking people for help: https://linustechtips.com/main/topic/1046103-gpu-or-psu-causes-crash-wall-pc/ I separated it from here so this stays a buildlog only.
  7. For two years I have been building a wall PC. But now I have problems with the hardware and I am seeking your help since I seem to have tried everything and I have no solution. The buildlog of the PC is here so you know how complex it is. You really should take a look there first: https://linustechtips.com/main/topic/744224-wall-pc-build-log/ What’s the problem: - The computer boots normally and works fine without the GPU driver installed. - As soon as the GPU driver (EVGA GTX1080) is installed, the computer crashes either during startup or shortly after login, no error-code or bluescreen. Just black. - PC runs stable in safe-mode (where presumably the GPU driver is not active) What I (and friends) have tried: - Replaced GPU driver with newest driver. Tried Nvidia as well as EVGA driver. - Replaced Win7 with Win10 (works until Win10 is online and updates drivers) - Updated Bios - Tried different monitor - Tried different monitor cables - Tried HDMI / DVI - Disabled on-board intel graphics - Tried RAM in a different PC to check if they are good, they are. - Tried different SSDs - Disconnected all non vital SSDs - Prime95 stress-tested CPU, all good - Tried all PCI-e Raiser cables by themselves and in combination, all good - Completely disconnected the two HD-Plex 250W PSUs and ran entire PC through a regular PSU (see pic), no change. - Borrowed a working GTX970 from Friend, setup with dual PSUs, still crashes. - Put GTX970 in PCI-e Slots directly, still crashes - GTX970 with regular PSU works (even with PCI-e Raiser cables) So the only stable configuration that works is when both the PSUs and the GPU are replaced. But all this at least eliminates the following error-sources: - Mainboard - CPU - RAM - SSD - Raiser-Cables What really haunts me, is that initially in 2017 the computer worked in the same configuration before the frame was completed. (see table test pic) I used furmark to stress the GPU and Prime95 to stress the CPU and proofed that it works under simultaneous load. At that time I didn’t have the cooling rig, so I used 3 fans on a radiator. Those fans are not used now which results in less power usage. This eliminates the following error-source: - Not enough power from the dual PSU setup. The power draw graph of the GTX1080 suggests that power could be an issue. (pic 2) (eventhough already proven that it isn’t two years ago) So I hooked up the regular power supply to the GPU alone and ran the PC through the other PSUs. - PC doesn’t crash, but GPU performs only 4fps in furmark. You likely want to point out the special setup with the two micro PSU connected through the add2psu adapter. But remember the PC still crashes when those are taken out of the equation entirely. And initially the setup worked flawlessly with those two PSUs two years ago. A test done by eteknix.com with same Mobo, CPU and a GTX980Ti (same power draw) shows 368 Watts power draw under full combined load. (pic) List of Hardware: (and which PSU it is hooked up to, see attached assembly diagram.) - PSU : 2x HDplex 250 Hi-Fi DC-ATX 250W, max. peak is 300W (www.hdplex.com) - PSU connector: add2psu (http://www.add2psu.com/) This connector does not share the power!! - Power : 2x Dell PA-9E AC-Adapter 240W (recommended by HDplex) Master PSU - Mainboard: AsRock Z170 Extreme7+ (LGA 1151, ATX) - CPU: Intel Core i7 6700K Box (LGA 1151, 4Ghz, unlocked) 100W - Storage: Intel SSD - RAM: Hyper X Fury (2x 16GB, DDR4-2133, Dimm 288) Slave PSU (201W total components, max. peak 296W) - GPU: EVGA GeForce GTX 1080 FE (8GB) (178W, max. peak 273W) - Pump: EKWB - EK-DBAY D5 PWM (23W) PC assembly diagram Original test setup with dual stresstest (both GPU / CPU) successful for 4 hours: Stresstest at the time: GTX1080 power consumption When I used a regular PSU (didn't help) Mainboard diagram
  8. thanks guys. @cluelessgenius for now it works and if I need to replace I will look into it. Thanks. @lenien77 actually two years now... it's not so much fun anymore. because right now I am dealing with lots of stupid problems -> next post
  9. Insert locking There’s light glooming at the horizon, dear friends of the computer crafts! Both cooling rig and the hardware insert are finished. Now they need to marry. The rig has been sitting on my CNC for ever, the CNC projects are piling up. So a first test showed that the pump somehow doesn’t want to fit. The opening in the glass had to be widened. With this thin mill it worked pretty well. Once it fit, we attached two clear block which turn to hold the insert in place. The aluminium plates are always in place and prevent the insert from falling out the other direction. So to be able to work on it, we propped it up Oh and yeah, tadaaa: But will it blen.. I mean, boot?
  10. Water test First I had to switch the pump’s in and out, since the CPU-block cares about the direction of flow, gosh, so needy. RTFM, right? Time to test whether I tightened all connections correctly everywhere. With all these fittings I have about 30 of them, and some 30 more in the cooling rig. If you do this, run the pump with a jumper adapter. In the picture it is the 24pin molex with the red cable. 5$ or so. Really handy. (just to the right of the dry coupling) This way your hardware has no power and if there is a leak, no harm will be done. It ran for about 36 hours successfully. Pic shows process of first fill. Oh and look, a folded PCI raiser cable and a big moist turd of foreshadowing. ? Used this bucket so any air in the piping can escape here. Front view with the CPU GPU piping. Definitely looks cleaner now that it is filled. Had to find out that coloured fluid apparently settles pretty fast. Here after 2 days or so. But it mixes up again within seconds. Is this normal?
  11. Cable management Here’s my excuse for a cable management. It sure cleaned things up but there’s just too much and it’ll be hidden behind it all anyways. The only thing that bothers me a bit is that they will be visible from the front when they cross over from behind one tray to the next. (look at the renderings and you’ll know what I mean) I might try to put them all close toghether later on to create a similar appearance as the copper piping. Have you ever had access to a labeling machine ? People, don’t do it. It’s addictive. I have many random things in my appartment now that are labeled. Like « chair », « bowl », or « cat ». Keep your hands off it, or use only under supervision. Yeah, take this, 3feet-raisercable ! So ghetto,.. any ideas how to make it keep that folded? Yes it really needs to be that way around. No, there’s no performance drop with one or more raiser cables, did my research. Also, flex tubing is in. Quick and dirty. Don’t judge me. The dry couplings will allow me to take this insert out of the main cooling rig. They are by alphaUNcool. What a shitty product, couldn’t find anything else tho. The sealing came off on two of them after just a few uses. 2x30$ gone. But once you started such a project you just keep throwing money at it here and there and apathetically stop to count. Here’s the raiser cable on the other side. Took me half a day to find a good looking black one. After all this is the prominent side. Plugged into a close PCI slot, apparently they are all 16x, props to my adopted IT-Guy for clarification.
  12. Power supply What a mess in the back. Before I clean that up, the two laptop poweradapters need to go in. After Ill see how much space I have left to do the «cable management». The quotation marks let you know that I’ll be streching that term. A while ago I discovered one can buy velcro in bulk, on a roll. Guys, and ladies, this will change you Life forever. It ties just about anything and is really easily detached and attached and detached and attached and detached and attached and detached and attached and detachted.…reusable you might say. In the end there will be a cable bundle exiting below of the PC. But I want as few as possible. I guess I’ll even overcome my disgust for wireless periphery. Two power cables are too much, and the cord from the power adapter is too long too. Power adapter cable, soldered and double shrink tubed. What's the thin green one for? I made the classic power cords into a Y, one end for the wall plug and two ends for the two poweradapters. I am not putting the pictures of that modification here, they are NSFW.
×