Jump to content

Alphy13

Member
  • Posts

    13
  • Joined

  • Last visited

Awards

This user doesn't have any awards

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

Alphy13's Achievements

  1. Update: I took the whole thing apart. Once it was out on the table, it booted up just fine. I put it all back together one piece at a time making sure it would still boot at each step. Finally, the whole thing worked with no issues. A couple days later, it went back to its strange almost boot loop, so I tried shorting the PS_ON pin but this didn't change anything. Should I replace the Motherboard, the CPU, or both and just go with Current Gen?
  2. My computer wont power on. Any troubleshooting tips? How can I check the motherboard or CPU without a replacement? I left my computer with a game paused and when I came back it was stuck continually trying to power on. It spins through a bunch of boot codes that are not specified on my MB manual for about 5 seconds and then the power cuts out. Never even gets to the BIOS screen. I tried unplugging EVERYTHING: GPU, USB (internal and external), all of my dives, displays, front panel; no change. I hooked the power supply up to a volt-meter and found a stable 12V and 5V on each rail while the computer did its attempted boot. If I unplug the MB power connector, and short the "on" pin, it stays on with the same voltages as long as the appropriate pin is shorted. This is still true with or without my other devices plugged in drawing power. I'm not sure how I can test my motherboard or CPU without buying a replacement, not knowing which is the issue. Please help!!! i5 6600K, GA-Z170X-Gaming 7, HyperX FURY Black (2x8GB), EVGA 970 4GB SSC Gaming, EVGA 220-P2-0650-X1 SuperNOVA 650 P2, Samsung 850 EVO 250GB 2.5-Inch SATA III Internal SSD (I normally have an additional set of RAM, a Blueray drive, and an assortment of HDD and NVMe, but I disconnected it all for troubleshooting.)
  3. Sorry for the necro, but this is still a top result on search engines. My experience with nice after-market coolers has been very different. If you want to keep the card from thermal throttling, you have to keep the cooler at noise levels noticeably above that of an AIO. I have an AIO on my GPU and it works great. Best I can tell, the forums are full of people that fall in one of two camps, do EVERYTHING (full custom loop with top of the line water blocks) and do the minimum (buy the best off the shelf parts and go about your day). These are totally fine but that doesn't at all mean that the middle of the road options are worthless. The noise reduction and longevity I get from having an AIO on my GPU is well worth it to me. And I can keep the AIO when I upgrade my GPU, and save $50 buying a card with a less fancy after-market cooler. Perhaps what I am missing is volume levels. If you exclusively use headphones, or play your games at high volumes, maybe you dont notice the difference in noise coming from your computer.
  4. That's some impressive cable management... How long is the run from the PCs to the radiators? Have you had any issues with the coolant? @AnonymousGuy seems to have had some issues with his but @AntVenom didn't seem to have too much trouble with it.
  5. Using heat exchangers between the fluids would allow you to use a different type of block between systems, and I think it would be safer for the system. However, it would add a tremendous amount of cost and complexity. The exchangers are expensive themselves and they removed the advantage of the aquarium in that now you need a reservoir for each PC which adds to the maintenance complexity. As long as you know whats in the "radiator" for lack of a better word, you can avoid galvanic corrosion. The radiator is probably copper and using copper blocks on your PCs would solve that. Thanks for bringing this up, as galvanic corrosion could easily become an issue in any project like this. Using antifreeze is difficult because it is glycol based and will eat up flexible tubing and plastic pump parts. I would hate to lock each of those stations into place relative to each other by using hardline tubing.
  6. My concern with NN is that in attempting to open up the internet, we will actually restrict it. Right now, it is partially outside the domain of the gov. NN will be trading corporate regulation and throttling for government regulation and throttling. If one ISP starts restricting you in ways that you dont like, you can choose another. The NN bill as it was, would need to be enforced and would have allowed the gov to do so. How do you enforce it? The gov could easily require ISPs to put up hardware that tracks all traffic and makes sure everything is "neutral" but where does that stop? As is, the main factors restricting new ISPs to introduce new technologies and more competitive services are all of the restrictions and regulations. Making more regulation is going to put more power in the hands of the huge corporations.
  7. Definitely not!! but its a cheap way to get a big reservoir. I think you will need a relatively large thermal mass to make this work. An aquarium will be cheap and easy and should make maintenance easier.
  8. LTT's new LAN party room inspired me to put to paper an idea that I have had for quite a while: Whole room water cooling, revived. Each PC has a 2 blocks, and a pump. The pump will be independent for each system so it is only running when the system is on. The GPU can use a regular CPU block, minimal air cooling will be sufficient for the rest of the PC. (Consider something like the NZXT G12). The Heat exchanger can then only worry about maintaining the temperature of the aquarium.
  9. This thread is for sharing ideas and projects for water cooling multiple systems. This was inspired by LTTs old "Whole Room Water Cooling". If you have any crazy ideas or success with similar things in the past, please share it here.
  10. I think the future will see a move to optical computing. Optical transistors have already been built and are extremely fast and efficient. We already have our internet traffic traveling as multiple different wavelengths of light. Adding a higher energy frequency to the line would act as the power source and multiple streams could be processed as different wavelengths at the same time. Then we could have information entering and exiting as light without much transformation to and from electricity. Our limitation is that the components required to convert from electricity to light, and back again are very expensive... for now.
  11. Can you share how you have this set up? Do you have your 3 PCs in parallel or series?
  12. Whole room water cooling, revived. Each PC has a 2 blocks, and a pump. The pump will be independent for each system so it is only running when the system is on. The GPU can use a regular CPU block, minimal air cooling will be sufficient for the rest of the PC. (Consider something like the NZXT G12). The Heat exchanger can then only worry about maintaining the temperature of the aquarium.
  13. Monitors: Ultrawide 3440x1440. I have loved mine and i cant go back. Its far more real-estate than 1080 but not nearly as demanding as 4k. Caching server: Everything about a server is meant to be remote. Put it in the server closet and just route the networking for the LAN party room through it. Networking: dual 10G from the caching server to a switch in the LAN party room and then 1Gig to each of the rigs. GPU: 1070 is a good call! stick with it. Rig placement: Put the rig on top next to the monitor. Show those off! AC: put and air return at the top of the room. It will make a huge difference. The above is the practical version... The fun version would be to revive whole room water cooling with the heat exchanger. Just a note: Don't be afraid of virtualization. Some of the most demanding loads in the world are run through virtualized servers. I would recommend setting up two identical hardware servers in your server closet and then running KVM or VMware on top of them. Then you can have plenty of VMs running on top of them that can dynamically switch between physical machines for changing demand or possible failure. This works best with a separate storage server that hosts the boot drives of the VMs but you shouldn't have a problem there. It may sound complicated but y'all have done virtualization really well in the past and setting up a system like this will save you a ton of work for many projects to come. You can then use this virtualization setup for network routing and the caching server. This is going to be a great project! Thanks for taking the time to hear us out.
×