Jump to content

minutellim

Member
  • Posts

    31
  • Joined

  • Last visited

Everything posted by minutellim

  1. I've got one coming Saturday from Amazon, so I'll see what happens.
  2. When I have the onboard enabled with a standalone card like I do, is it only using one chip? I was under the impression that they would simply drive whichever monitor was plugged into it? Also, I would think I would be having the same problem before I replaced the monitor if that were the case?
  3. Hey guys, Normally, I'm pretty good with hardware, but this has me a little stumped. I had 2x 1080 monitors and one 1440x900 monitor hooked up to my computer. Today, I bought a 1080 monitor of the same type as my others to replace the x900. When I plugged in the new monitor, my FPS in LoL went from the 150 range down to 30-40. Anyone know what gives? I knew replacing the x900 monitor with a x1080 would be harder to run, but I think something else is going on here. I have two of them plugged into my 750Ti (I'm a filthy casual), and the other is plugged into the onboard graphics. I have tried having two plugged into the onboard and one on the 750, but I get the same exact results. The only combination I haven't tried is having all of them plugged into the 750, because I don't have a means to do so. Thanks in advance! Specs: i5-6600 16GB DDR4 750Ti ASRock H170M
  4. There are several on Google when typing in "Wireless Bridge". Here is one that involves flashing your router to DD-WRT, but I think you can do this with several commercial APs. http://www.cnet.com/how-to/reuse-an-old-router-to-bridge-devices-to-your-wireless-network/ -EDIT- Here's just one example of a commercial product that can do this straight out of the box. http://www.amazon.com/D-Link-Wireless-Gigabit-Extender-DAP-1522/dp/B001769K3O It's a coincidence, but this is on the front page right now.
  5. I'm almost positive that you can do this with an access point set to client mode. It will pick up the wireless signal, and then you can connect an ethernet cable to another computer or a switch that can push that signal to multiple computers.
  6. Haha! I know that feeling all too well. I think it happens to all of us. Good luck with your builds. I recently just went through and built a pfSense box and a NAS, so if you have any more questions, just ask me.
  7. I gotcha. I'm still confused as to why you needed 3 of those 4 port NICs that you linked, though.
  8. Ah, I think I see what you're trying to do. You're wanting to just plug all of your devices into the router itself, right? While pfSense can do bridging between the ports, it's kind of clunky and not really what that's for. A router should route traffic between different networks, not different devices. When people add more LAN ports to their pfSense router, they're usually setting up multiple LANs. For the home user, most of the time you only want one LAN, so you only need two ethernet ports to your router. One will go from your modem to your router, and one will come out of the router and go into a network switch. Then, all of your devices will connect to the switch.
  9. What NICs are you buying that are more expensive than buying a pfSense router? My suggestion is to build a dedicated router and build a dedicated server.
  10. I don't think this is something you should consider. I think that it's doable, but it adds so much confusion to things that I just don't see how it's worth it, especially for someone that hasn't used any of this kind of software before. I've looked at a lot of mini-ITX motherboards, and I'm guessing you're looking at something like the ASRock C2550. All of the ITX boards that I have seen have a dedicated port for IPMI which, as far as I know, cannot be used in a typical networking capacity like you're expecting. Both FreeNAS and pfSense advise against running them in VMs, let alone both. I say don't do it.
  11. Did you happen to read the full requirements in that forum post about hosting a DayZ server? If everything in that thread is true, there's almost no way you're running one from home, if I'm not confused about something.
  12. I did not know this. I have a Skylake booting unRaid from a USB, and I had no problem installing Windows 10 with a USB on a Skylake system. Is this specifically to FreeBSD?
  13. Unless he expects his website to be getting hundreds of hits at at a time or querying large data sets, I think a G3258 with 8GB of RAM is more than enough.
  14. Yeah. I'm going to reiterate what @djdwosk97 said and suggest going with Intel. You could get a G4400 for $30 more, and it's a much better processor.
  15. I'm a little confused. You now have it running on a 2600k instead of the a4-3300 you mentioned in your previous post? That's still a very good processor.
  16. For what he's wanting to do, I think this is entirely overkill. If he's just hosting a TS server and a website for his friends, he could get away with an embedded system for next to nothing. It wouldn't be future proof, but I don't think one of the J1900 Celeron solutions would sneeze at it. He'd also be able to run it with all desktop components. I don't think a Xeon and ECC RAM is what you need here.
  17. That's even overkill for one Plex stream. Theoretically, that could do around three streams at 1080p. I prefer overkill in my builds, but if you wanted to scale it back to a Pentium G4400, that will easily do one 1080p stream and still have a good bit of headroom. You could probably get two out of it, but I haven't tried myself.
  18. I agree. I think unRaid is worth the money. It is a terrific piece of software, especially with the virtualization built in. It's incredibly easy to use, and you can pretty much get it setup and forget that it's even there. My problem comes down to whether or not I think it's right for my use case, and I'm not sure the answer is quite as clear. I've been thinking about trying this. My unRaid trial is going to expire this weekend, so I might go with this option until I decide whether or not I want to go with unRaid or FreeNas.
  19. I recently built a NAS, and I have been running UnRaid so far with no problems. I generally like it, but for some reason I am having trouble bringing myself to shell out the $60 for something that I can do for free with other platforms. I'm currently out of SATA ports on my motherboard as well, so that needs an HBA card if I'm going to add more drives. My options are to continue using unRaid and just get an HBA for my motherboard, or I could get a proper motherboard, snag some ECC RAM, and switch over to FreeNAS. For some reason, I just don't like the thought of continuing to use the cheap ASRock board I bought for server use, and I'm really hung up on spending the $60 on unRaid even though I think it's a fantastic product. The only thing I use unRaid for, other than the hard drive pooling, is the virtualization of one Ubuntu VM for hosting my web projects, but I believe I can do that with Jails. The motherboard/CPU that is currently in the system wouldn't be wasted as I would probably use them to build a programming rig anyway as I'm tired of connecting my laptop to an external monitor and such when I want to work on projects.
  20. The build looks good for the most part. I ended up just going for the 2TB version of the Reds. If you need more storage, it might be good to shell out for the 4TB version. It seems that the 3TB version of the Reds have a much higher failure rate than the other versions. As for the separate LAN, you can do this fairly easily. You can buy a router or a switch that supports VLAN. Then it would just be a matter of setting up your room to be on a separate VLAN. The two networks wouldn't have any way of talking to each other, so you wouldn't be able to use a network printer or get files off of other computers that aren't connected to that network without additional configuration. I think this is what you want, though. You'd still have internet access through your personal LAN.
  21. I searched for a nice NAS case for a while. I wanted to go with a micro ATX motherboard, so that I could have more expansion ports compared to mini-itx. The Node 804 was really the only Micro ATX case that I was even considering. In the end, I decided that I didn't absolutely need the additional expansion ports and went with the Node 304 mini-itx, and the only reason that I did that was because I got one for $40 versus the $110 price tag of the 804. In my research, the only options I was considering if I needed something other than mini-itx was the 804 or to go with a full sized chassis.
  22. As others have said, get a different power supply and use the Intel i3. It's just better, and a little bit cheaper. There are eVGA Bronze edition power supplies that can be had for <$50. The case is nice. I almost went with it, but I decided on getting the Node 304. It can only hold 6 drives, but that gives you a good amount of expandability at a lower price. I figure 6 drives, especially if you're using the 4TB drives, will last you several years. While others have said to dump the SSD, I have one running as a cache drive and generally like it. There's almost no need for it, though. I really only get about double the transfer speed which only really matters for large files, and I rarely transfer very large files.
  23. That's pretty much my build, but I went with a Skylake i3 instead of the older version. There's not much gain in terms of performance, but I figured it's better to go with the newer socket, especially since it's about the same price. I notice you only have two drives. Are you going to be doing any redundancy, or are you just going to put both drives in there? I'd recommend getting three if you want a parity drive.
  24. His speed test is in kbps, not mbps. There's a problem somewhere unless he's on an ISDN connection or something.
  25. A couple of months ago, I built a simple server that I was using as an all-in-one NAS, Web Server, and Plex Media Server. When I first built it, I put it into a really neat looking HTPC case that sits in my entertainment center. The HTPC case only has room for 2x 3.5" drives, which didn't seem like a problem at the time, but now that I have thrown a blu-ray drive into it, there is only room for the one drive which is quickly filling up. I now am faced with the decision to simply move the components of the server to a larger case and keep the all-in-one solution or build an entirely new server to use for NAS and other server type stuff, keeping the media center intact. I know this ultimately comes down to my decision, but what do you guys typically prefer? I really like the look of the Silverstone HTPC case that I'm using, so I kind of hate to remove it from my entertainment center, but its components are way overkill for what it would be if I kept the system since it's functionality will be relegated to a simple Plex server, something the all-in-one would do very well, too. I also kind of like the idea of having them separated, just for more fun tweaking systems, but I thought I would see what you guys typically think.
×