Jump to content

arter97

Member
  • Posts

    6
  • Joined

  • Last visited

Awards

This user doesn't have any awards

arter97's Achievements

  1. I'm looking into making a home-made VPN server. I need about 9 RJ45 connectors on this computer for making an aggregated VPN(via Linux). I'm planning to use 8 connectors for connecting to the internet and the last one for feeding the home ASUS router. (It's a long story but yes, I can use 8 connections from the ISP and I have confirmed that using all of them at the same time results in linearly scaled performance, so it's not being held down with QoS or something. I have another server at work, which is connected to the internet at Gigabit speed, so I'm planning to use MultiPath TCP to make an aggregated VPN server) So while I was looking for a suitable motherboard/NIC for this setup, BTC(mining) motherboards caught my eyes. In Korea, where I live, a NIC with 4 RJ45 ports costs $150, while a NIC with only 1 RJ45 port costs $4. (Both Gigabit) So to have 8 RJ45 ports, I can choose the $300 route or $32 route. The problem with the latter is that I need a motherboard with 8 PCIe x2 slots. And it seems like mining motherboards are perfect for this. Yes, I know that such slots are going through the PCH, which would theoretically add latency but since I'm aggregating 40 Mbit connections, I'm not expecting it to be a bottleneck at all. What I'm wondering is if there are any gotchas with such setup. Maybe BTC motherboards only work with graphics cards? Have anyone ran into some problems with using a similar setup? Thanks.
  2. I'm looking for a new laptop under a kilogram for ultimate portability. The only feasible solution seems to be the 2018 LG Gram, but the same 1080p screen resolution is a major deal breaker for me. Seems like there are no laptops with 1440p or 4K panel that's under a kilogram. (Since I don't use Windows in favor of Ubuntu(Unity), I don't have to worry about poor UI scaling.) I consider myself in a very small consumer group for choosing a laptop. Here's my preference: a. iGPU only I don't play games, and dGPU is a major regression to battery life, even more so on Linux. b. Under a kilogram I've been using 2016 LG Gram for 2 years, and I literally can't use other laptops that are heavier. Please don't advise me on using a 1.3 kg laptop. c. 1440p or higher I use a 4K monitor for my desktop with 150% scaling, and it absolutely pains me to use a 1080p laptop on the go. Since there are no laptops for me, I'm planning to DIY on my own. I've spent days to see if I can use a panel from eBay with a compatible connector, but I can't find a concrete answer. Almost everyone else asking the similar question has a laptop which offers a higher resolution panel on the same model series, which is almost guaranteed to work. So on to the main question: Are all eDP connections standardized? Can I assume a 4K panel with the same connector work on my laptop? I'm wondering what role motherboard plays in handshaking eDP connection. Can a BIOS mod make it work? I'm even thinking of flashing panel's EEPROM(EDID) myself, if that's the reason for breakage in compatibility. The internet is full of success stories of upgrading 1366p panel to 1080p, or 1080p panel to 4K with those actually supports it via an option upon purchase, they're just doing it DIY. I wonder if there are any known success stories of upgrading 1080p to 1440p or 4K, when no known similar model supports 1440p or 4K panel. If anyone knows a similar attempt being made before, please let me know.
  3. Xeon or other workstation CPUs are definitely not what we're targeting. Ryzen and possibly cheaper mainline Intel CPUs are. I think that most people would be willing to take the price benefit despite of that possible downtime increase, as the computer hardware is getting more stable and stable.
  4. It's provided. People who've used virtualized graphics stack such as QEMU's QXL will understand what that means. That sentence is not for gaming context at all. QXL is barely usable.
  5. Previously, it was cheaper to just buy a few modestly spec’ed PCs instead of getting a high core-count PC and setting up multiusers environment. But now with enormous multicore CPUs being affordable, we thought it makes more sense than ever for a single PC to serve multiple users at the same time. We can plug in multiple monitors, keyboards and mice and just use. Me and my fellow teammates think this is the future, and we wanted to make this concept easily accessible to everyone. I want to hear what you all think of this. While there are many guides on how to set this up, almost all of those are overwhelmingly complicated - matching hardware spec and using advanced Linux techniques. As a person who was interested in this tech years before 6 cores, 8 cores CPUs for mainline was a thing, I can comfortably say that UX was/is horrible. We had to use multiple graphics cards since QEMU’s integrated graphics stack is not performant, and USB devices was a nightmare, especially when used headlessly. We wanted to avoid all of these major issues and allow non-techies to use this advanced tech as well. People can just avoid using these type of solutions and get cheap alternatives such as Raspberry Pi and Chromebook. But still, there are nothing like running a full and proper desktop OS such as Windows, and that’s what we’re targeting at. Using multi-session RDP is also not comparable, imo, as that still needs some kind of a client device to remotely login. I know many people would be wondering how gaming will be handled. While the majority of people here would be gamers, we’re currently targeting value over performance - people who’re fine with IGP-level performance. This mean that we want to enable multiple OSes to use a single graphics card. This needs a virtualized graphics device to be passed to the OS, which in term, means worsened graphics performance. We have been able to achieve 1080p60-ish performance using a single RX460 with 4 concurrent users, while the stock graphics driver in QEMU - QXL doesn’t even come close. We want to wait until SR-IOV powered graphics cards to be a thing before experimenting with gaming. I’m sorry if this sounds way too much like an advertisement, but I’m genuinely curious. What do you think? We think this is the future. If there was a simple and elegant solution to this concept, we think the entire world can save some serious amount of resources(hardware, power, wastes, etc).
  6. I'm excited for recent Linux development around Polaris GPUs. NVIDIA is not on par...
×