Combining gigabit internet to 10 gbit internet
Adding on to what others have stated - NOT WORTH IT.
At a minimum you would need a 20 port managed switch, a bunch of cabling and only sometimes will it work. Also it's compute heavy
What I'd actually suggest for most people -
Mikrotik CRS305 switch - $120ish (QNAP also has some decent but pricier choices)
SFP+ DAC cable - $15ish
SFP+ 10Gbe NIC - $30ish (MELLANOX CONNECTX-3 EN CX311A)
10GBASE-T SFP+ to RJ45 Transceiver - $40ish
10Gbe NIC - $70ish (might be 100ish right now due to supply chain BS)
1 CAT6 cable
This will get you in the game, it'll be hassle free, it'll just work and you won't go WTF when your server can only send 1.5Gbps at a time (with awful latency) because whatever crazy hacky solution you came up with doesn't support ROCE.
Going down a step you could just "settle" for 2.5Gbps. An RJ45 switch will be around $100ish, a NIC is around $25ish and a lot of newer motherboards have 2.5Gbps built in. 2.5Gbps is probably fine in most cases.
On 5/28/2022 at 1:54 AM, Hjallerrboii said:What kind of costs are we talking? What if i could do with combining 3 or 4 gigabit connections?
You'll likely end up spending about as much to get 10 ethernet cables working over SMB multi-channel as it would to just do it the right way.
Similarish story with 4. You'd also likely get better real world speed with 2.5Gbps networking than trying to get 3-4x 1Gbe cables working together. Also better latency.
There's a reason why people usually don't do this (at most they're doing 2x1Gbe because it's "easy enough")
Also SMB multichannel only really works well if EVERYTHING is running windows. It's a huge hassle otherwise. That and I probably suck at setting up routing tables under linux.
----
One other alternative, if you just care about performance for ONE person...
ISCSI network share, then set up block level caching on your client system. (LVM cache in linux, primocache in windows). This will usually work better than trying to go gungho with faster networking.
Before and after (on a 10Gbe link)
This is admittedly throwing a BUNCH of hardware at the issue (my NAS is probably overkill and I'm using a 58GB optane stick to cache it on my client)
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now