Jump to content

LAN connection for 100 gaming PCs

XzzDSA

Heyo o/

Made a topic similar to this one regarding powerline.. Since I have now decided that's NOT going to happen, and I'm going to use proper cables to get LAN for all of the PC's...

Here's the deal:

I've got a hall with space for 100+ people with pcs.
I want to get a SOLID connection to everyone in there.
I have NOT got any Ethernet ports in the hall. But I have ethernet ports in rooms close to it.
The Idea is to run 10Gigabit (cat6) cables in from other rooms, and split it through 7-10 switches.
I'm going to run 2 cat6 cables, (one from either end of the hall).

Now I've encountered my first problem.
How do I get the 10Gigabit connection to every switch, if the switches themselves only support Gigabit(10/100/1000)? Will I just get a few small 10Gigabit switches (4-ports) and daisy-chain them? Or what is the simplest and most solid way of doing this?

The switches I'm going to use for the pcs connections are the TP-LINK TL-SG1016D (16port Gigabit switch).

Also, what are the differences between cat6 cables? (UTP, FTP, SFTP)?

 

Also, funds are somewhat limited, but pretty flexible.

 

Any help is very much appreciated! :)

 

Regards
   XzzDSA

Everyone have a cool signature. I don't, so I thought I would write something.

- Cool right?

Link to comment
Share on other sites

Link to post
Share on other sites

The switches will need to support 10Gbps. You don't need 10Gbps however, 1Gbps will be fine for just gaming.

Link to comment
Share on other sites

Link to post
Share on other sites

The switches will need to support 10Gbps. You don't need 10Gbps however, 1Gbps will be fine for just gaming.

I won't be running 10+ gigabit cables in to the hall though... Splitting 1Gbps across 100 machines seems like a bad idea.

Everyone have a cool signature. I don't, so I thought I would write something.

- Cool right?

Link to comment
Share on other sites

Link to post
Share on other sites

I won't be running 10+ gigabit cables in to the hall though... Splitting 1Gbps across 100 machines seems like a bad idea.

I would have network switches with perhaps 24 Gigabit Connections and 2 10 Gigabit connections the connect the 10 Gigabit into one 10 Gigabit switch.

 

Going to have a scout online what your best option is.

Main Machine:  16 inch MacBook Pro (2021), Apple M1 Pro (10 CPU, 16 GPU Core), 512GB SDD, 16GB RAM

Gaming Machine:  Acer Nitro 5, Core i7 10750H, RTX 3060 (L) 6GB, 1TB SSD (Boot), 2TB SSD (Storage), 32GB DDR4 RAM

Other Tech: iPhone 15 Pro Max, Series 6 Apple Watch (LTE), AirPods Max, PS4, Nintendo Switch, PS3, Xbox 360

Network Gear:  TP Link Gigabit 24 Port Switch, TP-Link Deco M4 Mesh Wi-Fi, M1 MacMini File & Media Server with 8TB of RAID 1 Storage

Link to comment
Share on other sites

Link to post
Share on other sites

http://www.netgear.co.uk/business/products/switches/managed/M5300-28G.aspx#tab-techspecs

 

Perhaps 4 of these? 

 

Just saw the price of them so perhaps not :lol:

Main Machine:  16 inch MacBook Pro (2021), Apple M1 Pro (10 CPU, 16 GPU Core), 512GB SDD, 16GB RAM

Gaming Machine:  Acer Nitro 5, Core i7 10750H, RTX 3060 (L) 6GB, 1TB SSD (Boot), 2TB SSD (Storage), 32GB DDR4 RAM

Other Tech: iPhone 15 Pro Max, Series 6 Apple Watch (LTE), AirPods Max, PS4, Nintendo Switch, PS3, Xbox 360

Network Gear:  TP Link Gigabit 24 Port Switch, TP-Link Deco M4 Mesh Wi-Fi, M1 MacMini File & Media Server with 8TB of RAID 1 Storage

Link to comment
Share on other sites

Link to post
Share on other sites

Try to contact the local isp and ask for some sponsoring or if they could turn up the bandwith and help with something else

Case: NZXT Phantom 410 Cpu: i7 3820 at 4.3 Ghz Motherboard: Gigabyte GA-X79-UP4 | Gpu: XFX hd 7970 Ghz 1100Mhz/1600Mhz Ram: Crucial BallistiX Elite 16GB 2x8 Gb 1866 Mhz


SSD's: 2x Kingston HyperX 3K SSD 120 GB in RAID 0 HDD:Seagate Barracuda 2TB 7200 Rpm


Cpu cooler: Corsair H100i with SP120's | Psu: Corsair TX 850W My monitors: 3x BenQ 24"  GL2460 = Eyefinity

Link to comment
Share on other sites

Link to post
Share on other sites

I would have network switches with perhaps 24 Gigabit Connections and 2 10 Gigabit connections the connect the 10 Gigabit into one 10 Gigabit switch.

 

Going to have a scout online what your best option is.

I'm going for 16ports, because that's what fits the number of machines per row.

24 ports will just go to waste, and it costs more. I'd need the same number of switches if I still want the cabling to be convenient for the guests.

 

Try to contact the local isp and ask for some sponsoring or if they could turn up the bandwith and help with something else

Oh the bandwith is not the problem, the in and outgoing connection is faster than I would ever be able to use... we are talking GigaBYTES of bandwith.

It's just the internal ethernet connections I'm trying to figure out.

 

 

I wonder If I can get some 16ports switches, with some extra 10Gbps ports for chaining them together?

But if what @Ssoele really is true, then maybe it'll hold up with just the 1Gbps switches... hmmm.. 

Everyone have a cool signature. I don't, so I thought I would write something.

- Cool right?

Link to comment
Share on other sites

Link to post
Share on other sites

I dont uderstand the 10Gibabit thing. why have you chosen 10G.  because you dont need it for gaming.

 

Your planning on connecting 16 port switches togeather.

Gigabit would be fine.  but if you realy need it you can setup  Link Aggregation between the switches to give you a higher backbone speed.  IE you can setup 2 cables between each switch to give you 2G between them. the more cables the more you can carry over that link.  But for gaming its not nessasery.

 

Just use some cat5e (rated up to 1G) and plug them all togeather. you can have a centrolised switch that they all connect to or just daisy chane the switches its up to you.

 

Remember though if you wanted to setup Link Aggregation you need a switch that will support it.   But its still not nessacery.

Link to comment
Share on other sites

Link to post
Share on other sites

I dont uderstand the 10Gibabit thing. why have you chosen 10G.  because you dont need it for gaming.

 

Your planning on connecting 16 port switches togeather.

Gigabit would be fine.  but if you realy need it you can setup  Link Aggregation between the switches to give you a higher backbone speed.  IE you can setup 2 cables between each switch to give you 2G between them. the more cables the more you can carry over that link.  But for gaming its not nessasery.

 

Just use some cat5e (rated up to 1G) and plug them all togeather. you can have a centrolised switch that they all connect to or just daisy chane the switches its up to you.

 

Remember though if you wanted to setup Link Aggregation you need a switch that will support it.   But its still not nessacery.

When running alot of connections off of a single cable, from my understanding you need the throughput. 

I've no doubt that 10Gbps might be too much. But unless you can convince me otherwise, or point me to where I can get convinced, I'll still think 1 or 2Gbps is too low for running 100 machines...

this is not some crappy ol' livingroom LAN - There's alot of machines communicating, downloading and all that.. All while everyone needs a low latency and not spike because some idiot decided to download the whole series of Star Wars movies in ProRes.

 

Understand where I'm coming from. I've got knowledge for PC's in general and look at myself as pretty good at what I do as a hobbyist. Though I'm a control freak when it comes to this and have not got much experience with networks.

So if you think 1 or 2 Gbps is enough to run a whole room, then convince me :)

 

Thanks for the input though, it IS much appriciated :)

Everyone have a cool signature. I don't, so I thought I would write something.

- Cool right?

Link to comment
Share on other sites

Link to post
Share on other sites

You do not need 10 gigabit for 100 gaming machines. You can literally have 1 gigabit and be fine.

Link to comment
Share on other sites

Link to post
Share on other sites

You do not need 10 gigabit for 100 gaming machines. You can literally have 1 gigabit and be fine.

Please, do convince me. Because I'm sceptical.

It's not like I don't think many of you aren't competent, but proooveee it to me.

The LAST thing I need is ending up with a whole room not being able to play a game of leauge or whatever because the setup is not sufficient.

Everyone have a cool signature. I don't, so I thought I would write something.

- Cool right?

Link to comment
Share on other sites

Link to post
Share on other sites

Ahhh you didn't say you where going to allow people to download lots of things.

So I have some questions.

What is your download upload like speed to the Internet.

Proposed budget

aand the equipment you want to use or plan on using

Link to comment
Share on other sites

Link to post
Share on other sites

In and out is several GigaBYTES of bandwidth - We are running on 'DTU's Network which is a big univercity in Denmark.
 

We allow anyone to download as long as the content isn't illegal obviously. But we want people to be able to download a game and so on in a timely manner.. THAT is why I want the throughput - obviously if we were all JUST gaming, sure 10Gbps are overkill. But if you want to do other stuff in the meanwhile, (youtube, etc.) and got a lot doing that, from my experience and observations (which again, are not a lot) you need the throughput.

Sorry if I haven't made this clear before now :/

The budget.. Well... MAX is maybe 1700-2000US$.. But remember this is Denmark and everything is a bit more expensive that, fx. the US.

Everyone have a cool signature. I don't, so I thought I would write something.

- Cool right?

Link to comment
Share on other sites

Link to post
Share on other sites

Well whatever you do you basically have two options:

 

1. Star topology. Have a central switch that every table (which also has a switch) and the outside connection runs into. Do LAG or whatever if you think you REALLY need more than 1Gbps. Alternatively, if you can get it going this way, just run a separate cable from the "other room" to each individual table. Not sure how much your UNI will like that though. Ask them what the best option is.

 

2. Bus topology. Possibly easier to setup, much harder to troubleshoot if someone knocks out a cable. You have the outside connection running to the first table with however many cables, work out how to set it up properly. Again, ask. Then run a couple of cables from table 1 to table 2 using LAG. From there you run a couple of cables from table 2 to table 3 etc, etc. For performance? Not ideal, especially for the guys on the last table... but it could be a LOT easier to setup if the tables are close and there's no well defined "central" place

Fools think they know everything, experts know they know nothing

Link to comment
Share on other sites

Link to post
Share on other sites

Well whatever you do you basically have two options:

 

1. Star topology. Have a central switch that every table (which also has a switch) and the outside connection runs into. Do LAG or whatever if you think you REALLY need more than 1Gbps. Alternatively, if you can get it going this way, just run a separate cable from the "other room" to each individual table. Not sure how much your UNI will like that though. Ask them what the best option is.

 

2. Bus topology. Possibly easier to setup, much harder to troubleshoot if someone knocks out a cable. You have the outside connection running to the first table with however many cables, work out how to set it up properly. Again, ask. Then run a couple of cables from table 1 to table 2 using LAG. From there you run a couple of cables from table 2 to table 3 etc, etc. For performance? Not ideal, especially for the guys on the last table... but it could be a LOT easier to setup if the tables are close and there's no well defined "central" place

Alright.. So I started to plan out how I'm going to run cables and what not.

We're going to have exactly 84 machines in the room. 1024/84 = 12Mbit. I am able to get more, and so I will. I'll proberly run two cables, so that's around 24Mbit, theoretically, bandwith for everyone. Not too sure how many switches I should connect..

I'm going to have 6 rows of 14 machines. So one cable per 3 rows. so that's a total of 2Gbit. Theoratically. 

Then running the switches in serial. So from one to the other. Not sure how this will affect performance, if I should maybe try and keep it to two switches per cable.. If so I need a 3rd cable in, which is possible. 

But I've got no clue how bandwith is handled within a cable. I guess 1Gbps is the TOTAL bandwith, back and forth. So that includes any up and download. right?

 

What do you think?

Also. Every cable and switch is going to be taped to the floor/tables. So it is actually impossible to accidently pull it out. Either way everyone is pretty well behaved and havn't had problems with people "pranking" by pulling out cables before :P

 

Everyone have a cool signature. I don't, so I thought I would write something.

- Cool right?

Link to comment
Share on other sites

Link to post
Share on other sites

Alright.. So I started to plan out how I'm going to run cables and what not.

We're going to have exactly 84 machines in the room. 1024/84 = 12Mbit. I am able to get more, and so I will. I'll proberly run two cables, so that's around 24Mbit, theoretically, bandwith for everyone. Not too sure how many switches I should connect..

So here's the thing, people aren't machines. If you suddenly decided that everyone should download a game that nobody had all at once? Then maybe. But I doubt that'll happen. More likely you'll get the usual sort of "crowd" behaviour. You have 84 people? Lets say 50% decide they all want to play this game and of them 50% already have the game. At a LAN that's pretty close to a realistic "worst case".but even then some of the people will have started the download pre-emptively and others will be late to the party.

 

Lets say it's a 20GB game and for the sake of argument say 20 people want this game and all of them are sitting together so even your multi-cable thing doesn't really help them much. Of these people 5 started the download 20 mins early, 5 were 20mins late and 10 were in the middle. Rounding to the "worst case" every time, the first group get "200Mbps" and take ~15mins to download it, second group get 100Mbps and download the first ~10GB in their 20mins. Speed slows to 66Mbps, second group takes another 15mins at this speed, third group gets half way through. Speed jumps back up to 200Mbps, third group finishes in another 7mins. So the longest wait with that made-up scenario was 35mins.

 

If you can get more bandwidth total? That's cool. Just don't think that 1Gbps is realistically going to slow down to 12Mbps. Game downloads are the worst kind of traffic for this *but* even then people don't all hammer it at their theoretical MAX all at once. Even if they do people will just let the game download in the background and play/do something else.

 

edit:

also, I have to add. Regardless of what you say about the sort of content that will be at your LAN. Pretty much guaranteed that a fair amount of files will be shared between machines. In which case it would make sense to do some LAG between switches, whatever your topology is, just to make sure that that sort of traffic doesn't eat into your 'net traffic too much. Also if you were *really* serious you might find a way to have a little bit of read-only network storage on your network with some of the big files you know people want. Tell people what games are on it and that if they want to play them, buy them on steam and copy it into their Library.

Edited by skywake

Fools think they know everything, experts know they know nothing

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×