Jump to content

100 Gbps Network Speeds

EricTrujillo

So I was at work today and an interesting thought came into my head. Obviously 10 Gbps is possible at the moment and its not out of reach, but how about 100 Gbps?

 

I've seen networking equipment around the web that supports it, and although its very expensive and purely fiber optic at this point, what else would bottle neck the system? Are there storage devices or protocols out there written to read/write at those speeds? Setting price aside, how difficult would it be to achieve such a thing?

Link to comment
Share on other sites

Link to post
Share on other sites

400Gbps is just around the corner and will be on the market this year. GPU based systems can easily saturate 100Gbps links for number crunching and certain tasks that are oriented to take advantage of GPUs. I'm working on a small project now for a customer, as part of a larger engagement, where they need lossless 100Gbps links between all their GPU clusters.

 

It's not going to be cheap but I would say that at current prices it's not crazy expensive, depending on what you need. For a switching platform with some decent L3 support they are generally around $1000 or so per 100Gbps port which is pretty nice. Scale up the feature sets needed and you can go much higher but for a lot of fast scale data centers that just need switching and some L3 capabilities, $1000 or less per port is pretty nice.

Current Network Layout:

Current Build Log/PC:

Prior Build Log/PC:

Link to comment
Share on other sites

Link to post
Share on other sites

Using the internet at these speeds is probably unreasonable at this time, but datacenters and niche scenarios like NASA can have some pretty insane speeds, but primarily these speeds are used locally in something like transferring massive amounts of data to and from servers.

Gaming - Ryzen 5800X3D | 64GB 3200mhz  MSI 6900 XT Mini-ITX SFF Build

Home Server (Unraid OS) - Ryzen 2700x | 48GB 3200mhz |  EVGA 1060 6GB | 6TB SSD Cache [3x2TB] 66TB HDD [11x6TB]

Link to comment
Share on other sites

Link to post
Share on other sites

100gb is possible with fiber.  you just have to have the memory speed to support it.  Hard drive more than anything. 

 

 i went too google to do the conversion, and this is what i found... thats a new one on me google. xDxD (yes apparently thats a real thing, but i was confused....dyslexai)

Capture.JPG.a56274470800022af901d576fb730372.JPG

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, suchamoneypit said:

Using the internet at these speeds is probably unreasonable at this time, but datacenters and niche scenarios like NASA can have some pretty insane speeds, but primarily these speeds are used locally in something like transferring massive amounts of data to and from servers.

Datacenter interconnects are often times going to use bundles of 100GbE as well,  ISP backbone networks have a lot of 100GbE and I know of several companies that do it at their larger headend sites and locations too. They don't do it from each branch but they've got a couple 100Gb or a couple 40Gb connections to the ISP on their headend aggregation stuff.

Current Network Layout:

Current Build Log/PC:

Prior Build Log/PC:

Link to comment
Share on other sites

Link to post
Share on other sites

10Gb/S is common in the enterprise space

CPU: Core i9 12900K || CPU COOLER : Corsair H100i Pro XT || MOBO : ASUS Prime Z690 PLUS D4 || GPU: PowerColor RX 6800XT Red Dragon || RAM: 4x8GB Corsair Vengeance (3200) || SSDs: Samsung 970 Evo 250GB (Boot), Crucial P2 1TB, Crucial MX500 1TB (x2), Samsung 850 EVO 1TB || PSU: Corsair RM850 || CASE: Fractal Design Meshify C Mini || MONITOR: Acer Predator X34A (1440p 100hz), HP 27yh (1080p 60hz) || KEYBOARD: GameSir GK300 || MOUSE: Logitech G502 Hero || AUDIO: Bose QC35 II || CASE FANS : 2x Corsair ML140, 1x BeQuiet SilentWings 3 120 ||

 

LAPTOP: Dell XPS 15 7590

TABLET: iPad Pro

PHONE: Galaxy S9

She/they 

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, Lurick said:

400Gbps is just around the corner and will be on the market this year. GPU based systems can easily saturate 100Gbps links for number crunching and certain tasks that are oriented to take advantage of GPUs. I'm working on a small project now for a customer, as part of a larger engagement, where they need lossless 100Gbps links between all their GPU clusters.

 

It's not going to be cheap but I would say that at current prices it's not crazy expensive, depending on what you need. For a switching platform with some decent L3 support they are generally around $1000 or so per 100Gbps port which is pretty nice. Scale up the feature sets needed and you can go much higher but for a lot of fast scale data centers that just need switching and some L3 capabilities, $1000 or less per port is pretty nice.

 

It’s theoretical at this point. I would be using it for large file transfers. I was thinking more of how easy it could be done. My main concern is finding a storage device that could read and write that fast. I know it can be done with a couple of cards, some fiber optic cable and a capable switch.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, EricTrujillo said:

 

It’s theoretical at this point. I would be using it for large file transfers. I was thinking more of how easy it could be done. My main concern is finding a storage device that could read and write that fast. I know it can be done with a couple of cards, some fiber optic cable and a capable switch.

For file transfers you're going to be hard pressed to hit anywhere near that unless you've got one or two dozen really solid SSDs in like a RAID 10 or RAID 0 probably unless you want to pay a TON of money.

Current Network Layout:

Current Build Log/PC:

Prior Build Log/PC:

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Lurick said:

For file transfers you're going to be hard pressed to hit anywhere near that unless you've got one or two dozen really solid SSDs in like a RAID 10 or RAID 0 probably unless you want to pay a TON of money.

Something like a PCI Express SSD? I know SATA 3.2 has a 16 Gbps limit but I haven’t seen any actual devices with that spec in it.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, EricTrujillo said:

Something like a PCI Express SSD? I know SATA 3.2 has a 16 Gbps limit but I haven’t seen any actual devices with that spec in it.

Even most PCIe SSDs are going to be limited to, in most cases, PCIe 3.0 x4 speeds, so you'll need several of them as well. If you can get a solution that uses several and puts them into a single PCIe 3.0 x16 slot then you'll be able to utilize a 100GbE link for transfers since that will give you over 100Gb of bandwidth.

Current Network Layout:

Current Build Log/PC:

Prior Build Log/PC:

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Mornincupofhate said:

easily be done

I assume you don't factor expense into that?

 

But otherwise you can get some off the shelf stuff for 100 GbE.  There's some 40 gig equipment like Arista switches you could roll used for under $1k.  Main bottlenecks are generally storage or routing capacity, depends on what you're trying to do.

PC : 3600 · Crosshair VI WiFi · 2x16GB RGB 3200 · 1080Ti SC2 · 1TB WD SN750 · EVGA 1600G2 · Define C 

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, beersykins said:

I assume you don't factor expense into that?

 

But otherwise you can get some off the shelf stuff for 100 GbE.  There's some 40 gig equipment like Arista switches you could roll used for under $1k.  Main bottlenecks are generally storage or routing capacity, depends on what you're trying to do.

Let's not bring that evil "A" word into the mix now :P

Current Network Layout:

Current Build Log/PC:

Prior Build Log/PC:

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Lurick said:

Let's not bring that evil "A" word into the mix now :P

I'm not very familiar with them, do they just have garbage products or something?

Link to comment
Share on other sites

Link to post
Share on other sites

20 hours ago, JCBiggs said:

100gb is possible with fiber.  you just have to have the memory speed to support it.  Hard drive more than anything. 

 

 i went too google to do the conversion, and this is what i found... thats a new one on me google. xDxD (yes apparently thats a real thing, but i was confused....dyslexai)

Capture.JPG.a56274470800022af901d576fb730372.JPG

I just wanted to make a clarification.

 

100Gbps = 100 Gigabit per second, not GigaByte (or GibiByte, for that matter).

 

Still a huge number, but much smaller than what you've listed.

 

100 Gbps = 11,920.9 MiB/s (12,500 MB/s)

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, beersykins said:

I assume you don't factor expense into that?

 

But otherwise you can get some off the shelf stuff for 100 GbE.  There's some 40 gig equipment like Arista switches you could roll used for under $1k.  Main bottlenecks are generally storage or routing capacity, depends on what you're trying to do.

Expenses for what? Installing a new open source networking stack?

I wasn't talking at all about hardware in my previous post.

Link to comment
Share on other sites

Link to post
Share on other sites

Outside of data centers and normal ISPs, you have deep sea fiber where the combined speeds are measured in terabit/s. Granted it is all aggregated so not a single connection, but damn impressive still.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Mornincupofhate said:

Expenses for what? Installing a new open source networking stack?

I wasn't talking at all about hardware in my previous post.

It doesn't happen without hardware my man.

PC : 3600 · Crosshair VI WiFi · 2x16GB RGB 3200 · 1080Ti SC2 · 1TB WD SN750 · EVGA 1600G2 · Define C 

Link to comment
Share on other sites

Link to post
Share on other sites

On 8/13/2018 at 6:36 PM, Lurick said:

Datacenter interconnects are often times going to use bundles of 100GbE as well,  ISP backbone networks have a lot of 100GbE and I know of several companies that do it at their larger headend sites and locations too. They don't do it from each branch but they've got a couple 100Gb or a couple 40Gb connections to the ISP on their headend aggregation stuff.

Indeed they have large backbone. Distributor of fiber TV uses 1.45 gbps for 1 channel at 1080p and a 4 1080p signal for 4k which total's up to 5.8 gbps that is input from the tv station to their server. Then they compress and redistribute that signal over land lines and satellite. At over 230 channels (from one specific provider i know the guy that actually work there) that is a huge bandwidth coming in their system. That become even crazier especially when TV shows are live then the data cannot be process prior to diffusion, it has to be done live so their equipment definitively support very high bandwidth.

Link to comment
Share on other sites

Link to post
Share on other sites

We are installing a 100gbps Ethernet ring using Nokia... it's dang fast but the cost is high. Our 100gbps CSFP's are around 14k a piece for 10km optics. The 40km optics are roughly double that. 

 

 CSFP.thumb.JPG.99748d68d9f7074d6aa8a99f646ca209.JPG

 

This is the 10km CSFP optic we use for the shorter links. You can see they are quite a bit larger than SFP+ optics for 10gbps. 

5b743bc40795f_100gbpsethernet.png.cbc0b2a7cdc785aa2ef717674fc557e1.png

 

This is the interface info showing the 100gbps. 

 

We are using this as our backbone for all services. Internet will be the largest consumer of bandwidth. But even that in most cases won't be close to saturating this connection anytime soon. Even our data center is mostly 10gbps.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×