Jump to content

Internet speed record shattered at 178 terabits per second

EJMB

Not sure if this has been posted before but might as well...

 

Summary

UCL engineers, together with two companies, Xtera and KDDI Research, achieved a data transmission rate of 178 terabits a second (178,000,000 megabits a second).
They claim it would be possible to download the entire Netflix library in less than a second. 

This was achieved by transmitting data through a much wider range of colors of light, or wavelengths, than is typically used in optical fiber through new Geometric Shaping (GS) constellations. They are basically patterns of signal combinations that alter the phase, brightness and polarization of the wavelengths, in order to fit more information into light without the wavelengths interfering with each other. 

The data that make up the world’s first image of a black hole was absurdly heavy and, therefore, had to be stored on half a ton of hard drives and transported by plane. Thus, the speed record raises the possibility of taking less than an hour to download the same data from that image.

 

 

Quote

While current state-of-the-art cloud data-center interconnections are capable of transporting up to 35 terabits a second, we are working with new technologies that utilize more efficiently the existing infrastructure, making better use of optical fiber bandwidth and enabling a world record transmission rate of 178 terabits a second,” said Lidia Galdino, lead researcher on the study. "

 

I am very much interested in this as it will open the avenue for fully based cloud computing for everything (and I mean everything) in the world where everything can be accessed in an instant. Although there will be some socio-political concerns with that possibility...it might be an inevitability moving forward.

 

Sources

  1. Optical Fibre Capacity Optimisation via Continuous Bandwidth Amplification and Geometric Shaping. DOI: 10.1109/LPT.2020.3007591
    https://newatlas.com/telecommunications/internet-speed-record178-terabits-per-second/
    https://www.techexplorist.com/ucl-engineers-achieved-record-internet-speed-178-terabits-per-second/34838/
Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, huilun02 said:

So phones physically connected to landline optical cables and ridiculous data bills

 

Genius.

Nope, not what I meant though 😂 but I'm not sure if you're being sarcastic but alright.

 

I meant having all the computing and storage elements of everything in central "cloud" areas so there is a possibility that in the future the only thing we'll have in our home as personal computers will be screens and I/O connected to the cloud and all your data and computing needs will be handled from it. 

 

For phones, wearables and mobile devices, 5G will probably handle the wireless portion of this "cloud" future.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Moonzy said:

178,000,000 megabits a second

 

how many jpegs is that?

A lot. A whole lot. 

Main System: Phobos

AMD Ryzen 7 2700 (8C/16T), ASRock B450 Steel Legend, 16GB G.SKILL Aegis DDR4 3000MHz, AMD Radeon RX 570 4GB (XFX), 960GB Crucial M500, 2TB Seagate BarraCuda, Windows 10 Pro for Workstations/macOS Catalina

 

Secondary System: York

Intel Core i7-2600 (4C/8T), ASUS P8Z68-V/GEN3, 16GB GEIL Enhance Corsa DDR3 1600MHz, Zotac GeForce GTX 550 Ti 1GB, 240GB ADATA Ultimate SU650, Windows 10 Pro for Workstations

 

Older File Server: Yet to be named

Intel Pentium 4 HT (1C/2T), Intel D865GBF, 3GB DDR 400MHz, ATI Radeon HD 4650 1GB (HIS), 80GB WD Caviar, 320GB Hitachi Deskstar, Windows XP Pro SP3, Windows Server 2003 R2

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, EJMB said:

I am very much interested in this as it will open the avenue for fully based cloud computing for everything (and I mean everything) in the world where everything can be accessed in an instant. Although there will be some socio-political concerns with that possibility...it might be an inevitability moving forward.

Can we please not do cloud for everything?

 

The best argument will always be "what if it goes down".

Ketchup is better than mustard.

GUI is better than Command Line Interface.

Dubs are better than subs

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Trik'Stari said:

Can we please not do cloud for everything?

 

The best argument will always be "what if it goes down".

I hope they don't but a lot of people want to have a hassle-free experience like have everything done for them...not a fan of it to be honest.

 

I'm more concerned about who controls the cloud...you can only imagine what kind of power they would hold. 😬

Link to comment
Share on other sites

Link to post
Share on other sites

Very cool. I'll continue waiting my place to move from ADSL speeds. 

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

Oh wow!

I was present when the previous record (1.6 terabit) was set in dreamhack 2018. Even got to meet Alex and Brandon! The fact that the record was beat a hundred times over is insanely cool.

Potato

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, Moonzy said:

178,000,000 megabits a second

 

how many jpegs is that?

/8 for megabytes

and say a modern phone/camera creates a 10mb jpeg file, that'd be 2,225,000 jpegs.

Link to comment
Share on other sites

Link to post
Share on other sites

Super Mario Bros was around 31 Kilobytes.

So at this speed you could transfer  the whole super mario bros game roughly 718 MILLION times in 1 second. 

 

Maybe sth more modern...like Call of Duty Modern Warfare:

200 gigs game at this transfer speed would result in roughly 112 call of duty's being transfered in 1 second.

Or for the impatient gamer who awaits his single instance of CoD to download, that'd take about 0,01 seconds to complete.

 

P.S. that's without any tcp overhead taken into consideration.

Link to comment
Share on other sites

Link to post
Share on other sites

Time for Linus to upgrade his servers again...

Phone 1 (Daily Driver): Samsung Galaxy Z Fold2 5G

Phone 2 (Work): Samsung Galaxy S21 Ultra 5G 256gb

Laptop 1 (Production): 16" MBP2019, i7, 5500M, 32GB DDR4, 2TB SSD

Laptop 2 (Gaming): Toshiba Qosmio X875, i7 3630QM, GTX 670M, 16GB DDR3

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, huilun02 said:

So the mobile devices turn into bricks whenever there is no connection, and still ridiculous data bills?

 

Genius

yes and no

The future would seem that the ultimate goal is to have every bit of the habitable earth (I'm not a fan of this to be honest) be connected in some manner through wireless technology such as 5g or whatever generation as the case may be in the future at the bare minimum. Failsafes such as redundancy of local nodes/stations and backup power storage and generation will most likely be put in place to prevent any downtimes (so no, it most likely won't turn into a brick given that local stations would likely be put in place and would function more should central areas be disabled for some reason and I'm sure every phone then would be given some sort of calability although much more limited)  but as you alluded to, there still the possibility of unfortunate regular or catastrophic events to occur such as an EMP from a nuclear blast, a cataclysmic event such as natural disasters destroy facilities, loss of power or a multitude of things could go wrong and so there always exists that possibility and so I agree with you that they may turn into bricks as well.

We'll see 😆

Link to comment
Share on other sites

Link to post
Share on other sites

I was debating posting this but glad someone did as it's really awesome to see this stuff coming around. The Australian fiber comb is really what excites me as that could mean huge shifts in bandwidth for data center applications and really boost capacity without much/any changes to the fiber.

To put this into perspective as to how many (non-DWDM) network ports this is:

800Gb ports = 223 total ports

400Gb ports = 445 total ports

100Gb ports = 1780 total ports

 

At 36 ports of 100Gb per linecard you're looking at just over three Nexus 9516 chassis fully maxed out with linecards. The 9516 is a 16 slot 21RU chassis (~3 feet tall). Power draw for ONE chassis fully loaded is around 19.5kW under max load. So you're looking at something like 65kW to push that much bandwidth across three chassis. Now I do want to state this example is not using DWDM which can put multiple lambdas on a single cable and compact things down but just for scale of what it could cost in terms of power/footprint from a data center perspective.

 

If you use something with DWDM then you could probably get it into a single chassis and probably drop the power draw and footprint down a lot it just depends on the number of lambdas you get on a single fiber as to how many ports/linecards you would need. So if you assume around 100 lambdas totaling 10Tb/s on a single fiber then you're looking at 18 fibers so you can see how things shrink down.

Current Network Layout:

Current Build Log/PC:

Prior Build Log/PC:

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, Moonzy said:

178,000,000 megabits a second

 

how many jpegs is that?

About 700.000 jifs.

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, EJMB said:

 

They claim it would be possible to download the entire Netflix library in less than a second. 
 

 

17c.png

178 Tbps is roughly 22.3 TB of storage, there is no way the Netflix library is that small. Even the Open Connect appliances they install in ISP datacenters for faster speeds to end users contain ten times that and those don't contain the entire library.

 

Quote

I am very much interested in this as it will open the avenue for fully based cloud computing for everything (and I mean everything) in the world where everything can be accessed in an instant. Although there will be some socio-political concerns with that possibility...it might be an inevitability moving forward.

This isn't even remotely intended for actual widespread use. These speeds are possible only under incredibly specific conditions and at a price that no consumer or business can afford. At best you'll see this used for site to site connections between research institutes and the like where they can actually utilize that datarate. Most of the world is still on 10 Mbps or less, the high end for consumers has only the last year or two started moving into >1 Gbps speeds for home networks and less than that for internet. Even the largest companies often don't have more than a 400 Gbps uplink and getting even that costs millions. By the time a regular consumer can download at 178 Terabits, we'll likely have colonies on Mars.

My Build:

Spoiler

CPU: i7 4770k GPU: GTX 780 Direct CUII Motherboard: Asus Maximus VI Hero SSD: 840 EVO 250GB HDD: 2xSeagate 2 TB PSU: EVGA Supernova G2 650W

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, Murasaki said:

That's great! Won't ever experience it.

 

/s

 

Eh depends on your age, give it 40 years and you might.

Link to comment
Share on other sites

Link to post
Share on other sites

Wait there's existing 35 terabit deployments? Who the fuck has that shit, literal network backbone providers?

MOAR COARS: 5GHz "Confirmed" Black Edition™ The Build
AMD 5950X 4.7/4.6GHz All Core Dynamic OC + 1900MHz FCLK | 5GHz+ PBO | ASUS X570 Dark Hero | 32 GB 3800MHz 14-15-15-30-48-1T GDM 8GBx4 |  PowerColor AMD Radeon 6900 XT Liquid Devil @ 2700MHz Core + 2130MHz Mem | 2x 480mm Rad | 8x Blacknoise Noiseblocker NB-eLoop B12-PS Black Edition 120mm PWM | Thermaltake Core P5 TG Ti + Additional 3D Printed Rad Mount

 

Link to comment
Share on other sites

Link to post
Share on other sites

I have a feeling that the entire Netflix catalog is larger than 22TB.

MacBook Pro 16 i9-9980HK - Radeon Pro 5500m 8GB - 32GB DDR4 - 2TB NVME

iPhone 12 Mini / Sony WH-1000XM4 / Bose Companion 20

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, S w a t s o n said:

Wait there's existing 35 terabit deployments? Who the fuck has that shit, literal network backbone providers?

Using DWDM, yes. Data center interconnections for example use this as well.

Current Network Layout:

Current Build Log/PC:

Prior Build Log/PC:

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×