Jump to content

Researchers Release Tool That Can Scan the Entire Internet In Under an Hour

Story at /.
 

dstates writes

"A team of researchers at the University of Michigan has released Zmap, a tool that allows an ordinary server to scan every address on the Internet in just 45 minutes. This is a task that used to take months, but now is accessible to anyone with a fast internet connection. In their announcement Friday , at the Usenix security conference in Washington they provide interesting examples tracking HTTPS deployment over time, the effects of Hurricane Sandy on Internet infrastructure, but also rapid identification of vulnerable hosts for security exploits. A Washington Post Blog discussing the work shows examples of the rate with which of computers on the Internet have been patched to fix Universal Plug and Play, 'Debian weak key' and 'factorable RSA keys' vulnerabilities. Unfortunately, in each case it takes years to deploy patches and in the case of UPnP devices, they found 2.56 million (16.7 percent) devices on the Internet had not yet upgraded years after the vulnerability had been described."

Dat title. So misleading and yet true.

tumblr_lh5510vYi61qzaxefo1_500.jpg

Although I do not like what this implies that the NSA could possibly do (with their surveillance and such), it does give great power to the average person. Perhaps... too much power.

† Christian Member †

For my pertinent links to guides, reviews, and anything similar, go here, and look under the spoiler labeled such. A brief history of Unix and it's relation to OS X by Builder.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

cool :D 

If you tell a big enough lie and tell it frequently enough it will be believed.

-Adolf Hitler 

Link to comment
Share on other sites

Link to post
Share on other sites

I call utter bullshit, anyone that has take a class in Algorithm efficiency and computer logic would realize that those claims are a joke. That would be like running supercomputer protien folding on a cell phone, to parse that much information you need some serious hardware. look at how big google is and they run a search engine that comprises most of the surface net, they have massive datacenters and servers to make sure it all works and you have this tiny university coming along and saying it can keep an index of everything ever connected to the net on a single server. Yeh excuse me while I go laugh at them.

 

No form of sorting is efficient enough, quicksort would still be too slow. Watch I bet you any money they will say they used bubble sort and that it is surprisingly efficient ( for you programmers out there you should get this joke) 

Link to comment
Share on other sites

Link to post
Share on other sites

I call utter bullshit, anyone that has take a class in Algorithm efficiency and computer logic would realize that those claims are a joke. That would be like running supercomputer protien folding on a cell phone, to parse that much information you need some serious hardware. look at how big google is and they run a search engine that comprises most of the surface net, they have massive datacenters and servers to make sure it all works and you have this tiny university coming along and saying it can keep an index of everything ever connected to the net on a single server. Yeh excuse me while I go laugh at them.

 

No form of sorting is efficient enough, quicksort would still be too slow. Watch I bet you any money they will say they used bubble sort and that it is surprisingly efficient ( for you programmers out there you should get this joke) 

From their site:

"ZMap is capable of performing a complete scan of the IPv4 address space in under 45 minutes, approaching the theoretical limit of gigabit Ethernet.

ZMap can be used to study protocol adoption over time, monitor service availability, and help us better understand large systems distributed across the Internet."

So yeah. They are only scanning the IP addresses from what I understand. Not the file systems that connect to those (like the actual computers), but only the traffic and protocols regarding the connection.

In other words, it is not exactly a lot of information imo, as compared to the storage on all the computers connected to said IP addresses. If it approaches the theoretical limit of gigabit Ethernet, it is about 100MB/s+ meaning someone needs Google Fiber to utilize it (but if it's servers, they should have enough bandwidth). 

At 100MB/s for 45 minutes, that is is 45 * 60 * 100 = 270,000MB or 270GB of information. 

Just my thoughts.

† Christian Member †

For my pertinent links to guides, reviews, and anything similar, go here, and look under the spoiler labeled such. A brief history of Unix and it's relation to OS X by Builder.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

its imposibruuuu to do such things in that ammount of time

its all bs that came out becasuse of  the nsa leaks and shit

Link to comment
Share on other sites

Link to post
Share on other sites

what no windows version ......

If your grave doesn't say "rest in peace" on it You are automatically drafted into the skeleton war.

Link to comment
Share on other sites

Link to post
Share on other sites

That is a very bold claim. Not sure how true it actually is. Think about the amount of data that would have to sift through.

Link to comment
Share on other sites

Link to post
Share on other sites

That is a very bold claim. Not sure how true it actually is. Think about the amount of data that would have to sift through.

Again, as I have said, they are basically saying they can scan every IPv4 address and check the traffic of that address along with UPnP stuff regarding it. But that is it.

They do not scan the actual "internet". They do not scan the pages of the websites, nor the computers connected to the addresses. They only look at the addresses, the firmware/vulnerabilities of said addresses, and the traffic regarding said addresses.

At least that is all they mention that they see with the Zmap program. 270GB for all that seems about right imo. Maybe a little on the light side but eh. 

† Christian Member †

For my pertinent links to guides, reviews, and anything similar, go here, and look under the spoiler labeled such. A brief history of Unix and it's relation to OS X by Builder.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Story at /.

 

Dat title. So misleading and yet true.

tumblr_lh5510vYi61qzaxefo1_500.jpg

Although I do not like what this implies that the NSA could possibly do (with their surveillance and such), it does give great power to the average person. Perhaps... too much power.

Wow that sounds so impossible and almost reminds me of the guy that wanted to print the entire internet. Also it's better for the average person to have that kind of power instead of it just being restricted to organizations like the NSA. Ultimatly nobody should have the ablitiy to surveillance someone else; however since I doubt that we can get away from that at this point in time, it is the lesser of the two evils to have everyone have access to that kind of power instead of just one group of people. (also if everyone had this access I think people would want to get rid of it quicker so win-win :D

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×