In “The Internet Reborn” (free registration necessary for this long Technology Review article), Wade Roush says that “a grass-roots group of leading computer scientists is working on replacing today's Internet with a faster, more secure, and vastly smarter network: PlanetLab.”
PlanetLab, with its many innovations, will not be part of the existing Internet.
Instead, the PlanetLab researchers, who hail from Princeton, MIT, the University of California, Berkeley, and more than 50 other institutions, are building their network on top of the Internet. But their new machines — called smart nodes — will vastly increase its processing power and data storage capability, an idea that has quickly gained support from the National Science Foundation and industry players such as Intel, Hewlett-Packard, and Google.
Since starting out in March 2002, PlanetLab has linked 175 smart nodes at 79 sites in 13 countries, with plans to reach 1,000 nodes by 2006.
The traffic on the regular Internet is handled by routers which are just able to recognize data packets and transmit them to another router. By contrast, the PlanetLabs smart nodes are regular computers which can handle much more than traffic, like analyzing patterns to prevent attacks by viruses or worms.
This is the goal of one PlanetLab project, Netbait, developed by researchers at Intel and UC Berkeley. Let's switch to PlanetLab words.
Netbait is a PlanetLab service that provides distributed detection of machines infected with Internet worms. It provides an architectural framework which allows for detailed analysis of both the extent and nature of worm propagation by leveraging the PlanetLab testbed's geographical diversity and multiple network viewpoints. Each participating machine runs a set of simplified network services which log all incoming requests. Such machines are then federated, log data is continuously collected, and pattern matching is performed to identify well-known signatures of various worms and viruses (e.g., Code Red, Nimda, etc.) and to index the relevant data and make it available via remote procedure calls. The end result is a service which network administrators, or even programs (e.g., a daemon that controls firewall rules), might subscribe to in order to automatically isolate compromised machines and prevent the spread of infection.
Another PlanetLab project is CoDeeN, a Content Distribution Network (CDN) whose goal is to eventually become a free “public Akamai.”
There is also the Scriptroute project, a system for distributed Internet debugging and measurement. Roush esplains.
Developed at the University of Washington, its a distributed program that uses smart nodes to launch probes that fan out through particular regions of the Internet and send back data about their travels. The data can be combined into a map of the active links within and between Internet service providers networks — along with measurements of the time packets take to traverse each link.
Let's finish this round of innovations with OceanStore, a distributed storage system.
OceanStore encrypts files — whether memos or other documents, financial records, or digital photos, music, or video clips — then breaks them into overlapping fragments. The system continually moves the fragments and replicates them on nodes around the planet. The original file can be reconstituted from just a subset of the fragments, so its virtually indestructible, even if a number of local nodes fail.
OceanStore's goal is to produce software capable of managing 100 trillion files, or 10,000 files for each of 10 billion people.
Here is Roush's conclusion.
PlanetLab aims to transform todays dumb, simple Internet communications system into a smarter and much more flexible network that can ward off worms, store huge amounts of data with perfect security, and deliver content instantly.
If you have time, Roush's article contains many more details.
Source: Wade Roush, Technology Review, September 26, 2003 [Roland Piquepaille's Technology Trends]