oreilly.comSafari Books Online.Conferences.
Articles Radar Books  

Peer-to-Peer for Academia
Pages: 1, 2, 3

Bandwidth Issues

And finally, bandwidth issues, one of the fundamental features of Internet2. I've saved this for last among my technical topics because the popularity of file-sharing systems on college campuses and the negative reaction of many system administrators deserves a good share of time.



By decentralizing data and therefore redirecting users so they download data directly from other users' computers, Napster reduced the load on its servers to the point where it could cheaply support tens of millions of users. The same principle is used in many commercial peer-to-peer systems; I just mentioned it in relation to McAfee ASaP. In short, peer-to-peer cannot only distribute files, it can also distribute the burden of supporting network connections. The overall bandwidth required on the Internet remains the same as in centralized systems, but bottlenecks are eliminated at central sites and equally importantly, at their ISPs.

[Overloaded campus networks] are not a problem caused by peer-to-peer.

How much bandwidth does a simple peer-to-peer system like Napster save? Let's look at some rough estimates made by a company called CenterSpan, which makes a peer-to-peer content-sharing system called C-Star. They estimate that, if you put together Napster and the various Gnutella systems and all the knock-offs, you'd see about 3 billion songs traded every month. Sounds like a high number, but it's been replicated elsewhere and could be pretty accurate. If you delivered all those songs from a central server, you'd need 25,000 T1 lines costing 25 million dollars a month. Peer-to-peer has to be more efficient.

Many network administrators will now protest that Napster was a bandwidth hog and overloaded their campus networks. This is not a problem caused by peer-to-peer: The load would have been just as bad had all those students exchanged files over FTP or some other protocol. Music files are just plain big, and if it suddenly became the hippest thing on the planet to exchange PowerPoint presentations or 100-page PostScript files (like term papers), the load would be just the same.

I'm not surprised that colleges would complain about Napster bandwidth requirements because I hear the same wringing of hands over education in general. I hear there are too many applicants to top colleges. Excuse me, but wouldn't it be good to educate more students? Instead of saying there are too many applicants, why don't you work on increasing the availability of high-quality course offerings? I know you don't have tenure-track positions for all the people awarded doctorates, but it's not your job to offer everyone a position; it's your job to educate them.

College administrators have fallen into the same rut as telephone companies that are slow to roll out high-bandwidth lines, or the recording industry that is shutting down Napster. These institutions all find it more profitable to manage scarcity than to offer abundance.

Universities should be excited by the spirit of curiosity shown by Napster.

I'll apply the same reasoning now to Napster. The reason tens of millions of people used it is that it opened up the wonderful universe of music. Napster was much more than a free source of popular tunes; it represented exploration, a striving to know the unknown, a widening of cultural horizons. Yes, I know most of the stuff traded over Napster was junk, but much of what Beethoven wrote was also junk; just ask any musicologist. The point is that you need to cut a wide swath to encourage new experiences and new sounds. Universities should be excited by the spirit of curiosity shown by Napster. It was a flowering of cultural opportunities never before seen in the world, and that's why there were so many downloads. Let's provide bandwidth for the material people want instead of complaining that they want it.

I know you have to pay for bandwidth, so you have to charge for bandwidth too, somehow. I'm sure your users like to have things cost-free, and the next best alternative to cost-free is flat-rate. If you have to move to some kind of chokepoints or metered pricing, I don't have a right to criticize you, but I'd like to offer a couple points of comparison for you to consider first.

One frequently made observation is that Internet access and use is much greater in the United States, where local calls are priced at flat rates, than in most countries where local calls are metered. Observers tend to conclude that flat-rate pricing encourages experimentation, and suggest that many innovative uses of the Internet arose within this environment of experimentation. The whole phenomenon called "surfing" was a critical phase in the growth of the World Wide Web.

A second fascinating historical point of comparison is the New York City subway system. When it was opened near the beginning of the 1900s, the city's leaders had to decide whether to base prices on how far people were riding, or to charge a nickel for every ride regardless of distance. They choose the latter, flat-rate system. Historians believe this led to the rapid spread of New Yorkers out of Manhattan and into the surrounding boroughs, creating a richer and more thriving city.

Social Impact and Public Excitement

We have no idea what students will think of sharing next. Their experiments should be welcomed because it will make the university more transparent and force professors to teach better.

Peer-to-peer excites people because they can participate and make a difference. Even something as impersonal as SETI@home, where users downloaded software that performed calculations in the background, attracted millions of volunteers. And many said they did it because they felt like they were part of something. Just think how much more sense of ownership and pride can evolve around systems where you share ideas and content that have personal meaning to you.

University professors are already feeling anxious about students who share personal notes on the Web. Some professors have tried to force students to remove these notes, all the while uttering the same irrelevant shibboleths that plague peer-to-peer now: The professors claim to be worried about quality; they think they own intellectual property rights to the ideas they put forward in class--Not true!--and so forth. Everybody knows the professors are just scared to death at having their work exposed to scrutiny.

Peer-to-peer will up the ante even further. We have no idea what students will think of sharing next. Their experiments should be welcomed because it will make the university more transparent and force professors to teach better. Remember what happened--excuse me if this is trite--when Alexander Fleming discovered a foreign mold in one of his petri dishes. Instead of throwing it out, he started to research it, and discovered penicillin.

There's still a lot of innovation left for computer technology. The next time you need rapid interaction, efficient data sharing, and the combined processing of inputs from many different sources, I urge you to look at what peer-to-peer can offer.

Andy Oram is an editor for O'Reilly Media, specializing in Linux and free software books, and a member of Computer Professionals for Social Responsibility. His web site is www.praxagora.com/andyo.


Return to OpenP2P.com.



P2P Weblogs

Richard Koman Richard Koman's Weblog
Supreme Court Decides Unanimously Against Grokster
Updating as we go. Supremes have ruled 9-0 in favor of the studios in MGM v Grokster. But does the decision have wider import? Is it a death knell for tech? It's starting to look like the answer is no. (Jun 27, 2005)

> More from O'Reilly Developer Weblogs


More Weblogs
FolderShare remote computer search: better privacy than Google Desktop? [Sid Steward]

Data Condoms: Solutions for Private, Remote Search Indexes [Sid Steward]

Behold! Google the darknet/p2p search engine! [Sid Steward]

Open Source & The Fallacy Of Composition [Spencer Critchley]