oreilly.comSafari Books Online.Conferences.
Articles Radar Books  

The Great Rewiring

by Richard Koman
08/20/2001

A year ago, peer-to-peer was a nebulous concept, and many people had trouble understanding exactly what it meant and how it encompassed everything from Napster and Gnutella to instant messaging and distributed computing. To many people, Clay Shirky captured the essence of P2P with his observation that most users were essentially the "dark matter of the Internet" -- intermittently connected PCs cut off from the DNS system that makes machines visible and reachable.

Shirky kicks off the O'Reilly Conference on P2P and Web Services in September with a keynote titled "The Great Rewiring." OpenP2P.com editor Richard Koman reached Shirky by phone to talk about this "rewiring," Web services and the future of P2P.

Richard Koman:Your keynote is called "The Great Rewiring." What does that mean?

Clay Shirky:The argument I'm advancing is partly historical and partly rooted in recent changes in technology. The argument I want to make is that what we're seeing is the result of a bunch of forces that were put in place about 15 years ago. In the first years of the 1980s, both halves of what we think of as the modern computing landscape launched. We saw on January 1, 1983, the ARPANet switch over to running off of Internet protocol and became for all intents and purposes the Internet we recognize today. Then in January 1984, we saw the PC with a GUI launch in the form of the Apple Macintosh. And for 10 years those two revolutions didn't connect to one another. PCs were very rarely connected directly to the Internet. The two revolutions were sort of going on parallel tracks.

Then a period that I'm calling "the great wiring" came about -- from '94 to '99 -- when Mosaic, and then later Netscape and IE, gave people a reason to connect their PCs directly to the Internet for the first time. But those connections were really kind of weak and lame -- temporary IP addresses and so forth. And the change that peer-to-peer brought about seems to be to be a "great rewiring:" a way of rethinking the ways in which the personal computer, with all of the expectations that it brings to the user in terms of local speed and graphic interfaces, can be connected more directly to the Internet.

Koman: So this is specifically a P2P innovation? You put P2P right at the center of this rewiring?

Shirky: Peer-to-peer is the more kind of philosophical question; Web services right now is a more technological set of questions. But I think the two can actually inform one another quite a bit. It's plain to anybody looking at the peer-to-peer movement that one of the things it's critically lacking is an agreed-upon set of infrastructure and data standards. This is what Web services is trying to create, obviously, at its core. It seems likelier to me that peer- to-peer will converge on standards pioneered by the Web services people, rather than on standards arising directly out of the peer-to-peer world.

Koman: As far as standards in the peer-to-peer world, it seems like it's been a complete nonstarter for the past year.

Shirky: Exactly right. The only things we have in the peer-to-peer world that even are starting to look like standards are essentially bilateral interop agreements around certain technologies. "Here is a way you can write XML to the Groove network" and "Here is a way to use Jabber to pass XML documents in real time." But really those are hardly standards. They're barely application frameworks.

Web services, by starting with a standards-driven goal, stands a much better chance in my mind of providing not only the standards and infrastructure for Web services but the spill-over: standards and infrastructure for peer-to-peer. So the obvious win for peer-to-peer from Web services is better interoperability. And I think there are a couple of important things the Web services people can learn from the peer-to-peer people as well.

The first is that HTTP is not the be-all and end-all of transport mechanisms. What the peer-to-peer people have been really good at is finding different transport mechanisms for different needs. Jabber will handle things in real time, logging presence and identity. Napster took the idea of HTTP but rewrote it in asynchronous mode so you wouldn't get traffic jams. Companies like 3Path are building on top of SMTP as a transport mechanism. There are people looking at BEEP or BXXP. And one of the things I think the Web services people are going to learn from peer-to-peer is that there are reasons to use protocols other than HTTP in certain circumstances. When the wealth of innovation in the peer-to-peer world is exposed, I think it's going to be very valuable for the Web services people.

The other thing I think Web services can learn from peer-to-peer is that the idea of client-server is an attitude about transactions but not an attitude about machines. What peer-to-peer has shown us is that a machine can as easily be a client in one second and a server in the next, or indeed a client and a server at the same time, so that all notions like "Oh, there's the Web server and there's the Web browser and those are two fundamentally separate things," I think are going to break down in the Web services world. And we've seen a great deal of innovation with clervers and transceivers and nodes and all these other words that the peer-to-peer people have for a device that acts as both client and server in different environments.

Koman: What is the Web services' people's understanding of client-server right now? Fairly traditional?

Shirky: It's almost completely traditional. What essentially they are trying to do is take what the Web did for publishing and apply that to a computing environment in general, to say that simple requests and structured replies can form the backbone of everything from exposing business processes to remote procedure calls. But they're still very much in this mode of request-and-response, where one machine is consistently acting as the client and the other machine is consistently acting as the server. And I think what we've seen with Napster is if you come and are downloading from me a Frank Sinatra song, a Motown song and a Talking Heads song, I might think, "Well, that guy's got really diverse taste. Why don't I go see what he's doing." So at the same time I'm being a server to you, you are being a server to me.

In the Web services world, you can readily imagine instead of having a cross-border tax calculating engine, having two machines essentially communicating with one another bilaterally: "Hey, here's what the tax rates are in my country. What are they in your country?" And that kind of two-way conversation is I think going to become a more integral part of Web services than people are now imagining.

Koman: So does that equal something more like "peer services?" I mean is the term "Web services" a holdover from this traditional view of client-server?

Shirky: Yes, absolutely. "Web services" is a holdover from traditional client services and it started with the notion that HTTP was such a flexible transport protocol that that was going to become the baseline of everything new and that all we needed to replace was HTML, and we needed to replace it with XML. I think what we're seeing is that of a traditional Web services stack -- HTTP, SOAP, WSDL and UDDI -- the only thing in that stack that's really questionable is the HTTP part. It might as well be SMTP or FTP or even BXXP, or what have you. But I think a little bit like peer-to-peer, the label "Web services" is going to stick even if the definition becomes mutable. You know, five years ago "the Web" meant HTML documents moderated or mediated by HTTP. "The Web" now means the publicly accessible Internet, so I think we're not only going to keep the word "Web services," I think inasmuch as it becomes popular, it's going to change the meaning of the word "Web" away from narrower protocol-driven definition and toward the larger public Internet definition.

Pages: 1, 2, 3

Next Pagearrow





P2P Weblogs

Richard Koman Richard Koman's Weblog
Supreme Court Decides Unanimously Against Grokster
Updating as we go. Supremes have ruled 9-0 in favor of the studios in MGM v Grokster. But does the decision have wider import? Is it a death knell for tech? It's starting to look like the answer is no. (Jun 27, 2005)

> More from O'Reilly Developer Weblogs


More Weblogs
FolderShare remote computer search: better privacy than Google Desktop? [Sid Steward]

Data Condoms: Solutions for Private, Remote Search Indexes [Sid Steward]

Behold! Google the darknet/p2p search engine! [Sid Steward]

Open Source & The Fallacy Of Composition [Spencer Critchley]