oreilly.comSafari Books Online.Conferences.
Articles Radar Books  

Interoperability, Not Standards

by Clay Shirky
03/15/2001

"Whatever else you think about, think about interoperability. Don't think about standards yet."

Nothing I said at the O'Reilly P2P conference in San Francisco has netted me more flak than that statement. To advocate interoperability while advising caution on standards seems oxymoronic -- surely standards and interoperability are inextricably linked?

Indeed, the coupling of standards and interoperability is the default for any widely dispersed technology. However, there is one critical period where interoperability is not synonymous with standardization, and that is in the earliest phases of work, when it is not entirely clear what, if anything, should be standardized.

For people working with hardware, where Pin 5 had better carry voltage on all plugs from the get-go, you need a body creating a priori standards. In the squishier field of software, however, the history of RFCs demonstrates a successful model where standards don't get created out of whole cloth, but ratify existing practice. "We reject kings, presidents and voting. We believe in rough consensus and running code," as David Clarke put it. Standardization of software can't proceed in a single giant hop, but requires some practical solution to point to first.

I take standardization to be an almost recursive phenomena: a standard is any official designation of a protocol that is to be adopted by any group wanting to comply with the standard. Interoperability, meanwhile, is much looser: two systems are interoperable if a user of one system can access even some resources or functions of the other system.

Because standardization requires a large enough body of existing practice to be worth arguing over, and because P2P engineering is in its early phases, I believe that a focus on standardization creates two particular dangers: risk of premature group definition and damage to meaningful work. Focusing on the more modest goals of interoperability offers a more productive alternative, one that will postpone but improve the eventual standards that do arise.

Standardization and Group Definition

A standard implies group adoption, which presupposes the existence of a group, but no real P2P group exists yet. (The P2P Working Group is an obvious but problematic candidate for such a group.) The only two things that genuinely exist in the P2P world right now are software and conversations, which can can be thought of as overlapping circles:

  • There is a small set of applications that almost anyone thinking about P2P regards as foundational -- Napster, ICQ and SETI@Home seem to be as close to canonical as we're likely to get.
  • There is a much larger set of applications that combine or extend these functions, often with a view to creating a general purpose framework, like Gnutella, Jabber, Aimster, Bitzi, Allcast, Groove, Improv, and on and on.
  • There is a still larger set of protocols and concepts that seem to address the same problems as these applications, but from different angles -- on the protocol front, there are attempts to standardize addressing and grid computing with things like UDDI, XNS, XML-RPC, and SOAP, and conceptually there are things like the two-way Web, reputation management and P2P journalism.
  • And covering all of these things is a wide-ranging conversation about something called P2P that, depending on your outlook, embraces some but probably not all of these things.

What is clear about this hodge-podge of concepts is that there are some powerful unlocking resources at the edges of the Internet and democratizing the Internet as a media channel.

Comment on this articleDoes P2P even need standards? What should the work of the P2P Working Group be?
Tell us what you think.

What is not clear is which of these things constitute any sort of group amenable to standards. Should content networks use a standard format for hashing their content for identification by search tools? Probably. Would the distributed computation projects having a standard client engine to run code? Maybe. Should the people who care about P2P journalism create standards for all P2P journalists to follow. No.

P2P is a big tent right now, and it's not at all clear that there is any one thing that constitutes membership in a P2P group, nor is there any reason to believe (and many reasons to disbelieve) that there is any one standard, other than eventually resolving to IP addresses for nodes, that could be adopted by even a large subset of companies who describe themselves as "P2P" companies.

Standardization and Damage to Meaningful Work

Even if at this point, P2P were a crystal-clear definition --within which it was clear which sub-groups should be adopting standards -- premature standardization risks destroying meaningful work.

This is the biggest single risk with premature standardization -- the loss of that critical period of conceptualization and testing that any protocol should undergo before it is declared superior to its competitors. It's tempting to believe that standards are good simply because they are standard, but to have a good standard, you first need a good protocol, and to have a good protocol, you need to test it in real-world conditions.

Imagine two P2P companies working on separate metadata schemes; call them A and B. For these two companies to standardize, there are only two options: one standard gets adopted by both groups, or some hybrid standard is created.

Now if both A and B are in their 1.0 versions, simply dropping B in favor of A for the sole purpose of having a standard sacrifices any interesting or innovative work done on B, while the idea of merging A and B could muddy both standards, especially if the protocols have different design maxims, like "lightweight" vs. "complete."

This is roughly the position of RSS and ICE, or XML-RPC and SOAP. Everyone who has looked at these protocols has had some sense that these pairs of protocols solve similar problems, but as it is not immediately obvious which one is better (and better here can mean "most lightweight" or "most complete," "most widely implemented" or "most easily extensible," and so on) the work goes on both of them.

This could also describe things like Gnutella vs. Freenet, or further up the stack, BearShare vs. ToadNode vs. Lime Wire. What will push these things in the end will be user adoption -- faced with more than one choice, the balance of user favor will either tip decisively in one direction, as with the fight over whether HTML should include visual elements, or else each standard will become useful for particular kinds of tasks, as with Perl and C++.

Premature standardization is a special case of premature optimization, the root of all evil, and in many cases standardization will have to wait until something more organic happens: interoperability.

Pages: 1, 2

Next Pagearrow





P2P Weblogs

Richard Koman Richard Koman's Weblog
Supreme Court Decides Unanimously Against Grokster
Updating as we go. Supremes have ruled 9-0 in favor of the studios in MGM v Grokster. But does the decision have wider import? Is it a death knell for tech? It's starting to look like the answer is no. (Jun 27, 2005)

> More from O'Reilly Developer Weblogs


More Weblogs
FolderShare remote computer search: better privacy than Google Desktop? [Sid Steward]

Data Condoms: Solutions for Private, Remote Search Indexes [Sid Steward]

Behold! Google the darknet/p2p search engine! [Sid Steward]

Open Source & The Fallacy Of Composition [Spencer Critchley]