Linux and the Tools Philosophy07/25/2000
Let's look at one of the foundations that Linux is built on: the "tools philosophy" of Unix.
A product of the late 1960s, the Unix operating system and related software was invented by Dennis Ritchie, Ken Thompson, Brian Kernighan, and many other hackers at Bell Labs; its name was a pun on "Multics," another operating system of the time.
Much has happened in the history of Unix since those early years, and there have since evolved many "flavors" and variations of Unix-like operating systems, of which Linux has become the most popular.
That this operating system has survived in form for more than thirty years should tell us something about the temerity of its design considerations. And one of these considerations -- perhaps its most endearing -- is the "tools" philosophy.
The closed-source, anti-Unix approach
Most operating systems are designed with a concept of files, come with a set of utility programs for handling these files, and then leave it to the large "applications" to do the interesting work: a word processor, a spreadsheet, a presentation designer, a Web browser. (When it just so happens that a few of these applications recognize each other's file formats or share a unified interface, they're called "suites.")
Each of these monolithic applications presumably has an "open file" command to read a file from disk and open it in the application; most of them, too, come with commands for searching and replacing text, spellchecks, printing the current document, and so on. The program source code for handling all of these tasks must be accounted for separately, inside each application -- taking up extra space both in memory and on disk.
And in the case of proprietary software, all of the actual program source code is kept from the public -- so other programmers can't use, build to, or learn from any of it. This kind of closed-source software is presented to the world as a kind of magic trick: if you buy a copy of the program, you may use it, but you can never learn how the program actually works.
The result of this is that the code to handle essentially the same function inside all of these different applications must be developed by programmers from scratch, separately and independently of the others each time -- so the progress of society as a whole is set back by the countless man-hours of time and energy programmers must waste by inefficiently reinventing all the same software functions to perform the same tasks, over and over again.
The synergy of tools
Unix-like operating systems do not put so much weight on application programs. Instead, they come with a lot of small programs called "tools." Each tool is generally capable of performing a very simple, specific task, and performing it well -- one tool does nothing but output the file(s) or data passed to it, one tool spools its input to the print queue, one tool sorts the lines of its input, and so on.
An important early development in Unix was the invention of "pipes," a way to pass the output of one tool to the input of another. By knowing what the individual tools do and how they are combined, a user could now build powerful "strings" of commands.
Just as the tensile strength of steel is greater than the added strength of its components -- nickel, cadmium, and iron -- multiple tools could then be combined to perform a task unpredicted by the function of the individual tools. This is the concept of "synergy," and it forms the basis of the Unix tools philosophy.
Pages: 1, 2