May the source be with you, but remember the KISS principle ;-)
Home Switchboard Unix Administration Red Hat TCP/IP Networks Neoliberalism Toxic Managers
(slightly skeptical) Educational society promoting "Back to basics" movement against IT overcomplexity and  bastardization of classic Unix

(Slightly skeptical) Annotated Collection of Quotes
 from Eric Raymond's
CatB Book

News Complete List of Relevant Quotes from CatB Complete List of relevant quotes from HtN Complete List of relevant quotes from tMC Interesting Quotes from interviews and Speached  RMS bashing
Messianic Marxist-style Quotes Anarcho-communist quotes (extremes meet) Ultimate quality quotes Microsoft should be destroyed Funny Quotes Etc

Eric Raymond is a very controversial figure. The author of the famous "Cathedral and Bazaar" paper is widely considered to be the leading Linux evangelist and a spokesman for the Open Source movement. But sometimes he behaves  like a preacher speaking about family values or even as a leader of some obscure cult speaking to his followers or to his enemies with emotional insults as the only available weapon. Sometimes he behaves like a "Libertarian commissar" making ridiculous statements. Some of them really funny and probably deserve to be included into the corpus of Internet humor like the quote  from his  speech "On Socially Responsible Programming". His personality also has a dark side -- long and vicious vendetta toward RMS and FSF that damaged Open Source movement and to certain extent split the community into Linux camp and GNU camp.

In September 1998 Microsoft's "Halloween Documents", internal corporate memos, that was leaked to the public. In in October they provoked  big discussion about the future of Linux and about commentaries to memos posted by Eric Raymond. A very nice parody on Eric's early paper  The Circus Midget and the Fossilized Dinosaur Turd was created.  Approximately at this time the term "raymondism" was born to denote extremely biased, on the border of blind-folded chauvinism, view of OSS.

We will use abbreviations CatB for "the Cathedral and  the Bazaar", HtN for "Homesteading the Noosphere" and tMC for "The Magic Cauldron". The quotes below belong to earlier versions of the CatB and HtN. The latest version of  ESR papers can be found in the O'Reilly book

Work on the O'Reilly book also lead to updates of the CatB and HtN:

See also Interesting Quotes from interviews and Speached

Quotes from CatB

All quotes are taken from [ESR1998a] "The Cathedral and the Bazaar," First Monday, Vol.3 No.3 - March 2nd. 1998 

Collection of annotated quotes from the Cathedral and Bazzar

All quotes are taken from [ESR1998a] "The Cathedral and the Bazaar," First Monday, Vol.3 No.3 - March 2nd. 1998 

This is not obvious and requires a closer examination of the cost of catching bugs late in the development cycle. For Linux this is not an issue as Linux is a clone of Unix and most design issues were already solved. But in general case if the bug is due to a design defect (like it can be in Perl) it can cost significantly more(two orders of magnitude) to find and fix it on this phase. Delegation of debugging is nice for the project leader, but a situation when a couple of dozen gifted programmers waist time trying to find the same bug with no coordination  is not that efficient...

...I then make a sustained argument from the Linux experience for the proposition that "Given enough eyeballs, all bugs are shallow"...





...Linus Torvalds's style of development - release early and often, delegate everything you can, be open to the point of promiscuity - came as a surprise. No quiet, reverent cathedral-building here - rather, the Linux community seemed to resemble a great babbling bazaar of differing agendas and approaches (aptly symbolized by the Linux archive sites, who'd take submissions from anyone) out of which a coherent and stable system could seemingly emerge only by a succession of miracles.

...If I'm correct, they'll help you understand exactly what it is that makes the Linux community such a fountain of good software - and help you become more productive yourself.

Perhaps this should have been obvious (it's long been proverbial that "Necessity is the mother of invention") but too often software developers spend their days grinding away for pay at programs they neither need nor love. But not in the Linux world - which may explain why the average quality of software originated in the Linux community is so high.

That's is pretty close to the idea of Microsoft wide beta programs. No source code to all (you need NDA to get it) but the mechanism is similar and the audience is much wider. Is Microsoft an ultimate bazaar organization ?

Another strength of the Unix tradition, one that Linux pushes to a happy extreme, is that a lot of users are hackers too. Because source code is available, they can be effective hackers. This can be tremendously useful for shortening debugging time. Given a bit of encouragement, your users will diagnose problems, suggest fixes, and help improve the code far more quickly than you could unaided.

IMHO the  hijacking of Minix community was the cleverest Linux hack ( along with the adoption of GPL.). This way he got a lot of skilled developers and the community that can appreciate their efforts. Without them his efforts would probably collapse OSS or no OSS. This argument about invention of the development model  does not look realistic...

In fact, I think Linus' cleverest and most consequential hack was not the construction of the Linux kernel itself, but rather his invention of the Linux development model. When I expressed this opinion in his presence once, he smiled and quietly repeated something he has often said: "I'm basically a very lazy person who likes to get credit for things other people actually do." Lazy like a fox. Or, as Robert Heinlein might have said, too lazy to fail.
There is a significant difference in developing the Emacs core and Lisp code pool from the software engineering point of view.

...In retrospect, one precedent for the methods and success of Linux can be seen in the development of the GNU Emacs Lisp library and Lisp code archives. In contrast to the cathedral-building style of the Emacs C core and most other FSF tools, the evolution of the Lisp code pool was fluid and very user-driven. Ideas and prototype modes were often rewritten three or four times before reaching a stable final form. And loosely-coupled collaborations enabled by the Internet, a la Linux, were frequent.

Was it really opposite. Was not is a clever hidden cathedral building with Linux as an ultimate dictator and Linux as a one more cathedral -- the expression of worshipers feeling toward Unix...

...But by a year later, as Linux became widely visible, it was clear that something different and much healthier was going on there. Linus' open development policy was the very opposite of cathedral-building. The sunsite and tsx-11 archives were burgeoning, multiple distributions were being floated. And all of this was driven by an unheard-of frequency of core system releases.

Microsoft make more than 1000 builds of NT before release of 4.0. Probably they did several builds a day too, you never know ;-). To what extent high frequency of builds is related to cultivating co-developers is unclear. I suspect that this can backfire as people will soon get tired of this. Anyway as the system became more mature this is a bad idea that confuse everybody... 

...Linus' innovation wasn't so much in doing this (something like it had been Unix-world tradition for a long time), but in scaling it up to a level of intensity that matched the complexity of what he was developing. In those early times (around 1991) it wasn't unknown for him to release a new kernel more than once a day! Because he cultivated his base of co-developers and leveraged the Internet for collaboration harder than anyone else, this worked.

For Linus this is probably minimum-effort path, but for other developers it is not. Several talented people spend time looking at the same obscure bug in Linus code.  

...So, if rapid releases and leveraging the Internet medium to the hilt were not accidents but integral parts of Linus' engineering-genius insight into the minimum-effort path, what was he maximizing? What was he cranking out of the machinery?

...Put that way, the question answers itself. Linus was keeping his hacker/users constantly stimulated and rewarded - stimulated by the prospect of having an ego-satisfying piece of the action, rewarded by the sight of constant (even daily) improvement in their work.

Architecture-related  bugs are far from being shallow no matter of the  number of eager co-developers. And here Linux was partially saved by being a clone of Unix so few architectural problems were present (security in one -- it was not high in the Unix design priorities). But in general case this is not true. Moreover jumping to debugging without paying proper attention to the architecture can be a self-defeating strategy.

...In the bazaar view, on the other hand, you assume that bugs are generally shallow phenomena - or, at least, that they turn shallow pretty quick when exposed to a thousand eager co-developers pounding on every single new release. Accordingly you release often in order to get more corrections, and as a beneficial side effect you have less to lose if an occasional botch gets out the door.


Looks like machiavellian recommendation. Be humble and keep power, mention often the  contribution of others without mentioning any names, the media will do the dirty job of  self-promotion for you anyway.

...Interestingly enough, you will quickly find that if you are completely and self-deprecatingly truthful about how much you owe other people, the world at large will treat you like you did every bit of the invention yourself and are just being becomingly modest about your innate genius. We can all see how well this worked for Linus!

...But the problem with being clever and original in software design is that it gets to be a habit - you start reflexively making things cute and complicated when you should be keeping them robust and simple. I have had projects crash on me because I made this mistake, but I managed not to with fetchmail.

I disagree. Linux Documentation Project was a clever move, but it now suffer from series of problems. Nither in qulity not in quontity it can compete with commersial publishers like O'Reiisy and MCP.  Linux man pages can probably compete with Solaris, but generally are pretty user unfriendly. Linux documentation books are stagnant. The only active area are small documents like howtos, but OSS bureaucracy is taking toll on the authors even here ...

...Many people (especially those who politically distrust free markets) would expect a culture of self-directed egoists to be fragmented, territorial, wasteful, secretive, and hostile. But this expectation is clearly falsified by (to give just one example) the stunning variety, quality and depth of Linux documentation. It is a hallowed given that programmers hate documenting; how is it, then, that Linux hackers generate so much of it? Evidently Linux's free market in egoboo works better to produce virtuous, other-directed behavior than the massively-funded documentation shops of commercial software producers.

With proper tools the program like fetchmail can probably be developed as a shareware by a single talented developer. No real need for community of developers here (although it can help)-- this is not an operating system or compiler. In the case that somebody will want it as a commercial product the commercial developer would probably hire no more that a couple of programmers, couple of testers and a manager (who will also participate in development) for such a project. For any significant commercial developer like Corel or Borland a lot of existing code can be reused.

...And perhaps not only the future of open-source software. No commercial developer can match the pool of talent the Linux community can bring to bear on a problem. Very few could afford even to hire the more than two hundred people who have contributed to fetchmail!




Not true according to Linus Torvalds: "Torvalds cautioned, however, that "developers are not guaranteed success just because they adopt an open-source strategy against competitors who have not."

"People think just because it is open-source, the result is going to be automatically better. Not true. You have to lead it in the right directions to succeed. Open source is not the answer to world hunger," Torvalds said.
"Linus Torvalds says open source not a guarantee of success (InfoWorld)


... Perhaps in the end the open-source culture will triumph not because cooperation is morally right or software ``hoarding'' is morally wrong (assuming you believe the latter, which neither Linus nor I do), but simply because the closed-source world cannot win an evolutionary arms race with open-source communities that can put orders of magnitude more skilled time into a problem.

...Eric Hahn, Executive Vice President and Chief Technology Officer at Netscape, wrote me shortly afterwards as follows: "On behalf of everyone at Netscape, I want to thank you for helping us get to this point in the first place. Your thinking and writings were fundamental inspirations to our decision."





Torvalds pointed to Netscape's Mozilla, an open-source Java-based browser that held some promise but failed to attract many developers.
"Linus Torvalds says open source not a guarantee of success (InfoWorld)

Netscape is about to provide us with a large-scale, real-world test of the bazaar model in the commercial world. The open-source culture now faces a danger; if Netscape's execution doesn't work, the open-source concept may be so discredited that the commercial world won't touch it again for another decade



Complete List of Relevant Quotes from HtN

All quotes corresponds to [ESR1998b] Eric S. Raymond,  "Homesteading the Noosphere," First Monday, Vol.3 No.10 - October. 1998 

...We relate that to an analysis of the hacker culture as a 'gift culture' in which participants compete for prestige by giving time, energy, and creativity away.

...For many years the FSF was the single most important focus of open-source hacking, producing a huge number of tools still critical to the culture. The FSF was also long the only sponsor of open source with an institutional identity visible to outside observers of the hacker culture. They effectively defined the term 'free software', deliberately giving it a confrontational weight (which the newer label 'open source' just as deliberately avoids).

...It is very widely used in the open-source world. North Carolina's Sunsite is the largest and most popular software archive in the Linux world. In July 1997 about half the Sunsite software packages with explicit license terms used GPL.

...The typical pragmatist attitude is only moderately anti-commercial, and its major grievance against the corporate world is not 'hoarding' per se. Rather it is that world's perverse refusal to adopt superior approaches incorporating Unix and open standards and open-source software...

...For many years, the pragmatist point of view expressed itself within the hacker culture mainly as a stubborn current of refusal to completely buy into the GPL in particular or the FSF's agenda in general. Through the 1980s and early 1990s, this attitude tended to be associated with fans of Berkeley Unix, users of the BSD license, and the early efforts to build open-source Unixes from the BSD source base. These efforts, however, failed to build bazaar communities of significant size, and became seriously fragmented and ineffective.

...Not until the Linux explosion of early 1993-1994 did pragmatism find a real power base. Although Linus Torvalds never made a point of opposing RMS, he set an example by looking benignly on the growth of a commercial Linux industry, by publicly endorsing the use of high-quality commercial software for specific tasks, and by gently deriding the more purist and fanatical elements in the culture.

...In a reinforcing development, the pragmatist part of the culture was itself becoming polycentric by the mid-1990s. Other semi-independent communities with their own self-consciousness and charismatic leaders began to bud from the Unix/Internet root stock. Of these, the most important after Linux was the Perl culture under Larry Wall. Smaller, but still significant, were the traditions building up around John Osterhout's Tcl and Guido Van Rossum's Python languages. All three of these communities expressed their ideological independence by devising their own, non-GPL licensing schemes.

...In practice, however, such 'forking' almost never happens. Splits in major projects have been rare, and always accompanied by re-labeling and a large volume of public self-justification. It is clear that, in such cases as the GNU Emacs/XEmacs split, or the gcc/egcs split, or the various fissionings of the BSD splinter groups, that the splitters felt they were going against a fairly powerful community norm.

...The taboos of a culture throw its norms into sharp relief. Therefore, it will be useful later on if we summarize some important ones here.

...Examined in this way, it is quite clear that the society of open-source hackers is in fact a gift culture. Within it, there is no serious shortage of the 'survival necessities' - disk space, network bandwidth, computing power. Software is freely shared. This abundance creates a situation in which the only available measure of competitive success is reputation among one's peers.

...This illustrates an interesting point about the hacker culture. It consciously distrusts and despises egotism and ego-based motivations; self-promotion tends to be mercilessly criticized, even when the community might appear to have something to gain from it. So much so, in fact, that the culture's 'big men' and tribal elders are required to talk softly and humorously deprecate themselves at every turn in order to maintain their status. How this attitude meshes with an incentive structure that apparently runs almost entirely on ego cries out for explanation.

...A taboo against ego-driven posturing therefore increases productivity. But that's a second-order effect; what is being directly protected here is the quality of the information in the community's peer-evaluation system. That is, boasting or self-importance is suppressed because it behaves like noise tending to corrupt the vital signals from experiments in creative and cooperative behavior.

...The hacker culture's medium of gifting is intangible, its communications channels are poor at expressing emotional nuance, and face-to-face contact among its members is the exception rather than the rule. This gives it a lower tolerance of noise than most other gift cultures, and goes a long way to explain the example in public humility required of its tribal elders.

...This trend has interesting implications for the near future. In early 1998, Linux looks very much like a category-killer for the niche 'free operating systems' - people who might otherwise write competing OSs are now writing Linux device drivers and extensions instead. And most of the lower-level tools the culture ever imagined having as open-source already exist. What's left?

...Applications. As the year 2000 approaches, it seems safe to predict that open-source development effort will increasingly shift towards the last virgin territory - programs for non-techies. A clear early indicator is the developmentof GIMP, the Photoshop-like image workshop that is open source's first major application with the kind of end-user-friendly GUI interface considered de rigeur in commercial applications for the last decade. Another is the amount of buzz surrounding application-toolkit projects like KDE and GNOME.

...Typically, a benevolent-dictator organization evolves from an owner-maintainer organization as the founder attracts contributors. Even if the owner stays dictator, it introduces a new level of possible disputes over who gets credited for what parts of the project.

...In this situation, custom places an obligation on the owner/dictator to credit contributors fairly (through, for example, appropriate mentions in README or history files). In terms of the Lockean property model, this means that by contributing to a project you earn part of its reputation return (positive or negative).

...Pursuing this logic, we see that a 'benevolent dictator' does not in fact own his entire project unqualifiedly. Though he has the right to make binding decisions, he in effect trades away shares of the total reputation return in exchange for others' work. The analogy with sharecropping on a farm is almost irresistible, except that a contributor's name stays in the credits and continues to 'earn' to some degree even after that contributor is no longer active.

...Some very large projects discard the 'benevolent dictator' model entirely. One way to do this is turn the co-developers into a voting committee (as with Apache). Another is rotating dictatorship, in which control is occasionally passed from one member to another within a circle of senior co-developers (the Perl developers organize themselves this way).

...While technical arguments over design might seem the most obvious risk for internecine conflict, they are seldom a serious cause of strife. These are usually relatively easily resolved by the territorial rule that authority follows responsibility.

Another way of resolving conflicts is by seniority - if two contributors or groups of contributors have a dispute, and the dispute cannot be resolved objectively, and neither owns the territory of the dispute, the side that has put the most work into the project as a whole (that is, the side with the most property rights in the whole project) wins.

These rules generally suffice to resolve most project disputes. When they do not, fiat of the project leader usually suffices. Disputes that survive both these filters are rare.

...Ultimately, all of these conflict-resolution mechanisms rest on the wider hacker community's willingness to enforce them. The only available enforcement mechanisms are flaming and shunning - public condemnation of those who break custom, and refusal to cooperate with them after they have done so.

...Many cultures use hidden clues (more precisely 'mysteries' in the religio-mystical sense) as an acculturation mechanism. These are secrets which are not revealed to outsiders, but are expected to be discovered or deduced by the aspiring newbie. To be accepted inside, one must demonstrate that one both understands the mystery and has learned it in a culturally approved way.

The hacker culture makes unusually conscious and extensive use of such clues or tests.

...Some have gone so far as to suggest that hacker customs are merely a reflection of the research community's folkways and are actually (for most) acquired there. This probably overstates the case, if only because hacker custom seems to be readily acquired by intelligent high-schoolers!

There is a more interesting possibility here. I suspect academia and the hacker culture share adaptive patterns not because they're genetically related, but because they've both evolved the most optimal social organization for what they're trying to do, given the laws of nature and the instinctive wiring of humans. The verdict of history seems to be that free-market capitalism is the globally optimal way to cooperate for economic efficiency; perhaps, in a similar way, the reputation-game gift culture is the globally optimal way to cooperate for generating (and checking!) high-quality creative work.

This point, if true, is of more than (excuse me) academic interest. It suggests from a slightly different angle one of the speculations in The Cathedral And The Bazaar; that, ultimately, the industrial-capitalist mode of software production was doomed to be out-competed from the moment capitalism began to create enough of a wealth surplus for many programmers to live in a post-scarcity gift culture.

...The culture's (and my own) understanding of large projects that don't follow a benevolent-dictator model is weak. Most such projects fail. A few become spectacularly successful and important (Perl, Apache, KDE). Nobody really understands where the difference lies. There's a vague sense abroad that each such project is sui generis and stands or falls on the group dynamic of its particular members, but is this true or are there replicable strategies a group can follow?

Complete List of Quotes from "The Magic Cauldon"

All quotes are taken from [ESR1999a] The Magic Cauldron (bold italics are mine --NNB):

...``The Cathedral and the Bazaar'' [CatB] described the ways in which decentralized cooperative software development effectively overturns Brooks's Law, leading to unprecedented levels of reliability and quality on individual projects.

...Indeed, widespread use of open-source software tends to increase its value, as users fold in their own fixes and features (code patches). In this inverse commons, the grass grows taller when it's grazed on.

...Unfortunately, there's a serious superbeing shortage, so patch author J. Random Hacker is left with two choices: sit on the patch, or throw it into the pool for free. The first choice gains nothing. The second choice may gain nothing, or it may encourage reciprocal giving from others that will address some of J. Random's problems in the future. The second choice, apparently altruistic, is actually optimally selfish in a game-theoretic sense.

...The real free-rider problems in open-source software are more a function of friction costs in submitting patches than anything else. A potential contributor with little stake in the cultural reputation game (see [HtN]) may, in the absence of money compensation, think ``It's not worth submitting this fix because I'll have to clean up the patch, write a ChangeLog entry, and sign the FSF assignment papers...''. It's for this reason that the number of contributors (and, at second order, the success of) projects is strongly and inversely correlated with the number of hoops each project makes a user go through to contribute. Such friction costs may be political as well as mechanical. Together they may explain why the loose, amorphous Linux culture has attracted orders of magnitude more cooperative energy than the more tightly organized and centralized BSD efforts and why the Free Software Foundation has receded in relative importance as Linux has risen.

...There are other reasons for closing source that are outright irrational. You might, for example, be laboring under the delusion that closing the sources will make your business systems more secure against crackers and intruders. If so, I recommend therapeutic conversation with a cryptographer immediately. The really professional paranoids know better than to trust the security of closed-source programs, because they've learned through hard experience not to. Security is an aspect of reliability; only algorithms and implementations that have been thoroughly peer-reviewed can possibly be trusted to be secure.

...Join the Apache group. The Apache server was built by an Internet-connected group of webmasters who realized that it was smarter to pool their efforts into improving one code base than to run a large number of parallel development efforts. By doing this they were able to capture both most of the advantages of roll-your-own and the powerful debugging effect of massively-parallel peer review.

...The advantage of the Apache choice is very strong. Just how strong, we may judge from the monthly Netcraft survey, which has shown Apache steadily gaining market share against all proprietary webservers since its inception. As of June 1999, Apache and its derivatives have 61% market share -- with no legal owner, no promotion, and no contracted service organization behind them at all.
Can the rise of Apache be attributed completely to the OSS model or there were other important factors? Why  IBM and other Apache resellers are not considered as contracted service organization ?

...The Apache story generalizes to a model in which software users find it to their advantage to fund open-source development because doing so gets them a better product than they could otherwise have, at lower cost.

...Open source makes it rather difficult to capture direct sale value from software. The difficulty is not technical; source code is no more nor less copyable than binaries, and the enforcement of copyright and license laws permitting capture of sale value would not by necessity be any more difficult for open-source products than it is for closed.

..The difficulty lies rather with the nature of the social contract that supports open-source development. For three mutually reinforcing reasons, the major open-source licenses prohibit most of the sort of restrictions on use, redistribution and modification that would facilitate direct-sale revenue capture. To understand these reasons. we must examine the social context within which the licenses evolved; the Internet hacker culture.

...Despite myths about the hacker culture still (in 1999) widely believed outside it, none of these reasons has to do with hostility to the market. While a minority of hackers does indeed remain hostile to the profit motive, the general willingness of the community to cooperate with for-profit Linux packagers like Red Hat, SUSE, and Caldera demonstrates that most hackers will happily work with the corporate world when it serves their ends. The real reasons hackers frown on direct-revenue-capture licenses are more subtle and interesting.

...One reason has to do with symmetry. While most open-source developers do not intrinsically object to others profiting from their gifts, most also demand that no party (with the possible exception of the originator of a piece of code) be in a privileged position to extract profits. J. Random Hacker is willing for Fubarco to profit by selling his software or patches, but only so long as JRH himself could also potentially do so.

...The final and most critical reason has to do with preserving the peer-review, gift-culture dynamic described in [HtN]. License restrictions designed to protect intellectual property or capture direct sale value often have the effect of making it legally impossible to fork the project (this is the case, for example, with Sun's so-called "Community Source" licenses for Jini and Java). While forking is frowned upon and considered a last resort (for reasons discussed at length in [HtN]), it's considered critically important that that last resort be present in case of maintainer incompetence or defection (e.g. to a more closed license).

...By open-sourcing the still-widely-popular Netscape browser, Netscape effectively denied Microsoft the possibility of a browser monopoly. They expected that open-source collaboration would accelerate the development and debugging of the browser, and hoped that Microsoft's IE would be reduced to playing catch-up and prevented from exclusively defining HTML.

...This strategy worked. In November 1998 Netscape actually began to regain business-market share from IE. By the time Netscape was acquired by AOL in early 1999, the competitive advantage of keeping Mozilla in play was sufficiently clear that one of AOL's first public commitments was to continue supporting the Mozilla project, even though it was still in alpha stage.

...O'Reilly Associates, publishers of many excellent references volumes on open-source software, is a good example of an accessorizing company. O'Reilly actually hires and supports well-known open-source hackers (such as Larry Wall and Brian Behlendorf) as a way of building its reputation in its chosen market.

...The Linux operating system, however, drives home a lesson that we should probably have learned years ago from the history of the Internet's core software and other branches of engineering -- that open-source peer review is the only scalable method for achieving high reliability and quality.

...The network effects behind TCP/IP's and Linux's success are fairly clear and reduce ultimately to issues of trust and symmetry -- potential parties to a shared infrastructure can rationally trust it more if they can see how it works all the way down, and will prefer an infrastructure in which all parties have symmetrical rights to one in which a single party is in a privileged position to extract rents or exert control.

...It is not, however, actually necessary to assume network effects in order for symmetry issues to be important to software consumers. No software consumer will rationally choose to lock itself into a supplier-controlled monopoly by becoming dependent on closed source if any open-source alternative of acceptable quality is available. This argument gains force as the software becomes more critical to the software consumer's business -- the more vital it is, the less the consumer can tolerate having it controlled by an outside party.

...From the analysis presented in [CatB], we can expect that open source has a high payoff where (a) reliability/stability/scalability are critical, and (b) correctness of design and implementation is not readily verified by means other than independent peer review. (The second criterion is met in practice by most non-trivial programs.)

...As for application area, we observed above that open-source infrastructure creates trust and symmetry effects that, over time, will tend to attract more customers and to outcompete closed-source infrastructure; and it is often better to have a smaller piece of such a rapidly-expanding market than a bigger piece of a closed and stagnant one. Accordingly, for infrastructure software, an open-source play for ubiquity is quite likely to have a higher long-term payoff than a closed-source play for rent from intellectual property.

...We may sum up this logic by observing that open source seems to be most successful in generating greater returns than closed source in software that (d) establishes or enables a common computing and communications infrastructure.

...In summary, the following discriminators push towards open source:

reliability/stability/scalability are critical
correctness of design and implementation cannot readily be verified by means other than independent peer review
the software is critical to the user's control of his/her business
the software establishes or enables a common computing and communications infrastructure
key methods (or functional equivalents of them) are part of common engineering knowledge.

...A first-order effect of this internal market structure is that no node in the net is indispensible. Developers can drop out; even if their portion of the code base is not picked up directly by some other developer, the competition for attention will tend to rapidly generate functional alternatives. Distributors can fail without damaging or compromising the common open-source code base. The ecology as a whole has a more rapid response to market demands, and more capability to resist shocks and regenerate itself, than any monolithic vendor of a closed-source operating system can possibly muster.

...Despite endless talk of open standards, despite numerous alliances and consortia and agreements, proprietary Unix fell apart.

...This is quite unlikely to happen to Linux, for the simple reason that all the distributors are constrained to operate from a common base of open source code.

...In a latter-day take on John Gilmore's famous observation that the Internet interprets censorship as damage and routes around it, it has been aptly said that the hacker community responsible for Linux interprets attempts at control as damage and routes around them. For Red Hat to have protested the pre-release cloning of its newest product would have seriously compromised its ability to elicit future cooperation from its developer community.

...There is one other respect in which the infusion of real money into the open-source world is changing it. The community's stars are increasingly finding they can get paid for what they want to do, instead of pursuing open source as a hobby funded by another day job. Corporations like Red Hat, O'Reilly Associates, and VA Linux Systems are building what amount to semi-independent research arms with charters to hire and maintain stables of open-source talent.

...This makes economic sense only if the cost per head of maintaining such a lab can easily be paid out of the expected gains it will enable by growing the firm's market faster. O'Reilly can afford to pay the principal authors of Perl and Apache to do their thing because it expects their efforts will enable it to sell more Perl- and Apache-related books. VA Linux Systems can fund its laboratory branch because improving Linux boosts the use value of the workstations and servers it sells. And Red Hat funds Red Hat Advanced Development Labs to increase the value of its Linux offering and attract more customers.

...Working to earn goodwill, and valuing it as an asset predictive of future market gains, is hardly novel either. What's interesting is the extremely high valuation that the behavior of these firms suggest they put on that goodwill. They're demonstrably willing to hire expensive talent for projects that are not direct revenue generators even during the most capital-hungry phases of the runup to IPO. And, at least so far, the market has actually rewarded this behavior.

...The overall trends are clear. We mentioned before IDC's projection that Linux will grow faster than all other operating systems combined through 2003. Apache is at 61% market share and rising steadily. Internet usage is exploding, and surveys such as the Internet Operating System Counter show that Linux and other open-source operating systems are already a plurality on Internet hosts and steadily gaining share against closed systems. The need to exploit open-source Internet infrastructure increasingly conditions not merely the design of other software but the business practices and software use/purchase patterns of every corporation there is. These trends, if anything, seem likely to accelerate.

...In a future that includes competition from open source, we can expect that the eventual destiny of any software technology will be to either die or become part of the open infrastructure itself. While this is hardly happy news for entrepreneurs who would like to collect rent on closed software forever, it does suggest that the software industry as a whole will remain entrepreneurial, with new niches constantly opening up at the upper (application) end and a limited lifespan for closed-IP monopolies as their product categories fall into infrastructure.

...Finally, of course, this equilibrium will be great for the software consumer driving the process. More and more high-quality software will become permanently available to use and build on instead of being discontinued or locked in somebody's vault. Ceridwen's magic cauldron is, finally, too weak a metaphor -- because food is consumed or decays, whereas software sources potentially last forever. The free market, in its widest libertarian sense including all un-coerced activity whether trade or gift, can produce perpetually increasing software wealth for everyone.