|Home||Switchboard||Unix Administration||Red Hat||TCP/IP Networks||Neoliberalism||Toxic Managers|
May the source be with you, but remember the KISS principle ;-)
Bigger doesn't imply better. Bigger often is a sign of obesity, of lost control, of overcomplexity, of cancerous cells
Nikolai Bezroukov. Portraits of Open Source Pioneers
For readers with high sensitivity to grammar errors access to this page is not recommended :-)
"Knuth began TeX because he had become annoyed at the declining quality of the typesetting in volumes I-III of his monumental "Art of Computer Programming" (see Knuth, also bible). In a manifestation of the typical hackish urge to solve the problem at hand once and for all, he began to design his own typesetting language. He thought he would finish it on his sabbatical in 1978; he was wrong by only about 8 years."
The Jargon File
Knuth took a decade off from writing The Art of Computer Programming to create the TeX typesetting language The third edition of The Art of Computer Programming is now typeset in TeX and that in itself was another landmark event. As Donald Knuth noted in his Amazon.com interview:
I've been accumulating corrections and emendations in my own personal copies of the books for 25 years, and people have written to me and said, "Don, do you know that there's a typo on page such and such?" I always knew about these mistakes and I wasn't happy to see thousands of copies printed every year having these mistakes in them. But I also knew that correcting them was a lot of work, as there are many, many cross-references. And my biggest project was to work on the volumes that haven't yet been finished. So, my original plan was simply to make an errata list for volumes 1, 2, and 3 that I could put up on the Web. I created a big database of corrections--there were quite a lot, about 200 pages for each volume--and posted them. When I showed this list of changes to a meeting of the TeX user group [TeX is a computer typesetting system developed by Mr. Knuth. Ed.], one of the guys in the audience volunteered for the hard work of putting the revisions in electronic form. He wound up creating many megabytes of data from which to generate the book. All I needed to do was double-check the corrections. All in all, several volunteers spent a couple of years of their lives doing the detail work and getting the revisions ready. In January of this year, I received volumes 1, 2, and 3 in electronic form, and used them to generate 2,000 laser-printed pages incorporating my hundreds of pages of errata, which looked something like The Art of Computer Programming. When a book exists as a computer file, you have a different feeling about it because you know it's something that you can easily improve. This is my life's work after all--I've spent 35 years on it--and I saw many, many places where I could make it better. So I spent the last seven months making this book into something special. Of course, I'm not unbiased, but in my humble opinion, I've gotten close to something that I can be really proud of. It's a much better book than I would have dared to attempt with the old method of correcting galleys by hand.
Many people do not understand that one of the first major open source project was neither GNU nor Linux. It was TeX. Knuth developed the first version of TeX in in 1971-1978 in order to avoid problem with typesetting of the second edition of his TAoCP volumes. The program proved popular and he produced a second version (in 1982) which was the basis of what we use today. The whole text of the program was published in this book The TeXbook (Addison-Wesley, 1984, ISBN 0-201-13447-0, paperback ISBN 0-201-13448-9). At this type there were several proprietary typesetting systems but they were not very flexible and had problem with complex formulas and sophisticated layouts of math formulas required for TAoCP.
Paradoxically the development of TeX was done largely in parallel with the development of troff in Bell Labs and in a way Donald Knuth "outprogram" very talented researchers in Bell Lab. At the same time it cannot be denied that he duplicated a lot of work already done by Unix team. Moreover due to lack of exposure to Unix at this some solutions were less elegant.
Bell Lab was one of the first companies which tried to create a set typesetting tools for computer. It started experimenting with typesetting around the same time as Knuth. J.E. Saltzer had written "runoff" for CTSS. Bob Morris moved it to the 635, and called it "roff". Ritchie rewrote that as "rf" for the PDP-7, before there was UNIX. At the same time, the summer of 1969, Doug McIllroy rewrote roff in BCPL (...), extending and simplifying it. Joseph Ossanna wrote troff and maintained it until his death in 1977.
As many people probably know the initial version of Unix for PDP 11 was formally developed to support publishing. In 1971 Unix developers wanted to get a PDP-11 for further work on the operating system. In order to justify the cost for this system, they proposed that they would implement a document formatting system for the AT&T patents division. And the key component of this system was roff/troff. Because troff required a commercial license, it was later reengineered for use with free version of Unix. The groff formatter suite used on most free BSD systems was written by James Clark in the 1980's from UNIX troff, with ideas from SoftQuad and other extended versions of troff. Nearly a decade later, Ted Dolotta created the memorandum (-mm) macros, with a lot of input from John Mashey. Thereafter, Eric Allman wrote the BSD -me macros. Here is a relevant quote from Groff History :
`troff' can trace its origins back to a formatting program called `runoff', written by J. E. Saltzer, which ran on MIT's CTSS operating system in the mid-sixties. This name came from the common phrase of the time "I'll run off a document." Bob Morris ported it to the 635 architecture and called the program `roff' (an abbreviation of `runoff'). It was rewritten as `rf' for the PDP-7 (before having UNIX), and at the same time (1969), Doug McIllroy rewrote an extended and simplified version of `roff' in the BCPL programming language.
The first version of UNIX was developed on a PDP-7 which was sitting around Bell Labs. In 1971 the developers wanted to get a PDP-11 for further work on the operating system. In order to justify the cost for this system, they proposed that they would implement a document formatting system for the AT&T patents division. This first formatting
program was a reimplementation of McIllroy's `roff', written by J. F. Ossanna.
When they needed a more flexible language, a new version of `roff' called `nroff' ("Newer `roff'") was written. It had a much more complicated syntax, but provided the basis for all future versions. When they got a Graphic Systems CAT Phototypesetter, Ossanna wrote a version of `nroff' that would drive it. It was dubbed `troff', for "typesetter `roff'", although many people have speculated that it actually means "Times `roff'" because of the use of the Times font family in `troff' by default. As such, the name `troff' is pronounced `t-roff' rather than `trough'.
With `troff' came `nroff' (they were actually the same program except for some `#ifdef's), which was for producing output for line printers and character terminals. It understood everything `troff' did, and ignored the commands which were not applicable (e.g. font changes).
Since there are several things which cannot be done easily in `troff', work on several preprocessors began. These programs would transform certain parts of a document into `troff', which made a very natural use of pipes in UNIX.
The `eqn' preprocessor allowed mathematical formulae to be specified in a much simpler and more intuitive manner. `tbl' is a preprocessor for formatting tables. The `refer' preprocessor (and the similar program, `bib') processes citations in a document according to a bibliographic database.
Unfortunately, Ossanna's `troff' was written in PDP-11 assembly language and produced output specifically for the CAT phototypesetter. He rewrote it in C, although it was now 7000 lines of uncommented code and still dependent on the CAT. As the CAT became less common, and was no longer supported by the manufacturer, the need to make it support
other devices became a priority. However, before this could be done, Ossanna was killed in an auto accident.
So, Brian Kernighan took on the task of rewriting `troff'. The newly rewritten version produced a device independent code which was very easy for postprocessors to read and translate to the appropriate printer codes. Also, this new version of `troff' (called `ditroff' for "device independent `troff'") had several extensions, which included drawing functions.
Due to the additional abilities of the new version of `troff', several new preprocessors appeared. The `pic' preprocessor provides a wide range of drawing functions. Likewise the `ideal' preprocessor did the same, although via a much different paradigm. The `grap' preprocessor took specifications for graphs, but, unlike other preprocessors, produced `pic' code.
James Clark began work on a GNU implementation of `ditroff' in early 1989. The first version, `groff' 0.3.1, was released June 1990. `groff' included:
- A replacement for `ditroff' with many extensions.
- The `soelim', `pic', `tbl', and `eqn' preprocessors.
- Postprocessors for character devices, POSTSCRIPT, TeX DVI, X windows.
- GNU `troff' also eliminated the need for a separate
- `nroff' program with a postprocessor which would produce ASCII output.
- A version of the `me' macros and an implementation of the `man'
- Also, a front-end was included which could construct the, sometimes painfully long, pipelines required for all the post- and preprocessors.
Development of GNU `troff' progressed rapidly, and saw the additions of a replacement for `refer', an implementation of the `ms' and `mm' macros, and a program to deduce how to format a document (`grog').
It was declared a stable (i.e. non-beta) package with the release of version 1.04 around November 1991.
Beginning in 1999, `groff' has new maintainers (the package was an orphan for a few years). As a result, new features and programs like `grn', a preprocessor for gremlin images, and an output device to produce HTML output have been added.
I think that one of reason for performing this colossal work and essentially beating a very talented AT&T team on their own turf was Donald Knuth love for typesetting. Otherwise working with some vendor like Adobe to improve an exiting tool so that it became suitable for the texts with complex mathematical formulas like typesetting of TAOCP would be a much better solution of the problem. After all, TAOCP is not an open book. And free software is only free if you don't place a monetary value on your own time. From this point of view eight years of Donald Knuth time was a huge investment which probably could be more productively spend on written another volume of TAOCP. After all troff can be extended to do all the things TeX can and that can be done by somebody else then Knuth. The only benefit that I see is that computer science development stalled after, say 1990 and due a decade consumed by work on TeX Knuth got into a better position to systematize more of less static body of knowledge. The field just lost its dynamic. You can see yourself the this loss of dynamic and beginning of the "sclerosisation" of computer science by browsing old CACM issues: while in 80th almost each issue contained groundbreaking articles in late 80th they became much less impressive, almost completely dull reading with articles of questionable quality not only because editors were asleep at the wheel but because there was not better.
Anyway, eight year were lost for peripheral ( from the point if view of TAOCP goals) project but as a side effect Donald Knuth emerged as one of open source pioneers. Unlike most open source pioneers he published not only his code but major algorithms as well and that put him above pure "code junkies" ;-)
Here is how Donald Knuth justified writing TeX in his Advocado Interview:
Advogato: The first questions that I have are about free software. TeX was one of the first big projects that was released as free software and had a major impact. These days, of course, it's a big deal. But I think when TeX came out it was just something you did, right?
Prof. Knuth: I saw that the whole business of typesetting was being held back by proprietary interests, and I didn't need any claim to fame. I had already been successful with my books and so I didn't have to stake it all on anything. So it didn't matter to me whether or not whether I got anything financial out of it.
There were people who saw that there was a need for such software, but each one thought that they were going to lock everyone into their system. And pretty much there would be no progress. They wouldn't explain to people what they were doing. They would have people using their thing; they couldn't switch to another, and they couldn't get another person to do the typesetting for them. The fonts would be only available for one, and so on.
But I was thinking about FORTRAN actually, the situation in programming in the '50s, when IBM didn't make FORTRAN an IBM-only thing. So it became a lingua franca. It was implemented on all different machines. And I figured this was such a new subject that whatever I came up with probably wouldn't be the best possible solution. It would be more like FORTRAN, which was the first fairly good solution [chuckle]. But it would be better if it was available to everybody than if there were all kinds of things that people were keeping only on one machine.
So that was part of the thinking. But partly that if I hadn't already been successful with my books, and this was my big thing, I probably would not have said, "well, let's give it away." But since I was doing it really for the love it and I didn't have a stake in it where I needed it, I was much more concerned with the idea that it should be usable by everybody. It's partly also that I come out of traditional mathematics where we prove things, but we don't charge people for using what we prove.
So this idea of getting paid for something over and over again, well, in books that seems to happen. You write a book and then the more copies you sell the more you get, even though you only have to write the book once. And software was a little bit like that.
It is interesting to note that TeX begins slightly earlier than Unix but was finished long after Unix became a major player on the OS arena. Actually one can say that Knuth wrote an open source alternative to Unix troff (see below).
TeX is the composition engine (strictly speaking, an interpreter, not a compiler). It is essentially a batch engine, although a limited amount of interactivity is possible when processing a file, to allow error recovery and diagnostic.
To produce his own books, Knuth had to use the rich set of academic publishing conventions and typesetting styles – footnotes, floating insertions (figures and tables), etc., etc. To simplify his work he developed an input language that permit typesetting complex math expressions. As a markup language TeX is relatively low level (skip so much space, change to font X, set this string of words in paragraph form, ...), but can be enhanced by macro commands.
The handling of footnotes and similar structures are so well behaved that "style files" have been created for TeX to process critical editions and legal tomes. It is also (after some highly useful enhancements in about 1990) able to handle the composition of many different languages according to their own traditional rules, and for this reason (as well as for the low cost) is quite widely used in Eastern Europe.
From the start, it became very popular among mathematicians to the extent that many mathematical publications accept papers only in TeX. Some of the algorithms in TeX have not been bettered in any of the composition tools devised in the years since TeX appeared. The most obvious example is the paragraph typesetting with sophisticated line breaking.
TeX produces a "device independent" output file – .dvi – that must then be translated to the particular output device being used (a laser printer, inkjet printer, typesetter; in the "old days" even daisy-wheel printers were used). The DVI translator actually accesses the font shapes, either as bitmaps, Type 1 fonts, or pointers to fonts installed in a printer with the shapes not otherwise accessible. PostScript is one of the most popular "final" output forms for TeX.
Despite its age TeX is still hold its own ground or production of books and journal articles in research mathematics. Few other tools proprietary or otherwise, can handle the complex formula and layout so well and produce high-quality, publication-ready output.
Here is an old (1999) interview by Steve Ditlea that I managed to find that cover some little known aspects of creation of TeX (Sept-Oct 99 Rewriting the Bible in 0's and 1's) :
Now intent on completing his scriptures, the 61-year-old Knuth (ka-NOOTH) leads what he calls a hermit-like existence (with his wife) in the hills surrounding the university, having taken early retirement from teaching. He has unplugged his personal e-mail account, posting a Web page (www-cs-faculty.stanford.edu/~knuth/) to keep the software multitudes at bay by answering frequently asked questions such as, “When is Volume 4 coming out?”
About once a month during the academic year, Knuth comes down from the heights to a basement lecture room in the Gates Computer Science Building at Stanford to deliver one of his “Computer Musings” lectures, usually about some aspect of his current work on The Art of Computer Programming. These talks draw computer science students, visiting professors, software engineers from nearby companies and an occasional CEO. On a balmy day earlier this year, the topic and listeners are different. To celebrate the publication of the third volume of his collected papers, Digital Typography, the associates of the Stanford University Libraries have invited an audience of fans of the printed word to hear Knuth talk about creating the TeX system for scientific and mathematical publication. Wearing a black T-shirt over a long-sleeve black shirt, his bald pate glistening in the overhead lights, he appears suitably monkish before about 70 acolytes and colleagues.
Hesitatingly, his words fighting his innate Lutheran modesty, he begins: “My main life’s work and the reason that I started this whole project is to write a series of books called The Art of Computer Programming—for which I hope to live another 20 years and finish the project I began in 1962. Unfortunately, computer programming has grown over the years and so I’ve had to write a little more than I thought when I sketched it out.” The faithful laugh knowingly.
Knuth relates his detour into digital typography during the 1970s. This was a time of enormous change in the typesetting industry, as computer systems replaced the hot type that had been used since the day of Gutenberg. Computer typography was less expensive, but also less esthetically pleasing—especially for complex mathematical notation. Recalls Knuth:
“As printing technology changed, the more important commercial activities were treated first and mathematicians came last. So our books and journals started to look very bad. I couldn’t stand to write books that weren’t going to look good.”
Knuth took it upon himself to write every line of code for software that yielded beautiful typography. He drew the name of his typesetting program from the Greek word for art—the letters are tau epsilon chi (it rhymes with “blecch”). Says Knuth: “Well over 90 percent of all books on mathematics and physics” are typeset with TeX and with its companion software, Metafont, a tool Knuth developed to design pleasing type fonts.
He is quick to acknowledge the contribution of the type designers, punch cutters, typographers, book historians and scholars he gathered at Stanford while developing TeX. Some are in the audience. He tells them: “TeX is what we now call open-system software—anybody around the world can use it free of charge. Because of this, we had thousands of people around the world to help us find all the mistakes. I think it’s probably the most reliable computer program of its size ever.”
Anyone who doubts this claim by the decidedly unboastful Knuth can find confirmation from Guy Steele, one of TeX’s first users and now a distinguished engineer at Sun Microsystems. TeX, says Steele, was one of the first large programs whose source code was published openly. Steele says Knuth’s publication of the TeX code in a book, along with full comments, made it so that “anyone could understand how it works and offer bug fixes.” With academe’s top scientists and mathematicians as beta-testers, an extraordinary quality control team helped perfect TeX. (The TeX development effort was a model for today’s open-source software movement, which has given the world Linux—an operating system that is beginning to compete with Microsoft Windows.)
Perfectability is a major preoccupation of Knuth’s. The only e-mail address Knuth maintains gathers reports of errata from readers of his books, offering $2.56 for each previously unreported error. (The amount is an inside joke: 256 equals 2 to the 8th power—the number of values a byte can represent.) Knuth’s reward checks are among computerdom’s most prized trophies; few are actually cashed.
He takes this error business very seriously. Engraved in the entryway to his home are the words of Danish poet Piet Hein:
The road to wisdom?
Well it’s plain
and simple to express:
and err again
In a variation on this theme of perfectability, Knuth’s contribution to computer science theory in the pages of The Art of Computer Programming has been his rigorous analysis of algorithms. Using methods in his book, the operations used to translate machine instructions into equations can be tested to determine whether they are optimal. Improving a program then becomes a question of finding algorithms with the most desirable attributes. Not that theoretical proofs can replace actually running software on a computer. In an often-cited remark he mentions on his Web page, he once warned a colleague: “Beware of the above code; I have only proved it correct, not tried it.”
In Knuth’s Stanford talk, perfectability was again a theme. He followed the pages in his volume on Digital Typography beyond its introductory chapters to the longest section in the book, which attacks a crucial problem in typography. He calls his listeners’ attention to “one of the main technical tricks in the TeX system: the question of how to break up paragraphs so that the lines are approximately equal and good.”
Poor spacing between words and ugly choices for line breaks had been among the major computer typography gaffes that launched Knuth on his TeX crusade. Odd word chasms, ladders of hyphens, and orphaned bits of text resulted from the rigid algorithms used to program line breaks without regard for visual elegance. Knuth’s solution: have the computer use trial-and-error methods to test how each paragraph of text can best be broken up. Instead of “greedy” algorithms packing in the most words on a line—standard in computer typography before and after TeX — Knuth’s computation-intensive method evaluates beauty.
Knuth seems born to the task of promoting beauty on the printed page — via computational methods. “I had a love of books from the beginning,” he tells his audience. “In my mother’s collection, we found the first alphabet book I had. I had taken the letters and counted all the serifs.” He is proud of his early literacy, telling a writer that he was the youngest member of the Book Worm Club at the Milwaukee Public Library. His interest in typographic reproduction also came early in life. One of his earliest memories of pre-desktop publishing was helping his father, Ervin, with the mimeograph stencils for printing the church newsletter in the basement. Like his father’s newsletter, TeX was meant to be a homebrew project, on a manageable scale. “The original intent was that it would be for me and my secretary,” he tells TR in an interview in his home’s second-floor study. Leaning back in the black lounge chair, Knuth acknowledges that the long journey into TeX was intended to be a quick side trip: “I was going to finish it in a year.”
Events took a different turn. In 1978, Sun’s Steele—then an MIT grad student visiting Stanford—translated TeX for use on MIT’s mainframe computer. Suddenly, Knuth recalls, “I had 10 users, then 100. Each time it went through different levels of error. In between the 1,000th and 10,000th user, I tore up the code and started over.” Knuth says he realized then that TeX wasn’t just a digression, it was itself part of the vision. “I saw that this fulfilled a need in the world and so I better do it right.”
A key turning point in the spread of TeX was a lecture Knuth gave before the American Mathematical Society (AMS). Barbara Beeton, a staff specialist in composition systems for AMS and a longtime official of the Portland, Ore.-based TeX User’s Group, remembers the occasion: “He was invited to deliver the Josiah Willard Gibbs Lecture. Albert Einstein and John von Neumann had been among the previous speakers. Knuth talked about his new typesetting system for the first time in public.” Knuth was preaching to the choir; the assembled mathematicians were familiar with how printing quality had declined. Adds Beeton: “TeX was the first composition system meant to be used by the author of an article or book” as opposed to a publishing house. Soon after, AMS became the original institutional user of TeX, employing Knuth’s system to publish all of its documents and journals.
As word spread and more users took advantage of his free software (written for an academic mainframe computer but soon made available for PCs), Knuth found himself studying the history of printing to find solutions for narrow applications. Often as not, his research proved fruitless and he would have to come up with his own answer. For ceremonial invitations, he created new fonts; for musical typesetting he solved difficult alignment problems. “I had so many users,” he recalls. “From wedding invitations and programs for the local symphonic orchestra to computer programs.”
For nearly nine years, Knuth’s foray into typography occupied him full time—pulling him away from work on the programming book that he considered his true calling. “I had to think of the endgame,” he says. “How could I responsibly finish TeX and say: This is not going to change anymore? I had to work out a four-year strategy to extricate myself” and return to The Art of Computer Programming.
Knuth’s solution: with the release of TeX version 3.0 in 1990, he declared his work complete. Disciples will have to maintain the system. Knuth says he will limit his work to repairing the rare bugs brought to his attention; with each fix he assigns one more digit to the version number so that it tends to pi (the current version is 3.14159).
One result of Knuth’s decision to stop making major changes to TeX is that the TeX file format has remained unchanged. “It’s the only software where you can take the file for your paper from 1985 and not have to convert it to print out the same way today,” notes David Fuchs, a senior researcher with Liberate Technologies (formerly Network Computer Inc.),who was a grad student at Stanford during the development of TeX. Fuchs estimates that there are 1 million TeX users worldwide; many employ special-purpose commercial packages built around the TeX “kernel,” such as LATeX (a command-oriented macro language) and TeXDoc (optimized for software documentation).
“On the downside, TeX is limited in its appeal because it’s not WYSIWYG,” Fuchs admits, employing the acronym for “what you see is what you get”—the standard term describing text processing software that displays formatting on screen as it will appear on the printed page. Rather than offering real-time onscreen interactivity, TeX requires a markup language typed into a document and interpreted by the computer; you see what you get only after it is in print. Despite its unintuitive user interface, TeX has developed a dedicated core of production professionals who will accept no substitute. “Why would anyone want anything else?” asks Paul Anagnostopolis, a Carlisle, Mass.-based publishers’ consultant and author of TeX-based software for book composition. “A lot of people don’t care about WYSIWYG.”
Groupthink : Two Party System as Polyarchy : Corruption of Regulators : Bureaucracies : Understanding Micromanagers and Control Freaks : Toxic Managers : Harvard Mafia : Diplomatic Communication : Surviving a Bad Performance Review : Insufficient Retirement Funds as Immanent Problem of Neoliberal Regime : PseudoScience : Who Rules America : Neoliberalism : The Iron Law of Oligarchy : Libertarian Philosophy
War and Peace : Skeptical Finance : John Kenneth Galbraith :Talleyrand : Oscar Wilde : Otto Von Bismarck : Keynes : George Carlin : Skeptics : Propaganda : SE quotes : Language Design and Programming Quotes : Random IT-related quotes : Somerset Maugham : Marcus Aurelius : Kurt Vonnegut : Eric Hoffer : Winston Churchill : Napoleon Bonaparte : Ambrose Bierce : Bernard Shaw : Mark Twain Quotes
Vol 25, No.12 (December, 2013) Rational Fools vs. Efficient Crooks The efficient markets hypothesis : Political Skeptic Bulletin, 2013 : Unemployment Bulletin, 2010 : Vol 23, No.10 (October, 2011) An observation about corporate security departments : Slightly Skeptical Euromaydan Chronicles, June 2014 : Greenspan legacy bulletin, 2008 : Vol 25, No.10 (October, 2013) Cryptolocker Trojan (Win32/Crilock.A) : Vol 25, No.08 (August, 2013) Cloud providers as intelligence collection hubs : Financial Humor Bulletin, 2010 : Inequality Bulletin, 2009 : Financial Humor Bulletin, 2008 : Copyleft Problems Bulletin, 2004 : Financial Humor Bulletin, 2011 : Energy Bulletin, 2010 : Malware Protection Bulletin, 2010 : Vol 26, No.1 (January, 2013) Object-Oriented Cult : Political Skeptic Bulletin, 2011 : Vol 23, No.11 (November, 2011) Softpanorama classification of sysadmin horror stories : Vol 25, No.05 (May, 2013) Corporate bullshit as a communication method : Vol 25, No.06 (June, 2013) A Note on the Relationship of Brooks Law and Conway Law
Fifty glorious years (1950-2000): the triumph of the US computer engineering : Donald Knuth : TAoCP and its Influence of Computer Science : Richard Stallman : Linus Torvalds : Larry Wall : John K. Ousterhout : CTSS : Multix OS Unix History : Unix shell history : VI editor : History of pipes concept : Solaris : MS DOS : Programming Languages History : PL/1 : Simula 67 : C : History of GCC development : Scripting Languages : Perl history : OS History : Mail : DNS : SSH : CPU Instruction Sets : SPARC systems 1987-2006 : Norton Commander : Norton Utilities : Norton Ghost : Frontpage history : Malware Defense History : GNU Screen : OSS early history
The Peter Principle : Parkinson Law : 1984 : The Mythical Man-Month : How to Solve It by George Polya : The Art of Computer Programming : The Elements of Programming Style : The Unix Hater’s Handbook : The Jargon file : The True Believer : Programming Pearls : The Good Soldier Svejk : The Power Elite
Most popular humor pages:
Manifest of the Softpanorama IT Slacker Society : Ten Commandments of the IT Slackers Society : Computer Humor Collection : BSD Logo Story : The Cuckoo's Egg : IT Slang : C++ Humor : ARE YOU A BBS ADDICT? : The Perl Purity Test : Object oriented programmers of all nations : Financial Humor : Financial Humor Bulletin, 2008 : Financial Humor Bulletin, 2010 : The Most Comprehensive Collection of Editor-related Humor : Programming Language Humor : Goldman Sachs related humor : Greenspan humor : C Humor : Scripting Humor : Real Programmers Humor : Web Humor : GPL-related Humor : OFM Humor : Politically Incorrect Humor : IDS Humor : "Linux Sucks" Humor : Russian Musical Humor : Best Russian Programmer Humor : Microsoft plans to buy Catholic Church : Richard Stallman Related Humor : Admin Humor : Perl-related Humor : Linus Torvalds Related humor : PseudoScience Related Humor : Networking Humor : Shell Humor : Financial Humor Bulletin, 2011 : Financial Humor Bulletin, 2012 : Financial Humor Bulletin, 2013 : Java Humor : Software Engineering Humor : Sun Solaris Related Humor : Education Humor : IBM Humor : Assembler-related Humor : VIM Humor : Computer Viruses Humor : Bright tomorrow is rescheduled to a day after tomorrow : Classic Computer Humor
The Last but not Least
Copyright © 1996-2018 by Dr. Nikolai Bezroukov. www.softpanorama.org was initially created as a service to the (now defunct) UN Sustainable Development Networking Programme (SDNP) in the author free time and without any remuneration. This document is an industrial compilation designed and created exclusively for educational use and is distributed under the Softpanorama Content License. Original materials copyright belong to respective owners. Quotes are made for educational purposes only in compliance with the fair use doctrine.
FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to advance understanding of computer science, IT technology, economic, scientific, and social issues. We believe this constitutes a 'fair use' of any such copyrighted material as provided by section 107 of the US Copyright Law according to which such material can be distributed without profit exclusively for research and educational purposes.
This is a Spartan WHYFF (We Help You For Free) site written by people for whom English is not a native language. Grammar and spelling errors should be expected. The site contain some broken links as it develops like a living tree...
|You can use PayPal to make a contribution, supporting development of this site and speed up access. In case softpanorama.org is down you can use the at softpanorama.info|
The statements, views and opinions presented on this web page are those of the author (or referenced source) and are not endorsed by, nor do they necessarily reflect, the opinions of the author present and former employers, SDNP or any other organization the author may be associated with. We do not warrant the correctness of the information provided or its fitness for any purpose.
Last modified: September 12, 2017