|(slightly skeptical) Educational society promoting "Back to basics" movement against IT overcomplexity and bastardization of classic Unix
Nikolai Bezroukov. Portraits of Open Source Pioneers
For readers with high sensitivity to grammar errors access to this page is not recommended :-)
|Therefore, if anyone is in Christ, he is a new creation; old things have passed away; behold, all things have become new.
Contrary to what many believe scripting languages are the most innovative part of the open source movement. After all Linux kernel is just a re-implementation of Unix kernel and there was at least one (and in some respects better) free Unix kernel in existence before Linux kernel was developed (FreeBSD kernel).
Perl and other scripting languages were something really new in 1990th. Before Perl only two rudimentary scripting languages were implemented -- AWK and REXX. Each was an important innovation but both dod notgo far enough. Perl was the first scripting language which went "far enough"
Perl smashed popular belief that scripting is "unsafe" or "second rate" or "prototype" solution. Java zealots often dismiss scripting as a "quick and dirty stuff that is somehow less significant than the programming behind compiled commercial applications." (forgetting that java is "compiled on the fly" language; and are in turn dismissed by C or C++ zealots for the same reasons ;-)
But if the project had died, than it does not matter what was an implementation language, so for a real project and tough schedules higher level language is of paramount importance and here scripting languages have a huge edge over Java.
Also Perl suffers less from the OO cult excesses then, say, Python or Ruby. The overhead of "objectifying" every piece of data is too much of a performance hit for high-volume text processing. But this did not prevented from BioPython displacing BioPerl. Fashion and groupthink rules in programming languages area. Nothing succeed more then success. So in a way Perl is the language for independent thinks who were not seduced by OO fads and think the "modules are enough".
I like the commend made recently by robdelacruz at reddit.com
robdelacruz 5 points· 5 days ago
Like you said, the way to appreciate Perl is to be aware that it is part of the Unix package. I think Larry Wall said it best: "Perl is, in intent, a cleaned up and summarized version of that wonderful semi-natural language known as 'Unix'."
See Softpanorama Scripting Page for more details.
Perl was a breakthrough that is difficult to understand in retrospect because it achieved so much in establishing the legitimacy of scripting languages as a separate, distinct class of programming languages. It also violated a lot of common CS dogmas, and still managed to achieve a tremendous success, which actually discredited orthodox views on computer language design. May be that's why it create so much animosity at CS departments, why they pushed first Java and then Python as the first language for students to rehabilitate themselves. The animosity which reminds me jeremiads against PL/1 another, albeit older, revolutionary language what was a precursor of C.
With all due respect for PHP, Python, Ruby and other scripting languages, it was Perl which served as an icebreaker that navigated previously unknown and dangerous Arctic path. Those who followed it later faced mostly clean water (may be due to global warming -- much better hardware, which also legitimized overhead inherent in interpretive languages and typical for Python and Ruby ;-).
Perl also introduced several innovation is lexical structure of languages. One is a wide use of "tagged literals" (q, qq, qr, qx) which much later (in 2019) is found its way into Python as f-strings. Still the idea of "dynamic delimiters for literals remains the unique Perl feature. Other language designers were simply too afraid to replicate it, as it is a clear kick in chin to the adherents of language design orthodoxy.
Another interesting innovation of Perl is the statement that allow you to define a namespace (package statement). Perl was that first major language that allow explicit manipulation with the namespace of a program.
Only TCL can compete with Perl for this "icebreaker" role as it innovated in the area of macro languages --it was the language which can serve both as a macro language for application and as a shell for the OS . But unfortunately TCL success was very limited and later it found its niche as a scripting language in networking. But truth be told REXX, which was not open source language, but a proprietary language developed by IBM, did similar things before TCL: it served as both shell and macro language in VM/CMS, Amiga, OS/2, and even IBM DOS 7.
I have to say that Larry Wall is one of the most interesting and insightful people to show up in the scripting languages design arena. In addition to Perl he is also the author of Artistic license and, thus, a very elegant idea of dual licensing of open source software. That alone is a significant achievement, as we all know that GPL is an ultimate "incompatible" license. It is incomparable with most of other popular licenses. Dual licensing approach invented by Wall undermined the radicalism of GNU license, make it more acceptable for commercial developers and as such dramatically increased GPL popularity. In a way Larry Wall can be viewed as a hidden co-author of GPL, or at least major contributor to its success. See Labyrinth of Software Freedom (BSD vs. GPL and social aspects of free licensing debate) for the discussion of this problem.
If you've read any of his writings, especially his "State of the Onion" addresses, you'll see that he manages to present his thoughts and beliefs in a very unorthodox, sometimes outright strange way. That does not mean that each and every "State of the Onion" address is an interesting read. To the contrary some of them are really boring. But sometimes I really enjoy the way he manages to make analogies between completely desperate things, not only because that makes the subject matter more interesting, but also because "in programming in large" you really should be able to jump between different levels of abstraction levels in viewing the system. Here is one example from The State of the Onion 9 - Perl.com (2005) which sounds in really new light after recent NSA revelations. Pretty visionary as it is almost 15 years old:
Comrades, here in the People's Republic, the last five years have seen great progress in the science of computer programming. In the next five years, we will not starve nearly so many programmers, except for those we are starving on purpose, and those who will starve accidentally. Comrades, our new five-year mission is to boldly go where no man has gone before! Oh wait, wrong TV show.
You might say that Perl grew out of the Cold War. I've often told the story about how Perl was invented at a secret lab that was working on a secret NSA project, so I won't repeat that here, since it's no secret. Some of you have heard the part about my looking for a good name for Perl, and scanning through /usr/dict/words for every three- and four-letter word with positive connotations. Though offhand, I can't explain how I missed seeing Ruby. So anyway, I ended up with "Pearl" instead.
But it's a little known fact that one of the three-letter names I considered for quite a while was the word "spy." Now, those of you who took in Damian's session on Presentation Aikido are now wondering whether I'm just making this up to make this speech more interesting. And in this particular case, I'm not. You can ask my brother-in-law, who was there. On the other hand, please don't ask him to vouch for anything else in this speech.
But wouldn't "Spy" be a great name to give to a language whose purpose was pattern matching and reporting? Hmm. And spies are also called "agents of change." "Practical extractions are one of our specialties."
... ... ...As I was thinking about the intelligence community and its recent obvious failures, it kinda put a new spin onto the phrase, "Information wants to be free," or my own version of it, which is that "Information wants to be useful."
We often think that intelligence failures are caused by having too little information. But often, in retrospect, we find that the problem is too much information, and that in fact, we had the data available to us, if only it had been analyzed correctly.
So I'm just wondering if we're getting ourselves into a similar situation with open source software. More software is not always better software. Google notwithstanding, I think it's actually getting harder and harder over time to find that nugget you're looking for. This process of re-inventing the wheel makes better wheels, but we're running the risk of getting buried under a lot of half-built wheels.
And there are two take-home lessons from that. The first is that, as an open source author, you should be quick to try to make someone else's half-built wheel better, and slow to try to make your own. We're making progress in this realm in the Perl community, but I don't think any open source community ever gets good enough at harmonizing the dissimilar interests that sometimes lead to project forks. We can always improve there.
The second take-home lesson is this. Pity your poor intelligence analyst back at headquarters. He's not all that intelligent, after all. The intelligence of the intelligence community is distributed, and it's often the Tinas and the Wheelbarrows of the world that know when they've got a piece of hot information. But somehow that meta-information gets lost on transmission back to headquarters.
I think that there are several types of great programmers. And one such type don't seem to be as focused on depth and in this sense represents a completely opposite approach in the spectrum of programming titans to my hero Donald Knuth. They draw much of their power from the some kind of "component vision" instead of "digging to the bottom of it" approach. The more different systems you experience, the more you can abstract the particular things you have to be working on and actually transcend the implementation language and platform. In a way this is a hidden philosophy of Unix which borrowed freely from both Multics and OS/360. I see Unix first of all as the first successful componentization scheme in programming invented by accident (as a side product of developing an OS based on extension of CTSS with Multics ideas) and it is only natural that the first generation of scripting languages that Perl represents flourished in Unix environment.
This "programming in the large" approach is still in its infancy and there will be major innovation in decades ahead. One of such innovations is the use of virtual machines with the application installed on a specially created (with unnecessary parts and daemons removed) version of OS. This approach now is called Virtual Software Appliances. In this role the OS became a development framework, a part of application (and Unix was actually design as a development framework, at least initially). For example you can use it for logging, scheduling and other things necessary for your application without reinventing the bicycle. But other innovations will definitely happen because virtual machines is such a powerful concept. And because this is one of the few places were programming is still really interesting and innovative activity.
The sprit of exploration of "terra incognita" is now completely lost in general programming and it was actually destroyed by OO orthodoxy more then by anything else. Exploration of the new scripting languages is the one of the very few ways to have fun with systems programming again. And as I mentioned before, the more languages you master, the better programmer you become in the language of your choice. So please don't miss the opportunity to learn Perl. It has some features that you never encounter in other major scripting languages, features that make solving some types of programming problems much easier. For example it is the only major scripting language that has pointers as an explicit language construct. That instantly means that this language is not for suckers, but for real programmers, although everybody can use its subset without pointers. See A Slightly Skeptical View on Scripting Languages for more elaborate discussion.
Probably only PL/1 experienced the level of malign and nasty attacks that Perl did. PL/1 from verification zealots, Perl from OO zealots and other adherents of "purity" in programming language design. BTW the language that false claims that "there should preferable one and only one way of doing particular thing" -- Python grow in version 3.8 into completely Byzantine mess which reminds me the architecture of OS/360 -- "everything is a structure and you sucker better learn them all" is the view of the world of OS/360 designers and IBM in general.
While Unix way to view the world is "everything is string". BTW now in Python for certain things like I/O there are more way to "do it" then in Perl. Even the set of standard functions, which is generally superior to Perl, also can be easily improved and "orthogonalized". Which, in a way "makes them "brothers in misery" ;-). Perl idea is that as programming became more complex more of it, including most of Unix API, should be incorporated into the language. This is a viable idea because you can't avoid the complexity of Unix by offloading it to a hundred or so ragtag utilities written in non-uniform style and "glued" by the shell. So the simplicity of environment is an illusion. And the answer is the creation of the language that at least try to aggregate a large part of this complexity in a more uniform way -- on the level of language features.
So in a major way it is a kick in a chin to the designers of language with small core (C++, Java, Python, and friends) and they can't forgive the Perl success as complex non-orthogonal language, because it destroys their cherished bur unrealistic dream. What is funny that most of the languages initially designed with a small core stopped having it pretty soon (just look at the level of complexity of Python 3.8) and they develop a Byzantine set of module/packages to implement the necessary functionality which defeats their initial design goal. Python treatment of regular expressions is one example here. It was a language design blunder. I/O is another which add insult to the injury.
A lot of resources have been pushed into Python in the past decade. It seems to me that this has come at the opportunity cost of traditional Linux/Unix sysadmins.
What is amazing is the level of ignorance and self-importance demonstrated by users of other programming languages when they critique of Perl. I would immediately discard those who claims that that language is less transparent then, say, Python, because this is a clear sign of person who never wrote any substantial (say, over 1K of code excluding comments) program in Perl. Perl does has it warts, and many of them, but paradoxically it fares really well in comparison with alternatives. Especially for system administrators, who need to use shell as their primary language. And that list includes Python and Ruby because the level of complexity they eventually acquired and their violation of C-language transition of programming language design. Bothe designers of Python and Ruby implicitly adhered the slogan he slogan: in the next five years, we will not starve nearly so many programmers in C and Unix-shell, except for those we are starving on purpose, and those who will starve accidentally by violating each and every principle of C-language design and continuity with Unix shell ;-)
Perl 5 is much less "self-important" that those two language and is more understandable and easily learnable for both Bash and C-language programmers then those alternatives.
You can read my thought of this topic in Perl as powerful and flexible tool for Unix system administrators and defensive programming in Perl. As of 2020 Perl returned to his roots. It was displaced in his role as the major CGI programming language and returned to its initial role of the scripting language of choice for elite Unix/Linux administrators.
Many point of out on Perl decline in the industry as partially based on measures on language popularity used. For example the number of program written by novices always exceed the number of large programs whiten by professionals and having a substantial user base. There is big difference is writing toy programs and large programs and languages in which you can write complex programs not always win in popularity contest.
As I already mentioned Larry Wall is one of the most interesting figures on the open source/free software arena. The only one with the implicit vow of poverty, kind of open source software monk (or, more correctly, a priest). Unlike most other major figures of open source movement he never tried to commercialize his creation. That actually did not served him all too well when he was diagnosed with cancer. Wall's Christian values also has influenced some of the terminology of Perl function bless as well as organization of Perl 6 design documents with categories such as apocalypse and exegesis. In comparison with Linux, Perl itself was much less about money and never produced shady startups with greedy executives like Red Hat and VA Linux.
Wall says there have been a number of people who have inspired him or served as role models throughout his life. Among them are 19th century writer George Macdonald, his grandmother who earned her Ph.D. in comparative literature when she was 77, his parents, and his wife. Unlike Linus Torvalds he seems does not have a taste for expensive cars and in 2001 at the peak of his fame and popularity was still driving old 1977 Honda accord:
CL: Do you still drive Honda Accord?
LW: Yes, I still drive my Honda Accord. It's a 1977 Honda Accord. The paint is almost all worn off :-)
CL: Is it still running?
LW: It's still running :-)
And it looks that unlike Linus Torvalds, he accepted a pre-IPO stock only once ;-). And used it as a fund for university education of his children. Here is how Andrew Leonard in his Salon 21st The joy of Perl describes this event:
Larry Wall smiles when he recalls the message that Yahoo co-founder David Filo sent him several years ago, shortly before Yahoo was to go public. Yahoo, wrote Filo, could never have been started without Perl, the all-purpose programming language Wall invented. So would Larry like to buy some cheap, pre-IPO stock?
Back in early 1996, at the absolute height of Silicon Valley Internet IPO madness, such an offer was akin to asking if you would accept a dump truck delivery of solid gold ingots on your front lawn. But for Wall, money has never been a primary motivation. Though widely acclaimed as the author of one of the most valuable tools for hackers anywhere, Wall lives modestly in suburban Mountain View, Calif., tooling around town in a well-worn 1977 Honda Accord. Perl itself was never about money -- Wall created the language to solve a programming problem he faced during his day job, and from the get-go he made sure that the source code to Perl would be freely available. People are always allowed to tinker with Perl -- regardless of whether they use it to construct a multibillion-dollar Internet directory company or just to get a survey form working on their own home page.
Still, Wall may be frugal, but he's not stupid. He accepted the offer and bought some Yahoo stock for his 14-year-old daughter -- enough to pay for her college education. A better example of the Internet's old "gift economy" ethic could hardly be imagined -- give unto the Net, and you shall receive.
Larry Wall likes to call Perl a "humble" language. In his soft-spoken voice, he describes Perl as if it were a meek, obeisant servant, existing only to "let you bend it to your uses." The legions of Perl hackers who swarm the Web are less modest: Perl, they declare, is the indispensable duct tape, or glue, that holds the entire Web together -- not just Yahoo, but Amazon and a million other sites. Without Perl and Larry Wall, Perl's advocates argue, the Net would be but a pale shadow of its current self.
Wall has played an important role in spurring forward not only the Web's evolution but also the burgeoning free software/open source movement responsible for so much of the Internet's structure and plumbing. But even though his peers hail him as one of the "paramount chiefs and wise elders" of free-software culture, Wall's version of leadership is utterly self-effacing -- a character trait that sets him apart from some of the other leaders of the movement.
The creation of Perl and writing first Perl interpreter was a really very important step in the history of development of scripting languages. This was also one of the few huge open source projects that was not a direct reimplementation of a preexisting system (after all what is Linux other then reimplementation of FreeBSD under GNU license). Yes it was integration of several pre-existing Unix components, but like any talented integration it created something new, not found in any components.
Unfortunately not that much was know about Larry as a person and O'Reilly failed top publish a book on the subject ;-). Here is how he described himself in the poster for his lecture in Stanford University:
Larry Wall lives in Mountain View with his wife and four kids, two cats, and fifteen fish. He likes to teach his computers to make weird noises in response to everything that goes on in his house. He posts strange articles to Usenet occasionally. He also happens to be known for writing rn, patch, and Perl.
Larry Wall (His personal site is www.wall.org) was born in September 27, 1954 in Los Angeles, CA. He is a son and grandson of evangelical preachers. Growing up in Bremerton, Washington, he initially planned to be a missionary. "I was very much encouraged to go into some sort of full-time Christian work." As he mentioned in one of his interviews:
In terms of biographical beginnings, my father was a pastor, as were both my grandfathers, and many of my ancestors before that. My wife likes to say that preachers are bred for intelligence (though I suppose she might be saying that just to flatter me).
Unlike most of programmers, Wall himself is an active evangelical. For many years he was a member (and webmaster) of the New Life Church's Cupertino Church of the Nazarene. He is the only such figure among leading open source programmers.
He was married with four children. His (now former ?) wife Gloria Biggar Wall (born in 1958) is a writer and bible classes teacher. A sample of her writings with her brother Mark can be found here: GlobalWarming. She have taught Bible classes for more than 35 years. His older daughter Heidi went to Seattle Pacific in 2000.
Wall had spent three of his eight years as an undergraduate at Seattle Pacific University working at the school's computer center. At this time he wrote one of this first programs the warp space-war game (the first version of which was written in BASIC/PLUS). Although he started programming on PDP-11 at Seattle Pacific, Wall learned programming much earlier, at high school after he got a programmable calculator. "You could program 120 steps into [it] and I very nearly taught it to play tic-tac-toe -- but I just couldn't get that squeezed down to 120 steps.''
While at Seattle Pacific, Wall says he "was vaguely acquainted with Bill Gates.'' Wall explains, ``[Gates] was still programming for the experimental college at the University of Washington and they had been jacking up the computer rates on him. So he came over and was using our little PDP 11 because we'd give him cheaper computer time.''
Wall got his bachelor's degree from Seattle Pacific University in 1976. With a degree in linguistics and science from Seattle Pacific, he joined Wycliffe Bible Translators with his wife, Gloria Biggar Wall in 1979. "We took their training and went off to graduate school," he recalls. After that he Wall attended graduate school at U.C. Berkeley and U.C.L.A. He was attending the linguistics department at UC-Berkeley:
... Before he committed to a lifetime of systems administration and associated hacking, he and his wife were graduate students in the linguistics department at UC-Berkeley. Their plan, says Wall, was to become field missionaries dedicated to assisting Bible translation. They would go live with a tribe that had no written language, learn it from scratch, write it down and then help translate the Bible into that language.
Despite his training as a linguist, Wall turned to computer science because of more lucrative job prospects and natural, demonstrated early on, talent of a programmer. Also in graduate school Wall developed food allergies, including allergic reactions to wheat and eggs, which make him "incompatible" with missionary work for health reasons.
Here his training at Seattle Pacific University computer center came very handy. He worked at Unisys and the NASA Jet Propulsion Laboratory(JPL). In his spare time he developed several free UNIX programs, including several famous, classic programs such as rn news reader, patch and metaconfig. Here is slightly humorous critique of those programs from Unix haters handbook (1994), which actually demonstrates power and flexibility not so much shortcomings ;-).
rn, trn: You Get What You Pay for
Like almost all of the Usenet software, the programs that people use to read (and post) news are available as freely redistributable source code. This policy is largely a matter of self-preservation on the part of the authors:
- Itís much easier to let other people fix the bugs and port the code; you can even turn the reason around on its head and explain why thisis a virtue of giving out the source.
- Unix isnít standard; the poor author doesnít stand a chance in hell of being able to write code that will "just work"on all modern Unices.
- Even if you got a single set of sources that worked everywhere, different Unix C compilers and libraries would ensure that compiled files wonít work anywhere but the machine where they were built. The early versions of Usenet software came with simple programs to read articles. These programs, called readnews and rna, were so simplistic that they donít bear further discussion.
The most popular newsreader may be rn, written by Larry Wall. rnís documentation claimed that "even if itís not faster, it feels like it is."rn shiftedthe paradigm of newsreader by introducing killfiles. Each time rn reads a newsgroup, it also reads the killfile that you created for that group (if itexisted) that contains lines with patterns and actions to take. The patterns are regular expressions. (Of course, theyíre sort of similar to shell patterns, and, unfortunately, visible inspection canít distinguish between the two.)
Killfile s let readers create their own mini-islands of Usenet within the babbling whole. For example, if someone wanted to read only announcements but not replies, they could put "/Re:.*/"in the killfile. This could cause problems if rn wasnít careful about "Tricky"subjects.Date: Thu, 09 Jan 1992 01:14:34 PST From: Mark Lottor <[email protected]> To: UNIX-HATERS Subject: rn killI was just trying to catch up on a few hundred unread messages in anewsgroup using rn. I watch the header pop up, and if the subjectisnít interesting I type "k"for the kill command. This says "markingsubject <foo> as read"and marks all unread messages with the samesubject as having been read.rn commands are a single letter, which is a fundamental problem. Since there are many commands some of the assignments make no sense. Why does "f"post a follow up, and what does follow up mean, anyway? One would like to use "r"to post a reply, but that means send reply directly to the author by sending mail. You canít use "s"for mail because that means save to a file, and you canít use "m"for mail because that means "mark the article as unread."And who can decipher the jargon to really know what that means? Or, who can really remember the difference between "k", "K",ď^K", ".^K", and so on?
So what happens... I see a message pop up with subject "*******",and type "k."Yepóit marks ALL messages as being read. No way toundo it. Total lossage. Screwed again.
There is no verbose mode, the help information is never complete, and there is no scripting language. On the other hand, "it certainly seems faster."
Like all programs, rn has had its share of bugs. Larry introduced the idea of distributing fixes using a formalized message containing the "diff" output. This said: hereís how my fixed code is different from your broken code. Larry also wrote patch, which massages the old file and the description of changes into the new file. Every time Larry put out an official patch (and there were various unofficial patches put out by "helpful" people at times), sites all over the world applied the patch and recompiled their copy of rn. Remote rn, a variant of rn, read news articles over a network. Itís interesting only because it required admins to keep two nearly identical programs around for a while, and because everyone sounded like a seal when they said the name, rrn.trn, the latest version of rn, has merged in all the patches of rn and rrn and added the ability to group articles into threads. A thread is a collection of articles and responses, and trn shows the "tree" by putting a little diagram in the upper-right corner of the screen as its reading. For example:+--(1) \--[*] | +- +- + -
No, we donít know what it means either, but there are Unix weenies who swear by diagrams like this and the special non alphabetic keystrokes that "manipulate" this information.
The rn family is highly customizable. On the other hand, only the true anal-compulsive Unix weenie really cares if killfiles are stored as$HOME/News/news/group/name/KILL, ~/News.Group.Name, $DOTDIR/K/news.group.nameThere are times when this capability (which had to be shoehorned into an inflexible environment by means of "% strings" and "escape sequences") reaches up and bites you:Date: Fri, 27 Sep 91 16:26:02 EDT From: Robert E. Seastrom <[email protected]> To: UNIX-HATERS Subject: rn bites weenie
So there I was, wasting my time reading abUsenet news, when I ran across an article that I thought I'd like to keep. RN has this handy little feature that lets you pipe the current article into any unix program, so you could print the article by typing "| lpr" at the appropriate time. Moveover, you can mail it to yourself or some other lucky person by typing| mail [email protected]
at the same prompt.
Now, this article that I wanted to keep had direct relevance to what I do at work, so I wanted to mail it to myself there. We have a UUCP connection to uunet (a source of constant joy to me, but that's another flame...), but no domain name. Thus, I sent it to "rs%[email protected]" Apparently %d means something special to rn, because when I went to read my mail several hours later, I found this in my mailbox:Date: Fri, 27 Sep 91 10:25:32 -0400 From: [email protected] (Mail DeliverySubsystem) ----- Transcript of session follows ----- >>> RCPT To:<rs/tmp/alt/sys/[email protected] ><<< 550 <rs/tmp/alt/sys/[email protected]>... User unknown 550 <rs/tmp/alt/sys/[email protected]>... User unknown - Rob
He also worked for JPL, and Seagate, playing with everything from discrete event simulators to network-management systems.
The initial version of Perl was created in 1987 while Wall was working at Unisys, where he was trying to glue together a bicoastal configuration management system over a 1200 baud encrypted link using a hacked-over version of Netnews. That means that Perl was created ten years after C-shell and AWK were written. At this point only REXX and AWK existed and none of them was a fully fledged scripting language.
REXX was designed and first implemented between 1979 and mid-1982 by Mike Cowlishaw of IBM. REXX was closer to full fledged language then AWK and was used as a shell in VM/CMS, but was little known outside IBM mainframe community (which at this time was far passed their heyday) Interrupter was well written with a good debugger but proprietary. One of the first network computer viruses was written in REXX.
The little known fact is that he created all his programs including Perl being almost completely blind on one eye. Recently he published a diary relating details of hiscornea transplant surgery that improved his vision. It is really amazing that he managed to overcome such a severe for a programmer handicap.
O'Reilly publishing house was quick to realize huge commercial potential of Perl books. And from 1995 till 2002 Larry Wall worked for O'Reilly & Associates as a salaried employee. During this period he authored and co-authored several popular books about Perl including famous Programming Perl, the Perl bible. Generally O'Reilly based Perl books during this period at tremendous speed and they all were selling like hot cakes, making nice revenue for the publisher, not so much for Larry. As he mentioned in his 1998 interview to DDJ:
DDJ: When did you join O'Reilly as a salaried employee? And how did that come about?
LW: A year-and-a-half ago. It was partly because my previous job was kind of winding down.
He left O'Reilly in 2002 as interest in Perl books started to wane and O'Reilly can't milk Perl book publishing cow as intensively as before. Here is the relevant quote from Perl Foundation Funds Larry Wall
Perl Foundation announces Larry Wall as a recipient of a 2002 Perl Development Grant
Holland, Michigan, February 5 2002 -- The Perl Foundation announces the awarding of a 2002 Perl Development Grant to Larry Wall.
Larry Wall joins Dr. Damian Conway and Dan Sugalski as 2002 grant recipients. Larry is the creator of the Perl programming language, and is currently drafting the specifications for Perl 6, the next major version of Perl.
With the addition of Larry Wall, the total amount the Foundation plans to raise and contribute for the Perl Development Grants comes to $240,000. The grants consist of $60,000 in stipend and $20,000 in travel allowance, and are partially funded for 2002. Partial disbursements are being given through the year as fundraising continues -- for more information, or to donate, visit http://donate.perl-foundation.org.
"We're pleased to be able to let Larry focus on Perl 6 without distraction," said Kevin Lenzo, President and Founder of the Perl Foundation. "Through these grants, the support of the community -- including sizeable amounts from individuals, as well as companies such as O'Reilly, BlackStar, Altec, DynDNS, Pair Networks, SAGE, Stonehenge Consulting, and Manning, can be put to use for the good of everyone."
The first Perl Development Grant was awarded to Dr. Conway by the Yet Another Society in 2001, when a individuals and corporate sponsors made it possible. Individual and small companies accounted for nearly half the US$75,000 award. The list of contributors, as well as the work produced under the grant, are at http://yetanother.org/damian.
After being asked to leave by O'Reilly, he became semi-unemployed. Later the same news was mentioned at perl.com This week on Perl 6, week ending 2003-01-19 [Jan. 19, 2003]
Damian mentioned that "We should bear in mind that Larry has had some health issues. And that he's currently unemployed with four children to support. Other matters are taking precedence at the moment." Get well soon Larry.
This led to a discussion of whether the Perl Foundation would be continuing its grant to Larry in 2003 (apparently not). As of 2004 he holds the position of Senior Scientist ("oldest hacker'') at NetLabs and resided in Mountain View, California.
Since 2002 he was only partly employed. In 2009 he recollected
Essentially I have been officially unemployed for not quite five years now. There's never enough funding.
Unfortunately during this difficult period he was diagnosed with the stomach tumor and approximately in 2007 Larry Wall undergone two major surgeries. Here is how he remembered the situation in 2009: (Larry Wall interview Linux Format)
GW: Also, Larry wasn't expecting to be sick. That lasted a year out of the project.
LW: Yeah, a couple of years ago I ended up having two stomach surgeries. Two because the first one didn't work. Well, it worked in the sense that they chopped out the tumour that needed to be chopped out, but not in the sense that I couldn't actually eat or drink anything for a period of six weeks. During that time I was subsisting away on what could be pumped into my veins.
Back-to-back surgeries like that take a lot more out of you than you think they do. Two months after my second surgery I thought I was back to 100% but then half a year after that I looked back and said no, I was not at 100% then either. I don't know if it was that whole year, but it was a significant portion of the year.
On September 27, 2014 Larry turned 60 and in two year he formally can get Social Security.
Perl is an acronym for "Practical Extraction and Report Language." The first version of Perl, Perl 1.000 was released by Larry Wall in 1987. See an excellent PerlTimeline for more information.
Unlike Linux Torvalds, Larry Wall did not have POSIX standards before him to solve architectural problems although Unix shell languages, AWK and Snobol provided some guidelines. Perl was developed as an "umbrella" language that combines functionality of several preexisting Unix utilities, such as C-shell, sed and AWK into a single language. But the way this pre-existing functionality was integrated into the language was pretty new and I would say innovative. In language design, like in cooking you can have absolutely the same ingredients, but still will get different quality of dishes with different cooks :-)
His views on Perl are very interesting for me because paradoxically the idea that "easy things should be easy and hard thing should be possible" was probably first time implemented in PL/1 -- almost forgotten mainframe language (PL/1 served as a prototype for C due to the close association of Unix developers with Multics project were PL/1 was used as a system programming language):
The strengths that come from UNIX are that it's a worldview where easy things are easy and hard things are possible, and Perl has taken to that idea in particular. There's a really old UNIX idea that if something could be represented as a simple flat text file it ought to be, because then you can edit with regular tools rather than going into a database or something [similarly cumbersome and proprietary].
It is interesting, that while at Berkeley, he had nothing to do with the UNIX development going on there. From the point of view of Larry Wall philosophy probably the most interesting was LWN interview:
CL: Have you ever thought of starting a Perl support company and going IPO?
LW: :-) No, I am not a sort of person who wants to run a company. I think that would be even less fun than what I'm doing right now :-) There are other companies that are already working to support Perl. So if I did that, that would just be duplicated effort.
CL: Is it because you feel comfortable being at O'Reilly?
LW: Yes. Essentially, my position is what you call a patronage. It's a very old-fashioned idea which goes back to the time when there was an aristocracy and they would support artists and musicians. They would have a patron, Tim O'Reilly is my patron. He pays me to create things to, kind of be in charge of the Perl culture.
CL: So you consider yourself an artist.
LW: Yes in many ways. I'm certainly not a manager :-)
CL: And you don't consider yourself an engineer?
LW: Oh, a little bit of that, too. Some of modern engineering is necessary to good art. But what I think of myself is a cultural artist. Not just trying to write computer programs, but trying to change the culture around me for a better one, whether it's writing Perl or any of the other programs I've written, or trying to change the way people license their software into open source.
I think of myself as a hacker in that sense. Not in the sense that people would break into computer but who will be working on a problem until they solve it. And the problems that I really like to solve are our cultural problems.
CL: Would you give us an example of cultural problems?
LW: Ten years ago or so, we had Richard Stallman's GPL, and Perl was licensed under that. And I discovered that that worked fine for the hacker community, for the geeks, but it prevented Perl from being used in a commercial environment. So I wrote my own license. But I didn't want to offend the free software, the GPL people.
So, rather than switching licenses, I said "Well, let's have both licenses and you may distribute Perl under either of them at the same time." And that way, the computer crowd, they had their insurance that their rights would not be taken away, and the companies had some insurances that their rights would not be taken away, and everyone was happy. That's sort of cultural hack that I'm talking about.
... ... ...
The name Perl was chosen not accidentally, it was result of much research and hard work. As he mentioned in his Linux Journal interview:
I wanted a short name with positive connotations. (I would never name a language ``Scheme'' or ``Python'', for instance.) I actually looked at every three- and four-letter word in the dictionary and rejected them all. I briefly toyed with the idea of naming it after my wife, Gloria, but that promised to be confusing on the domestic front. Eventually I came up with the name ``pearl'', with the gloss Practical Extraction and Report Language. The ``a'' was still in the name when I made that one up. But I heard rumors of some obscure graphics language named ``pearl'', so I shortened it to ``perl''. (The ``a'' had already disappeared by the time I gave Perl its alternate gloss, Pathologically Eclectic Rubbish Lister.)
Another interesting tidbit is that the name ``perl'' wasn't capitalized at first. UNIX was still very much a lower-case-only OS at the time. In fact, I think you could call it an anti-upper-case OS. It's a bit like the folks who start posting on the Net and affect not to capitalize anything. Eventually, most of them come back to the point where they realize occasional capitalization is useful for efficient communication. In Perl's case, we realized about the time of Perl 4 that it was useful to distinguish between ``perl'' the program and ``Perl'' the language. If you find a first edition of the Camel Book, you'll see that the title was Programming perl, with a small ``p''. Nowadays, the title is Programming Perl
The description from the original man page sums up this new language well. (Dec 18, 1987):
NAME perl | Practical Extraction and Report Language SYNOPSIS perl [options] filename args DESCRIPTION Perl is a interpreted language optimized for scanning arbi- trary text files, extracting information from those text files, and printing reports based on that information. It's also a good language for many system management tasks. The language is intended to be practical (easy to use, effi- cient, complete) rather than beautiful (tiny, elegant, minimal). It combines (in the author's opinion, anyway) some of the best features of C, sed, awk, and sh, so people familiar with those languages should have little difficulty with it. (Language historians will also note some vestiges of csh, Pascal, and even BASIC|PLUS.) Expression syntax corresponds quite closely to C expression syntax. If you have a problem that would ordinarily use sed or awk or sh, but it exceeds their capabilities or must run a little fas- ter, and you don't want to write the silly thing in C, then perl may be for you. There are also translators to turn your sed and awk scripts into perl scripts. OK, enough hype.
The very first version already contained a lot strong and weak points of the language:
I made one major, incompatible change to Perl just before it was born. From the start, one of my overriding design principles was to "optimize for the common case." I didn't coin this phase, of course. I learned it from people like Dennis Ritchie, who realized that computers tend to assign more values than they compare. This is why Dennis made = represent assignment and == represent comparison in his C programming language.
I'd made many such tradeoffs in designing Perl, but I realized that I'd violated the principle in Perl's regular expression syntax. It used grep's notion of backslashing ordinary characters to produce metacharacters, rather than egrep's notion of backslashing metacharacters to produce ordinary characters.
It turns out that you use the metacharacters much more frequently than you do the literal characters, so it made sense to change Perl so that /(.*)/ defined a substring that could be referenced later, while /\(.*\)/ matched a sequence inside literal parentheses.
The decision to release it as free open sourced software was not an easy one:
I knew that I didn't dare ask the company lawyers for permission, because they'd have thought about it for something like six months, and then told me "no." This is despite the fact that they wouldn't be interested in peddling it themselves. In the old days, a lot of free software was released under the principle that it's much easier to ask forgiveness than to seek permission. I'm glad things have changed -- at least to the extent that the counterculture is acknowledged these days, even if it's not quite accepted. Yet.
Here is how early history of Perl is viewed by the author himself (the quote below is from the third edition of the Camel book):
Way back in 1986, Larry was a systems programmer on a project developing multi-level-secure wide-area networks. He was in charge of an installation consisting of three VAXen and three Suns on the West Coast, connected over an encrypted, 1200-baud serial line to a similar configuration on the East Coast. Since Larry's primary job was support (he wasn't a programmer on the project, just the system guru), he was able to exploit his three virtues (laziness, impatience, and hubris) to develop and enhance all sorts of useful tools--such as rn, patch, and warp.
 It was at about this time that Larry latched onto the phrase "feeping creaturism" in a desperate attempt to justify on the basis of biological necessity his overwhelming urge to add "just one more feature". After all, if Life Is Simply Too Complicated, why not programs too? Especially programs like rn that really ought to be treated as advanced Artificial Intelligence projects so that they can read your news for you. Of course, some people say that the patch program is already too smart.
One day, after Larry had just finished ripping rn to shreds, leaving it in pieces on the floor of his directory, the great Manager came to him and said, "Larry, we need a configuration management and control system for all six VAXen and all six Suns. We need it in a month. Go to it!"
So, Larry, never being one to shirk work, asked himself what was the best way to have a bicoastal CM system, without writing it from scratch, that would allow viewing of problem reports on both coasts, with approvals and control. The answer came to him in one word: B-news.
That is, the second implementation of Usenet transport software.
Larry went off and installed news on these machines and added two control commands: an "append" command to append to an existing article, and a "synchronize" command to keep the article numbers the same on both coasts. CM would be done using RCS (Revision Control System), and approvals and submissions would be done using news and rn. Fine so far.
Then the great Manager asked him to produce reports. News was maintained in separate files on a master machine, with lots of cross-references between files. Larry's first thought was "Let's use awk." Unfortunately, the awk of that day couldn't handle opening and closing of multiple files based on information in the files. Larry didn't want to have to code a special-purpose tool. As a result, a new language was born.
This new tool wasn't originally called Perl. Larry bandied about a number of names with his officemates and cohorts (Dan Faigin, who wrote this history, and Mark Biggar, his brother-in-law, who also helped greatly with the initial design). Larry actually considered and rejected every three- or four-letter word in the dictionary. One of the earliest names was "Gloria", after his sweetheart (and wife). He soon decided that this would cause too much domestic confusion.
The name then became "Pearl", which mutated into our present-day "Perl", partly because Larry saw a reference to another language called PEARL, but mostly because he's too lazy to type five letters all the time. And, of course, so that Perl could be used as a four-letter word. (You'll note, however, the vestiges of the former spelling in the acronym's gloss: "Practical Extraction And Report Language".)
This early Perl lacked many of the features of today's Perl. Pattern matching and filehandles were there, scalars were there, and formats were there, but there were very few functions, no associative arrays, and only a crippled implementation of regular expressions, borrowed from rn. The manpage was only 15 pages long. But Perl was faster than sed and awk and began to be used on other applications on the project.
But Larry was needed elsewhere. Another great Manager came over one day and said, "Larry, support R&D." And Larry said, okay. He took Perl with him and discovered that it was turning into a good tool for system administration. He borrowed Henry Spencer's beautiful regular expression package and butchered it into something Henry would prefer not to think about during dinner. Then Larry added most of the goodies he wanted, and a few goodies other people wanted. He released it on the network. The rest, as they say, is history.
 More astonishingly, he kept on releasing it as he went to work at Jet Propulsion Lab, then at NetLabs and Seagate. Nowadays, other people do most of the real work, and Larry pretends to work for O'Reilly & Associates (a small company that publishes pamphlets about computers and stuff).
 And this, so to speak, is a footnote to history. When Perl was started, rn had just been ripped to pieces in anticipation of a major rewrite. Since he started work on Perl, Larry hasn't touched rn. It is still in pieces. Occasionally, Larry threatens to rewrite rn in Perl, but never seriously.
In the paper Linux Magazine October 1999 FEATURES Uncultured Perl Larry Wall wrote:
Like the typical human, Perl was conceived in secret, and existed for roughly nine months before anyone in the world ever saw it. Its womb was a secret project for the National Security Agency known as the "Blacker" project, which has long since closed down. The goal of that sexy project was not to produce Perl. However, Perl may well have been the most useful thing to come from Blacker. Sex can fool you that way.
That means that like Internet itself Perl creation was subsidized by the military. In other interview Larry mentioned that he wrote Perl "while trying to glue together a bicoastal configuration management system over a 1200 baud encrypted link for some defense project using a hacked-over version of Netnews" Initial Perl has two co-authors:
At this point, I'm talking about Perl, version 0. Only a few people in my office ever used it. In fact, the early history of Perl recorded in O'Reilly's Camel Book (Programming Perl) was written by my officemate of the time, Daniel Faigin.
He, along with my brother in law, Mark Biggar, were most influential in the early design of Perl. They were also the only users at the time. Mark talked me out of using bc as a backend expression processor, and into using normal, built in floating point operations, since they were just being standardized by the IEEE (Institute of Electrical and Electronics Engineers). Relying on that standard was one of the better decisions I ever made. Earlier scripting languages such as REXX didn't have that option, and as a result they tend to run slower.
It is natural to think about Perl as an integration project, some kind of csh/AWK/SED superset -- or more formally an attempt to add AWK and SED features to the C-shell framework. The idea was to create a language more suitable for processing logs and generating reports for large quantities of data than combination of shell, AWK and C-shell. Design contains both a lot of strong solutions and some not that good solutions that make Perl learning more complex that it should be. Here how Larry Wall explains his decision:
I've always been smart enough to realize how stupid I am, and one of the things I'm stupid about is predicting how my programs will develop over time. So Perl was equipped to learn, and have a long childhood.
We value the maturing process in our own species, but for some reason we don't like it as much in computer programs. In the absence of a handy Zeus, we like to think that computer programs should spring fully formed from our own foreheads. We want to present the world with a fait accompli. In modern terms, we want to build a cathedral.
Now let me just say that I think cathedrals have gotten a bum rap lately. Open Source advocate Eric Raymond has likened the commercial software development model to a cathedral, while he compares free software development to a bazaar.
Eric's heart is in the right place, but I think his metaphors are a little off. Most cathedrals were built in plain view with lots of volunteer labor. And most of the bazaars I've seen have produced little of lasting architectural value. Eric should have written about artists who insist on having an unveiling when their sculpture or painting is finished. Somehow I can't imagine anyone pulling a shroud off of a cathedral and saying, "Voila!"
For historic reasons Perl's syntax resembles both C and C-shell and it has a superset of AWK repertoire of built-in function. That means that those who know shell programming feel that they can adapt to Perl without major problems. That's why many UNIX users and especially system administrators find Perl (deceptively) easy to learn. In reality Perl is a very complex language with a very convoluted semantic. The slogan of Perl...."There's always more than one way to do it." is essentially the same idea that inspire the designers on PL/1 and it would definitely find home in hearts of designers of MS Office ;-). Different Perl programmers may use different approaches even for simple problem. Like with any sufficiently complex language everybody uses some subset, never the full language. In this sense Perl can be considered an anti-Unix development ;-). And Larry Wall agree with this:
But Perl was actually much more countercultural than you might think. It was intended to subvert the Unix philosophy. More specifically, it was intended to subvert that part of Unix philosophy that said that every tool should do only one thing and do that one thing well.
The problem with that philosophy is that many of the tools available under Unix did not, in fact, do things very well. They had arbitrary limits. They were slow. They were non-portable. They were difficult to integrate via the shell because they had different ideas of data formats. They worked okay as long as you did what was expected, but if you wanted to do something slightly different, you had to write your own tool from scratch.
So that's what I did. Perl is just another tool in the Unix toolbox. Perl does one thing, and it does it well: it gets out of your face.
When I first encountered Perl I was surprised how many of underling ideas of Perl are close to PL/1 -- the language that served as one of the inspirations for the C and despite being a mainframe language historically related to the Unix culture via its Multics roots. PL/1 was an innovative language that was too far ahead of its time to survive. It was the first language that contained decent string handling, pointers, three types of allocation of storage -- static, automatic (stack-based) and controlled (heap-based), exception handling, rudimentary multitasking. It was and probably still is one of the most complex and interesting algorithmic languages in existence, although it's popularity (similar to popularity of many other interesting IBM products with VM/CMS and OS/2 and examples) that suffered blows from IBM itself and in 70th from religious fanatics in the days of structured programming and verification. What is most interesting that despite its age PL/1 has probably the best optimizing and debugging compilers for any language of similar complexity in existence.
Complex non-orthogonal programming language rarely became hugely popular. Cobol, Basic, Pascal and Java popularity are primary examples here. All of them are dull uninventive languages designed for novices (with Pascal explicitly designed as for teaching programming at universities). Perl is one of the very few complex non-orthogonal programming languages that gained wide popularity. Actually it was second after PL/1 complex non-orthogonal language that managed to achieve world wide popularity.
Probably the major killing factor for PL/1 was that compiler for PL/1 was very complex to write (similar to C++ in complexity) due to built-in concept of exceptions which were 20 years ahead of their time. Due to them and some (may be unnecessary, like no reserved words feature) peculiarities of the syntax PL/1 compilers were quite expensive to write and maintain. No free compiler existed (although Cornell University managed to implemented PL/C -- a pretty full teaching subset of PL/1 and successfully use it for a number of years it doesn't receive widespread use; I wonder under which license it was released). It's unclear what would happen with PL/1, if PL/1 compilers were an open source development like Perl. If PL/1 compiler were open sourced it might be that the destiny of language could have been different. Currently the quality of Perl interpreter is much less that PL/1 debugging compiler but its open source status protects Perl from being pushed aside by "cheap and primitive" language on one side and languages that spot fashionable paradigm (OO was for the last decade such a paradigm) on the other.
Paradoxically, PL/1 played a role of freeware at least in one country ;-). And in this conditions it quickly became a dominant programming language on mainframes in the USSR, far outpacing Cobol and Fortran that still dominated the mainframe arena in the USA and other Western countries. So here analogy with Perl hold perfectly. PL/1 dominated despite the fact the Soviet IBM 360/370 clones (called EC -- Russian abbreviation of "Uniform System of Computers") were much less powerful (and far less reliable) that Western counterparts. Now both books and compilers on PL/1 became rarity, although IBM optimizing and debugging compilers on system 360/370 remain an unsurpassed masterpiece of software engineering. But I would like to stress that PL/1 (as a system programming language for Multics) has large influence on C -- one of the most widely used compiled programming languages and many of it's ideas directly or indirectly found its way into other programming languages (I have no information about Larry Wall possible exposure to PL/1) can be found in Perl. IMHO understanding if not PL/1 programming, but PL/1 philosophy -- or its close relative Perl philosophy can benefit programming community much more that playing with languages based on some kind of religious doctrine like pure strongly type languages or OO languages ;-).
There were several versions of Perl but historically the most important are two versions: version 4 (now obsolete) and version 5 that is now dominant.
Version 4 was the first widely used version of Perl. Timing was simply perfect: it was already widely available before WEB explosion in 1994. At this moment Perl already has had two books "Programming Perl" and "Learning Perl" published by O'Reilly. As Larry Wall recollected:
Another thing that helped legitimize Perl was the addition of the Artistic License to stand beside the GPL. Perl 3 used only the GPL, but I found that this didn't do quite what I wanted. I wanted Perl to be used, and the GPL was preventing people from using Perl. Not that I dislike the GPL myself -- it provides a set of assurances that many hackers find comforting. But business people needed a different set of assurances, and so I wrote the Artistic License to reassure them.
The really brilliant part was that I didn't require people to state which license they were distributing under, so nobody had to publicly commit to one or the other. In sociological terms, nobody had to lose face, or cause anyone else to lose face. Most everyone chose to read whichever license they preferred, and to ignore the other. That's how Perl used psychology to subvert the license wars which, as you may or may not be aware, are still going on. Ho hum.
Yet another thing that helped legitimize Perl was that there was a long period of stability for Perl 4, patch level 36. The primary cause of this was that I abandoned Perl 4 to work on Perl 5.
As Tim O'Reilly later noted in O'Reilly The Importance of Perl [Apr. 01, 1998] with the the advent of the World Wide Web, Perl usage exploded:
Despite all the press attention to Java and ActiveX, the real job of "activating the Internet" belongs to Perl, a language that is all but invisible to the world of professional technology analysts but looms large in the mind of anyone -- webmaster, system administrator or programmer -- whose daily work involves building custom web applications or gluing together programs for purposes their designers had not quite foreseen. As Hassan Schroeder, Sun's first webmaster, remarked: "Perl is the duct tape of the Internet."
Perl was originally developed by Larry Wall as a scripting language for UNIX, aiming to blend the ease of use of the UNIX shell with the power and flexibility of a system programming language like C. Perl quickly became the language of choice for UNIX system administrators.With the advent of the World Wide Web, Perl usage exploded. The Common Gateway Interface (CGI) provided a simple mechanism for passing data from a web server to another program, and returning the result of that program interaction as a web page. Perl quickly became the dominant language for CGI programming.
With the development of a powerful Win32 port, Perl has also made significant inroads as a scripting language for NT, especially in the areas of system administration and web site management and programming.
For a while, the prevailing wisdom among analysts was that CGI programs--and Perl along with them--would soon be replaced by Java, ActiveX and other new technologies designed specifically for the Internet. Surprisingly, though, Perl has continued to gain ground, with frameworks such as Microsoft's Active Server Pages (ASP) and the Apache web server's mod_perl allowing Perl programs to be run directly from the server, and interfaces such as DBI, the Perl DataBase Interface, providing a stable API for integration of back-end databases.
This paper explores some of the reasons why Perl will become increasingly important, not just for the web but as a general purpose computer language. These reasons include:
- fundamental differences in the tasks best performed by scripting languages like Perl versus traditional system programming languages like Java, C++ or C.
- Perl's ability to "glue together" other programs, or transform the output of one program so it can be used as input to another.
- Perl's unparalleled ability to process text, using powerful features like regular expressions. This is especially important because of the re-emergence via the web of text files (HTML) as a lingua-franca across all applications and systems.
- The ability of a distributed development community to keep up with rapidly changing demands, in an organic, evolutionary manner.
A good scripting language is a high-level software development language that allows for quick and easy development of trivial tools while having the process flow and data organization necessary to also develop complex applications. It must be fast while executing. It must be efficient when calling system resources such as file operations, interprocess communications, and process control. A great scripting language runs on every popular operating system, is tuned for information processing (free form text) and yet is excellent at data processing (numbers and raw, binary data). It is embeddable, and extensible. Perl fits all of these criteria.
Version 5 was released in the end of 1994:
The much anticipated Perl 5.000 is unveiled. It was a complete rewrite of Perl. A few of the features and pitfalls are: (18 October)
- The documentation is much more extensive and perldoc along with pod is introduced.
- Lexical scoping available via my. eval can see the current lexical variables.
- The preferred package delimiter is now :: rather than '.
- New functions include: abs(), chr(), uc(), ucfirst(), lc(), lcfirst(), chomp(), glob()
- There is now an English module that provides human readable translations for cryptic variable names.
- Several previously added features have been subsumed under the new keywords use and no.
- Pattern matches may now be followed by an m or s modifier to explicitly request multiline or singleline semantics. An s modifier makes . match newline.
- @ now always interpolates an array in double-quotish strings. Some programs may now need to use backslash to protect any @ that shouldn't interpolate.
- It is no longer syntactically legal to use whitespace as the name of a variable, or as a delimiter for any kind of quote construct. The -w switch is much more informative.
- => is now a synonym for comma. This is useful as documentation for arguments that come in pairs, such as initializers for associative arrays, or named arguments to a subroutine.
Perl 5 matured quickly due to O'Reilly sponsoring the project starting from 1996 as O'Reilly published avalanche of boks on Perl, and each book contributed to the debugging, enhancement of the interpreter. Otherwise I doubt that such a complex version would ever be accomplished: it definitely had outgrown the volunteer stage. Tim O'Reilly probably understood that Perl books can be an important cash cow, so it make business sense to ensure that they are published by O'Reilly not nobody else. He was right on money and later managed to get a hold on Perl conferences. The first O'Reilly Perl Conference (TPC) is held in San Jose, California in 1997. The conference was attended by over 1K people making it a financial success and ensuring a second conference. Larry Wall recollected this nice development in 1999 in the following way:
But beyond that, I was looking around for someone with some business sense to cooperate with, when lo and behold I found out that Tim O'Reilly (as in O'Reilly & Associates, my publisher) was having the same ideas about establishing a more symbiotic relationship with the open source community.
Tim is a class act. He's also a bit of a rarity: a brilliant (but not greedy) entrepreneur. His slogan is "Interesting work for interesting people." We hit it off right away, and Tim offered to pay me to take care of Perl, because anything that is good for Perl is good for O'Reilly. And from my perspective, lots of what O'Reilly does happens to be good for Perl.
But it goes beyond even that.
Tim and I both felt that there was something larger than Perl afoot. Free software has been around in various forms as long as there has been software, but something new was beginning to happen, something countercultural to the counterculture.
The various open source projects were starting to realize that, hey, we aren't just a bunch of separate projects, but we have a lot in common. We don't have a bunch of separate open source movements here. We have a single Open Source movement -- albeit one with lots of diversity of opinion as to how best to move the bandwagon forward to where more people can hop on.
In short, our counterculture was beginning to count.
When Tim hired me on three years ago, that was very much on our minds. We were preaching it a year before Netscape released the Mozilla browser code under an open source license.
Perl 5 evolved very slowly with the major changes appearing only in Perl 5.10.
For Perl, the meltdown happened because I decided to follow the rule: "Plan to throw away your prototype, because you will anyway." Perl 5 was nearly a total reorganization. I have in times past claimed that it was a total rewrite, but that's a bit of a stretch, since I did, in fact, evolve Perl 4's runtime system into Perl 5's. (Though if you compared them, you'd see almost nothing in common.) The compiler, though, was a total rewrite.
All programming languages are essentially compilations. There have always been important programming languages that differ from mainstream ALGOL-style languages. For example Fortran was different and more suitable than Algol-style languages to numeric computations.
Although Perl implementation is an open source implementation, commercial companies actively participate in its development. The major commercial beneficiary of Perl success was O'Reilly & Associates that published first books on the language (Programming Perl and Learning Perl in 1993) and now sells several millions of dollars worth of Perl books each year. They make a largely unsuccessful attempt of distributing Perl Resource Kit. At the same time they provide financial support for the Perl movement. Since 1995 they employed Larry Wall. O'Reilly also hosts Perl web site and sponsors a Perl conference. Around 2002 O'Reilly lost interest in Perl due to plunging books sales and the same year Larry left O'Reilly. (see 2009 Larry Wall interview to Linux Format):
LXF: Did you leave O'Reilly after the dotcom boom had ended, when people stopped buying books so much?
LW: O'Reilly had run into really tough times because of the plunge in book sales, which was already starting before 9/1 but very much accelerated at that point. I knew that I was one of their fluffier employees from a standpoint of their core business, so I was not at all surprised to get laid off.
People sometimes say, "Aren't you angry at Tim O'Reilly for laying you off?" and I say, "No, you don't understand." The years that he hired me he essentially paid me to do what I wanted to do. He essentially gave me a scholarship for those years, and I'm completely grateful for that, for what he was able to do, and so that's my feeling about Tim O'Reilly. I'm on very good terms with him.
LXF: So when this happened about five years ago you were just starting to kick off Perl 6, then?
One should see Perl as one of important non-Algol family languages. Among important non-traditional precursors of Perl are:
Although this is an important part of history, I do not want to delve into religious debates about merits of a particular approach or discuss in what language a particular feature was first introduced. Many things in languages were invented/reinvented in parallel. Generally language development is a high drama, the drama of ideas. And unfortunately in many cases the timing and the wealth of the sponsor means more then the quality of ideas.
See Python Compared -- a very interesting page that provide a lot of relevant links. Perl is a compilation, but it is a useful one that extent some features in such way that they can be considered innovations. But at the same time viewing Perl as one branch among several largely parallel development streams of development of scripting languages helps in understand the language, it's strengths and limitations. Being a PL/1 style language Perl added and developed further some interesting ideas originated in Unix shell languages (mainly in Bourne shell). Among them:
Perl 5 interpreter is now so big and complex that adding features are a tricky exercise. Support for Unicode is probably the most important recent innovation, but other that that I expect that the language will evolve very slowly.
Perl 6 development results in some backporting of features to Perl 5. On it own Perl 6 never exited experimental language stage and now wwill probably be renamed.
Here is how Larry Wall positioned Perl among other programming languages:
CL: Once again about Ruby, until a few years ago, I would recommend Perl with no doubt because of its usability, its big enough development and user community base, many good books, etc. Now, I think Ruby and Python can also be good candidates. What should I do?
LW: Obviously, you should still recommend Perl :-) It really depends on the kind of the programmer you are talking to. Ruby and Python are languages that are designed more with the computer science mind-set, trying to be minimalistic. Some people prefer that kind of language. Perl was designed to work more like a natural language. It's a little more complicated but there are more shortcuts, and once you learned the language, it's more expressive.
So, it really depends on whether if you would just like to learn a smaller language and then you just fight with it all the time, or, learn a slightly larger language and have more fun. I think Perl is still more fun than the other languages.
CL: In general, there are some open source projects that compete with other projects that have the same goal. This may be good because, for example, it brings diversity and rivalry in a good way. But human resources have always been the most valuable resource in open source projects and from that point of view, it could be just duplication of effort. Do you see this as a problem?
LW: I don't think you can avoid that problem... In the area of computer languages, there are just a certain number of people who are interested in developing their own computer language. And you can't stop them, they're just going to do it anyway :-)
It's not duplication of effort in a sense that we copy all the good ideas from each other. So all of those languages get better. They just make their duplication of efforts in terms of implementing those ideas, because they have to implement the idea in one way over here and in a different way over there. But there are potentially some ways in which different camps could cooperate.
We see the same thing happening not just with free software. We have the same thing happening in terms of over all language architectures. You have the whole idea of compiling C or C++ down to machine code. Then you have the Java camp which duplicated an awful lot of stuff, then you have Microsoft coming out with C#, they're trying to do the same thing. That actually makes duplicated work for us, because we want to target all of those architectures :-)
On the other hand, it forces us to look at how we do our things and do them in our way or in a general way so that we can do that. That means when something else comes along later, we'll be able to do it easily too. So it has its pluses and minuses.
But to me the whole progress is really driven in a Darwinian sense, the way evolution works. You have variations of ideas, and then some of them work better than others, and then those survive and continue on. So it's actually important in the long run to have multiple languages, multiple operating systems, and if you don't have that competition, then we don't survive. We're dead. We're extinct :-)
... ... ...
CL: Perl development is said to be started with the power of laziness, but what power do you think make the development going right now?
LW: Oh, good question... Hmmm... I think it's still laziness. Laziness on a different level. When Perl first was developed, it was laziness to get particular small jobs done quickly. But now, people don't want to have to use Perl plus other things. If there is a job that really ought to be written in C++ or Java or Ruby or Python or something like that, but they like Perl, and Perl may not be the best tool yet for it, but they would like it to be.
So rather than learning a different language, they just want to extend Perl toward what is better for that. So I think it's still laziness :-)
CL: Do you think you will keep taking part in the Perl development for all the rest of your life?
LW: I believe so. Of course, someday I will become too stupid to participate :-)
When I announced the development of Perl 6 this summer, I said that it was going to be a community design. I designed Perl, myself. It's limited by my own brain power. So I wanted Perl 6 to be a community design. But one of the first thing that the community said was "We still want you to be the language designer," so I'm still the language designer :-) and I have to understand everything they've proposed, say "Yes" or "No" or "Change this." So that's my big job right now, to weigh all of those proposals they have made for Perl 6.
CL: If you become... as you said, too stupid (sorry for my lack of vocabulary) do you think the project will keep going?
LW: :-) I expect so. There are many people who love Perl dearly and would want to see it advanced whether or not I was in the part of it.
The main problem would probably be if I weren't there to say what was good or what was bad, they probably quarrel, they probably fight over what to do :-)
CL: Perl developers seem to have many meetings offline, including the Perl Conference we are having today. Do you think that helps development of Perl a lot?
LW: Yes, it does. There are things that are difficult to decide over the Internet. Things can be decided very rapidly in a meeting such as "We're going to write Perl 6!" Also, when you meet somebody across the net, you can be friends with them but until you meet them face to face, it's often difficult to really understand how other people think.
So just getting the people in one place and having them helps Perl development. People with similar interests find each other and be able to go out after the meetings for dinner and they can talk over those things. That's just a lot more efficient. So I think face to face meetings are important.
... ... ...
CL: The development of Perl 6 seems to be done in a planned, organized way. Have you been that way until now?
LW: Until now, the process of the design of Perl has been evolutionary. It's been done by prototype and modification over time. I talked about becoming stupid, but I've always been stupid. Fortunately I've been just smart enough to realize that I'm stupid. I'm not smart enough to design the whole thing in a planned way, so I've had to say "Well, let's add this and this looks good, so let's add that," so at each step, I've been able to extend a little bit more. That's been the way through the first five versions of Perl.
Now with Perl 6, we are taking a more organized approach. We're gathering all the proposals that people have made and there are three hundred and sixty one of them. I could read one everyday and it would take me all year and I could take four days off :-) Christmas, Easter, Memorial Day, Labor Day.
So in that sense it's more organized. The proposals themselves which we call RFCs, in a sense, are not very organized at all, because many of them contradict each other, they have all sorts of different ideas. So my job now is to bring them into an organization.
But to me it's actually easier to take someone else's proposal so I can say "Yes", "No", "...sort of." That's easier for me than if I had to come up with everything myself.
... ... ...
CL: Then, do you have any part which you don't like?
LW: Oh, I have a list of things I don't like. That's part of the reason we are doing Perl 6. There are a number of things that are snagging during the five versions, which we as a community are smarter now about and I probably wouldn't put in if I've known back then what I know now.
So there are a number of ways which we can make some simplifications, some of the funny looking global variables can become more object-oriented attached to the appropriate objects such as files or whatever the appropriate object is, some of them can become lexically scoped rather than global.
What we've realized was that although we kept backward compatibility through the first five versions, we now had the technology to do translation. We actually have a complier back-end that will spit out several different things like C, Java, but particularly, Perl. You can compile it down to a syntax tree and decompile it back to Perl. So we thought if we can translate Perl to Perl, if we can translate Perl 5 to Perl 5, why not translate Perl 5 to Perl 6. So if we can do a translation step here, that frees us up to do a redesign.
And this is like the first chance we've had to do this, a major redesign. Maybe it's our last chance, so we should take it. It's scary, but that's what we've decided to do. It's another experiment, and it may succeed, it may fail, but we're going to do our best.
CL: You often mention about Post Modernism. It's definitely an important and useful idea, but it's an idea born before the 90's. In the next decade, people may have problems we haven't even imagined before. And in order to solve them, they may need a camel to fly or swim, too. I don't know what it is yet, but will Perl 6 have such a new paradigm (born in the 90's)?
LW: Post Modernism was a reaction against Modernism. It came to different realms at different times. It came quite early to music and to literature, and a little later to architecture. And I think it's still coming to computer science. I think computer science, by and large, is still stuck in the Modern age.
Anytime you have a Modern to a Post Modern transition in a particular kind of art or genre, they serve an over reaction to where Modernism is. It's kind of disliked. But really to me the essence of Post Modernism is not anti-Modern. It is sort of at right-angles to what the Modern is, so there has to be a period of time which is sort of deconstruction against the Modern. But it recovers, and comes back to where you can mix together the Modern, the Romantic, the Classical and the Baroque..., however you want to classify the history.
Now, Perl's been a little bit anti-Modern. I think Perl 6 is mixing a little more of the Modern back in, which is a healthier balance. There's people who right now prefer Python or Ruby because of its Modern aspects. We'll also feel it more common in Perl when we get to Perl 6.
CL: So in a sense, not being Post Modern could probably mean being very Post Modern, for languages that came after the 90's (after Perl) that are not designed in the Post Modern (TMTOWTDI) way are being successful, too right now...
LW: Well, one of the very basic ideas of Post Modernism is rejection of arbitrary power structures. Different people are sensitive to different kinds of power structures. Some people see the Post Modern as threat, a different kind of power structure. So, in trying to escape that, they're being Post Modern, but they don't realize it :-) It's so basic to the way we think nowadays. We are so Post Modern that we don't realize how Post Modern we are anymore.
So I think even the people who are still trying to be conservative and to be Modern in computer scientists, are actually signs of Post Modern sensitivities.
CL: Do you fear software patents?
LW: Yes, I worry about some. I think that software patents are bad idea. Many patents are given for trivial inventions. I think the real problem with so ftware patents is that they don't provide equal protection. If you're a large corporation, you can afford to pay the money to register patents, but if you're an individual like me, you can't.
So I think it really works against the open source movement. I'd rather see them be protective with copyrights and trade secrets, but not patents.
CL: What do you think the future of free software and proprietary software would be like?
LW: I hope that most of the infrastructure will be open source software. We have a word in English, "freeway," which is a road that is not a toll road (which you have to pay to go on). If you want to go to an attraction like Disneyland, you want to drive free roads and pay there. So I think that the things that are infrastructure like roads, electric lines, should stay free, and you pay for the electricity, you pay for the gas.
I think operating systems work best if they're free and open. I think computer languages work best if they are open source. Particular applications are more likely to be proprietary. So, Microsoft Word is maybe a little like Disneyland, you're willing to pay to use that. But the operating system is more like a public road which you should not have to pay to go on.
CL: You mentioned at the first LinuxWorld Expo that the business world and the open source world should have something like sex (that is acceptable and fun for both). What do you think happened to the two after that? Do you think they're having a baby now?
LW: I think things are very disorganized right now. It's hard to know how things are going to come out. There are a lot of people who are interested in open source and there are a lot of companies that are experimenting, IBM, in particular. But whether those experiments will be successful, we really don't know yet. The next a few years are going to be very, very interesting.
CL: You sound as if you're watching a soap opera.
LW: Yeah, it's kind of like a soap opera :-)
CL: Where would you position yourself among those three; free software movement (like Richard Stallman), or open source movement (like Eric Raymond), or Linus (interested in free beer :-)?
LW: I'm really interested in all of those. I suppose more than the other two, I'm probably with Linus. I'm interested in giving Perl away. I want people to use Perl. I want to be a positive ingredient of the world and make my American history. So, whatever it takes to give away my software and get it used, that's great.
That's why I did the dual licensing. One license was to agree with the free software people, and the other license was to agree with the open source people. But those are both means to an end. So, I lean slightly on Linus's direction there.
CL: Linus once said basically that he was a hard-boiled guy and he rejects whatever he thinks should be rejected, considering only the technical side of anything, even if that may make somebody weep or get hurt feelings. How do you manage people in this respect?
LW: I take more of the approach of letting people yell at each other :-) I find if people have enough discussion, people will point out why each other's idea is stupid :-) hopefully in a nice way :-) But it actually helps me because once they've discussed an idea thoroughly, then I may be seeing one or two other things they didn't think of. They would pretty cover all the issues, and it's usually left to me just to be the judge. The way I work is pretty like the Supreme Court. All the lawyers, they prepare for their defenses for one side or the other, and then they just present those and I say "Hmm...," maybe I say "This guy's right," maybe I say "That guy's right," maybe I just throw it to different court to decide again :-)
To me it's important to make the decisions, but also not to make too many decisions. Officially, I'm the dictator. I'm always the dictator because people want me to be, and the reason they want me to be is because I don't actually act like a dictator :-) If I acted like a dictator then they won't want me to be one.
That's my approach. Linus may be a little bit more dictatorial, or at least he would like to think of himself that way and people accept that.
CL: You mentioned before that you think differently from Linus, where Linus preferred to stay out of the Linux business because business brings troubles, but you would rather be close to it. But it turned out to be that Linus had been, in a sense, in the center of Linux business. What did you think when you found that out?
LW: Transmeta, yeah... :-) Well, they're not really in the operating system business, they're in the hardware business, so I think it's still true that Linus is not directly involved in commercializing Linux.
In a sense, I'm not, either, really. ActiveState are really the people who are commercializing Perl, and while O'Reilly makes a lot of money off of Perl by selling Perl books, giving conferences, it's really sort of a side-effect. So both of us have found places where we're sort of on the edge. We don't want to limit some marketplace in how it decides to make use of what we've written. On the other hand, we would like to be close enough to it, so we have some positive influences in how things develop.
CL: Hackers like Linus and Miguel are little younger than hackers at your age, probably including Richard and Eric? Do you see any difference between them?
LW: ...ummmm..... Of course... well, I think that the newer, younger hackers are... I don't know... they're hard to classify... I think they're probably just as diverse as the old hackers are. You know, we're all over the map and they're all over on the different map.
I think maybe, the older hackers, we grew up in a time when most of the software was produced in a corporate framework that assumed they would own their software. So we had to do a lot of the work, sort of on the side, sneakily. In a way, that tended to limit our vision. So we did a lot of small projects and gave them away because if you do a large project the company notices it and you're in trouble.
I think now the younger programmers can afford to have a larger vision. So, you know, something like Gnome or Linux, you can have a vision like that now, and not feel like you're going to get in trouble with it. I think I was lucky to have as larger vision as Perl when I did :-) At that time, you talked about little languages, like "I wrote a little language that do this", or "little language that do that." For several reasons I said "I want a bigger language" :-)
CL: They sometimes mention "World Domination"...
LW: You know, "Laziness, Impatience and Hubris", that's Hubris :-) We say it jokingly, but there's some elements of truth to it. There's a way in which you have to have both hubris and humility, because hubris itself will not let you be an artist.
To be a good artist, you have to serve the work of art and allow it to be what it is supposed to be. Maybe that's less than what you would like it to be, if you were purely driven by world domination. Linus talks about world domination, but he's not going to turn Linux into Windows in order to do that. He's going to make sure that it stays Linux. So really, Linus is an artist also, Linux is his work of art, and that is more important to him than world domination. He would mind world domination, but that's not his first goal.
CL: Perl is not just a great program, but it also has great documentation base that is online and available for free. But not all developers love to write documentation. Do you see any problem around it?
LW: Well, the approach we took was to make it as easy as possible for the programmers to write documentation. Rather than enforcing them to write documentation in some fancy format, we came up with a very simple way to put the documentation right in a program. They can add the documentation to programs with their ordinary text editor. It's called POD, Plain Old Documentation.
That's been very successful for several reasons. One of them is because it's very easy for the programmers to write, and they can be lazy. And because it makes writing with program itself, it's sort of understood that the program is not complete unless documentation is also there. If you do not have the documentation out here over the side, people can kind of ignore that like "Oh, here's my complete program" and we won't talk about the documentation, but if the documentation is supposed to be right there with it, people will notice that it's not there. In a programming language, you have to clear things ahead of time and if you don't, you're in trouble. Well, if you don't put the documentation, you're sort of culturally in trouble. That works out very well to encourage people to put at least some documentation.
Then we have people who are interested in making sure that documentation is good. They'll take the documentation that other people write maybe is not so good, and they'll make it better. Those people are very valuable also. So a lot of things work together to make that possible. There are things that we could do better, though.
Google matched content
Larry Wall Approves Re-Naming Perl 6 To Raku - Slashdot
Perl's State of the Onion 10 - Slashdot
perl-fortunes-larry-wall at master ∑ thibaultduponchelle-perl-fortunes ∑ GitHub
Groupthink : Two Party System as Polyarchy : Corruption of Regulators : Bureaucracies : Understanding Micromanagers and Control Freaks : Toxic Managers : Harvard Mafia : Diplomatic Communication : Surviving a Bad Performance Review : Insufficient Retirement Funds as Immanent Problem of Neoliberal Regime : PseudoScience : Who Rules America : Neoliberalism : The Iron Law of Oligarchy : Libertarian Philosophy
War and Peace : Skeptical Finance : John Kenneth Galbraith :Talleyrand : Oscar Wilde : Otto Von Bismarck : Keynes : George Carlin : Skeptics : Propaganda : SE quotes : Language Design and Programming Quotes : Random IT-related quotes : Somerset Maugham : Marcus Aurelius : Kurt Vonnegut : Eric Hoffer : Winston Churchill : Napoleon Bonaparte : Ambrose Bierce : Bernard Shaw : Mark Twain Quotes
Vol 25, No.12 (December, 2013) Rational Fools vs. Efficient Crooks The efficient markets hypothesis : Political Skeptic Bulletin, 2013 : Unemployment Bulletin, 2010 : Vol 23, No.10 (October, 2011) An observation about corporate security departments : Slightly Skeptical Euromaydan Chronicles, June 2014 : Greenspan legacy bulletin, 2008 : Vol 25, No.10 (October, 2013) Cryptolocker Trojan (Win32/Crilock.A) : Vol 25, No.08 (August, 2013) Cloud providers as intelligence collection hubs : Financial Humor Bulletin, 2010 : Inequality Bulletin, 2009 : Financial Humor Bulletin, 2008 : Copyleft Problems Bulletin, 2004 : Financial Humor Bulletin, 2011 : Energy Bulletin, 2010 : Malware Protection Bulletin, 2010 : Vol 26, No.1 (January, 2013) Object-Oriented Cult : Political Skeptic Bulletin, 2011 : Vol 23, No.11 (November, 2011) Softpanorama classification of sysadmin horror stories : Vol 25, No.05 (May, 2013) Corporate bullshit as a communication method : Vol 25, No.06 (June, 2013) A Note on the Relationship of Brooks Law and Conway Law
Fifty glorious years (1950-2000): the triumph of the US computer engineering : Donald Knuth : TAoCP and its Influence of Computer Science : Richard Stallman : Linus Torvalds : Larry Wall : John K. Ousterhout : CTSS : Multix OS Unix History : Unix shell history : VI editor : History of pipes concept : Solaris : MS DOS : Programming Languages History : PL/1 : Simula 67 : C : History of GCC development : Scripting Languages : Perl history : OS History : Mail : DNS : SSH : CPU Instruction Sets : SPARC systems 1987-2006 : Norton Commander : Norton Utilities : Norton Ghost : Frontpage history : Malware Defense History : GNU Screen : OSS early history
The Peter Principle : Parkinson Law : 1984 : The Mythical Man-Month : How to Solve It by George Polya : The Art of Computer Programming : The Elements of Programming Style : The Unix Haterís Handbook : The Jargon file : The True Believer : Programming Pearls : The Good Soldier Svejk : The Power Elite
Most popular humor pages:
Manifest of the Softpanorama IT Slacker Society : Ten Commandments of the IT Slackers Society : Computer Humor Collection : BSD Logo Story : The Cuckoo's Egg : IT Slang : C++ Humor : ARE YOU A BBS ADDICT? : The Perl Purity Test : Object oriented programmers of all nations : Financial Humor : Financial Humor Bulletin, 2008 : Financial Humor Bulletin, 2010 : The Most Comprehensive Collection of Editor-related Humor : Programming Language Humor : Goldman Sachs related humor : Greenspan humor : C Humor : Scripting Humor : Real Programmers Humor : Web Humor : GPL-related Humor : OFM Humor : Politically Incorrect Humor : IDS Humor : "Linux Sucks" Humor : Russian Musical Humor : Best Russian Programmer Humor : Microsoft plans to buy Catholic Church : Richard Stallman Related Humor : Admin Humor : Perl-related Humor : Linus Torvalds Related humor : PseudoScience Related Humor : Networking Humor : Shell Humor : Financial Humor Bulletin, 2011 : Financial Humor Bulletin, 2012 : Financial Humor Bulletin, 2013 : Java Humor : Software Engineering Humor : Sun Solaris Related Humor : Education Humor : IBM Humor : Assembler-related Humor : VIM Humor : Computer Viruses Humor : Bright tomorrow is rescheduled to a day after tomorrow : Classic Computer Humor
The Last but not Least Technology is dominated by two types of people: those who understand what they do not manage and those who manage what they do not understand ~Archibald Putt. Ph.D
Copyright © 1996-2021 by Softpanorama Society. www.softpanorama.org was initially created as a service to the (now defunct) UN Sustainable Development Networking Programme (SDNP) without any remuneration. This document is an industrial compilation designed and created exclusively for educational use and is distributed under the Softpanorama Content License. Original materials copyright belong to respective owners. Quotes are made for educational purposes only in compliance with the fair use doctrine.
FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to advance understanding of computer science, IT technology, economic, scientific, and social issues. We believe this constitutes a 'fair use' of any such copyrighted material as provided by section 107 of the US Copyright Law according to which such material can be distributed without profit exclusively for research and educational purposes.
This is a Spartan WHYFF (We Help You For Free) site written by people for whom English is not a native language. Grammar and spelling errors should be expected. The site contain some broken links as it develops like a living tree...
|You can use PayPal to to buy a cup of coffee for authors of this site
Last modified: September 08, 2020