Softpanorama

May the source be with you, but remember the KISS principle ;-)
Home Switchboard Unix Administration Red Hat TCP/IP Networks Neoliberalism Toxic Managers
(slightly skeptical) Educational society promoting "Back to basics" movement against IT overcomplexity and  bastardization of classic Unix

eval command

News Unix Utilities Recommended Links Examples

Unix exec command

Pipes Shells
Unix Find Tutorial/Using -exec option with find Unix Xargs tee Unix script command Admin Horror Stories Humor Etc

The internal eval  command interprets and expands a command line before the shell interprets and expands the line.  Essentially it permits dynamically construct program or statements and then execute them. This is a feature typical for all scripting languages and one of the most powerful one. Among other things you can with it we can mention the following:

The eval  command similar to dot command. Dot command is mainly used for execution of static code. Eval is used for execution dynamically generated code.

The general format of the eval  command is very simple:

     eval [ command_to_be_interpreted... ]

No options whatsoever which is pretty rare. The shell expands arguments to eval  using standard command processing rules. Then the shell forms a space-separated string of all the arguments. The shell reads the string as a command line and processes it again and executes it.

The following example illustrates how you can use eval  to expand variables within a command line to be the name of another variable.

     eval last_arg='$'{$#}    # expands to last positional parameter
Here is another pretty artificial, but still useful for understanding the eval  command example:
X=10
Y=X
echo '$'$Y
eval echo '$'$Y
10

Before most of Unix shells got one-dimensional arrays, one of the  most  popular use of  the eval  command  was, probably, imitation of arrays in shell:

The command:

cat /etc/group | while read line 
do
   read line
   eval x$a="$line"
done

sets variables x1, x2 , x3, and so on to the lines read. . Once this has been done,  you can use variables $x1 $x2 and so on in any shell construct, or construct them dynamically to get data, for example:

for a in 3 2 1
do
   eval line=x$a;
   echo $line
done

produces the lines in different order

See also Example 15-12 ( below). Again the key idea if dynamic "on the fly" generation of code. Good, practical examples of this "on the fly code generation" usually are pretty complex and the best simple example that I have found is an example from O'Reilly book Learning the Korn shell:

we constructed a simple pipeline that sorts a file and prints out the first N lines, where N defaults to 10. The resulting pipeline was:

sort -nr $1 | head -${2:-10}

The first argument specified the file to sort; $2 is the number of lines to print.

Now suppose we change the task just a bit so that the default is to print the entire file instead of 10 lines. This means that we don't want to use head at all in the default case. We could do this in the following way:

if [[ -n $2 ]]; then
    sort -nr $1 | head -$2
else
    sort -nr $1
fi

In other words, we decide which pipeline to run according to whether or not $2 is null. But here is a more compact solution:

eval sort -nr \$1 ${2:+"| head -\$2"}

The last expression in this line evaluates to the string | head -\$2 if $2 exists (is not null); if $2 is null, then the expression is null too. We backslash-escape dollar signs (\$) before variable names to prevent unpredictable results if the variables' values contain special characters like > or |. The backslash effectively puts off the variables' evaluation until the eval command itself runs. So the entire line is either:

eval sort -nr \$1 | head -\$2

if $2 is given or:

eval sort -nr \$1

if $2 is null. Once again, we can't just run this command without eval because the pipe is "uncovered" after the shell tries to break the line up into commands. eval causes the shell to run the correct pipeline when $2 is given.

Examples

Example 1: Performing Variable Indirection

This example uses the eval utility to perform variable indirection. The first part of this example shows what happens when the eval utility is not used. The value 10 is assigned to the variable named abc and the value abc is assigned to the variable named x. Next, the variable named y is assigned the value of the variable x prefixed by $ or, in this case, the value $abc.

abc=10 x=abc
y='$'$x
echo $y

The echo command produces the following output:

$abc

You can instead use the eval to perform the assignment to the variable named y, as in:

eval y='$'$x
echo $y

In this case, the eval utility expands the assignment to y=$abc and passes that to the shell for further expansion, resulting in y having the value of the variable named abc. This means that the echo command produces the following output:

10

Setting and Printing Values of Arbitrary Variables

#!/bin/sh 
echo "Enter variable name and value separated by a space" 
read VARIABLE VALUE 
echo Assigning the value $VALUE to variable $VARIABLE 
eval $VARIABLE=$VALUE 
  
# print the value 
eval echo "$"$VARIABLE 
  
# export the value 
eval export $VARIABLE 
  
# print the exported variables. 
export 
This example takes user input, constructs a variable based on the value entered using eval, then prints the value stored in the resulting variable.

Checking the last parameter in the list

     set One Two Three Four
     if eval [ ! -f \${$#} ]
     then echo "$4: last argument must be a file!"
        exit 1
     fi

This code checks for the last argument on the command line to be the name of a file that exists. The first line sets the positional parameters to One Two Three Four. Since Four is not the name of a file, your code will return the message about the last file. The exit is commented out so you are not logged off when the command executes.

Tuesday's Tips for Unix Shell Scripts

The randstr function selects one of its arguments at random and puts it in the variable $_RETVAL:

randstr() {
    [ $# -eq 0 ] && return 1
    n=$(( ($RANDOM % $#) + 1 ))
    eval _RETVAL=\${$n}
}
   

For example, to pick a card at random:

randstr diamonds hearts clubs spades
suit=$_RETVAL
randstr Ace 2 3 4 5 6 7 8 9 10 Jack Queen King
card="$_RETVAL of $suit"
echo $card

Examples from Advanced Bash-Scripting Guide (Internal Commands and Builtins):

eval
eval arg1 [arg2] ... [argN]

Combines the arguments in an expression or list of expressions and evaluates  them. Any variables within the expression are expanded. The net result is to convert a string into a command.

Tip The eval command can be used for code generation from the command-line or within a script.
bash$ command_string="ps ax"
bash$ process="ps ax"
bash$ eval "$command_string" | grep "$process"
26973 pts/3 R+ 0:00 grep --color ps ax 26974 pts/3 R+ 0:00 ps ax
	      
 

Each invocation of eval forces a re-evaluation of its arguments.

a='$b'
b='$c'
c=d

echo $a             # $b
                    # First level.
eval echo $a        # $c
                    # Second level.
eval eval echo $a   # d
                    # Third level.

# Thank you, E. Choroba.
 

Example 15-11. Showing the effect of eval

#!/bin/bash
# Exercising "eval" ...

y=`eval ls -l`  #  Similar to y=`ls -l`
echo $y         #+ but linefeeds removed because "echoed" variable is unquoted.
echo
echo "$y"       #  Linefeeds preserved when variable is quoted.

echo; echo

y=`eval df`     #  Similar to y=`df`
echo $y         #+ but linefeeds removed.

#  When LF's not preserved, it may make it easier to parse output,
#+ using utilities such as "awk".

echo
echo "==========================================================="
echo

eval "`seq 3 | sed -e 's/.*/echo var&=ABCDEFGHIJ/'`"
# var1=ABCDEFGHIJ
# var2=ABCDEFGHIJ
# var3=ABCDEFGHIJ

echo
echo "==========================================================="
echo


# Now, showing how to do something useful with "eval" . . .
# (Thank you, E. Choroba!)

version=3.4     #  Can we split the version into major and minor
                #+ part in one command?
echo "version = $version"
eval major=${version/./;minor=}     #  Replaces '.' in version by ';minor='
                                    #  The substitution yields '3; minor=4'
                                    #+ so eval does minor=4, major=3
echo Major: $major, minor: $minor   #  Major: 3, minor: 4

Example 15-12. Using eval to select among variables

#!/bin/bash
# arr-choice.sh

#  Passing arguments to a function to select
#+ one particular variable out of a group.

arr0=( 10 11 12 13 14 15 )
arr1=( 20 21 22 23 24 25 )
arr2=( 30 31 32 33 34 35 )
#       0  1  2  3  4  5      Element number (zero-indexed)


choose_array ()
{
  eval array_member=\${arr${array_number}[element_number]}
  #                 ^       ^^^^^^^^^^^^
  #  Using eval to construct the name of a variable,
  #+ in this particular case, an array name.

  echo "Element $element_number of array $array_number is $array_member"
} #  Function can be rewritten to take parameters.

array_number=0    # First array.
element_number=3
choose_array      # 13

array_number=2    # Third array.
element_number=4
choose_array      # 34

array_number=3    # Null array (arr3 not allocated).
element_number=4
choose_array      # (null)

# Thank you, Antonio Macchi, for pointing this out.

Example 15-13. Echoing the command-line parameters

#!/bin/bash
# echo-params.sh

# Call this script with a few command-line parameters.
# For example:
#     sh echo-params.sh first second third fourth fifth

params=$#              # Number of command-line parameters.
param=1                # Start at first command-line param.

while [ "$param" -le "$params" ]
do
  echo -n "Command-line parameter "
  echo -n \$$param     #  Gives only the *name* of variable.
#         ^^^          #  $1, $2, $3, etc.
                       #  Why?
                       #  \$ escapes the first "$"
                       #+ so it echoes literally,
                       #+ and $param dereferences "$param" . . .
                       #+ . . . as expected.
  echo -n " = "
  eval echo \$$param   #  Gives the *value* of variable.
# ^^^^      ^^^        #  The "eval" forces the *evaluation*
                       #+ of \$$
                       #+ as an indirect variable reference.

(( param ++ ))         # On to the next.
done

exit $?

# =================================================

$ sh echo-params.sh first second third fourth fifth
Command-line parameter $1 = first
Command-line parameter $2 = second
Command-line parameter $3 = third
Command-line parameter $4 = fourth
Command-line parameter $5 = fifth

Example 15-14. Forcing a log-off

#!/bin/bash
# Killing ppp to force a log-off.
# For dialup connection, of course.

# Script should be run as root user.

SERPORT=ttyS3
#  Depending on the hardware and even the kernel version,
#+ the modem port on your machine may be different --
#+ /dev/ttyS1 or /dev/ttyS2.


killppp="eval kill -9 `ps ax | awk '/ppp/ { print $1 }'`"
#                     -------- process ID of ppp -------  

$killppp                     # This variable is now a command.


# The following operations must be done as root user.

chmod 666 /dev/$SERPORT      # Restore r+w permissions, or else what?
#  Since doing a SIGKILL on ppp changed the permissions on the serial port,
#+ we restore permissions to previous state.

rm /var/lock/LCK..$SERPORT   # Remove the serial port lock file. Why?

exit $?

# Exercises:
# ---------
# 1) Have script check whether root user is invoking it.
# 2) Do a check on whether the process to be killed
#+   is actually running before attempting to kill it.   
# 3) Write an alternate version of this script based on 'fuser':
#+      if [ fuser -s /dev/modem ]; then . . .

Example 15-15. A version of rot13

#!/bin/bash
# A version of "rot13" using 'eval'.
# Compare to "rot13.sh" example.

setvar_rot_13()              # "rot13" scrambling
{
  local varname=$1 varvalue=$2
  eval $varname='$(echo "$varvalue" | tr a-z n-za-m)'
}


setvar_rot_13 var "foobar"   # Run "foobar" through rot13.
echo $var                    # sbbone

setvar_rot_13 var "$var"     # Run "sbbone" through rot13.
                             # Back to original variable.
echo $var                    # foobar

# This example by Stephane Chazelas.
# Modified by document author.

exit 0

The eval command occurs in the older version of indirect referencing.

eval var=\$$var
 
Caution The eval command can be risky, and normally should be avoided when there exists a reasonable alternative. An eval $COMMANDS  executes the contents of COMMANDS, which may contain such unpleasant surprises as rm -rf *. Running an eval on unfamiliar code written by persons unknown is living dangerously.

Using the eval Builtin for Data Structures, Arrays, and Indirection

One of the more under-appreciated commands in shell scripting is the eval builtin. The eval builtin takes a series of arguments, concatenates them into a single command, then executes it.

For example, the following script assigns the value 3 to the variable X and then prints the value:

#!/bin/sh
eval X=3
echo $X

For such simple examples, the eval builtin is superfluous. However, the behavior of the eval builtin becomes much more interesting when you need to construct or choose variable names programmatically. For example, the next script also assigns the value 3 to the variable X:

#!/bin/sh
 
VARIABLE="X"
eval $VARIABLE=3
echo $X

When the eval builtin evaluates its arguments, it does so in two steps. In the first step, variables are replaced by their values. In the preceding example, the letter X is inserted in place of $VARIABLE. Thus, the result of the first step is the following string:

X=3

In the second step, the eval builtin executes the statement generated by the first step, thus assigning the value 3 to the variable X. As further proof, the echo statement at the end of the script prints the value 3.

The eval builtin can be particularly convenient as a substitute for arrays in shell script programming. It can also be used to provide a level of indirection, much like pointers in C. Some examples of the eval builtin are included in the sections that follow.

A Complex Example: Setting and Printing Values of Arbitrary Variables

The next example takes user input, constructs a variable based on the value entered using eval, then prints the value stored in the resulting variable.

#!/bin/sh
echo "Enter variable name and value separated by a space"
read VARIABLE VALUE
echo Assigning the value $VALUE to variable $VARIABLE
eval $VARIABLE=$VALUE
# print the value
eval echo "$"$VARIABLE
# export the value
eval export $VARIABLE
 
# print the exported variables.
export
Warning: This script executes arbitrary user input. It is intended only as an example of the usage of the eval builtin. In real-world code, you should never pass unsanitized user input directly to eval because doing so can provide a vector for arbitrary code execution.

 

Run this script and type something like MYVAR 33. The script assigns the value 33 to the variable MYVAR (or whatever variable name you entered).

You should notice that the echo command has an additional dollar sign ($) in quotes. The first time the eval builtin parses the string, the quoted dollar sign is simplified to merely a dollar sign. You could also surround this dollar sign with single quotes or quote it with a backslash, as described in “Quoting Special Characters.” The result is the same.

Thus, the statement:

eval echo "$"$VARIABLE

evaluates to:

echo $MYVAR
Note: If you forget to quote the first dollar sign, you get a very strange result. The variable $$ is a special shell variable that contains the process ID of the current shell. Thus, without quoting the first dollar sign, the two dollar signs are interpreted as a variable, and thus the statement evaluates to something like:

 

echo 1492MYVAR
This is probably not what you want.

 

A Practical Example: Using eval to Simulate an Array

In “Shell Variables and Printing,” you learned how to read variables from standard input. This was limited to some degree by the inability to read an unknown number of user-entered values.

The script below solves this problem using eval by creating a series of variables to hold the values of a simulated array.

#!/bin/sh
COUNTER=0
VALUE="-1"
echo "Enter a series of lines of test.  Enter a blank line to end."
while [ "x$VALUE" != "x" ] ; do
        read VALUE
        eval ARRAY_$COUNTER=$VALUE
        eval export ARRAY_$COUNTER
        COUNTER=$(expr $COUNTER '+' 1) # More on this in Paint by Numbers
done
COUNTER=$(expr $COUNTER '-' 1) # Subtract one for the blank value at the end.
 
# print the exported variables.
COUNTERB=0;
 
echo "Printing values."
while [ $COUNTERB -lt $COUNTER ] ; do
        echo "ARRAY[$COUNTERB] = $(eval echo "$"ARRAY_$COUNTERB)"
        COUNTERB=$(expr $COUNTERB '+' 1) # More on this in Paint by Numbers
done

This same technique can be used for splitting an unknown number of input values in a single line as shown in the next listing:

#!/bin/sh
COUNTER=0
VALUE="-1"
echo "Enter a series of lines of numbers separated by spaces."
read LIST
IFS=" "
for VALUE in $LIST ; do
        eval ARRAY_$COUNTER=$VALUE
        eval export ARRAY_$COUNTER
        COUNTER=$(expr $COUNTER '+' 1) # More on this in Paint by Numbers
done
 
# print the exported variables.
COUNTERB=0;
 
echo "Printing values."
while [ $COUNTERB -lt $COUNTER ] ; do
        echo "ARRAY[$COUNTERB] = $(eval echo '$'ARRAY_$COUNTERB)"
        COUNTERB=$(expr $COUNTERB '+' 1) # More on this in Paint by Numbers
done

A Data Structure Example: Linked Lists

In a complex shell script, you may need to keep track of multiple pieces of data and treat them like a data structure. The eval builtin makes this easy. Your code needs to pass around only a single name from which you build other variable names to represent fields in the structure.

Similarly, you can use the eval builtin to provide a level of indirection similar to pointers in C.

For example, the following script manually constructs a linked list with three items, then walks the list:

#!/bin/sh
VAR1_VALUE="7"
VAR1_NEXT="VAR2"
 
VAR2_VALUE="11"
VAR2_NEXT="VAR3"
 
VAR3_VALUE="42"
 
HEAD="VAR1"
POS=$HEAD
while [ "x$POS" != "x" ] ; do
        echo "POS: $POS"
        VALUE="$(eval echo '$'$POS'_VALUE')"
        echo "VALUE: $VALUE"
        POS="$(eval echo '$'$POS'_NEXT')"
done

Using this technique, you could conceivably construct any data structure that you need (with the caveat that manipulating large data structures in shell scripts is generally not conducive to good performance).


Top Visited
Switchboard
Latest
Past week
Past month

NEWS CONTENTS

Old News ;-)

[Jul 30, 2011] Bash Preserving Whitespace Using set and eval Linux Journal

The important line is:
  eval set -- $items

The set command takes any arguments after the options (here "--" signals the end of the options) and assigns them to the positional parameters ($0..$n). The eval command executes its arguments as a bash command.

eval When You Need Another Chance O'Reilly Media by Mike Loukides

02/10/2000 | LinuxDevCenter.com

Here's a more realistic example; you see code like this fairly often in Bourne shell scripts:

...
command='grep $grepopts $searchstring $file'
for opt
do
   case "$opt" in
      file) output=' > $ofile' ;;
      read) output=' | more'   ;;
      sort) postproc=' | sort $sortopts';;
   esac
done
...
eval $command $postproc $output

Do you see what's happening? We're constructing a command that will look something like:

grep $grepopts $searchstring $file | sort $sortopts > $ofile

But the entire command is "hidden" in shell variables, including the I/O redirectors and various options. If the eval isn't there, this command will blow up in all sorts of bizarre ways. You'll see messages like | not found, because variable expansion occurs after output redirection. The "nested" variables (like $ofile, which is used inside of $output) won't be expanded either, so you'll also see $ofile not found. Putting an eval in front of the command forces the shell to process the line again, guaranteeing that the variables will be expanded properly and that I/O redirection will take place.

eval is incredibly useful if you have shell variables that include other shell variables, shell variables that include aliases, shell variables that include I/O redirectors, or all sorts of perversities. It's commonly used within shell scripts to "evaluate" commands that are built during execution

The eval Command

This section describes another of the more unusual commands in the shell: eval. Its format is as follows:

eval command-line

where command-line is a normal command line that you would type at the terminal. When you put eval in front of it, however, the net effect is that the shell scans the command line twice before executing it.[1] For the simple case, this really has no effect:

[1] Actually, what happens is that eval simply executes the command passed to it as arguments; so the shell processes the command line when passing the arguments to eval, and then once again when eval executes the command. The net result is that the command line is scanned twice by the shell.

$ eval echo hello
hello
$

But consider the following example without the use of eval:

$ pipe="|"
$ ls $pipe wc -l
|: No such file or directory
wc: No such file or directory
-l: No such file or directory
$

Those errors come from ls. The shell takes care of pipes and I/O redirection before variable substitution, so it never recognizes the pipe symbol inside pipe. The result is that the three arguments |, wc, and -l are passed to ls as arguments.

Putting eval in front of the command sequence gives the desired results:

$ eval ls $pipe wc –l
     16
$

The first time the shell scans the command line, it substitutes | as the value of pipe. Then eval causes it to rescan the line, at which point the | is recognized by the shell as the pipe symbol.

The eval command is frequently used in shell programs that build up command lines inside one or more variables. If the variables contain any characters that must be seen by the shell directly on the command line (that is, not as the result of substitution), eval can be useful. Command terminator (;, |, &), I/O redirection (<, >), and quote characters are among the characters that must appear directly on the command line to have any special meaning to the shell.

Sys Admin Miscellaneous Unix Tips Answering Novice Shell Questions

A common eval use is to build a dynamic string containing valid Unix commands and then use eval to execute the string. Why do we need eval? Often, you can build a command that doesn't require eval:
evalstr="myexecutable"
$evalstr   # execute the command string
However, chances are the above command won't work if "myexecutable" requires command-line arguments. That's where eval comes in.

Our man page says that the arguments to the eval command are "read as input to the shell and the resulting commands executed". What does that mean? Think of it as the eval command forcing a second pass so the string's arguments become the arguments of the spawned child shell.

In a previous column, we built a dynamic sed command that skipped 3 header lines, printed 5 lines, and skipped 3 more lines until the end of the file:

evalstr="sed -n '4,\${p;n;p;n;p;n;p;n;p;n;n;n;}' data.file"
eval $evalstr  # execute the command string
This command fails without eval. When the sed command executes in the child shell, eval forces the remainder of the string to become arguments to the child.

Possibly the coolest eval use is building dynamic Unix shell variables. The following stub script dynamically creates shell variables user1 and user2 setting them equal to the strings John and Ed, respectively:

COUNT=1
eval user${COUNT}=John
echo $user1

COUNT=2
eval user${COUNT}=Ed
echo $user2
Pasting Files with paste

Recommended Links

Google matched content

Softpanorama Recommended

Top articles

Sites



Etc

Society

Groupthink : Two Party System as Polyarchy : Corruption of Regulators : Bureaucracies : Understanding Micromanagers and Control Freaks : Toxic Managers :   Harvard Mafia : Diplomatic Communication : Surviving a Bad Performance Review : Insufficient Retirement Funds as Immanent Problem of Neoliberal Regime : PseudoScience : Who Rules America : Neoliberalism  : The Iron Law of Oligarchy : Libertarian Philosophy

Quotes

War and Peace : Skeptical Finance : John Kenneth Galbraith :Talleyrand : Oscar Wilde : Otto Von Bismarck : Keynes : George Carlin : Skeptics : Propaganda  : SE quotes : Language Design and Programming Quotes : Random IT-related quotesSomerset Maugham : Marcus Aurelius : Kurt Vonnegut : Eric Hoffer : Winston Churchill : Napoleon Bonaparte : Ambrose BierceBernard Shaw : Mark Twain Quotes

Bulletin:

Vol 25, No.12 (December, 2013) Rational Fools vs. Efficient Crooks The efficient markets hypothesis : Political Skeptic Bulletin, 2013 : Unemployment Bulletin, 2010 :  Vol 23, No.10 (October, 2011) An observation about corporate security departments : Slightly Skeptical Euromaydan Chronicles, June 2014 : Greenspan legacy bulletin, 2008 : Vol 25, No.10 (October, 2013) Cryptolocker Trojan (Win32/Crilock.A) : Vol 25, No.08 (August, 2013) Cloud providers as intelligence collection hubs : Financial Humor Bulletin, 2010 : Inequality Bulletin, 2009 : Financial Humor Bulletin, 2008 : Copyleft Problems Bulletin, 2004 : Financial Humor Bulletin, 2011 : Energy Bulletin, 2010 : Malware Protection Bulletin, 2010 : Vol 26, No.1 (January, 2013) Object-Oriented Cult : Political Skeptic Bulletin, 2011 : Vol 23, No.11 (November, 2011) Softpanorama classification of sysadmin horror stories : Vol 25, No.05 (May, 2013) Corporate bullshit as a communication method  : Vol 25, No.06 (June, 2013) A Note on the Relationship of Brooks Law and Conway Law

History:

Fifty glorious years (1950-2000): the triumph of the US computer engineering : Donald Knuth : TAoCP and its Influence of Computer Science : Richard Stallman : Linus Torvalds  : Larry Wall  : John K. Ousterhout : CTSS : Multix OS Unix History : Unix shell history : VI editor : History of pipes concept : Solaris : MS DOSProgramming Languages History : PL/1 : Simula 67 : C : History of GCC developmentScripting Languages : Perl history   : OS History : Mail : DNS : SSH : CPU Instruction Sets : SPARC systems 1987-2006 : Norton Commander : Norton Utilities : Norton Ghost : Frontpage history : Malware Defense History : GNU Screen : OSS early history

Classic books:

The Peter Principle : Parkinson Law : 1984 : The Mythical Man-MonthHow to Solve It by George Polya : The Art of Computer Programming : The Elements of Programming Style : The Unix Hater’s Handbook : The Jargon file : The True Believer : Programming Pearls : The Good Soldier Svejk : The Power Elite

Most popular humor pages:

Manifest of the Softpanorama IT Slacker Society : Ten Commandments of the IT Slackers Society : Computer Humor Collection : BSD Logo Story : The Cuckoo's Egg : IT Slang : C++ Humor : ARE YOU A BBS ADDICT? : The Perl Purity Test : Object oriented programmers of all nations : Financial Humor : Financial Humor Bulletin, 2008 : Financial Humor Bulletin, 2010 : The Most Comprehensive Collection of Editor-related Humor : Programming Language Humor : Goldman Sachs related humor : Greenspan humor : C Humor : Scripting Humor : Real Programmers Humor : Web Humor : GPL-related Humor : OFM Humor : Politically Incorrect Humor : IDS Humor : "Linux Sucks" Humor : Russian Musical Humor : Best Russian Programmer Humor : Microsoft plans to buy Catholic Church : Richard Stallman Related Humor : Admin Humor : Perl-related Humor : Linus Torvalds Related humor : PseudoScience Related Humor : Networking Humor : Shell Humor : Financial Humor Bulletin, 2011 : Financial Humor Bulletin, 2012 : Financial Humor Bulletin, 2013 : Java Humor : Software Engineering Humor : Sun Solaris Related Humor : Education Humor : IBM Humor : Assembler-related Humor : VIM Humor : Computer Viruses Humor : Bright tomorrow is rescheduled to a day after tomorrow : Classic Computer Humor

The Last but not Least Technology is dominated by two types of people: those who understand what they do not manage and those who manage what they do not understand ~Archibald Putt. Ph.D


Copyright © 1996-2021 by Softpanorama Society. www.softpanorama.org was initially created as a service to the (now defunct) UN Sustainable Development Networking Programme (SDNP) without any remuneration. This document is an industrial compilation designed and created exclusively for educational use and is distributed under the Softpanorama Content License. Original materials copyright belong to respective owners. Quotes are made for educational purposes only in compliance with the fair use doctrine.

FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to advance understanding of computer science, IT technology, economic, scientific, and social issues. We believe this constitutes a 'fair use' of any such copyrighted material as provided by section 107 of the US Copyright Law according to which such material can be distributed without profit exclusively for research and educational purposes.

This is a Spartan WHYFF (We Help You For Free) site written by people for whom English is not a native language. Grammar and spelling errors should be expected. The site contain some broken links as it develops like a living tree...

You can use PayPal to to buy a cup of coffee for authors of this site

Disclaimer:

The statements, views and opinions presented on this web page are those of the author (or referenced source) and are not endorsed by, nor do they necessarily reflect, the opinions of the Softpanorama society. We do not warrant the correctness of the information provided or its fitness for any purpose. The site uses AdSense so you need to be aware of Google privacy policy. You you do not want to be tracked by Google please disable Javascript for this site. This site is perfectly usable without Javascript.

Last modified: July 28, 2019