Softpanorama

May the source be with you, but remember the KISS principle ;-)
Home Switchboard Unix Administration Red Hat TCP/IP Networks Neoliberalism Toxic Managers
(slightly skeptical) Educational society promoting "Back to basics" movement against IT overcomplexity and  bastardization of classic Unix

Bash as a scripting language

News

Unix Shells

Best Shell Books

Recommended Links Bash as command interpreter

Bash Variables

Bash expressions Arithmetic Expressions in BASH

String Operations in Shell

Bash Control Structures

Comparison operators in shell

Compound conditionals

Double square bracket conditionals

Loops in Shell

Pipes in Loops

 

 

 

Functions Input and Output Redirection Process Substitution in Shell Strange Files Deletion and Renaming Input and output redirection Restricted Shell Shell Prompts   Bourne Shell and portability

Pretty Printing

Regular Expressions

Classic Unix Tools

Scripts Collections

Debugging

Unix shells history

Sysadmin Horror Stories bash Tips and Tricks

Humor

Note: this page is partially based Nikolai Bezroukov lectures  in 2013 and 2017 on this topic, which in turn  were using Linux Shell Scripting with Bash by Ken O. Burtch (2004) as a textbook. See Best books about Bash and Korn Shell for books recommendations.


Introduction

It is important  to separate bash as a computer languages from the bash role as a command interpreter in Unix/Linux. Of course there are intersections and that the environment affects what you can do and how you can do it. But still...

Shell scripts remain a staple of the Linux world.  Despite being on of the oldest language in use shell is not dying. Even Perl, which is definitely a better tool for complex scripts, failed to dislodge it. Many scripts reflect process that are simply a matter of following a series of steps, after which the results are written to the disk. This is the very type of activity scripts are most convenient  to handle.

Each script is a regular Unix file and as such is identified by a name. For practicality, script names should not exceed 32 characters and usually consist of lowercase characters, underscores, minus signs, and periods. Spaces and punctuation symbols are permitted, but it is a bad practice to use then.

Filenames do not require a suffix to identify their contents, but following the tradition established by MS DOS  they are often used to help to identify the type of the file.  By convention the  suffix for scripts is .sh. Other common suffixes include

.txt— A generic text file
.log— A log file
.html— A HTML Web page
.tgz (or .tar.gz)— Compressed file archive

Commands usually have no suffixes.

Shell scripts, text files, and executable commands and other normal files are collectively referred to as regular files. They contain data that can be read or instructions that can be executed. There are also files that are not regular, such as directories or named pipes; they contain unique data or have special behaviors when they are accessed.

PART 1

Creating a Script

By convention, Bash shell scripts have names ending with .sh. For example

cat hello.sh
#!/bin/bash
# hello.sh
# This is my first shell script
# Joe User 
# Jul 6, 2017

printf '%s\n" "Hello world!"
exit 0 

Lines beginning with number signs (#) are comments. They are notes to the reader and do not affect the execution of a script. Everything between the number sign and the end of the line is effectively ignored by Bash. Comments should always be informative, describing what the script is trying to accomplish, not a blow-by-blow recount of what each command is doing. Too many scripts has no comments at all or bad uninformative comments. Their chances to survive are much lower than the chances for a well commented scripts. Clear and informative comments help to troubleshoot and debug obscure problems.

The very first line of a script is the header line. This line begins with #! at the top of the script, flush with the left margin. This character combination identifies the kind of script. Linux uses this information to start the right program to run the script. For Bash scripts, this line is the absolute pathname indicating where the Bash interpreter resides. On most Linux distributions, the first header line is as follows

#!/bin/bash

If you don't know the location of the Bash shell, use the which or whereis command to find it:

which bash
/bin/bash

It is a good practice to provide the name of the author and the purpose of the script in the first two lines after the line #!/bin/bash

#!/bin/bash
# Compress old files if nobody is on the computer
# John A Doer
... ... ... 

If I do not see such lines in a script my first reaction is -- the script is written by a clueless amateur.

The Bash header line is followed by comments describing the purpose of the script and who wrote it. Some comments might be inserted by version control system such as git, which we will discuss later.

Next after variable declarations, you might wish to specify some options via shopt command , which are instruction to bash interpreter on how to process your script. For example:
shopt -s nounset 

This command detects some spelling mistakes by reporting undefined variables.

You can execute this script by issuing the sh command:

bash hello.sh

If you invoke the sh command without an argument specifying a script file, a new interactive shell is launched. To exit the new shell and return to your previous session, issue the exit command.

If the hello.sh file were in a directory other than the current working directory, you'd have to type an absolute path, for example:

bash /home/bill/hello.sh

You can make it a bit easier to execute the script by changing its access mode to include execute access. To do so, issue the following command:

chmod ugo+x hello.sh

This gives you, members of your group, and everyone else the ability to execute the file. To do so, simply type the absolute path of the file, for example:

/home/bill/hello.sh

If the file is in the current directory, you can issue the following command:

./hello.sh

You may wonder why you can't simply issue the command:

hello.sh

In fact, this still simpler form of the command will work, so long as hello.sh resides in a directory on your search path. You'll learn about the search path later.

Structural elements of bash script

Bash script consists of several types of elements

This elements define so called "lexical level" of the language.

Comments exist purely to facilitate human understanding and are discarded by interpreter. The only exception is the pseudo comment at the beginning of the script that starts with "#!"

Identifiers are string without enclosing quote starting with the letter. They can contain underscore and numbers as well. they can't start with the number, though.

Numeric literals can be either integer, or real.

String literals are string of characters enclosed into either single quotes or double quotes. If the string contain the quote it needs to be doubled.

Bash Keywords

A keyword is an identifiers which has a special meaning in BASH language: they represent directives to the interpreter. As such they are distinct from the variables -- they do not have any value and treated "literally" much like punctuation signs such as . , : , +, etc. The following symbols and words provides some examples

if then else elif fi
for do in done break
while until function return exit
case esac echo printf
declare read time type

To keep scripts understandable, keywords should never be used for variable names.

Quoted Strings

Enclosing a command argument within single quotes, you can prevent the shell from expanding any special characters inside this string.

To see this in action, consider how you might cause the echo command to produce the output $PATH. If you simply issue the command:

echo $PATH

the echo command will print the value of the PATH shell variable. However, by enclosing the argument within single quotes, you obtain the desired result:

echo '$PATH'

Double quotes permit the expansion of shell variables.

Back quotes operate differently; they let you execute a command and use its output as an argument of another command. For example, the command:

dir_listing=`ls`

Bash Variables

Shell variables are widely used within shell scripts, because they provide a convenient way of transferring values from one command to another. Programs can obtain the value of a shell variable and use the value to modify their operation, in much the same way they use the value of command-line arguments.

There are two major types of bash variables -- numeric (integers or real numbers) and strings. If the variable is not previously declared as numeric it is assumed to be a string. In other words, the default is string.

BASH variable start with the letter $ when used on the right side of the assignment statement and without it on the left side of assignments statement and in all cases where it accepts a value (read statement is another example)

Note: In Bash you should NOT use the dollar sign in front on the variable to the left of the assignment sign. That's an important difference:

You can declare integer variable. To declare that a variable should accept only numeric values (integers), use the following statement:
declare -i varname

If variable is declared as integer you can perform arithmetic operations on it:

#!/bin/bash
declare -i count
count=12
count=$count+1
printf "%d\n" $count

The declare statement has some other options and can be used to declare an array. All variables can be used as arrays without explicit definition. As a matter of fact, it appears that in a sense, all variables are arrays, and that assignment without a subscript is the same as assigning to "[0]". Consider the following script:

#!/bin/bash

a=12
echo ${a[0]}
b[0]=13
echo $b

When run it produces:

$ sh arr.sh
12
13

For further options, see the bash man page (search for "^SHELL BUILTINS", then search for "declare").

echo and print statements

The echo commands simply print text on the console. The -n option of the first echo command causes omission of the trailing newline character normally written by the echo command, so both echo commands write their text on a single line.

The built-in printf (print formatted) command prints a message to the screen. This is upgraded version of the command echo that existed in shell since the very beginning. Command printf provides better control over output then echo. Should be used instead echo. Echo can be emulated using alias or function.

For example

alias echo='printf "%s\n" '

Bash printf is very similar to the C standard I/O printf() function, but they are not identical. In particular, single- and double-quoted strings are treated differently in shell scripts than in C programs.

The first parameter is a format string describing how the items being printed will be represented. For example, the special formatting code "%d" represents an integer number, and the code "%f" represents a floating-point number.

$ printf "%d\n" 5
5
$ printf "%f\n" 5
5.000000

Include a format code for each item you want to print. Each format code is replaced with the appropriate value when printed. Any characters in the format string that are not part of a formatting instruction are treated as printable characters.

$ printf "There are %d customers with purchases over %d.\n" 50 20000
There are 50 customers with purchases over 20000.

printf is sometimes used to redirect a variable or some unchanging input to a command. For example, suppose all you want to do is pipe a variable to a command. Instead of using printf, Bash provides a shortcut <<< redirection operator. <<< redirects a string into a command as if it were piped using printf.

The tr command can convert text to uppercase. This example shows an error message being converted to uppercase with both printf and <<<.

$ printf "%s\n" "$ERRMSG" | tr [:lower:] [:upper:]
WARNING: THE FILES FROM THE OHIO OFFICE HAVEN'T ARRIVED.
$ tr [:lower:] [:upper:] <<< "$ERRMSG"
WARNING: THE FILES FROM THE OHIO OFFICE HAVEN'T ARRIVED.

The format codes include the following.

If a number is too large, Bash reports an out-of-range error.

$ printf "%d\n" 123456789123456789012
bash: printf: warning: 123456789123456789012: Numerical result out of range

For compatibility with C's printf, Bash also recognizes the following flags, but treats them the same as %d:

Also for C compatibility, you can preface the format codes with a l or L to indicate a long number.

The %q format is important in shell script programming and it is discussed in the quoting section, in the Chapter 5, "Variables."

To create reports with neat columns, numbers can proceed many of the formatting codes to indicate the width of a column. For example, "%10d" prints a signed number in a column 10 characters wide.

$ printf "%10d\n" 11
        11

Likewise, a negative number left-justifies the columns.

$ printf "%-10d %-10d\n" 11 12
11         12

A number with a decimal point represents a column width and a minimum number of digits (or decimal places with floating-point values). For example, "%10.5f" indicates a floating-point number in a 10-character column with a minimum of five decimal places.

$ printf "%10.5f\n" 17.2
  17.20000

Finally, an apostrophe (')displays the number with thousands groupings based on the current country locale.

The \n in the format string is an example of a backslash code for representing unprintable characters. \n indicates a new line should be started. There are special backslash formatting codes for the representation of unprintable characters.

$ printf "Two separate\nlines\n"
Two separate
lines

Any 8-bit byte or ASCII character can be represented by \0 or \and its octal value.

$ printf "ASCII 65 (octal 101) is the character \0101\n"
ASCII 65 (octal 101) is the character A

printf recognizes numbers beginning with a zero as octal notation, and numbers beginning with 0x as hexadecimal notation. As a result, printf can convert numbers between these different notations.

$ printf "%d\n" 010
8
$ printf "%d\n " 0xF
15
$ printf "0x%X\n " 15
0xF
$ printf "0%o\n " 8
010

Most Linux distributions also have a separate printf command to be compliant with the POSIX standard

More about shell variables

There are multiple system variable in shell, the variables that are set by the system.

For example, PATH variable. Of cause you cal also set it yourself:

PATH=/usr/bin:/usr/sbin/:usr/local/bin

By default, shell variables are typeless and can have both arithmetic and non-numeric values.

You can see a list of system variables in you environment by issuing the env command. Usually, the command produces more than a single screen of output. So, you can use a pipe redirector and the more command to view the output one screen at a time:

env | more

Press the Space bar to see each successive page of output. You'll probably see several of the shell variables described below:

You can use the value of a shell variable in a command by preceding the name of the shell variable by a dollar sign ($). To avoid confusion with surrounding text, you can enclose the name of the shell variable within curly braces ({}); For example, you can change the current working directory to your home directory by issuing the command:

cd $HOME

An easy way to see the value of a shell variable is to specify the variable as the argument of the echo command. For example, to see the value of the PATH shell variable, issue the command:

echo $PATH

To make the value of a shell variable available not just to the shell, but to programs invoked by using the shell, you must export the shell variable. To do so, use the export command, which has the form:

export variable

where variable specifies the name of the variable to be exported. A shorthand form of the command lets you assign a value to a shell variable and export the variable in a single command:

export variable=value

You can remove the value associated with shell variable by giving the variable an empty value:

variable=

However, a shell variable with an empty value remains a shell variable and appears in the output of the set command. To dispense with a shell variable, you can issue the unset command:

unset variable

Once you unset the value of a variable, the variable no longer appears in the output of the set command.

The Search Path

The special shell variable PATH holds a series of paths known collectively as the search path. This is a very important variable and that's why we will study it separately. Whenever you issue an external command, the shell searches paths that comprise the search path, seeking the program file that corresponds to the command. The startup scripts establish the initial value of the PATH shell variable, but you can modify its value to include any desired series of paths. You must use a colon (:) to separate each path of the search path.

For example, suppose that PATH has the following value:

/usr/bin:/bin:/usr/local/bin:/usr/bin/X11:/usr/X11R6/bin

You can add a new search directory, say /opt/bin, with the following command:

PATH=$PATH:/opt/bin/

Now, the shell will look for external programs in /opt/bin/ as well as the default directories. However, it will look there last. If you prefer to check /opt/bin first, issue the following command instead:

PATH=/opt/bin:$PATH

The which command helps you work with the PATH shell variable. It checks the search path for the file specified as its argument and prints the name of the matching path, if any. For example, suppose you want to know where the program file for the wc command resides. Issuing the command:

which wc

will tell you that the program file is /usr/bin/wc, or whatever other path is correct for your system.

More on assignment statement

Variables in can be assigned string as well. The string literal assigned should be in either in single or in double quotes. The difference is that if string is in double quotes all variable in it are expanded to their values (so called macrosubstitution). For example:

LOGFILE='mylog.txt'
printf "%s\n" $LOGFILE
version=12
LOGFILE="mylog.$version.txt"
printf "%s\n" $LOGFILE

The value of variables can be printed using the printf command. printf has two arguments: a formatting code, and the variable to display. For simple variables, the formatting code is "%s\n" and the variable name should appear in double quotes with a dollar sign in front of the name

$ printf "%s\n" $LOGFILE
mylog.txt

printf also works without formatting codes. I think "%s" is assumed (note that there is no carriage return (newline) at the end). So the statement

printf $HOME
printf $PWD

produces the path to your home directory ($HOME is a system variable which is set by OS when you login to your account) concatenated with the path to your current directory, which might be not what you wanted.

Old output statement in Unix shells -- echo produces the line with the carriage return. Please note printf in modern bash by and large replaced old echo command and now plays an important role in shell scripting.

Assigning the result of execution of a command

You can assign to bash variable the results of the execution of any command or script. This is done via so called backquoting:

TIMESTAMP=`date`
printf "%s\n" "$TIMESTAMP"

Will produce

Wed, Jul 05, 2017 8:50:41 AM

The date shown is the date when the variable TIMESTAMP is assigned its value. The value of the variable remains the same until a new value is assigned.

Example of a bash script

#!/bin/sh
#
#     tiger - A UN*X security checking system
#     Copyright (C) 1993 Douglas Lee Schales, David K. Hess, David R. Safford
#
#     Please see the file `COPYING' for the complete copyright notice.
#
# check_system - 06/15/93
#
#-----------------------------------------------------------------------------
#
TigerInstallDir='.'

#
# Set default base directory.
# Order or preference:
#      -B option
#      TIGERHOMEDIR environment variable
#      TigerInstallDir installed location
#
basedir=${TIGERHOMEDIR:=$TigerInstallDir}

for parm
do
   case $parm in
   -B) basedir=$2; break;;
   esac
done

#
# Verify that a config file exists there, and if it does
# source it.
#
[ ! -r $basedir/config ] && {
  echo "--ERROR-- [init002e] No 'config' file in \`$basedir'."
  exit 1
}

. $basedir/config

. $BASEDIR/initdefs

#
# If run in test mode (-t) this will verify that all required
# elements are set.
#
[ "$Tiger_TESTMODE" = 'Y' ] && {
  haveallcmds GREP || exit 1
  haveallfiles BASEDIR WORKDIR || exit 1
  
  echo "--CONFIG-- [init003c] $0: Configuration ok..."
  exit 0
}

#------------------------------------------------------------------------
echo
echo "# Performing system specific checks..."

haveallfiles BASEDIR || exit 1

runtable()
{
  haveallcmds GREP && {
    $GREP -v '^#' |
    while read script
    do
      case "$script" in
	/*)
	if [ $TESTEXEC $script ]; then
	  echo "# Running '$script'..."
	  $script
	else
	  echo "--ERROR-- [misc005w] Can't find $script'..."
	fi
        ;;
        *)
	if [ $TESTEXEC $CONFIG_DIR/$script ]; then
	  echo "# Running '$CONFIG_DIR/$script'..."
	  $CONFIG_DIR/$script
	elif [ $TESTEXEC $SCRIPTDIR/$script ]; then
	  echo "# Running '$SCRIPTDIR/$script'..."
	  $SCRIPTDIR/$script
	else
	  echo "--ERROR-- [misc005w] Can't find $script'..."
	fi
        ;;
      esac
    done
  }
}

for dir in $OS/$REL/$REV/$ARCH $OS/$REL/$REV $OS/$REL $OS
do
  [ $TESTEXEC $BASEDIR/systems/$dir/check ] && {
    echo "# Performing checks for $dir..."
    $BASEDIR/systems/$dir/check
  }
done

[ -r $BASEDIR/check.tbl ] && runtable < $BASEDIR/check.tbl

for dir in $OS/$REL/$REV/$ARCH $OS/$REL/$REV $OS/$REL $OS
do
  [ -r $BASEDIR/systems/$dir/check.tbl ] && runtable < $BASEDIR/systems/$dir/check.tbl
done

Two types of Unix utilities and Bash scripts: Filters vs. utilities

The commands that can be typed at the Bash shell prompt are usually Linux programs stored externally on your file system. Some commands are built into the shell for speed, standardization, or because they can function properly only when they are built-in.

No matter what their source, commands fall into a number of informal categories. Utilities are general-purpose commands useful in many applications, such as returning the date or counting the number of lines in a file.

Filters are commands that take standard output from previous command modify it and output the result to the standard output. For example you can extract line that contain a certain word from the text file. Many standard Unix utilities can be used as filters. You can write you own filter in bash.

Multiple Commands

Multiple commands can be combined on a single line. How they are executed depends on what symbols separate them.

If each command is separated by a semicolon, the commands are executed consecutively, one after another.

$ printf "%s\n" "This is executed" ; printf "%s\n" "And so is this"
This is executed
And so is this

If each command is separated by a double ampersand (&&), the commands are executed until one of them fails or until all the commands are executed.

$ date && printf "%s\n" "The date command was successful"
Wed Aug 15 14:36:32 EDT 2001
The date command was successful

If each command is separated by a double vertical bar (||), the commands are executed as long as each one fails until all the commands are executed.

$ date 'duck!' || printf "%s\n" "The date command failed"
date: bad conversion
The date command failed

Semicolons, double ampersands, and double vertical bars can be freely mixed in a single line.

$ date 'format-this!' || printf "%s\n" "The date command failed" && \
					printf "%s\n" "But the printf didn't!"
date: bad conversion
The date command failed
But the printf didn't!

These are primarily intended as command-line shortcuts: When mixed with redirection operators such as >, a long command chain is difficult to read and you should avoid it in scripts.

The second view on a typical Bash script structure

A well-structured Bash script can be divided into five sections:

  1. The header. The header defines what kind of script this is, who wrote it, what version it is, and what assumptions or shell options Bash uses. The script without proper header is unprofessional script.
  2. Declarations of global variables. It is a good practice to declare variable that you use. countless hours were later spend in trying to find error which turned to be the result of a misspelled variable in some rarely executed part of the script.
  3. Sanity checks. Verify that supplied parameters and env corresponds to the assumption you made during the creation of the script. Environment tends to change.
  4. The main functionality tof the script. Here is where that real action is.
  5. Cleanup. here you need to remove temp file if you created during execution of the main part of the script, write some summary messages to the log if your script has long, etctc.

Declarations of global variables

All declarations that apply to the entirety of the script should occur at the top of the script, beneath the header.

By placing global declarations in one place, you make it easy for someone to refer to them while reading the script

# Global Declarations

declare -rx SCRIPT=${0##*/}     # A useful BASH idiom that puts the name of the script into the variable

declare -rx who="/usr/bin/who   # rx means that the variable can be read and executed only; the who command - man 1 who
declare -rx sync="/bin/sync     # the sync command - man 1 sync
declare -rx wc="/usr/bin/wc     # the wc command - man 1 wc

In bash 3.x and 4.x you can disallow the use of undefined variables by using the following command:

shopt -s nounset

That makes sense and is highly recommended practice.

Sanity Checks

The next section, sanity checks, protects the script from unexpected changes in the environment in which the script is running. here you check the environment and it is does not corresponds to expected exit the script, not to do some damage, due to changed environment in which it is running. Such changes can include different version of the command, different location of the command, running as non privileged user (for script that are designed to run as root), etc.

Normally, when a command runs at the command prompt, Bash searches several directories for the command you want to run. If it can't find the command, perhaps because of a spelling mistake, Bash reports an error. This kind of behavior is good for working interactively with Bash because it saves time and any mistakes are easily corrected with a few keystrokes.

Scripts, on the other hand, run without any human supervision. Before a script executes any statements, it needs to verify that all the necessary files are accessible. All required commands should be executable and stored in the expected locations. These checks are sometimes called sanity checks because they do not let the script begin its main task unless the computer is in a known, or "sane," state. This is especially important with operating systems such as Linux that are highly customizable: What is true on one computer might not be true on another.

Another way of putting it is that Bash relies on runtime error checking. Most errors are only caught when the faulty statement executes. Checking for dangerous situations early in the script prevents the script from failing in the middle of a task, otherwise making it difficult to determine where the script left off and what needs to be done to continue.

Sometimes system administrators unintentionally delete or change the accessibility of a file, making it unavailable to a script. Other times, changes in the environment can change which commands are executed. Malicious computer users have also been known to tamper with a person's login profile so that the commands you think you are running are not the ones you are actually using.

In the most primitive case you can check if external command are in the same places as on the computer you wrote the script. Generally you need to verify that location of the commands if you are using multiple flavors of Linux such as SUSE and Red Hat.

# Sanity checks

if [ -z "$BASH" ]  ; then
   printf "$SCRIPT:$LINENO: please run this script with the BASH shell\n" >&2
   exit 1
fi
if [ ! -x "$who" ] ; then
   printf "$SCRIPT:$LINENO: the command $who is not available aborting\n " >&2
   exit 1
fi

if test [ ! -x "$wc" ] ; then
   printf "$SCRIPT:$LINENO: the command $wc is not available aborting\n " >&2
   exit 1
fi

there are a couple of elements that will not understand in those statement but please ignore them for now. They will be explained later.

The actual functionality

When you have verified that the system is sane, the script can proceed to do its work.

# create a backup of my files at the beginning of the session
mybackup="/Scratch/users/$USER/myhome_backup"`date +"%y%m%d"`".tar" 
if [ ! -d  "/Scratch/users/$USER/" ] ; then
   mkdir -p "/Scratch/users/$USER/"
fi
tar cvf /Scratch/users/$USER/$mybackup.tar $HOME
Here system variable $HOME is set to the your home directory by the system at the beginning of your login session.

Cleanup

Finally, the script needs to clean up after itself. Any temporary files should be deleted, and the script returns a status code to the person or program running the script. In this case, there are no files to clean up.

echo -n Deleting the temporary files... 
rm -f *.tmp
echo Done.

More complex scripts might use a variable to keep track of that status code returned by a failed command.

As seen previously, the exit command unconditionally stops a script, exit can and should include a status code to return to the caller of the script. Return code 0 indicates successful completion of the script (no errors).

If the status code is omitted, the status of the last command executed by the script is returned. As a result, it's always best to supply an exit status.

if [ -f $mybackup ] ; then 
   exit 0 # all is well
else 
   exit 1 # backup was not created
fi

A script automatically stops when it reaches its end as if there was an implicit exit typed there, but the exit status in this case is the status of the last command executed.

There is also a utility called sleep, which suspends the script execution for a specific number of seconds after which it wakes up and resumes at the next statement after the sleep command.

sleep 5 # wait for 5 seconds

Sleep is useful for placing pauses in the script, enabling the user to read what's been displayed on the screen. Sleep isn't suitable for synchronizing events, however, because how long a particular program runs on the computer often depends on the system load, number of users, hardware upgrades, and other factors outside of the script's control.

Reading Keyboard Input

The built-in read command stops the script and waits for the user to type something from the keyboard. The text typed is assigned to the variable that accompanies the read command.

printf "%s\n" "Enter the number of days from now you want to include into your backup"
read BACKUP_PERIOD

In this example, the variable ARCHIVE DAYS contains the number of days typed by the user.

There are a number of options for read. First, -p (prompt) is a shorthand feature that combines the printf and read statements, read displays a short message before waiting for the user to respond.

read -p "Enter the number of days from now you want to include into your backup?" BACKUP_PERIOD

The -r (raw input) option disables the backslash escaping of special characters. Normally, read understands escape sequences such as \n when they're typed by the user. Using raw input, read treats the backspace the same as any other character typed on the keyboard. Typically you need to use -r when you need to enter Windows style path where directories are separated with backslashes.

read -p "Enter a Windows backup path): " -i BACKUP_PATH

The -e option works only interactively, not in shell scripts. It enables you to use Bash's history features to select the line to return. You can use the Up and Down Arrow keys to move through recently typed commands like Ctrl-R command on the bash command line.

A timeout can be set up using the -t (timeout) switch. If nothing is typed by the end of the timeout period, the shell continues with the next command and the value of the variable is unchanged. If the user starts typing after the timeout period ends, anything typed is lost. The timeout is measured in seconds.

read -t 5 FILENAME # wait up to 5 seconds to read a filename

If in your script there is a variable called tmout. Bash times out after the number of seconds in the variable even if -t is not used.

A limit can be placed on the number of characters to read using the -n (number of characters) switch. If the maximum number of characters is reached, the shell continues with the next command without waiting for the Enter/Return key to be pressed.

read -n 10 FILENAME # read no more than 10 characters

If you don't supply a variable, read puts the typed text into a variable named reply. Well-structured scripts should avoid this default behavior to make it clear to a script reader where the value of REPLY is coming from.

While reading from the keyboard, read normally returns a status code of 0.

=======================================================================================================================================

Basic Redirection

You can divert messages from commands like printf to files or other commands. Bash refers to this as redirection. There are a large number of redirection operators.

The > operator redirects the messages of a command to a file. The redirection operator is followed by the name of the file the messages should be written to. For example, to write the message "The processing is complete" to a file named results . txt, you use

timestamp=`date`
printf "%s\n" "The processing started at $timestamp" > /tmp/nikolai.log

The > operator always overwrites the named file. If a series of printf messages are redirected to the same file, only the last message appears.

To add messages to a file without overwriting the earlier ones, Bash has an append operator, >>. This operator redirects messages to the end of a file.

printf "%s\n" "The processing started at $timestamp" > /tmp/nikolai.log
... ... ... 
printf "There were no errors. Normal exist of the program" >>  /tmp/nikolai.log

In the same way, input can be redirected to a command from a file. The input redirection symbol is <. For example, the utility wc (word count) is able to calculate number of lines in the file with the option -l. That means that you can count the number of lines in a file, using the command:

wc  -l <  $HOME/.bashrc

Again, wc -l count lines of the file. In this case this is number of lines in your .bashrc. Printing this information from your .bash_profile script might be a useful reminder to you that can alert you to the fact that you recently modified your env, or God forbid your .bashrc file disappeared without trace :-)

There is also a possibility to imitate reading from a file inside the script by putting several lines directly into the script. The operator <<MARKER treats the lines following it in a script as if they were typed from the keyboard until it reaches the file starting from the work MARKET. In other words the lines which are treated as an input file are limited by the a special line using the delimiter you define yourself. For example, in the following example the delimiter word used is "EOL":

cat > /tmp/example <<EOF
this is a test demonstrating how you can 
write several lines of text into 
a file
EOF

If you use >> instead of > you can add lines to a file without using any editor:

cat >>/etc.resolv.conf <<EOF
search datacenter.firma.com headquaters.firma.com
nameserver 10.100.20.5
nameserver 10.100.20.6
EOF

In this example. Bash treats the three lines between the EOF markers as if they were being typed from the keyboard and write them to the file specified after > (/tmp/example in our case). there should be no spaces between << and EOF marker. Again, the name EOF is arbitrary. you can choose, for example, LINES_END instead. the only important thing is there should be no lines in your test that start with the same word. that's why using all caps makes sense in this case.

The data in the << list is known as a here file (or a here document) because the word HERE was often used in Bourne shell scripts as the marker of the end of the input lines.

Bash have another here file redirection operator, <<<, which redirects a variable or a literal.

cat > /tmp/example <<<  "this is another example of piping info into the file" 

Pipes

Instead of files, the results of a command can be redirected as input to another command. This process is called piping and uses the vertical bar (or pipe) operator |.

who | wc -l # count the number or users

Any number of commands can be strung together with vertical bar symbols. A group of such commands is called a pipeline.

If one command ends prematurely in a series of pipe commands, for example, because you interrupted a command with a control-c, Bash displays the message "Broken Pipe" on the screen.

The shell provides three standard data streams:

By default, most programs read their input from stdin and write their output to stdout. Because both streams are normally associated with a console, programs behave as you generally want, reading input data from the console keyboard and writing output to the console screen. When a well-behaved program writes an error message, it writes the message to the stderr stream, which is also associated with the console by default. Having separate streams for output and error messages presents an important opportunity, as you'll see in a moment.

Although the shell associates the three standard input/output streams with the console by default, you can specify input/output redirectors that, for example, associate an input or output stream with a file:

To see how redirection works, consider the wc command on the console.

Perhaps you can now see the reason for having the separate output streams stdout and stderr. If the shell provided a single output stream, error messages and output would be mingled. Therefore, if you redirected the output of a program to a file, any error messages would also be redirected to the file. This might make it difficult to notice an error that occurred during program execution. Instead, because the streams are separate, you can choose to redirect only stdout to a file. When you do so, error messages sent to stderr appear on the console in the usual way. Of course, if you prefer, you can redirect both stdout and stderr to the same file or redirect them to different files. As usual in the Unix world, you can have it your own way.

A simple way of avoiding annoying output is to redirect it to the null file, /dev/null. If you redirect the stderr stream of a command to /dev/null, you won't see any error messages the command produces.

Just as you can direct the standard output or error stream of a command to a file, you can also redirect a command's standard input stream to a file, so that the command reads from the file instead of the console. For example, if you issue the wc command without arguments, the command reads its input from stdin. Type some words and then type the end of file character (Ctrl-D) and wc will report the number of lines, words, and characters you entered. You can tell wc to read from a file, rather than the console, by issuing a command like:

wc </etc/passwd

Of course, this isn't the usual way of invoking wc. The author of wc helpfully provided a command-line argument that lets you specify the file from which wc reads. However, by using a redirector, you could read from any desired file even if the author had been less helpful.

Some programs are written to ignore redirectors. For example, the passwd command expects to read the new password only from the console, not from a file. You can compel such programs to read from a file, but doing so requires techniques more advanced than redirectors.

When you specify no command-line arguments, many Unix programs read their input from stdin and write their output to stdout. Such programs are called filters. Filters can be easily fitted together to perform a series of related operations. The tool for combining filters is the pipe, which connects the output of one program to the input of another. For example, consider this command:

ls -l ~ | wc -l

The command consists of two commands, joined by the pipe redirector ( |). The first command lists the names of the files in the users home directory, one file per line. The second command invokes wc by using the -l option, which causes wc to print only the total number of lines, rather than printing the total number of lines, words, and characters. The pipe redirector sends the output of the ls command to the wc command, which counts and prints the number of lines in its input, which happens to be the number of files in the user's home directory.

This is a simple example of the power and sophistication of the Unix shell. Unix doesn't include a command that counts the files in the user's home directory and doesn't need to do so. Should the need to count the files arise, a knowledgeable Unix user can prepare a simple script that computes the desired result by using general-purpose Unix commands.

By default Unix/Linux assumes that all output is going to STNDIN which is assigned to a user screen/console called

dev/tty

Try to execute

printf "\s\n" "Hello to myself" > /dev/tty

You will see that it typed on the your screen exactly the say way as if you executed the command

printf "\s\n" "Hello to myself"

Because those two command are actually identical.

When messages aren't redirected in your program, the output goes through a special file called standard output. By default, standard output represents the screen. That means that everything sent through standard output is redirected to the screen. Bash uses the symbol &l to refer to standard output, and you can explicitly redirect messages to it. You can redirect to the file the output of the whole script

bash myscript.sh > mylisting.txt

In this case any printf statement will write the information not the screen, but to the file you've redirected the output to. In this case this is the file mylisting. txt. Let's see another set of examples:

printf "Don't forget to backup your data" > results.txt   # sent to a file on disk
printf "Don't forget to backup your data" > /dev/tty      # send explicitly to the screen
printf "Don't forget to backup your data"                 # sent to screen via standard output
printf "Don't forget to backup your data >&1              # same as the last one
printf "Don't forget to backup your data >/dev/stdout     # same as the last one

Using standard output is a way to send all the output from a script and any commands in it to a new destination.

A script doesn't usually need to know where the messages are going: There's always the possibility they were redirected. However, when errors occur and when warning messages are printed to the user, you don't want these messages to get redirected along with everything else.

Linux defines a second file especially for messages intended for the user called standard error. This file represents the destination for all error messages. Because standard error, like standard output, is a file, standard error can likewise be redirected. The symbol for standard error is &2. /dev/stderr can also be used. The default destination, like standard output, is the screen. For example,

printf "$SCRIPT:SLINENO: No files available for processing" >&2

This command appears to work the same as a printf without the >&2 redirection, but there is an important difference. It displays an error message to the screen, no matter where standard output has been previously redirected.

The redirection symbols for standard error are the same as standard output except they begin with the number 2. For example

bash myscript.sh 2> myscript_errors.txt

type command

If you don't know whether a command is built-in, an external command, an alias, or a function the Bash type command can help. For example

type cd
cd is a shell builtin

type id
id is /usr/bin/id
NOTE: type command is essentially the same command of declare.

The shopt commands

Bash options can be enabled or disabled by commands or as a command switch when Bash is started. For example, to start a Bash session and disallow the use of undefined variables, use this:

$ bash -o nounset

In a script you can disallow the use of undefined variables by using the following command:

shopt -s nounset

Historically, the set command was used to turn options on and off. As the number of options grew, set became more difficult to use because options are represented by single letter codes. As a result, Bash provides the shopt (shell option) command to turn options on and off by name instead of a letter. You can set certain options only by letter. Others are available only under the shopt command. This makes finding and setting a particular option a confusing task.

shopt -s (set) turns on a particular shell option. shopt -u (unset) turns off an option. Without an -s or -u, shopt toggles the current setting.

shopt -u -nounset

shopt by itself or with -p prints a list of options and shows whether they are on, excluding the -o options. To see these options, you need to set -o. A list of the letter codes is stored in the shell variable $-.

There are way too may options in bash, but a good thing is that we care only about few of them:

# shopt -p
shopt -u autocd
shopt -u cdable_vars
shopt -u cdspell
shopt -u checkhash
shopt -u checkjobs
shopt -u checkwinsize
shopt -s cmdhist
shopt -u compat31
shopt -u compat32
shopt -u compat40
shopt -u compat41
shopt -u compat42
shopt -u compat43
shopt -u completion_strip_exe
shopt -s complete_fullquote
shopt -u direxpand
shopt -u dirspell
shopt -u dotglob
shopt -u execfail
shopt -s expand_aliases
shopt -u extdebug
shopt -u extglob
shopt -s extquote
shopt -u failglob
shopt -s force_fignore
shopt -u globasciiranges
shopt -u globstar
shopt -u gnu_errfmt
shopt -s histappend
shopt -u histreedit
shopt -u histverify
shopt -s hostcomplete
shopt -u huponexit
shopt -u inherit_errexit
shopt -s interactive_comments
shopt -u lastpipe
shopt -u lithist
shopt -s login_shell
shopt -u mailwarn
shopt -u no_empty_cmd_completion
shopt -u nocaseglob
shopt -u nocasematch
shopt -u nullglob
shopt -s progcomp
shopt -s promptvars
shopt -u restricted_shell
shopt -u shift_verbose
shopt -s sourcepath
shopt -u xpg_echo

I recommend to setup just a couple, they are not setup by default

shopt -s lastpipe
shopt -u autocd

Regular expressions in filenames

Shell regular expressions are of two types:

Primitive regular expressions metacharacters are as following:

NOTE: Bash 3.2 introduced "normal " regular expression and operator =~ to use them

PART 2

During a normal day a sysadmin often writes several bash or Perl scripts. They are called throwaway scripts. Often they perform specific task related to the current problem that sysadmin is trying to solve. For example some information collection. often if you face a problem you want to extract from the log file relevant to the problem information, but logfile is "dirty" and needs to be filtered from junk before it becomes usable.

The main tool in such circumstances is a subclass of Unix utilities called filters and pipes connected together via two mechanisms available in Unix -- redirection and pipes. which allow them to process each out output as input

In some specific roles like web server administrator extracting relevant information from web server and proxy log can approach a full time job.

Standard files

Unix has three standard files:

Of course, that's mostly by convention. There's nothing stopping you from writing your error information to standard output if you wish. You can even close the three file handles totally and open your own files for I/O.

Redirection and pipes: two powerful mechanisms in shell programming

There are two major mechanism that increase flexibility of Unix utilities:

Before shell executes a command, it scans the command line for redirection characters. These special symbols instruct the shell to redirect input and output. Redirection characters can appear anywhere in a simple command or can precede or follow a command. They are not passed on to the invoked command.

Redirection of standard files

By default Unix/Linux assumes that all output is going to STDOUT which is assigned to a user screen/console called /dev/tty. You can divert messages directed to standard output, for example from commands like printf, to files or other commands. Bash refers to this as redirection.

The most popular is > operator, which redirects STDOUT to a file. The redirection operator is followed by the name of the file the messages should be written to. For example, to write the message "The processing is complete" to a file named my.log , you use

timestamp=`date`
printf "%s\n" "The processing started at $timestamp" > /tmp/my.log

Try to execute

printf "\s\n" "Hello world" > /dev/tty

You will see that it typed on the your screen exactly the say way as if you executed the command

printf "\s\n" "Hello to myself"

Because those two command are actually identical.

It is important to understand that when messages aren't redirected in your program, the output goes through a special file called standard output. By default, standard output represents the screen. That means that everything sent through standard output is redirected to the screen. Bash uses the symbol &l to refer to standard output, and you can explicitly redirect messages to it. You can redirect to the file the output of the whole script

bash myscript.sh > mylisting.txt
This is the same as
bash myscript.sh 1> mylisting.txt

In this case any printf statement will write the information not the screen, but to the file you've redirected the output to. In this case this is the file mylisting. txt.

But you can also redirect each printf statement in you script. Let's see another set of examples:

printf "Don't forget to backup your data" > /dev/tty      # send explicitly to the screen
printf "Don't forget to backup your data"                 # sent to screen via standard output
printf "Don't forget to backup your data >&1              # same as the last one
printf "Don't forget to backup your data >/dev/stdout     # same as the last one
printf "Don't forget to backup your data" > warning.txt   # sent to a file on disk

Using standard output is a way to send all the output from a script and any commands in it to a new destination.

A script doesn't usually need to know where the messages are going: There's always the possibility they were redirected. However, when errors occur and when warning messages are printed to the user, you don't want these messages to get redirected along with everything else.

Linux defines a second file especially for messages intended for the user called standard error. This file represents the destination for all error messages. Because standard error, like standard output, is a file, standard error can likewise be redirected. The symbol for standard error is &2. /dev/stderr can also be used. The default destination, like standard output, is the screen. For example,

printf "$SCRIPT:SLINENO: No files available for processing" >&2

This command appears to work the same as a printf without the >&2 redirection, but there is an important difference. It displays an error message to the screen, no matter where standard output has been previously redirected.

The redirection symbols for standard error are the same as standard output except they begin with the number 2. For example

bash myscript.sh 2> myscript_errors.txt

There are several classic types of redirection:

Source and target can be expression. In this case bash performs command and parameter substitution before using the parameter. File name substitution occurs only if the pattern matches a single file

Unix command cat is actually short for "catenate," i.e., link together. It accepts multiple filename arguments and copies them to the standard output. But let's pretend, for the moment, that cat and other utilities don't accept filename arguments and accept only standard input. Unix shell lets you redirect standard input so that it comes from a file. The notation command < filename does the same as cat with less overhead.

The > operator always overwrites the named file. If a series of printf messages are redirected to the same file, only the last message appears.

To add messages to a file without overwriting the earlier ones, Bash has an append operator, >>. This operator redirects messages to the end of a file.

printf "%s\n" "The processing started at $timestamp" > /tmp/nikolai.log
... ... ... 
printf "There were no errors. Normal exist of the program" >>  /tmp/nikolai.log

In the same way, input can be redirected to a command from a file. The input redirection symbol is <. For example, the utility wc (word count) is able to calculate number of lines in the file with the option -l. That means that you can count the number of lines in a file, using the command:

wc  -l <  $HOME/.bashrc

Again, wc -l count lines of the file. In this case this is number of lines in your .bashrc. Printing this information from your .bash_profile script might be a useful reminder to you that can alert you to the fact that you recently modified your env, or God forbid your .bashrc file disappeared without trace :-)

There is also a possibility to imitate reading from a file inside the script by putting several lines directly into the script. The operator <<MARKER treats the lines following it in a script as if they were typed from the keyboard until it reaches the file starting from the work MARKET. In other words the lines which are treated as an input file are limited by the a special line using the delimiter you define yourself. For example, in the following example the delimiter word used is "EOL":

cat > /tmp/example <<EOF
this is a test demostrating how you can 
write several lines of text into 
a file
EOF

If you use >> instead of > you can add lines to a file without using any editor:

cat >>/etc/resolv.conf <<EOF
search datacenter.mycompany.com headquarters.mycompany.com
nameserver 10.100.20.5
nameserver 10.100.20.6
EOF

In this example bash treats the three lines between the EOF markers as if they were being typed from the keyboard and write them to the file specified after > (/tmp/example in our case). there should be no spaces between << and EOF marker. Again, the name EOF is arbitrary. you can choose, for example, LINES_END instead. the only important thing is there should be no lines in your test that start with the same word.

cat >>/etc/resolv.conf <<LINES_END
search datacenter.mycompany.com headquaters.mycompany.com
nameserver 10.100.20.5
nameserver 10.100.20.6
LINES_END

There should no market at any beginning of the lines of included text. that's why using all caps makes sense in this case.

The data in the << list is known as a here file (or a here document) because the word HERE was often used in Bourne shell scripts as the marker of the end of the input lines.

Bash have another here file redirection operator, <<<, which redirects a variable or a literal.

cat > /tmp/example <<<  "this is another example of piping info into the file" 
Here is a summary of what we can do.

Pipes as cascading redirection

Instead of files, the results of a command can be redirected as input to another command. This process is called piping and uses the vertical bar (or pipe) operator |.

who | wc -l # count the number or users

Any number of commands can be strung together with vertical bar symbols. A group of such commands is called a pipeline.

If one command ends prematurely in a series of pipe commands, for example, because you interrupted a command with a Ctrl-C, Bash displays the message "Broken Pipe" on the screen.

Bash and the process tree [Bash Hackers Wiki]

Pipes are a very powerful tool. You can connect the output of one process to the input of another process. We won't delve into pipign at this point, we just want to see how it looks in the process tree. Again, we execute some commands, this time, we'll run ls and grep:

$ ls | grep myfile

It results in a tree like this:

                   +-- ls
xterm ----- bash --|
                   +-- grep

Note once again, ls can't influence the grep environment. grep can't influence the ls environmet, and neither grep nor ls can influence the bash environment.

How is that related to shell programming?!?

Well, imagine some Bash code that reads data from a pipe. For example, the internal command read, which reads data from stdin and puts it into a variable. We run it in a loop here to count input lines:

counter=0

cat /etc/passwd | while read; do ((counter++)); done
echo "Lines: $counter"

What? It's 0? Yes! The number of lines might not be 0, but the variable $counter still is 0. Why? Remember the diagram from above? Rewriting it a bit, we have:

                   +-- cat /etc/passwd
xterm ----- bash --|
                   +-- bash (while read; do ((counter++)); done)

See the relationship? The forked Bash process will count the lines like a charm. It will also set the variable counter as directed. But if everything ends, this extra process will be terminated - your "counter" variable is gone You see a 0 because in the main shell it was 0, and wasn't changed by the child process!

So, how do we count the lines? Easy: Avoid the subshell. The details don't matter, the important thing is the shell that sets the counter must be the "main shell". For example:

counter=0

while read; do ((counter++)); done </etc/passwd
echo "Lines: $counter"

It's nearly self-explanitory. The while loop runs in the current shell, the counter is incremented in the current shell, everything vital happens in the current shell, also the read command sets the variable REPLY (the default if nothing is given), though we don't use it here.

Bash creates subshells or subprocesses on various actions it performs:

As shown above, Bash will create subprocesses everytime it executes commands. That's nothing new.

But if your command is a subprocess that sets variables you want to use in your main script, that won't work.

For exactly this purpose, there's the source command (also: the dot . command). Source doesn't execute the script, it imports the other script's code into the current shell:

source ./myvariables.sh
# equivalent to:
. ./myvariables.sh

Explicit subshell

If you group commands by enclosing them in parentheses, these commands are run inside a subshell:

(echo PASSWD follows; cat /etc/passwd; echo GROUP follows; cat /etc/group) >output.txt

Command substitution

With command substitution you re-use the output of another command asr command line, for example to set a variable. The other command is run in a subshell:

number_of_users=$(cat /etc/passwd | wc -l)
Note that, in this example, a second subshell was created by using a pipe in the command substitution:
                                            +-- cat /etc/passwd
xterm ----- bash ----- bash (cmd. subst.) --|
                                            +-- wc -l<  not suspect exist. 

Arithmetic Expressions

The ((...)) Command

The ((...)) command is equivalent to the let command, except that all characters between the (( and )) are treated as quoted arithmetic expressions. This is more convenient to use than let, because many of the arithmetic operators have special meaning to the Korn shell. The following commands are equivalent:

$ let "X=X + 1" 

and

$ ((X=X + 1)) 

Before the Korn shell let and ((...)) commands, the only way to perform arithmetic was with expr. For example, to do the same increment X operation using expr:
$ X=`expr $X + 1` 

In tests on a few systems, the let command performed the same operation 35-60 times faster! That is quite a difference.

Processing Arguments

You can easily write scripts that process arguments, because a set of special shell variables holds the values of arguments specified when your script is invoked.

For example, here's a simple one-line script that prints the value of its second argument:

echo My second argument has the value $2.

Suppose you store this script in the file second, change its access mode to permit execution, and invoke it as follows:

./second a b c

The script will print the output:

My second argument has the value b.
$0 The command name. $1, $2, ... , $9 The individual arguments of the command. $* The entire list of arguments, treated as a single word. $@ The entire list of arguments, treated as a series of words.$? The exit status of the previous command. The value 0 denotes successful completion. $$ he process id of the current process.

Notice that the shell provides variables for accessing only nine arguments. Nevertheless, you can access more than nine arguments. The key to doing so is the shift command, which discards the value of the first argument and shifts the remaining values down one position. Thus, after executing the shift command, the shell variable $9 contains the value of the tenth argument. To access the eleventh and subsequent arguments, you simply execute the shift command the appropriate number of times.

Exit Codes

The shell variable $? holds the numeric exit status of the most recently completed command. By convention, an exit status of zero denotes successful completion; other values denote error conditions of various sorts.

You can set the error code in a script by issuing the exit command, which terminates the script and posts the specified exit status. The format of the command is:

exit 
status

where status is a non-negative integer that specifies the exit status.

Conditional Logic

A shell script can employ conditional logic, which lets the script take different action based on the values of arguments, shell variables, or other conditions. The test command lets you specify a condition, which can be either true or false. Conditional commands (including the if, case, while, and until commands) use the test command to evaluate conditions.

The test command

The test command evaluates its arguments and sets the exit status to 0, which indicates that the specified condition was true, or a non-zero value, which indicates that the specified condition was false. Some commonly used argument forms used with the test command:

To see the test command in action, consider the following script:

test -d $1
echo $?

This script tests whether its first argument specifies a directory and displays the resulting exit status, a zero or a non-zero value that reflects the result of the test.

Suppose the script were stored in the file tester, which permitted read access. Executing the script might yield results similar to the following:

$ ./tester /
0
$ ./tester /missing
1

These results indicate that the / directory exists and that the /missing directory does not exist.

The if command

The test command is not of much use by itself, but combined with commands such as the if command, it is useful indeed. The if command has the following form:

if 
command
then
  
commands
else
 
commands
fi

Usually the command that immediately follows the word if is a test command. However, this need not be so. The if command merely executes the specified command and tests its exit status. If the exit status is 0, the first set of commands is executed; otherwise the second set of commands is executed. An abbreviated form of the if command does nothing if the specified condition is false:

if 
command
then
  
commands
fi

When you type an if command, it occupies several lines; nevertheless it's considered a single command. To underscore this, the shell provides a special prompt (called the secondary prompt) after you enter each line. Often, scripts are entered by using a text editor; when you enter a script using a text editor you don't see the secondary prompt, or any other shell prompt for that matter.

As an example, suppose you want to delete a file file1 if it's older than another file file2. The following command would accomplish the desired result:

if test file1 -ot file2
then
  rm file1
fi

You could incorporate this command in a script that accepts arguments specifying the filenames:

if test $1 -ot $2
then
  rm $1
  echo Deleted the old file.
fi

If you name the script riddance and invoke it as follows:

riddance thursday wednesday

the script will delete the file thursday if that file is older than the file wednesday.

The case command

The case command provides a more sophisticated form of conditional processing:

case 
value in
  
pattern1) 
commands;;
  
pattern2) 
commands ;;
  ...
esac

The case command attempts to match the specified value against a series of patterns. The commands associated with the first matching pattern, if any, are executed. Patterns are built using characters and metacharacters, such as those used to specify command arguments. As an example, here's a case command that interprets the value of the first argument of its script:

case $1 in
  -r) echo Force deletion without confirmation ;;
  -i) echo Confirm before deleting ;;
   *) echo Unknown argument ;;
esac

The command echoes a different line of text, depending on the value of the script's first argument. As done here, it's good practice to include a final pattern that matches any value.

The while command

The while command lets you execute a series of commands iteratively (that is, repeatedly) so long as a condition tests true:

while 
command
do
 
commands
done

Here's a script that uses a while command to print its arguments on successive lines:

echo $1
while shift 2> /dev/null
do
  echo $1
done

The commands that comprise the do part of a while (or another loop command) can include if commands, case commands, and even other while commands. However, scripts rapidly become difficult to understand when this occurs often. You should include conditional commands within other conditional commands only with due consideration for the clarity of the result. Include a comment command (#) to clarify difficult constructs.

The until command

The until command lets you execute a series of commands iteratively (that is, repeatedly) so long as a condition tests false:

until 
command
do
 
commands
done

Here's a script that uses an until command to print its arguments on successive lines, until it encounters an argument that has the value red:

until test $1 = red
do
  echo $1
  shift
done

For example, if the script were named stopandgo and stored in the current working directory, the command:

./stopandgo green yellow red blue

would print the lines:

green
yellow

The for command

The for command iterates over the elements of a specified list:

for variable in list
do  
	commands
done

Within the commands, you can reference the current element of the list by means of the shell variable $ variable, where variable is the name specified following the for. The list typically takes the form of a series of arguments, which can incorporate metacharacters. For example, the following for command:

for i in 2 4 6 8
do
  echo $i
done

prints the numbers 2, 4, 6, and 8 on successive lines.

A special form of the for command iterates over the arguments of a script:

for variable
do  
	commands
done

For example, the following script prints its arguments on successive lines:

for i
do
  echo $i
done

The break and continue commands

The break and continue commands are simple commands that take no arguments. When the shell encounters a break command, it immediately exits the body of the enclosing loop ( while, until, or for) command. When the shell encounters a continue command, it immediately discontinues the current iteration of the loop. If the loop condition permits, other iterations may occur; otherwise the loop is exited.


Top Visited
Switchboard
Latest
Past week
Past month

NEWS CONTENTS

Old News ;-)

[Jun 12, 2021] The use of PS4= LINENO in debugging bash scripts

Jun 10, 2021 | www.redhat.com

Exit status

In Bash scripting, $? prints the exit status. If it returns zero, it means there is no error. If it is non-zero, then you can conclude the earlier task has some issue.

A basic example is as follows:

$ cat myscript.sh
           #!/bin/bash
           mkdir learning
           echo $?

If you run the above script once, it will print 0 because the directory does not exist, therefore the script will create it. Naturally, you will get a non-zero value if you run the script a second time, as seen below:

$ sh myscript.sh
mkdir: cannot create directory 'learning': File exists
1
In the cloud Best practices

It is always recommended to enable the debug mode by adding the -e option to your shell script as below:

$ cat test3.sh
!/bin/bash
set -x
echo "hello World"
mkdiir testing
 ./test3.sh
+ echo 'hello World'
hello World
+ mkdiir testing
./test3.sh: line 4: mkdiir: command not found

You can write a debug function as below, which helps to call it anytime, using the example below:

$ cat debug.sh
#!/bin/bash
_DEBUG="on"
function DEBUG()
{
 [ "$_DEBUG" == "on" ] && $@
}
DEBUG echo 'Testing Debudding'
DEBUG set -x
a=2
b=3
c=$(( $a + $b ))
DEBUG set +x
echo "$a + $b = $c"

Which prints:

$ ./debug.sh
Testing Debudding
+ a=2
+ b=3
+ c=5
+ DEBUG set +x
+ '[' on == on ']'
+ set +x
2 + 3 = 5
Standard error redirection

You can redirect all the system errors to a custom file using standard errors, which can be denoted by the number 2 . Execute it in normal Bash commands, as demonstrated below:

$ mkdir users 2> errors.txt
$ cat errors.txt
mkdir: cannot create directory "˜users': File exists

Most of the time, it is difficult to find the exact line number in scripts. To print the line number with the error, use the PS4 option (supported with Bash 4.1 or later). Example below:

$ cat test3.sh
#!/bin/bash
PS4='LINENO:'

set -x
echo "hello World"
mkdiir testing

You can easily see the line number while reading the errors:

$ /test3.sh
5: echo 'hello World'
hello World
6: mkdiir testing
./test3.sh: line 6: mkdiir: command not found

[Jun 08, 2021] Basic scripting on Unix and Linux by Sandra Henry-Stocker

Mar 10, 2021 | www.networkworld.com

... ... ...

Different ways to loop

There are a number of ways to loop within a script. Use for when you want to loop a preset number of times. For example:

#!/bin/bash

for day in Sun Mon Tue Wed Thu Fri Sat
do
    echo $day
done

or

#!/bin/bash

for letter in {a..z}
do
   echo $letter
done

Use while when you want to loop as long as some condition exists or doesn't exist.

#!/bin/bash

n=1

while [ $n -le 4 ]
do
    echo $n
    ((n++))
done
Using case statements

Case statements allow your scripts to react differently depending on what values are being examined. In the script below, we use different commands to extract the contents of the file provided as an argument by identifying the file type.

#!/bin/bash

if [ $# -eq 0 ]; then
    echo -n "filename> "
    read filename
else
    filename=$1
fi

if [ ! -f "$filename" ]; then
    echo "No such file: $filename"
    exit
fi

case $filename in
    *.tar)      tar xf $filename;;
    *.tar.bz2)  tar xjf $filename;;
    *.tbz)      tar xjf $filename;;
    *.tbz2)     tar xjf $filename;;
    *.tgz)      tar xzf $filename;;
    *.tar.gz)   tar xzf $filename;;
    *.gz)       gunzip $filename;;
    *.bz2)      bunzip2 $filename;;
    *.zip)      unzip $filename;;
    *.Z)        uncompress $filename;;
    *.rar)      rar x $filename ;;
    *)          echo "No extract option for $filename"
esac

Note that this script also prompts for a file name if none was provided and then checks to make sure that the file specified actually exists. Only after that does it bother with the extraction.

Reacting to errors

You can detect and react to errors within scripts and, in doing so, avoid other errors. The trick is to check the exit codes after commands are run. If an exit code has a value other than zero, an error occurred. In this script, we look to see if Apache is running, but send the output from the check to /dev/null . We then check to see if the exit code isn't equal to zero as this would indicate that the ps command did not get a response. If the exit code is not zero, the script informs the user that Apache isn't running.

#!/bin/bash

ps -ef | grep apache2 > /dev/null
if [ $? != 0 ]; then
    echo Apache is not running
    exit
fi

[Apr 01, 2021] How to use range and sequence expression in bash by Dan Nanni

Mar 29, 2021 | www.xmodulo.com

When you are writing a bash script, there are situations where you need to generate a sequence of numbers or strings . One common use of such sequence data is for loop iteration. When you iterate over a range of numbers, the range may be defined in many different ways (e.g., [0, 1, 2,..., 99, 100], [50, 55, 60,..., 75, 80], [10, 9, 8,..., 1, 0], etc). Loop iteration may not be just over a range of numbers. You may need to iterate over a sequence of strings with particular patterns (e.g., incrementing filenames; img001.jpg, img002.jpg, img003.jpg). For this type of loop control, you need to be able to generate a sequence of numbers and/or strings flexibly.

While you can use a dedicated tool like seq to generate a range of numbers, it is really not necessary to add such external dependency in your bash script when bash itself provides a powerful built-in range function called brace expansion . In this tutorial, let's find out how to generate a sequence of data in bash using brace expansion and what are useful brace expansion examples .

Brace Expansion

Bash's built-in range function is realized by so-called brace expansion . In a nutshell, brace expansion allows you to generate a sequence of strings based on supplied string and numeric input data. The syntax of brace expansion is the following.

{<string1>,<string2>,...,<stringN>}
{<start-number>..<end-number>}
{<start-number>..<end-number>..<increment>}
<prefix-string>{......}
{......}<suffix-string>
<prefix-string>{......}<suffix-string>

All these sequence expressions are iterable, meaning you can use them for while/for loops . In the rest of the tutorial, let's go over each of these expressions to clarify their use cases.

https://googleads.g.doubleclick.net/pagead/ads?client=ca-pub-7245163904660683&output=html&h=200&slotname=5674857721&adk=3047986842&adf=3341013331&pi=t.ma~as.5674857721&w=1200&fwrn=4&lmt=1617109287&rafmt=11&psa=1&format=1200x200&url=https%3A%2F%2Fwww.xmodulo.com%2Frange-sequence-expression-bash.html&flash=0&wgl=1&dt=1617311559984&bpp=49&bdt=419&idt=296&shv=r20210331&cbv=r20190131&ptt=9&saldr=aa&abxe=1&cookie=ID%3Dc3dfa6581a6e36dd-22096420bac60000%3AT%3D1614570003%3ART%3D1614570003%3AS%3DALNI_MZbq_6NmD0W6EwR1pZiXu91X_Gmaw&prev_fmts=1200x280&correlator=486211930057&frm=20&pv=1&ga_vid=288434327.1614570002&ga_sid=1617311560&ga_hid=1542200251&ga_fc=0&rplot=4&u_tz=-240&u_his=1&u_java=0&u_h=864&u_w=1536&u_ah=864&u_aw=1536&u_cd=24&u_nplug=3&u_nmime=4&adx=160&ady=1350&biw=1519&bih=762&scr_x=0&scr_y=0&eid=42530672%2C44740079%2C44739537%2C44739387&oid=3&pvsid=2774697899597512&pem=502&ref=https%3A%2F%2Fwww.linuxtoday.com%2F&rx=0&eae=0&fc=896&brdim=1536%2C0%2C1536%2C0%2C1536%2C0%2C1536%2C864%2C1536%2C762&vis=1&rsz=%7C%7CeEbr%7C&abl=CS&pfx=0&fu=8320&bc=31&ifi=2&uci=a!2&btvi=1&fsb=1&xpc=Ug4rFEoUn3&p=https%3A//www.xmodulo.com&dtd=306

Use Case #1: List a Sequence of Strings

The first use case of brace expansion is a simple string list, which is a comma-separated list of string literals within the braces. Here we are not generating a sequence of data, but simply list a pre-defined sequence of string data.

{<string1>,<string2>,...,<stringN>}

You can use this brace expansion to iterate over the string list as follows.

for fruit in {apple,orange,lemon}; do
    echo $fruit
done
apple
orange
lemon

This expression is also useful to invoke a particular command multiple times with different parameters.

For example, you can create multiple subdirectories in one shot with:

$ mkdir -p /home/xmodulo/users/{dan,john,alex,michael,emma}

To create multiple empty files:

$ touch /tmp/{1,2,3,4}.log
Use Case #2: Define a Range of Numbers

https://googleads.g.doubleclick.net/pagead/ads?client=ca-pub-7245163904660683&output=html&h=200&slotname=1246960885&adk=2798017750&adf=1795540232&pi=t.ma~as.1246960885&w=1200&fwrn=4&lmt=1617109287&rafmt=11&psa=1&format=1200x200&url=https%3A%2F%2Fwww.xmodulo.com%2Frange-sequence-expression-bash.html&flash=0&wgl=1&dt=1617311560086&bpp=3&bdt=522&idt=212&shv=r20210331&cbv=r20190131&ptt=9&saldr=aa&abxe=1&cookie=ID%3Dc3dfa6581a6e36dd-22096420bac60000%3AT%3D1614570003%3ART%3D1614570003%3AS%3DALNI_MZbq_6NmD0W6EwR1pZiXu91X_Gmaw&prev_fmts=1200x280%2C1200x200&correlator=486211930057&frm=20&pv=1&ga_vid=288434327.1614570002&ga_sid=1617311560&ga_hid=1542200251&ga_fc=0&rplot=4&u_tz=-240&u_his=1&u_java=0&u_h=864&u_w=1536&u_ah=864&u_aw=1536&u_cd=24&u_nplug=3&u_nmime=4&adx=160&ady=2661&biw=1519&bih=762&scr_x=0&scr_y=0&eid=42530672%2C44740079%2C44739537%2C44739387&oid=3&pvsid=2774697899597512&pem=502&ref=https%3A%2F%2Fwww.linuxtoday.com%2F&rx=0&eae=0&fc=896&brdim=1536%2C0%2C1536%2C0%2C1536%2C0%2C1536%2C864%2C1536%2C762&vis=1&rsz=%7C%7CeEbr%7C&abl=CS&pfx=0&fu=8320&bc=31&ifi=3&uci=a!3&btvi=2&fsb=1&xpc=4Qr9I1IICq&p=https%3A//www.xmodulo.com&dtd=230

The most common use case of brace expansion is to define a range of numbers for loop iteration. For that, you can use the following expressions, where you specify the start/end of the range, as well as an optional increment value.

{<start-number>..<end-number>}
{<start-number>..<end-number>..<increment>}

To define a sequence of integers between 10 and 20:

echo {10..20}
10 11 12 13 14 15 16 17 18 19 20

You can easily integrate this brace expansion in a loop:

for num in {10..20}; do
    echo $num
done

To generate a sequence of numbers with an increment of 2 between 0 and 20:

echo {0..20..2}
0 2 4 6 8 10 12 14 16 18 20

You can generate a sequence of decrementing numbers as well:

echo {20..10}
20 19 18 17 16 15 14 13 12 11 10
echo {20..10..-2}
20 18 16 14 12 10

You can also pad the numbers with leading zeros, in case you need to use the same number of digits. For example:

echo {00..20..2}
00 02 04 06 08 10 12 14 16 18 20
Use Case #3: Generate a Sequence of Characters

https://googleads.g.doubleclick.net/pagead/ads?client=ca-pub-7245163904660683&output=html&h=200&slotname=1246960885&adk=2798017750&adf=2275625677&pi=t.ma~as.1246960885&w=1200&fwrn=4&lmt=1617109287&rafmt=11&psa=1&format=1200x200&url=https%3A%2F%2Fwww.xmodulo.com%2Frange-sequence-expression-bash.html&flash=0&wgl=1&adsid=ChEI8N6VgwYQhfmhjs6mgZfVARIqAB-w9KHKYtk-pO1suXBsxL8W2AonVwnPmH2XuFwrRPO8MEEAXQpMrZaL&dt=1617311560089&bpp=13&bdt=524&idt=234&shv=r20210331&cbv=r20190131&ptt=9&saldr=aa&abxe=1&cookie=ID%3Dc3dfa6581a6e36dd-22096420bac60000%3AT%3D1614570003%3ART%3D1614570003%3AS%3DALNI_MZbq_6NmD0W6EwR1pZiXu91X_Gmaw&prev_fmts=1200x280%2C1200x200%2C1200x200%2C0x0%2C1519x762&nras=2&correlator=486211930057&frm=20&pv=1&ga_vid=288434327.1614570002&ga_sid=1617311560&ga_hid=1542200251&ga_fc=0&rplot=4&u_tz=-240&u_his=1&u_java=0&u_h=864&u_w=1536&u_ah=864&u_aw=1536&u_cd=24&u_nplug=3&u_nmime=4&adx=160&ady=4285&biw=1519&bih=762&scr_x=0&scr_y=1242&eid=42530672%2C44740079%2C44739537%2C44739387&oid=3&psts=AGkb-H_lFqstnD2HWv6DycAKvGu9yoyyH3Im0lIwlWU9l6Uc-8KMKIFblasNhvUgGzV4BHfOo--XblJj_VswXA%2CAGkb-H9o5YtqjrXVMh6mfBSJzTIgoTV2500RL7u85T0dFqY9L2FCM8n5K3kCkE5gmmIGpZe6AF47pvNGmYctKA%2CAGkb-H-ww6bPiVlNqpc1PRrGrEXcujNuzAiKCh9dMztOCLvaTDy5GzZj2TpeUNENhbxuLuuOYYD5RgOfQA&pvsid=2774697899597512&pem=502&ref=https%3A%2F%2Fwww.linuxtoday.com%2F&rx=0&eae=0&fc=896&brdim=1536%2C0%2C1536%2C0%2C1536%2C0%2C1536%2C864%2C1536%2C762&vis=1&rsz=%7C%7CeEbr%7C&abl=CS&pfx=0&fu=8320&bc=31&jar=2021-04-01-17&ifi=4&uci=a!4&btvi=3&fsb=1&xpc=QImaZvyQly&p=https%3A//www.xmodulo.com&dtd=27097

Brace expansion can be used to generate not just a sequence of numbers, but also a sequence of characters.

{<start-character>..<end-character>}

To generate a sequence of alphabet characters between 'd' and 'p':

echo {d..p}
d e f g h i j k l m n o p

You can generate a sequence of upper-case alphabets as well.

for char1 in {A..B}; do
    for char2 in {A..B}; do
        echo "${char1}${char2}"
    done
done
AA
AB
BA
BB
Use Case #4: Generate a Sequence of Strings with Prefix/Suffix

It's possible to add a prefix and/or a suffix to a given brace expression as follows.

<prefix-string>{......}
{......}<suffix-string>
<prefix-string>{......}<suffix-string>

Using this feature, you can easily generate a list of sequentially numbered filenames:

# create incrementing filenames
for filename in img_{00..5}.jpg; do
    echo $filename
done
img_00.jpg
img_01.jpg
img_02.jpg
img_03.jpg
img_04.jpg
img_05.jpg
Use Case #5: Combine Multiple Brace Expansions

https://googleads.g.doubleclick.net/pagead/ads?client=ca-pub-7245163904660683&output=html&h=200&slotname=1246960885&adk=2798017750&adf=1069835252&pi=t.ma~as.1246960885&w=1200&fwrn=4&lmt=1617109287&rafmt=11&psa=1&format=1200x200&url=https%3A%2F%2Fwww.xmodulo.com%2Frange-sequence-expression-bash.html&flash=0&wgl=1&adsid=ChEI8N6VgwYQhfmhjs6mgZfVARIqAB-w9KHKYtk-pO1suXBsxL8W2AonVwnPmH2XuFwrRPO8MEEAXQpMrZaL&dt=1617311560132&bpp=3&bdt=568&idt=193&shv=r20210331&cbv=r20190131&ptt=9&saldr=aa&abxe=1&cookie=ID%3Dc3dfa6581a6e36dd-22096420bac60000%3AT%3D1614570003%3ART%3D1614570003%3AS%3DALNI_MZbq_6NmD0W6EwR1pZiXu91X_Gmaw&prev_fmts=1200x280%2C1200x200%2C1200x200%2C0x0%2C1519x762%2C1200x200&nras=2&correlator=486211930057&frm=20&pv=1&ga_vid=288434327.1614570002&ga_sid=1617311560&ga_hid=1542200251&ga_fc=0&rplot=4&u_tz=-240&u_his=1&u_java=0&u_h=864&u_w=1536&u_ah=864&u_aw=1536&u_cd=24&u_nplug=3&u_nmime=4&adx=160&ady=6156&biw=1519&bih=762&scr_x=0&scr_y=3151&eid=42530672%2C44740079%2C44739537%2C44739387&oid=3&psts=AGkb-H_lFqstnD2HWv6DycAKvGu9yoyyH3Im0lIwlWU9l6Uc-8KMKIFblasNhvUgGzV4BHfOo--XblJj_VswXA%2CAGkb-H9o5YtqjrXVMh6mfBSJzTIgoTV2500RL7u85T0dFqY9L2FCM8n5K3kCkE5gmmIGpZe6AF47pvNGmYctKA%2CAGkb-H-ww6bPiVlNqpc1PRrGrEXcujNuzAiKCh9dMztOCLvaTDy5GzZj2TpeUNENhbxuLuuOYYD5RgOfQA%2CAGkb-H_oWO6sMjx-sSACXECD6aXL8a7NcIP5miVIHjPj27ExAouRoqV1vRbD0UeQxrrlNTPAZbGg7YubopvUSA&pvsid=2774697899597512&pem=502&ref=https%3A%2F%2Fwww.linuxtoday.com%2F&rx=0&eae=0&fc=896&brdim=1536%2C0%2C1536%2C0%2C1536%2C0%2C1536%2C864%2C1536%2C762&vis=1&rsz=%7C%7CeEbr%7C&abl=CS&pfx=0&fu=8320&bc=31&jar=2021-04-01-17&ifi=5&uci=a!5&btvi=4&fsb=1&xpc=twNmeHYXl4&p=https%3A//www.xmodulo.com&dtd=41555

Finally, it's possible to combine multiple brace expansions, in which case the combined expressions will generate all possible combinations of sequence data produced by each expression.

For example, we have the following script that prints all possible combinations of two-character alphabet strings using double-loop iteration.

for char1 in {A..Z}; do
    for char2 in {A..Z}; do
        echo "${char1}${char2}"
    done
done

By combining two brace expansions, the following single loop can produce the same output as above.

for str in {A..Z}{A..Z}; do
    echo $str
done
Conclusion

In this tutorial, I described a bash's built-in mechanism called brace expansion, which allows you to easily generate a sequence of arbitrary strings in a single command line. Brace expansion is useful not just for a bash script, but also in your command line environment (e.g., when you need to run the same command multiple times with different arguments). If you know any useful brace expansion tips and use cases, feel free to share it in the comment.

If you find this tutorial helpful, I recommend you check out the series of bash shell scripting tutorials provided by Xmodulo.

[Mar 30, 2021] How to catch and handle errors in bash

Mar 30, 2021 | www.xmodulo.com

How to catch and handle errors in bash

Last updated on March 28, 2021 by Dan Nanni

https://googleads.g.doubleclick.net/pagead/ads?client=ca-pub-7245163904660683&output=html&h=280&slotname=6357311593&adk=3477157422&adf=3251077269&pi=t.ma~as.6357311593&w=1200&fwrn=4&fwrnh=100&lmt=1617039750&rafmt=1&psa=1&format=1200x280&url=https%3A%2F%2Fwww.xmodulo.com%2Fcatch-handle-errors-bash.html&flash=0&fwr=0&fwrattr=true&rpe=1&resp_fmts=3&wgl=1&dt=1617150500578&bpp=19&bdt=670&idt=289&shv=r20210322&cbv=r20190131&ptt=9&saldr=aa&abxe=1&cookie=ID%3Dc3dfa6581a6e36dd-22096420bac60000%3AT%3D1614570003%3ART%3D1614570003%3AS%3DALNI_MZbq_6NmD0W6EwR1pZiXu91X_Gmaw&correlator=2807789420329&frm=20&pv=2&ga_vid=288434327.1614570002&ga_sid=1617150501&ga_hid=294975347&ga_fc=0&u_tz=-240&u_his=1&u_java=0&u_h=864&u_w=1536&u_ah=864&u_aw=1536&u_cd=24&u_nplug=3&u_nmime=4&adx=31&ady=254&biw=1519&bih=714&scr_x=0&scr_y=0&eid=42530672%2C44740079%2C44739387&oid=3&pvsid=3816417963868055&pem=502&ref=https%3A%2F%2Fwww.linuxtoday.com%2F&rx=0&eae=0&fc=896&brdim=1536%2C0%2C1536%2C0%2C1536%2C0%2C1536%2C864%2C1536%2C714&vis=1&rsz=%7C%7CeE%7C&abl=CS&pfx=0&fu=8320&bc=31&ifi=1&uci=a!1&fsb=1&xpc=FeLkc0yKaB&p=https%3A//www.xmodulo.com&dtd=346

In an ideal world, things always work as expected, but you know that's hardly the case. The same goes in the world of bash scripting. Writing a robust, bug-free bash script is always challenging even for a seasoned system administrator. Even if you write a perfect bash script, the script may still go awry due to external factors such as invalid input or network problems. While you cannot prevent all errors in your bash script, at least you should try to handle possible error conditions in a more predictable and controlled fashion.

That is easier said than done, especially since error handling in bash is notoriously difficult. The bash shell does not have any fancy exception swallowing mechanism like try/catch constructs. Some bash errors may be silently ignored but may have consequences down the line. The bash shell does not even have a proper debugger.

In this tutorial, I'll introduce basic tips to catch and handle errors in bash . Although the presented error handling techniques are not as fancy as those available in other programming languages, hopefully by adopting the practice, you may be able to handle potential bash errors more gracefully.

Bash Error Handling Tip #1: Check the Exit Status

https://googleads.g.doubleclick.net/pagead/ads?client=ca-pub-7245163904660683&output=html&h=200&slotname=5674857721&adk=3047986842&adf=3341013331&pi=t.ma~as.5674857721&w=1200&fwrn=4&lmt=1617039750&rafmt=11&psa=1&format=1200x200&url=https%3A%2F%2Fwww.xmodulo.com%2Fcatch-handle-errors-bash.html&flash=0&wgl=1&dt=1617150500597&bpp=37&bdt=688&idt=355&shv=r20210322&cbv=r20190131&ptt=9&saldr=aa&abxe=1&cookie=ID%3Dc3dfa6581a6e36dd-22096420bac60000%3AT%3D1614570003%3ART%3D1614570003%3AS%3DALNI_MZbq_6NmD0W6EwR1pZiXu91X_Gmaw&prev_fmts=1200x280&correlator=2807789420329&frm=20&pv=1&ga_vid=288434327.1614570002&ga_sid=1617150501&ga_hid=294975347&ga_fc=0&rplot=4&u_tz=-240&u_his=1&u_java=0&u_h=864&u_w=1536&u_ah=864&u_aw=1536&u_cd=24&u_nplug=3&u_nmime=4&adx=160&ady=1003&biw=1519&bih=714&scr_x=0&scr_y=0&eid=42530672%2C44740079%2C44739387&oid=3&pvsid=3816417963868055&pem=502&ref=https%3A%2F%2Fwww.linuxtoday.com%2F&rx=0&eae=0&fc=896&brdim=1536%2C0%2C1536%2C0%2C1536%2C0%2C1536%2C864%2C1536%2C714&vis=1&rsz=%7C%7CeEbr%7C&abl=CS&pfx=0&fu=8320&bc=31&ifi=2&uci=a!2&btvi=1&fsb=1&xpc=R4Jgtckaf2&p=https%3A//www.xmodulo.com&dtd=373

As the first line of defense, it is always recommended to check the exit status of a command, as a non-zero exit status typically indicates some type of error. For example:

if ! some_command; then
    echo "some_command returned an error"
fi

Another (more compact) way to trigger error handling based on an exit status is to use an OR list:

<command1> || <command2>

With this OR statement, <command2> is executed if and only if <command1> returns a non-zero exit status. So you can replace <command2> with your own error handling routine. For example:

error_exit()
{
    echo "Error: $1"
    exit 1
}

run-some-bad-command || error_exit "Some error occurred"

Bash provides a built-in variable called $? , which tells you the exit status of the last executed command. Note that when a bash function is called, $? reads the exit status of the last command called inside the function. Since some non-zero exit codes have special meanings , you can handle them selectively. For example:

# run some command
status=$?
if [ $status -eq 1 ]; then
    echo "General error"
elif [ $status -eq 2 ]; then
    echo "Misuse of shell builtins"
elif [ $status -eq 126 ]; then
    echo "Command invoked cannot execute"
elif [ $status -eq 128 ]; then
    echo "Invalid argument"
fi
Bash Error Handling Tip #2: Exit on Errors in Bash

https://googleads.g.doubleclick.net/pagead/ads?client=ca-pub-7245163904660683&output=html&h=200&slotname=1246960885&adk=2798017750&adf=1795540232&pi=t.ma~as.1246960885&w=1200&fwrn=4&lmt=1617039750&rafmt=11&psa=1&format=1200x200&url=https%3A%2F%2Fwww.xmodulo.com%2Fcatch-handle-errors-bash.html&flash=0&wgl=1&dt=1617150500635&bpp=53&bdt=726&idt=346&shv=r20210322&cbv=r20190131&ptt=9&saldr=aa&abxe=1&cookie=ID%3Dc3dfa6581a6e36dd-22096420bac60000%3AT%3D1614570003%3ART%3D1614570003%3AS%3DALNI_MZbq_6NmD0W6EwR1pZiXu91X_Gmaw&prev_fmts=1200x280%2C1200x200&correlator=2807789420329&frm=20&pv=1&ga_vid=288434327.1614570002&ga_sid=1617150501&ga_hid=294975347&ga_fc=0&rplot=4&u_tz=-240&u_his=1&u_java=0&u_h=864&u_w=1536&u_ah=864&u_aw=1536&u_cd=24&u_nplug=3&u_nmime=4&adx=160&ady=2621&biw=1519&bih=714&scr_x=0&scr_y=0&eid=42530672%2C44740079%2C44739387&oid=3&pvsid=3816417963868055&pem=502&ref=https%3A%2F%2Fwww.linuxtoday.com%2F&rx=0&eae=0&fc=896&brdim=1536%2C0%2C1536%2C0%2C1536%2C0%2C1536%2C864%2C1536%2C714&vis=1&rsz=%7C%7CeEbr%7C&abl=CS&pfx=0&fu=8320&bc=31&ifi=3&uci=a!3&btvi=2&fsb=1&xpc=xlM0hGwtiw&p=https%3A//www.xmodulo.com&dtd=367

When you encounter an error in a bash script, by default, it throws an error message to stderr , but continues its execution in the rest of the script. In fact you see the same behavior in a terminal window; even if you type a wrong command by accident, it will not kill your terminal. You will just see the "command not found" error, but you terminal/bash session will still remain.

This default shell behavior may not be desirable for some bash script. For example, if your script contains a critical code block where no error is allowed, you want your script to exit immediately upon encountering any error inside that code block. To activate this "exit-on-error" behavior in bash, you can use the set command as follows.

set -e
#
# some critical code block where no error is allowed
#
set +e

Once called with -e option, the set command causes the bash shell to exit immediately if any subsequent command exits with a non-zero status (caused by an error condition). The +e option turns the shell back to the default mode. set -e is equivalent to set -o errexit . Likewise, set +e is a shorthand command for set +o errexit .

However, one special error condition not captured by set -e is when an error occurs somewhere inside a pipeline of commands. This is because a pipeline returns a non-zero status only if the last command in the pipeline fails. Any error produced by previous command(s) in the pipeline is not visible outside the pipeline, and so does not kill a bash script. For example:

set -e
true | false | true   
echo "This will be printed"  # "false" inside the pipeline not detected

If you want any failure in pipelines to also exit a bash script, you need to add -o pipefail option. For example:

set -o pipefail -e
true | false | true          # "false" inside the pipeline detected correctly
echo "This will not be printed"

Therefore, to protect a critical code block against any type of command errors or pipeline errors, use the following pair of set commands.

set -o pipefail -e
#
# some critical code block where no error or pipeline error is allowed
#
set +o pipefail +e
Bash Error Handling Tip #3: Try and Catch Statements in Bash

https://googleads.g.doubleclick.net/pagead/ads?client=ca-pub-7245163904660683&output=html&h=200&slotname=1246960885&adk=2798017750&adf=2275625677&pi=t.ma~as.1246960885&w=1200&fwrn=4&lmt=1617039750&rafmt=11&psa=1&format=1200x200&url=https%3A%2F%2Fwww.xmodulo.com%2Fcatch-handle-errors-bash.html&flash=0&wgl=1&adsid=ChAI8JiLgwYQkvKD_-vdud51EioAsc7QJfPbVjxhaA0k3D4cZGdWuanTHT1OnZFf-sYZ_FlsHeNm-m93y6g&dt=1617150500736&bpp=3&bdt=827&idt=284&shv=r20210322&cbv=r20190131&ptt=9&saldr=aa&abxe=1&cookie=ID%3Dc3dfa6581a6e36dd-22096420bac60000%3AT%3D1614570003%3ART%3D1614570003%3AS%3DALNI_MZbq_6NmD0W6EwR1pZiXu91X_Gmaw&prev_fmts=1200x280%2C1200x200%2C1200x200%2C0x0%2C1519x714&nras=2&correlator=2807789420329&frm=20&pv=1&ga_vid=288434327.1614570002&ga_sid=1617150501&ga_hid=294975347&ga_fc=0&rplot=4&u_tz=-240&u_his=1&u_java=0&u_h=864&u_w=1536&u_ah=864&u_aw=1536&u_cd=24&u_nplug=3&u_nmime=4&adx=160&ady=4322&biw=1519&bih=714&scr_x=0&scr_y=1473&eid=42530672%2C44740079%2C44739387&oid=3&psts=AGkb-H9kB9XBPoFQr4Nvbpzi-IDFo1H7_NaIL8M18sGGWSqpMo6EvnCzj-Qorx0rQkLTtpYfrxcistXQ3NLI%2CAGkb-H9NblhEl8n-XjoXLiznZ70w5Gvz_2AR1xlm3w9htg9Uoc9EqNnh-BnrA3HlHfn539NkqfOg0pb4UgvAzA%2CAGkb-H_8XpQQ502aEe7wRqWV9odZAPWfUTDNYIPLyzG6DAnUhxH_sAn3FM_H-EjHMVFKcfuXC1svgR-pJ4tNKQ&pvsid=3816417963868055&pem=502&ref=https%3A%2F%2Fwww.linuxtoday.com%2F&rx=0&eae=0&fc=896&brdim=1536%2C0%2C1536%2C0%2C1536%2C0%2C1536%2C864%2C1536%2C714&vis=1&rsz=%7C%7CeEbr%7C&abl=CS&pfx=0&fu=8320&bc=31&jar=2021-03-30-23&ifi=4&uci=a!4&btvi=3&fsb=1&xpc=v8JM1LJbyF&p=https%3A//www.xmodulo.com&dtd=7982

Although the set command allows you to terminate a bash script upon any error that you deem critical, this mechanism is often not sufficient in more complex bash scripts where different types of errors could happen.

To be able to detect and handle different types of errors/exceptions more flexibly, you will need try/catch statements, which however are missing in bash. At least we can mimic the behaviors of try/catch as shown in this trycatch.sh script:

function try()
{
    [[ $- = *e* ]]; SAVED_OPT_E=$?
    set +e
}

function throw()
{
    exit $1
}

function catch()
{
    export exception_code=$?
    (( $SAVED_OPT_E )) && set +e
    return $exception_code
}

Here we define several custom bash functions to mimic the semantic of try and catch statements. The throw() function is supposed to raise a custom (non-zero) exception. We need set +e , so that the non-zero returned by throw() will not terminate a bash script. Inside catch() , we store the value of exception raised by throw() in a bash variable exception_code , so that we can handle the exception in a user-defined fashion.

Perhaps an example bash script will make it clear how trycatch.sh works. See the example below that utilizes trycatch.sh .

# Include trybatch.sh as a library
source ./trycatch.sh

# Define custom exception types
export ERR_BAD=100
export ERR_WORSE=101
export ERR_CRITICAL=102

try
(
    echo "Start of the try block"

    # When a command returns a non-zero, a custom exception is raised.
    run-command || throw $ERR_BAD
    run-command2 || throw $ERR_WORSE
    run-command3 || throw $ERR_CRITICAL

    # This statement is not reached if there is any exception raised
    # inside the try block.
    echo "End of the try block"
)
catch || {
    case $exception_code in
        $ERR_BAD)
            echo "This error is bad"
        ;;
        $ERR_WORSE)
            echo "This error is worse"
        ;;
        $ERR_CRITICAL)
            echo "This error is critical"
        ;;
        *)
            echo "Unknown error: $exit_code"
            throw $exit_code    # re-throw an unhandled exception
        ;;
    esac
}

In this example script, we define three types of custom exceptions. We can choose to raise any of these exceptions depending on a given error condition. The OR list <command> || throw <exception> allows us to invoke throw() function with a chosen <exception> value as a parameter, if <command> returns a non-zero exit status. If <command> is completed successfully, throw() function will be ignored. Once an exception is raised, the raised exception can be handled accordingly inside the subsequent catch block. As you can see, this provides a more flexible way of handling different types of error conditions.

https://googleads.g.doubleclick.net/pagead/ads?client=ca-pub-7245163904660683&output=html&h=200&slotname=1246960885&adk=2798017750&adf=1069835252&pi=t.ma~as.1246960885&w=1200&fwrn=4&lmt=1617039750&rafmt=11&psa=1&format=1200x200&url=https%3A%2F%2Fwww.xmodulo.com%2Fcatch-handle-errors-bash.html&flash=0&wgl=1&adsid=ChAI8JiLgwYQkvKD_-vdud51EioAsc7QJfPbVjxhaA0k3D4cZGdWuanTHT1OnZFf-sYZ_FlsHeNm-m93y6g&dt=1617150500740&bpp=33&bdt=832&idt=288&shv=r20210322&cbv=r20190131&ptt=9&saldr=aa&abxe=1&cookie=ID%3Dc3dfa6581a6e36dd-22096420bac60000%3AT%3D1614570003%3ART%3D1614570003%3AS%3DALNI_MZbq_6NmD0W6EwR1pZiXu91X_Gmaw&prev_fmts=1200x280%2C1200x200%2C1200x200%2C0x0%2C1519x714%2C1200x200&nras=2&correlator=2807789420329&frm=20&pv=1&ga_vid=288434327.1614570002&ga_sid=1617150501&ga_hid=294975347&ga_fc=0&rplot=4&u_tz=-240&u_his=1&u_java=0&u_h=864&u_w=1536&u_ah=864&u_aw=1536&u_cd=24&u_nplug=3&u_nmime=4&adx=160&ady=6943&biw=1519&bih=714&scr_x=0&scr_y=4095&eid=42530672%2C44740079%2C44739387&oid=3&psts=AGkb-H9kB9XBPoFQr4Nvbpzi-IDFo1H7_NaIL8M18sGGWSqpMo6EvnCzj-Qorx0rQkLTtpYfrxcistXQ3NLI%2CAGkb-H9NblhEl8n-XjoXLiznZ70w5Gvz_2AR1xlm3w9htg9Uoc9EqNnh-BnrA3HlHfn539NkqfOg0pb4UgvAzA%2CAGkb-H_8XpQQ502aEe7wRqWV9odZAPWfUTDNYIPLyzG6DAnUhxH_sAn3FM_H-EjHMVFKcfuXC1svgR-pJ4tNKQ%2CAGkb-H_LZaKgZXHhi-mp793u920dtCBuBuOdBYfg8GxP5Yl69G1LrubEm-DNODFvz9VDpFX0r4wQgNJ9B_IZKQ&pvsid=3816417963868055&pem=502&ref=https%3A%2F%2Fwww.linuxtoday.com%2F&rx=0&eae=0&fc=896&brdim=1536%2C0%2C1536%2C0%2C1536%2C0%2C1536%2C864%2C1536%2C714&vis=1&rsz=%7C%7CeEbr%7C&abl=CS&pfx=0&fu=8320&bc=31&jar=2021-03-30-23&ifi=5&uci=a!5&btvi=4&fsb=1&xpc=cNiz7hdTMs&p=https%3A//www.xmodulo.com&dtd=13575

Granted, this is not a full-blown try/catch constructs. One limitation of this approach is that the try block is executed in a sub-shell . As you may know, any variables defined in a sub-shell are not visible to its parent shell. Also, you cannot modify the variables that are defined in the parent shell inside the try block, as the parent shell and the sub-shell have separate scopes for variables.

Conclusion

In this bash tutorial, I presented basic error handling tips that may come in handy when you want to write a more robust bash script. As expected these tips are not as sophisticated as the error handling constructs available in other programming language. If the bash script you are writing requires more advanced error handling than this, perhaps bash is not the right language for your task. You probably want to turn to other languages such as Python.

Let me conclude the tutorial by mentioning one essential tool that every shell script writer should be familiar with. ShellCheck is a static analysis tool for shell scripts. It can detect and point out syntax errors, bad coding practice and possible semantic issues in a shell script with much clarity. Definitely check it out if you haven't tried it.

If you find this tutorial helpful, I recommend you check out the series of bash shell scripting tutorials provided by Xmodulo.

[Mar 24, 2021] How to read data from text files by Roberto Nozaki

Mar 24, 2021 | www.redhat.com

The following is the script I use to test the servers:

1     #!/bin/bash
2     
3     input_file=hosts.csv
4     output_file=hosts_tested.csv
5     
6     echo "ServerName,IP,PING,DNS,SSH" > "$output_file"
7     
8     tail -n +2 "$input_file" | while IFS=, read -r host ip _
9     do
10        if ping -c 3 "$ip" > /dev/null; then
11            ping_status="OK"
12        else
13            ping_status="FAIL"
14        fi
15    
16        if nslookup "$host" > /dev/null; then
17            dns_status="OK"
18        else
19            dns_status="FAIL"
20        fi
21    
22        if nc -z -w3 "$ip" 22 > /dev/null; then
23            ssh_status="OK"
24        else
25            ssh_status="FAIL"
26        fi
27    
28        echo "Host = $host IP = $ip" PING_STATUS = $ping_status DNS_STATUS = $dns_status SSH_STATUS = $ssh_status
29        echo "$host,$ip,$ping_status,$dns_status,$ssh_status" >> $output_file
30    done

[Mar 14, 2021] while loops in Bash

Mar 14, 2021 | www.redhat.com
while true
do
  df -k | grep home
  sleep 1
done

In this case, you're running the loop with a true condition, which means it will run forever or until you hit CTRL-C. Therefore, you need to keep an eye on it (otherwise, it will remain using the system's resources).

Note : If you use a loop like this, you need to include a command like sleep to give the system some time to breathe between executions. Running anything non-stop could become a performance issue, especially if the commands inside the loop involve I/O operations.

2. Waiting for a condition to become true

There are variations of this scenario. For example, you know that at some point, the process will create a directory, and you are just waiting for that moment to perform other validations.

You can have a while loop to keep checking for that directory's existence and only write a message while the directory does not exist.

https://asciinema.org/a/BQN8CDagw6k8bSbGJPYi5kqpg/embed?

If you want to do something more elaborate, you could create a script and show a more clear indication that the loop condition became true:

#!/bin/bash

while [ ! -d directory_expected ]
do
   echo "`date` - Still waiting" 
   sleep 1
done

echo "DIRECTORY IS THERE!!!"
More about automation 3. Using a while loop to manipulate a file

Another useful application of a while loop is to combine it with the read command to have access to columns (or fields) quickly from a text file and perform some actions on them.

In the following example, you are simply picking the columns from a text file with a predictable format and printing the values that you want to use to populate an /etc/hosts file.

https://asciinema.org/a/2b1u28XqoC7j7Muhd5zXqHkYP/embed?

Here the assumption is that the file has columns delimited by spaces or tabs and that there are no spaces in the content of the columns. That could shift the content of the fields and not give you what you needed.

Notice that you're just doing a simple operation to extract and manipulate information and not concerned about the command's reusability. I would classify this as one of those "quick and dirty tricks."

Of course, if this was something that you would repeatedly do, you should run it from a script, use proper names for the variables, and all those good practices (including transforming the filename in an argument and defining where to send the output, but today, the topic is while loops).

#!/bin/bash

cat servers.txt | grep -v CPU | while read servername cpu ram ip
do
   echo $ip $servername
done

[Nov 22, 2020] Read a file line by line

Jul 07, 2020 | www.redhat.com

Assume I have a file with a lot of IP addresses and want to operate on those IP addresses. For example, I want to run dig to retrieve reverse-DNS information for the IP addresses listed in the file. I also want to skip IP addresses that start with a comment (# or hashtag).

I'll use fileA as an example. Its contents are:

10.10.12.13  some ip in dc1
10.10.12.14  another ip in dc2
#10.10.12.15 not used IP
10.10.12.16  another IP

I could copy and paste each IP address, and then run dig manually:

$> dig +short -x 10.10.12.13

Or I could do this:

$> while read -r ip _; do [[ $ip == \#* ]] && continue; dig +short -x "$ip"; done < ipfile

What if I want to swap the columns in fileA? For example, I want to put IP addresses in the right-most column so that fileA looks like this:

some ip in dc1 10.10.12.13
another ip in dc2 10.10.12.14
not used IP #10.10.12.15
another IP 10.10.12.16

I run:

$> while  read -r ip rest; do printf '%s %s\n' "$rest" "$ip"; done < fileA

[Jul 04, 2020] Learn Bash Debugging Techniques the Hard Way by Ian Miell

Highly recommended!
Notable quotes:
"... NOTE: If you are on a Mac, then you might only get second-level granularity on the date! ..."
Jul 04, 2020 | zwischenzugs.com

... ... ... Managing Variables

Variables are a core part of most serious bash scripts (and even one-liners!), so managing them is another important way to reduce the possibility of your script breaking.

Change your script to add the 'set' line immediately after the first line and see what happens:

#!/bin/bash
set -o nounset
A="some value"
echo "${A}"
echo "${B}"

...I always set nounset on my scripts as a habit. It can catch many problems before they become serious.

Tracing Variables

If you are working with a particularly complex script, then you can get to the point where you are unsure what happened to a variable.

Try running this script and see what happens:

#!/bin/bash 
set -o nounset 
declare A="some value" 
function a { 
  echo "${BASH_SOURCE}>A A=${A} LINENO:${1}" 
} 
trap "a $LINENO" DEBUG 
B=value 
echo "${A}" 
A="another value" 
echo "${A}" 
echo "${B}"

There's a problem with this code. The output is slightly wrong. Can you work out what is going on? If so, try and fix it.

You may need to refer to the bash man page, and make sure you understand quoting in bash properly.

It's quite a tricky one to fix 'properly', so if you can't fix it, or work out what's wrong with it, then ask me directly and I will help.

Profiling Bash Scripts

Returning to the xtrace (or set -x flag), we can exploit its use of a PS variable to implement the profiling of a script:

#!/bin/bash
set -o nounset
set -o xtrace
declare A="some value"
PS4='$(date "+%s%N => ")'
B=
echo "${A}"
A="another value"
echo "${A}"
echo "${B}"
ls
pwd
curl -q bbc.co.uk

From this you should be able to tell what PS4 does. Have a play with it, and read up and experiment with the other PS variables to get familiar with what they do.

NOTE: If you are on a Mac, then you might only get second-level granularity on the date!

Linting with Shellcheck

Finally, here is a very useful tip for understanding bash more deeply and improving any bash scripts you come across.

Shellcheck is a website and a package available on most platforms that gives you advice to help fix and improve your shell scripts. Very often, its advice has prompted me to research more deeply and understand bash better.

Here is some example output from a script I found on my laptop:

$ shellcheck shrinkpdf.sh
In shrinkpdf.sh line 44:
          -dColorImageResolution=$3             \
                                 ^-- SC2086: Double quote to prevent globbing and word splitting.
In shrinkpdf.sh line 46:
          -dGrayImageResolution=$3              \
                                ^-- SC2086: Double quote to prevent globbing and word splitting.
In shrinkpdf.sh line 48:
          -dMonoImageResolution=$3              \
                                ^-- SC2086: Double quote to prevent globbing and word splitting.
In shrinkpdf.sh line 57:
        if [ ! -f "$1" -o ! -f "$2" ]; then
                      ^-- SC2166: Prefer [ p ] || [ q ] as [ p -o q ] is not well defined.
In shrinkpdf.sh line 60:
        ISIZE="$(echo $(wc -c "$1") | cut -f1 -d\ )"
                      ^-- SC2046: Quote this to prevent word splitting.
                      ^-- SC2005: Useless echo? Instead of 'echo $(cmd)', just use 'cmd'.
In shrinkpdf.sh line 61:
        OSIZE="$(echo $(wc -c "$2") | cut -f1 -d\ )"
                      ^-- SC2046: Quote this to prevent word splitting.
                      ^-- SC2005: Useless echo? Instead of 'echo $(cmd)', just use 'cmd'.

The most common reminders are regarding potential quoting issues, but you can see other useful tips in the above output, such as preferred arguments to the test construct, and advice on "useless" echo s.

Exercise

1) Find a large bash script on a social coding site such as GitHub, and run shellcheck over it. Contribute back any improvements you find.


[Jul 02, 2020] Associative arrays in Bash by Seth Kenlon

Apr 02, 2020 | opensource.com

Originally from: Get started with Bash scripting for sysadmins - Opensource.com

Most shells offer the ability to create, manipulate, and query indexed arrays. In plain English, an indexed array is a list of things prefixed with a number. This list of things, along with their assigned number, is conveniently wrapped up in a single variable, which makes it easy to "carry" it around in your code.

Bash, however, includes the ability to create associative arrays and treats these arrays the same as any other array. An associative array lets you create lists of key and value pairs, instead of just numbered values.

The nice thing about associative arrays is that keys can be arbitrary:

$ declare -A userdata
$ userdata [ name ] =seth
$ userdata [ pass ] =8eab07eb620533b083f241ec4e6b9724
$ userdata [ login ] = ` date --utc + % s `

Query any key:

$ echo " ${userdata[name]} "
seth
$ echo " ${userdata[login]} "
1583362192

Most of the usual array operations you'd expect from an array are available.

Resources

[Mar 05, 2020] Debug your shell scripts with bashdb by Ben Martin

Nov 24, 2008 | www.linux.com

Author: Ben Martin

The Bash Debugger Project (bashdb) lets you set breakpoints, inspect variables, perform a backtrace, and step through a bash script line by line. In other words, it provides the features you expect in a C/C++ debugger to anyone programming a bash script.

To see if your standard bash executable has bashdb support, execute the command shown below; if you are not taken to a bashdb prompt then you'll have to install bashdb yourself.

$ bash --debugger -c "set|grep -i dbg" ... bashdb

The Ubuntu Intrepid repository contains a package for bashdb, but there is no special bashdb package in the openSUSE 11 or Fedora 9 repositories. I built from source using version 4.0-0.1 of bashdb on a 64-bit Fedora 9 machine, using the normal ./configure; make; sudo make install commands.

You can start the Bash Debugger using the bash --debugger foo.sh syntax or the bashdb foo.sh command. The former method is recommended except in cases where I/O redirection might cause issues, and it's what I used. You can also use bashdb through ddd or from an Emacs buffer.

The syntax for many of the commands in bashdb mimics that of gdb, the GNU debugger. You can step into functions, use next to execute the next line without stepping into any functions, generate a backtrace with bt , exit bashdb with quit or Ctrl-D, and examine a variable with print $foo . Aside from the prefixing of the variable with $ at the end of the last sentence, there are some other minor differences that you'll notice. For instance, pressing Enter on a blank line in bashdb executes the previous step or next command instead of whatever the previous command was.

The print command forces you to prefix shell variables with the dollar sign ( $foo ). A slightly shorter way of inspecting variables and functions is to use the x foo command, which uses declare to print variables and functions.

Both bashdb and your script run inside the same bash shell. Because bash lacks some namespace properties, bashdb will include some functions and symbols into the global namespace which your script can get at. bashdb prefixes its symbols with _Dbg_ , so you should avoid that prefix in your scripts to avoid potential clashes. bashdb also uses some environment variables; it uses the DBG_ prefix for its own, and relies on some standard bash ones that begin with BASH_ .

me name=

To illustrate the use of bashdb, I'll work on the small bash script below, which expects a numeric argument n and calculates the nth Fibonacci number .

#!/bin/bash version="0.01"; fibonacci() { n=${1:?If you want the nth fibonacci number, you must supply n as the first parameter.} if [ $n -le 1 ]; then echo $n else l=`fibonacci $((n-1))` r=`fibonacci $((n-2))` echo $((l + r)) fi } for i in `seq 1 10` do result=$(fibonacci $i) echo "i=$i result=$result" done

The below session shows bashdb in action, stepping over and then into the fibonacci function and inspecting variables. I've made my input text bold for ease of reading. An initial backtrace ( bt ) shows that the script begins at line 3, which is where the version variable is written. The next and list commands then progress to the next line of the script a few times and show the context of the current execution line. After one of the next commands I press Enter to execute next again. I invoke the examine command through the single letter shortcut x . Notice that the variables are printed out using declare as opposed to their display on the next line using print . Finally I set a breakpoint at the start of the fibonacci function and continue the execution of the shell script. The fibonacci function is called and I move to the next line a few times and inspect a variable.

$ bash --debugger ./fibonacci.sh ... (/home/ben/testing/bashdb/fibonacci.sh:3): 3: version="0.01"; bashdb bt ->0 in file `./fibonacci.sh' at line 3 ##1 main() called from file `./fibonacci.sh' at line 0 bashdb next (/home/ben/testing/bashdb/fibonacci.sh:16): 16: for i in `seq 1 10` bashdb list 16:==>for i in `seq 1 10` 17: do 18: result=$(fibonacci $i) 19: echo "i=$i result=$result" 20: done bashdb next (/home/ben/testing/bashdb/fibonacci.sh:18): 18: result=$(fibonacci $i) bashdb (/home/ben/testing/bashdb/fibonacci.sh:19): 19: echo "i=$i result=$result" bashdb x i result declare -- i="1" declare -- result="" bashdb print $i $result 1 bashdb break fibonacci Breakpoint 1 set in file /home/ben/testing/bashdb/fibonacci.sh, line 5. bashdb continue Breakpoint 1 hit (1 times). (/home/ben/testing/bashdb/fibonacci.sh:5): 5: fibonacci() { bashdb next (/home/ben/testing/bashdb/fibonacci.sh:6): 6: n=${1:?If you want the nth fibonacci number, you must supply n as the first parameter.} bashdb next (/home/ben/testing/bashdb/fibonacci.sh:7): 7: if [ $n -le 1 ]; then bashdb x n declare -- n="2" bashdb quit

Notice that the number in the bashdb prompt toward the end of the above example is enclosed in parentheses. Each set of parentheses indicates that you have entered a subshell. In this example this is due to being inside a shell function.

In the below example I use a watchpoint to see if and where the result variable changes. Notice the initial next command. I found that if I didn't issue that next then my watch would fail to work. As you can see, after I issue c to continue execution, execution is stopped whenever the result variable is about to change, and the new and old value are displayed.

(/home/ben/testing/bashdb/fibonacci.sh:3): 3: version="0.01"; bashdb<0> next (/home/ben/testing/bashdb/fibonacci.sh:16): 16: for i in `seq 1 10` bashdb<1> watch result 0: ($result)==0 arith: 0 bashdb<2> c Watchpoint 0: $result changed: old value: '' new value: '1' (/home/ben/testing/bashdb/fibonacci.sh:19): 19: echo "i=$i result=$result" bashdb<3> c i=1 result=1 i=2 result=1 Watchpoint 0: $result changed: old value: '1' new value: '2' (/home/ben/testing/bashdb/fibonacci.sh:19): 19: echo "i=$i result=$result"

To get around the strange initial next requirement I used the watche command in the below session, which lets you stop whenever an expression becomes true. In this case I'm not overly interested in the first few Fibonacci numbers so I set a watch to have execution stop when the result is greater than 4. You can also use a watche command without a condition; for example, watche result would stop execution whenever the result variable changed.

$ bash --debugger ./fibonacci.sh (/home/ben/testing/bashdb/fibonacci.sh:3): 3: version="0.01"; bashdb<0> watche result > 4 0: (result > 4)==0 arith: 1 bashdb<1> continue i=1 result=1 i=2 result=1 i=3 result=2 i=4 result=3 Watchpoint 0: result > 4 changed: old value: '0' new value: '1' (/home/ben/testing/bashdb/fibonacci.sh:19): 19: echo "i=$i result=$result"

When a shell script goes wrong, many folks use the time-tested method of incrementally adding in echo or printf statements to look for invalid values or code paths that are never reached. With bashdb, you can save yourself time by just adding a few watches on variables or setting a few breakpoints.

[Nov 28, 2019] Beginner shell scripting: Is there a shell script to rename a text file from its first line?

Sep 30, 2010 | www.reddit.com

1 r/commandline • Posted by u/acksed 6 years ago

I had to use file recovery software when I accidentally formatted my backup. It worked, but I now have 37,000 text files with numbers where names used to be.

If I name each file with the first 20-30 characters, I can sort the text-wheat from the bit-chaff.

I have the vague idea of using whatever the equivalent of head is on Windows, but that's as far as I got. I'm not so hot on bash scripting either. 9 comments 54% Upvoted This thread is archived New comments cannot be posted and votes cannot be cast Sort by level 1


tatumc 6 points · 6 years ago

To rename each file with the first line of the file, you can do:

for i in *; do mv $i "$(head -1 "$i")"; done

You can use cp instead of mv or make a backup of the dir first to be sure you don't accidentally nuke anything. level 2

acksed 2 points · 6 years ago
· edited 6 years ago

This is almost exactly what I wanted. Thanks! A quick tweak:

for i in *; do mv $i "$(head -c 30 "$i")"; done

Now, I know CygWin is a thing, wonder if it'll work for me. level 3

tatumc 1 point · 6 years ago

Just keep in mind that 'head -c' will include newlines which will garble the new file names. level 3

acksed 1 point · 6 years ago
· edited 6 years ago

Answer: not really. The environment and script's working, but whenever there's a forward slash or non-escaping character in the text, it chokes when it tries to set up a new directory, and it deletes the file suffix. :-/ Good thing I used a copy of the data.

Need something to strip out the characters and spaces, and add the file suffix, before it tries to rename. sed ? Also needs file to identify it as true text. I can do the suffix at least:

for i in *; do mv $i "$(head -c 30 "$i").txt"; done
level 4
tatumc 1 point · 6 years ago

I recommend you use 'head -1', which will make the first line of the file the filename and you won't have to worry about newlines. Then you can change the spaces to underscores with:

for i in *; do mv -v "$i" `echo $i | tr ' ' '_' `
level 1
yeayoushookme 1 point · 6 years ago
· edited 6 years ago

There's the file program on *nix that'll tell you, in a verbose manner, the type of the file you give it as an argument, irregardless of its file extension. Example:

$ file test.mp3 
test.mp3: , 48 kHz, JntStereo
$ file mbr.bin
mbr.bin: data
$ file CalendarExport.ics
CalendarExport.ics: HTML document, UTF-8 Unicode text, with very long lines, with CRLF, LF line terminators
$ file jmk.doc
jmk.doc: Composite Document File V2 Document, Little Endian, Os: Windows, Version 6.0, Code page: 1250, Title: xx, Author: xx, Template: Normal, Last Saved By: xx, Revision Number: 4, Name of Creating Application: Microsoft Office Word, Total Editing Time: 2d+03:32:00, Last Printed: Fri Feb 22 11:29:00 2008, Create Time/Date: Fri Jan  4 12:57:00 2013, Last Saved Time/Date: Sun Jan  6 16:30:00 2013, Number of Pages: 6, Number of Words: 1711, Number of Characters: 11808, Security: 0
level 2
acksed 1 point · 6 years ago
· edited 6 years ago

Thank you, but the software I used to recover (R-Undelete) sorted them already. I found another program, RenameMaestro, that renames according to metadata in zip, rar, pdf, doc and other files, but text files are too basic.

Edit: You were right, I did need it. level 1

RonaldoNazario 1 point · 6 years ago

Not command line, but you could probably do this pretty easily in python, using "glob" to get filenames, and os read and move/rename functions to get the text and change filenames. level 1

pfp-disciple 1 point · 6 years ago

So far, you're not getting many windows command line ideas :(. I don't have any either, but here's an idea:

Use one of the live Linux distributions (Porteus is pretty cool, but there're a slew of others). In that Linux environment, you can mount your Windows hard drive, and use Linux tools, maybe something like /u/tatumc suggested. r/commandline

[Sep 07, 2019] How to Debug Bash Scripts by Mike Ward

Sep 05, 2019 | linuxconfig.org

05 September 2019

... ... ... How to use other Bash options

The Bash options for debugging are turned off by default, but once they are turned on by using the set command, they stay on until explicitly turned off. If you are not sure which options are enabled, you can examine the $- variable to see the current state of all the variables.

$ echo $-
himBHs
$ set -xv && echo $-
himvxBHs

There is another useful switch we can use to help us find variables referenced without having any value set. This is the -u switch, and just like -x and -v it can also be used on the command line, as we see in the following example:

set u option at command line <img src=https://linuxconfig.org/images/02-how-to-debug-bash-scripts.png alt="set u option at command line" width=1200 height=254 /> Setting u option at the command line

We mistakenly assigned a value of 7 to the variable called "level" then tried to echo a variable named "score" that simply resulted in printing nothing at all to the screen. Absolutely no debug information was given. Setting our -u switch allows us to see a specific error message, "score: unbound variable" that indicates exactly what went wrong.

We can use those options in short Bash scripts to give us debug information to identify problems that do not otherwise trigger feedback from the Bash interpreter. Let's walk through a couple of examples.

#!/bin/bash

read -p "Path to be added: " $path

if [ "$path" = "/home/mike/bin" ]; then
        echo $path >> $PATH
        echo "new path: $PATH"
else
        echo "did not modify PATH"
fi
results from addpath script <img src=https://linuxconfig.org/images/03-how-to-debug-bash-scripts.png alt="results from addpath script" width=1200 height=417 /> Using x option when running your Bash script

In the example above we run the addpath script normally and it simply does not modify our PATH . It does not give us any indication of why or clues to mistakes made. Running it again using the -x option clearly shows us that the left side of our comparison is an empty string. $path is an empty string because we accidentally put a dollar sign in front of "path" in our read statement. Sometimes we look right at a mistake like this and it doesn't look wrong until we get a clue and think, "Why is $path evaluated to an empty string?"

Looking this next example, we also get no indication of an error from the interpreter. We only get one value printed per line instead of two. This is not an error that will halt execution of the script, so we're left to simply wonder without being given any clues. Using the -u switch,we immediately get a notification that our variable j is not bound to a value. So these are real time savers when we make mistakes that do not result in actual errors from the Bash interpreter's point of view.

#!/bin/bash

for i in 1 2 3
do
        echo $i $j
done
results from count.sh script <img src=https://linuxconfig.org/images/04-how-to-debug-bash-scripts.png alt="results from count.sh script" width=1200 height=291 /> Using u option running your script from the command line

Now surely you are thinking that sounds fine, but we seldom need help debugging mistakes made in one-liners at the command line or in short scripts like these. We typically struggle with debugging when we deal with longer and more complicated scripts, and we rarely need to set these options and leave them set while we run multiple scripts. Setting -xv options and then running a more complex script will often add confusion by doubling or tripling the amount of output generated.

Fortunately we can use these options in a more precise way by placing them inside our scripts. Instead of explicitly invoking a Bash shell with an option from the command line, we can set an option by adding it to the shebang line instead.

#!/bin/bash -x

This will set the -x option for the entire file or until it is unset during the script execution, allowing you to simply run the script by typing the filename instead of passing it to Bash as a parameter. A long script or one that has a lot of output will still become unwieldy using this technique however, so let's look at a more specific way to use options.


me name=


For a more targeted approach, surround only the suspicious blocks of code with the options you want. This approach is great for scripts that generate menus or detailed output, and it is accomplished by using the set keyword with plus or minus once again.

#!/bin/bash

read -p "Path to be added: " $path

set -xv
if [ "$path" = "/home/mike/bin" ]; then
        echo $path >> $PATH
        echo "new path: $PATH"
else
        echo "did not modify PATH"
fi
set +xv
results from addpath script <img src=https://linuxconfig.org/images/05-how-to-debug-bash-scripts.png alt="results from addpath script" width=1200 height=469 /> Wrapping options around a block of code in your script

We surrounded only the blocks of code we suspect in order to reduce the output, making our task easier in the process. Notice we turn on our options only for the code block containing our if-then-else statement, then turn off the option(s) at the end of the suspect block. We can turn these options on and off multiple times in a single script if we can't narrow down the suspicious areas, or if we want to evaluate the state of variables at various points as we progress through the script. There is no need to turn off an option If we want it to continue for the remainder of the script execution.

For completeness sake we should mention also that there are debuggers written by third parties that will allow us to step through the code execution line by line. You might want to investigate these tools, but most people find that that they are not actually needed.

As seasoned programmers will suggest, if your code is too complex to isolate suspicious blocks with these options then the real problem is that the code should be refactored. Overly complex code means bugs can be difficult to detect and maintenance can be time consuming and costly.

One final thing to mention regarding Bash debugging options is that a file globbing option also exists and is set with -f . Setting this option will turn off globbing (expansion of wildcards to generate file names) while it is enabled. This -f option can be a switch used at the command line with bash, after the shebang in a file or, as in this example to surround a block of code.

#!/bin/bash

echo "ignore fileglobbing option turned off"
ls *

echo "ignore file globbing option set"
set -f
ls *
set +f
results from -f option <img src=https://linuxconfig.org/images/06-how-to-debug-bash-scripts.png alt="results from -f option" width=1200 height=314 /> Using f option to turn off file globbing How to use trap to help debug

There are more involved techniques worth considering if your scripts are complicated, including using an assert function as mentioned earlier. One such method to keep in mind is the use of trap. Shell scripts allow us to trap signals and do something at that point.

A simple but useful example you can use in your Bash scripts is to trap on EXIT .

#!/bin/bash

trap 'echo score is $score, status is $status' EXIT

if [ -z  ]; then
        status="default"
else
        status=
fi

score=0
if [ ${USER} = 'superman' ]; then
        score=99
elif [ $# -gt 1 ]; then
        score=
fi
results from using trap EXIT <img src=https://linuxconfig.org/images/07-how-to-debug-bash-scripts.png alt="results from using trap EXIT" width=1200 height=469 /> Using trap EXIT to help debug your script

me name=


As you can see just dumping the current values of variables to the screen can be useful to show where your logic is failing. The EXIT signal obviously does not need an explicit exit statement to be generated; in this case the echo statement is executed when the end of the script is reached.

Another useful trap to use with Bash scripts is DEBUG . This happens after every statement, so it can be used as a brute force way to show the values of variables at each step in the script execution.

#!/bin/bash

trap 'echo "line ${LINENO}: score is $score"' DEBUG

score=0

if [ "${USER}" = "mike" ]; then
        let "score += 1"
fi

let "score += 1"

if [ "" = "7" ]; then
        score=7
fi
exit 0
results from using trap DEBUG <img src=https://linuxconfig.org/images/08-how-to-debug-bash-scripts.png alt="results from using trap DEBUG" width=1200 height=469 /> Using trap DEBUG to help debug your script Conclusion

When you notice your Bash script not behaving as expected and the reason is not clear to you for whatever reason, consider what information would be useful to help you identify the cause then use the most comfortable tools available to help you pinpoint the issue. The xtrace option -x is easy to use and probably the most useful of the options presented here, so consider trying it out next time you're faced with a script that's not doing what you thought it would

[Sep 06, 2019] Using Case Insensitive Matches with Bash Case Statements by Steven Vona

Jun 30, 2019 | www.putorius.net

If you want to match the pattern regardless of it's case (Capital letters or lowercase letters) you can set the nocasematch shell option with the shopt builtin. You can do this as the first line of your script. Since the script will run in a subshell it won't effect your normal environment.

#!/bin/bash
 shopt -s nocasematch
 read -p "Name a Star Trek character: " CHAR
 case $CHAR in
   "Seven of Nine" | Neelix | Chokotay | Tuvok | Janeway )
       echo "$CHAR was in Star Trek Voyager"
       ;;&
   Archer | Phlox | Tpol | Tucker )
       echo "$CHAR was in Star Trek Enterprise"
       ;;&
   Odo | Sisko | Dax | Worf | Quark )
       echo "$CHAR was in Star Trek Deep Space Nine"
       ;;&
   Worf | Data | Riker | Picard )
       echo "$CHAR was in Star Trek The Next Generation" &&  echo "/etc/redhat-release"
       ;;
   *) echo "$CHAR is not in this script." 
       ;;
 esac

[Sep 02, 2019] Switch statement for bash script

Sep 02, 2019 | www.linuxquestions.org
Switch statement for bash script
<a rel='nofollow' target='_blank' href='//rev.linuxquestions.org/www/delivery/ck.php?n=a054b75'><img border='0' alt='' src='//rev.linuxquestions.org/www/delivery/avw.php?zoneid=10&amp;n=a054b75' /></a>
[ Log in to get rid of this advertisement] Hello, i am currently trying out the switch statement using bash script.

CODE:
showmenu () {
echo "1. Number1"
echo "2. Number2"
echo "3. Number3"
echo "4. All"
echo "5. Quit"
}

while true
do
showmenu
read choice
echo "Enter a choice:"
case "$choice" in
"1")
echo "Number One"
;;
"2")
echo "Number Two"
;;
"3")
echo "Number Three"
;;
"4")
echo "Number One, Two, Three"
;;
"5")
echo "Program Exited"
exit 0
;;
*)
echo "Please enter number ONLY ranging from 1-5!"
;;
esac
done

OUTPUT:
1. Number1
2. Number2
3. Number3
4. All
5. Quit
Enter a choice:

So, when the code is run, a menu with option 1-5 will be shown, then the user will be asked to enter a choice and finally an output is shown. But it is possible if the user want to enter multiple choices. For example, user enter choice "1" and "3", so the output will be "Number One" and "Number Three". Any idea?

Just something to get you started.

Code:

#! /bin/bash
showmenu ()
{
    typeset ii
    typeset -i jj=1
    typeset -i kk
    typeset -i valid=0  # valid=1 if input is good

    while (( ! valid ))
    do
        for ii in "${options[@]}"
        do
            echo "$jj) $ii"
            let jj++
        done
        read -e -p 'Select a list of actions : ' -a answer
        jj=0
        valid=1
        for kk in "${answer[@]}"
        do
            if (( kk < 1 || kk > "${#options[@]}" ))
            then
                echo "Error Item $jj is out of bounds" 1>&2
                valid=0
                break
            fi
            let jj++
        done
    done
}

typeset -r c1=Number1
typeset -r c2=Number2
typeset -r c3=Number3
typeset -r c4=All
typeset -r c5=Quit
typeset -ra options=($c1 $c2 $c3 $c4 $c5)
typeset -a answer
typeset -i kk
while true
do
    showmenu
    for kk in "${answer[@]}"
    do
        case $kk in
        1)
            echo 'Number One'
            ;;
        2)
            echo 'Number Two'
            ;;
        3)
            echo 'Number Three'
            ;;
        4)
            echo 'Number One, Two, Three'
            ;;
        5)
            echo 'Program Exit'
            exit 0
            ;;
        esac
    done 
done
stevenworr
View Public Profile
View LQ Blog
View Review Entries
View HCL Entries
Find More Posts by stevenworr
Old 11-16-2009, 10:10 PM # 4
wjs1990 Member
Registered: Nov 2009 Posts: 30
Original Poster
Rep: Reputation: 15
Ok will try it out first. Thanks.
Last edited by wjs1990; 11-16-2009 at 10:13 PM .
wjs1990
View Public Profile
View LQ Blog
View Review Entries
View HCL Entries
Find More Posts by wjs1990
Old 11-16-2009, 10:16 PM # 5
evo2 LQ Guru
Registered: Jan 2009 Location: Japan Distribution: Mostly Debian and CentOS Posts: 5,945
Rep: Reputation: 1376 Reputation: 1376 Reputation: 1376 Reputation: 1376 Reputation: 1376 Reputation: 1376 Reputation: 1376 Reputation: 1376 Reputation: 1376 Reputation: 1376
This can be done just by wrapping your case block in a for loop and changing one line.

Code:

#!/bin/bash
showmenu () {
    echo "1. Number1"
    echo "2. Number2"
    echo "3. Number3"
    echo "4. All"
    echo "5. Quit"
}

while true ; do
    showmenu
    read choices
    for choice in $choices ; do
        case "$choice" in
            1)
                echo "Number One" ;;
            2)
                echo "Number Two" ;;
            3)
                echo "Number Three" ;;
            4)
                echo "Numbers One, two, three" ;;
            5)
                echo "Exit"
                exit 0 ;;
            *)
                echo "Please enter number ONLY ranging from 1-5!"
                ;;
        esac
    done
done
You can now enter any number of numbers seperated by white space.

Cheers,

EVo2.

[Aug 28, 2019] Echo Command in Linux with Examples

Notable quotes:
"... The -e parameter is used for the interpretation of backslashes ..."
"... The -n option is used for omitting trailing newline. ..."
Aug 28, 2019 | linoxide.com

The -e parameter is used for the interpretation of backslashes

... ... ...

To create a new line after each word in a string use the -e operator with the \n option as shown
$ echo -e "Linux \nis \nan \nopensource \noperating \nsystem"

... ... ...

Omit echoing trailing newline

The -n option is used for omitting trailing newline. This is shown in the example below

$ echo -n "Linux is an opensource operating system"

Sample Output

Linux is an opensource operating systemjames@buster:/$

[Aug 27, 2019] Bash Variables - Bash Reference Manual

Aug 27, 2019 | bash.cyberciti.biz

BASH_LINENO

An array variable whose members are the line numbers in source files corresponding to each member of FUNCNAME . ${BASH_LINENO[$i]} is the line number in the source file where ${FUNCNAME[$i]} was called. The corresponding source file name is ${BASH_SOURCE[$i]} . Use LINENO to obtain the current line number.

[Aug 27, 2019] linux - How to show line number when executing bash script

Aug 27, 2019 | stackoverflow.com

How to show line number when executing bash script Ask Question Asked 6 years, 1 month ago Active 1 year, 4 months ago Viewed 47k times 68 31


dspjm ,Jul 23, 2013 at 7:31

I have a test script which has a lot of commands and will generate lots of output, I use set -x or set -v and set -e , so the script would stop when error occurs. However, it's still rather difficult for me to locate which line did the execution stop in order to locate the problem. Is there a method which can output the line number of the script before each line is executed? Or output the line number before the command exhibition generated by set -x ? Or any method which can deal with my script line location problem would be a great help. Thanks.

Suvarna Pattayil ,Jul 28, 2017 at 17:25

You mention that you're already using -x . The variable PS4 denotes the value is the prompt printed before the command line is echoed when the -x option is set and defaults to : followed by space.

You can change PS4 to emit the LINENO (The line number in the script or shell function currently executing).

For example, if your script reads:

$ cat script
foo=10
echo ${foo}
echo $((2 + 2))

Executing it thus would print line numbers:

$ PS4='Line ${LINENO}: ' bash -x script
Line 1: foo=10
Line 2: echo 10
10
Line 3: echo 4
4

http://wiki.bash-hackers.org/scripting/debuggingtips gives the ultimate PS4 that would output everything you will possibly need for tracing:

export PS4='+(${BASH_SOURCE}:${LINENO}): ${FUNCNAME[0]:+${FUNCNAME[0]}(): }'

Deqing ,Jul 23, 2013 at 8:16

In Bash, $LINENO contains the line number where the script currently executing.

If you need to know the line number where the function was called, try $BASH_LINENO . Note that this variable is an array.

For example:

#!/bin/bash       

function log() {
    echo "LINENO: ${LINENO}"
    echo "BASH_LINENO: ${BASH_LINENO[*]}"
}

function foo() {
    log "$@"
}

foo "$@"

See here for details of Bash variables.

Eliran Malka ,Apr 25, 2017 at 10:14

Simple (but powerful) solution: Place echo around the code you think that causes the problem and move the echo line by line until the messages does not appear anymore on screen - because the script has stop because of an error before.

Even more powerful solution: Install bashdb the bash debugger and debug the script line by line

kklepper ,Apr 2, 2018 at 22:44

Workaround for shells without LINENO

In a fairly sophisticated script I wouldn't like to see all line numbers; rather I would like to be in control of the output.

Define a function

echo_line_no () {
    grep -n "$1" $0 |  sed "s/echo_line_no//" 
    # grep the line(s) containing input $1 with line numbers
    # replace the function name with nothing 
} # echo_line_no

Use it with quotes like

echo_line_no "this is a simple comment with a line number"

Output is

16   "this is a simple comment with a line number"

if the number of this line in the source file is 16.

This basically answers the question How to show line number when executing bash script for users of ash or other shells without LINENO .

Anything more to add?

Sure. Why do you need this? How do you work with this? What can you do with this? Is this simple approach really sufficient or useful? Why do you want to tinker with this at all?

Want to know more? Read reflections on debugging

[Oct 17, 2018] How to use arrays in bash script - LinuxConfig.org

Oct 17, 2018 | linuxconfig.org

Create indexed arrays on the fly We can create indexed arrays with a more concise syntax, by simply assign them some values:

$ my_array=(foo bar)
In this case we assigned multiple items at once to the array, but we can also insert one value at a time, specifying its index:
$ my_array[0]=foo
Array operations Once an array is created, we can perform some useful operations on it, like displaying its keys and values or modifying it by appending or removing elements: Print the values of an array To display all the values of an array we can use the following shell expansion syntax:
${my_array[@]}
Or even:
${my_array[*]}
Both syntax let us access all the values of the array and produce the same results, unless the expansion it's quoted. In this case a difference arises: in the first case, when using @ , the expansion will result in a word for each element of the array. This becomes immediately clear when performing a for loop . As an example, imagine we have an array with two elements, "foo" and "bar":
$ my_array=(foo bar)
Performing a for loop on it will produce the following result:
$ for i in "${my_array[@]}"; do echo "$i"; done
foo
bar
When using * , and the variable is quoted, instead, a single "result" will be produced, containing all the elements of the array:
$ for i in "${my_array[*]}"; do echo "$i"; done
foo bar

me name=


Print the keys of an array It's even possible to retrieve and print the keys used in an indexed or associative array, instead of their respective values. The syntax is almost identical, but relies on the use of the ! operator:
$ my_array=(foo bar baz)
$ for index in "${!my_array[@]}"; do echo "$index"; done
0
1
2
The same is valid for associative arrays:
$ declare -A my_array
$ my_array=([foo]=bar [baz]=foobar)
$ for key in "${!my_array[@]}"; do echo "$key"; done
baz
foo
As you can see, being the latter an associative array, we can't count on the fact that retrieved values are returned in the same order in which they were declared. Getting the size of an array We can retrieve the size of an array (the number of elements contained in it), by using a specific shell expansion:
$ my_array=(foo bar baz)
$ echo "the array contains ${#my_array[@]} elements"
the array contains 3 elements
We have created an array which contains three elements, "foo", "bar" and "baz", then by using the syntax above, which differs from the one we saw before to retrieve the array values only for the # character before the array name, we retrieved the number of the elements in the array instead of its content. Adding elements to an array As we saw, we can add elements to an indexed or associative array by specifying respectively their index or associative key. In the case of indexed arrays, we can also simply add an element, by appending to the end of the array, using the += operator:
$ my_array=(foo bar)
$ my_array+=(baz)
If we now print the content of the array we see that the element has been added successfully:
$ echo "${my_array[@]}"
foo bar baz
Multiple elements can be added at a time:
$ my_array=(foo bar)
$ my_array+=(baz foobar)
$ echo "${my_array[@]}"
foo bar baz foobar
To add elements to an associative array, we are bound to specify also their associated keys:
$ declare -A my_array

# Add single element
$ my_array[foo]="bar"

# Add multiple elements at a time
$ my_array+=([baz]=foobar [foobarbaz]=baz)

me name=


Deleting an element from the array To delete an element from the array we need to know it's index or its key in the case of an associative array, and use the unset command. Let's see an example:
$ my_array=(foo bar baz)
$ unset my_array[1]
$ echo ${my_array[@]}
foo baz
We have created a simple array containing three elements, "foo", "bar" and "baz", then we deleted "bar" from it running unset and referencing the index of "bar" in the array: in this case we know it was 1 , since bash arrays start at 0. If we check the indexes of the array, we can now see that 1 is missing:
$ echo ${!my_array[@]}
0 2
The same thing it's valid for associative arrays:
$ declare -A my_array
$ my_array+=([foo]=bar [baz]=foobar)
$ unset my_array[foo]
$ echo ${my_array[@]}
foobar
In the example above, the value referenced by the "foo" key has been deleted, leaving only "foobar" in the array.

Deleting an entire array, it's even simpler: we just pass the array name as an argument to the unset command without specifying any index or key:

$ unset my_array
$ echo ${!my_array[@]}

After executing unset against the entire array, when trying to print its content an empty result is returned: the array doesn't exist anymore. Conclusions In this tutorial we saw the difference between indexed and associative arrays in bash, how to initialize them and how to perform fundamental operations, like displaying their keys and values and appending or removing items. Finally we saw how to unset them completely. Bash syntax can sometimes be pretty weird, but using arrays in scripts can be really useful. When a script starts to become more complex than expected, my advice is, however, to switch to a more capable scripting language such as python.

[Jun 01, 2018] Introduction to Bash arrays by Robert Aboukhalil

Jun 01, 2018 | opensource.com

... ... ...

Looping through arrays

Although in the examples above we used integer indices in our arrays, let's consider two occasions when that won't be the case: First, if we wanted the $i -th element of the array, where $i is a variable containing the index of interest, we can retrieve that element using: echo ${allThreads[$i]} . Second, to output all the elements of an array, we replace the numeric index with the @ symbol (you can think of @ as standing for all ): echo ${allThreads[@]} .

Looping through array elements

With that in mind, let's loop through $allThreads and launch the pipeline for each value of --threads :

for t in ${allThreads[@]} ; do
. / pipeline --threads $t
done

Looping through array indices

Next, let's consider a slightly different approach. Rather than looping over array elements , we can loop over array indices :

for i in ${!allThreads[@]} ; do
. / pipeline --threads ${allThreads[$i]}
done

Let's break that down: As we saw above, ${allThreads[@]} represents all the elements in our array. Adding an exclamation mark to make it ${!allThreads[@]} will return the list of all array indices (in our case 0 to 7). In other words, the for loop is looping through all indices $i and reading the $i -th element from $allThreads to set the value of the --threads parameter.

This is much harsher on the eyes, so you may be wondering why I bother introducing it in the first place. That's because there are times where you need to know both the index and the value within a loop, e.g., if you want to ignore the first element of an array, using indices saves you from creating an additional variable that you then increment inside the loop.

Populating arrays

So far, we've been able to launch the pipeline for each --threads of interest. Now, let's assume the output to our pipeline is the runtime in seconds. We would like to capture that output at each iteration and save it in another array so we can do various manipulations with it at the end.

Some useful syntax

But before diving into the code, we need to introduce some more syntax. First, we need to be able to retrieve the output of a Bash command. To do so, use the following syntax: output=$( ./my_script.sh ) , which will store the output of our commands into the variable $output .

The second bit of syntax we need is how to append the value we just retrieved to an array. The syntax to do that will look familiar:

myArray+=( "newElement1" "newElement2" )
The parameter sweep

Putting everything together, here is our script for launching our parameter sweep:

allThreads = ( 1 2 4 8 16 32 64 128 )
allRuntimes = ()
for t in ${allThreads[@]} ; do
runtime =$ ( . / pipeline --threads $t )
allRuntimes+= ( $runtime )
done

And voilà!

What else you got?

In this article, we covered the scenario of using arrays for parameter sweeps. But I promise there are more reasons to use Bash arrays -- here are two more examples.

Log alerting

In this scenario, your app is divided into modules, each with its own log file. We can write a cron job script to email the right person when there are signs of trouble in certain modules:

# List of logs and who should be notified of issues
logPaths = ( "api.log" "auth.log" "jenkins.log" "data.log" )
logEmails = ( "jay@email" "emma@email" "jon@email" "sophia@email" )

# Look for signs of trouble in each log
for i in ${!logPaths[@]} ;
do
log = ${logPaths[$i]}
stakeholder = ${logEmails[$i]}
numErrors =$ ( tail -n 100 " $log " | grep "ERROR" | wc -l )

# Warn stakeholders if recently saw > 5 errors
if [[ " $numErrors " -gt 5 ]] ;
then
emailRecipient = " $stakeholder "
emailSubject = "WARNING: ${log} showing unusual levels of errors"
emailBody = " ${numErrors} errors found in log ${log} "
echo " $emailBody " | mailx -s " $emailSubject " " $emailRecipient "
fi
done

API queries

Say you want to generate some analytics about which users comment the most on your Medium posts. Since we don't have direct database access, SQL is out of the question, but we can use APIs!

To avoid getting into a long discussion about API authentication and tokens, we'll instead use JSONPlaceholder , a public-facing API testing service, as our endpoint. Once we query each post and retrieve the emails of everyone who commented, we can append those emails to our results array:

endpoint = "https://jsonplaceholder.typicode.com/comments"
allEmails = ()

# Query first 10 posts
for postId in { 1 .. 10 } ;
do
# Make API call to fetch emails of this posts's commenters
response =$ ( curl " ${endpoint} ?postId= ${postId} " )

# Use jq to parse the JSON response into an array
allEmails+= ( $ ( jq '.[].email' <<< " $response " ) )
done

Note here that I'm using the jq tool to parse JSON from the command line. The syntax of jq is beyond the scope of this article, but I highly recommend you look into it.

As you might imagine, there are countless other scenarios in which using Bash arrays can help, and I hope the examples outlined in this article have given you some food for thought. If you have other examples to share from your own work, please leave a comment below.

But wait, there's more!

Since we covered quite a bit of array syntax in this article, here's a summary of what we covered, along with some more advanced tricks we did not cover:

Syntax Result
arr=() Create an empty array
arr=(1 2 3) Initialize array
${arr[2]} Retrieve third element
${arr[@]} Retrieve all elements
${!arr[@]} Retrieve array indices
${#arr[@]} Calculate array size
arr[0]=3 Overwrite 1st element
arr+=(4) Append value(s)
str=$(ls) Save ls output as a string
arr=( $(ls) ) Save ls output as an array of files
${arr[@]:s:n} Retrieve elements at indices n to s+n
One last thought

As we've discovered, Bash arrays sure have strange syntax, but I hope this article convinced you that they are extremely powerful. Once you get the hang of the syntax, you'll find yourself using Bash arrays quite often.

... ... ...

Robert Aboukhalil is a Bioinformatics Software Engineer. In his work, he develops cloud applications for the analysis and interactive visualization of genomics data. Robert holds a Ph.D. in Bioinformatics from Cold Spring Harbor Laboratory and a B.Eng. in Computer Engineering from McGill.

[Apr 26, 2018] Bash Range How to iterate over sequences generated on the shell Linux Hint by Fahmida Yesmin

Notable quotes:
"... When only upper limit is used then the number will start from 1 and increment by one in each step. ..."
Apr 26, 2018 | linuxhint.com

Bash Range: How to iterate over sequences generated on the shell 2 days ago You can iterate the sequence of numbers in bash by two ways. One is by using seq command and another is by specifying range in for loop. In seq command, the sequence starts from one, the number increments by one in each step and print each number in each line up to the upper limit by default. If the number starts from upper limit then it decrements by one in each step. Normally, all numbers are interpreted as floating point but if the sequence starts from integer then the list of decimal integers will print. If seq command can execute successfully then it returns 0, otherwise it returns any non-zero number. You can also iterate the sequence of numbers using for loop with range. Both seq command and for loop with range are shown in this tutorial by using examples.

The options of seq command:

You can use seq command by using the following options.

  • -w This option is used to pad the numbers with leading zeros to print all numbers with equal width.
  • -f format This option is used to print number with particular format. Floating number can be formatted by using %f, %g and %e as conversion characters. %g is used as default.
  • -s string This option is used to separate the numbers with string. The default value is newline ('\n').
Examples of seq command:

You can apply seq command by three ways. You can use only upper limit or upper and lower limit or upper and lower limit with increment or decrement value of each step . Different uses of the seq command with options are shown in the following examples.

Example-1: seq command without option

When only upper limit is used then the number will start from 1 and increment by one in each step. The following command will print the number from 1 to 4.

$ seq 4

When the two values are used with seq command then first value will be used as starting number and second value will be used as ending number. The following command will print the number from 7 to 15.

$ seq 7 15

When you will use three values with seq command then the second value will be used as increment or decrement value for each step. For the following command, the starting number is 10, ending number is 1 and each step will be counted by decrementing 2.

$ seq 10 -2 1
Example-2: seq with –w option

The following command will print the output by adding leading zero for the number from 1 to 9.

$ seq -w 0110
Example-3: seq with –s option

The following command uses "-" as separator for each sequence number. The sequence of numbers will print by adding "-" as separator.

$ seq -s - 8

Example-4: seq with -f option

The following command will print 10 date values starting from 1. Here, "%g" option is used to add sequence number with other string value.

$ seq -f "%g/04/2018" 10

The following command is used to generate the sequence of floating point number using "%f" . Here, the number will start from 3 and increment by 0.8 in each step and the last number will be less than or equal to 6.

$ seq -f "%f" 3 0.8 6

Example-5: Write the sequence in a file

If you want to save the sequence of number into a file without printing in the console then you can use the following commands. The first command will print the numbers to a file named " seq.txt ". The number will generate from 5 to 20 and increment by 10 in each step. The second command is used to view the content of " seq.txt" file.

seq 5 10 20 | cat &gt; seq.txt
cat seq.txt

Example-6: Using seq in for loop

Suppose, you want to create files named fn1 to fn10 using for loop with seq. Create a file named "sq1.bash" and add the following code. For loop will iterate for 10 times using seq command and create 10 files in the sequence fn1, fn2,fn3 ..fn10.

#!/bin/bash
for i in ` seq 10 ` ; do touch fn. $i done

Run the following commands to execute the code of the bash file and check the files are created or not.

bash sq1.bash
ls

Examples of for loop with range: Example-7: For loop with range

The alternative of seq command is range. You can use range in for loop to generate sequence of numbers like seq. Write the following code in a bash file named " sq2.bash ". The loop will iterate for 5 times and print the square root of each number in each step.

#!/bin/bash
for n in { 1 .. 5 } ; do (( result =n * n ))
echo $n square = $result
done

Run the command to execute the script of the file.

bash sq2.bash

Example-8: For loop with range and increment value

By default, the number is increment by one in each step in range like seq. You can also change the increment value in range. Write the following code in a bash file named " sq3.bash ". The for loop in the script will iterate for 5 times, each step is incremented by 2 and print all odd numbers between 1 to 10.

#!/bin/bash
echo "all odd numbers from 1 to 10 are"
for i in { 1 .. 10 .. 2 }; do echo $i ; done

Run the command to execute the script of the file.

bash sq3.bash

If you want to work with the sequence of numbers then you can use any of the options that are shown in this tutorial. After completing this tutorial, you will be able to use seq command and for loop with range more efficiently in your bash script.

[Dec 09, 2017] linux - What does the line '!-bin-sh -e' do

Dec 09, 2017 | stackoverflow.com

,

That line defines what program will execute the given script. For sh normally that line should start with the # character as so:
#!/bin/sh -e

The -e flag's long name is errexit , causing the script to immediately exit on the first error.

[Oct 25, 2017] How to modify scripts behavior on signals using bash traps - LinuxConfig.org

Oct 25, 2017 | linuxconfig.org

Trap syntax is very simple and easy to understand: first we must call the trap builtin, followed by the action(s) to be executed, then we must specify the signal(s) we want to react to:

trap [-lp] [[arg] sigspec]
Let's see what the possible trap options are for.

When used with the -l flag, the trap command will just display a list of signals associated with their numbers. It's the same output you can obtain running the kill -l command:

$ trap -l
1) SIGHUP        2) SIGINT       3) SIGQUIT      4) SIGILL       5) SIGTRAP
6) SIGABRT       7) SIGBUS       8) SIGFPE       9) SIGKILL     10) SIGUSR1
11) SIGSEGV     12) SIGUSR2     13) SIGPIPE     14) SIGALRM     15) SIGTERM
16) SIGSTKFLT   17) SIGCHLD     18) SIGCONT     19) SIGSTOP     20) SIGTSTP
21) SIGTTIN     22) SIGTTOU     23) SIGURG      24) SIGXCPU     25) SIGXFSZ
26) SIGVTALRM   27) SIGPROF     28) SIGWINCH    29) SIGIO       30) SIGPWR
31) SIGSYS      34) SIGRTMIN    35) SIGRTMIN+1  36) SIGRTMIN+2  37) SIGRTMIN+3
38) SIGRTMIN+4  39) SIGRTMIN+5  40) SIGRTMIN+6  41) SIGRTMIN+7  42) SIGRTMIN+8
43) SIGRTMIN+9  44) SIGRTMIN+10 45) SIGRTMIN+11 46) SIGRTMIN+12 47) SIGRTMIN+13
48) SIGRTMIN+14 49) SIGRTMIN+15 50) SIGRTMAX-14 51) SIGRTMAX-13 52) SIGRTMAX-12
53) SIGRTMAX-11 54) SIGRTMAX-10 55) SIGRTMAX-9  56) SIGRTMAX-8  57) SIGRTMAX-7
58) SIGRTMAX-6  59) SIGRTMAX-5  60) SIGRTMAX-4  61) SIGRTMAX-3  62) SIGRTMAX-2
63) SIGRTMAX-1  64) SIGRTMAX
It's really important to specify that it's possible to react only to signals which allows the script to respond: the SIGKILL and SIGSTOP signals cannot be caught, blocked or ignored.

Apart from signals, traps can also react to some pseudo-signal such as EXIT, ERR or DEBUG, but we will see them in detail later. For now just remember that a signal can be specified either by its number or by its name, even without the SIG prefix.

About the -p option now. This option has sense only when a command is not provided (otherwise it will produce an error). When trap is used with it, a list of the previously set traps will be displayed. If the signal name or number is specified, only the trap set for that specific signal will be displayed, otherwise no distinctions will be made, and all the traps will be displayed:

$ trap 'echo "SIGINT caught!"' SIGINT
We set a trap to catch the SIGINT signal: it will just display the "SIGINT caught" message onscreen when given signal will be received by the shell. If we now use trap with the -p option, it will display the trap we just defined:
$ trap -p
trap -- 'echo "SIGINT caught!"' SIGINT
By the way, the trap is now "active", so if we send a SIGINT signal, either using the kill command, or with the CTRL-c shortcut, the associated command in the trap will be executed (^C is just printed because of the key combination):
^CSIGINT caught!
Trap in action We now will write a simple script to show trap in action, here it is:
#!/usr/bin/env bash
#
# A simple script to demonstrate how trap works
#
set -e
set -u
set -o pipefail

trap 'echo "signal caught, cleaning..."; rm -i linux_tarball.tar.xz' SIGINT SIGTERM

echo "Downloading tarball..."
wget -O linux_tarball.tar.xz https://cdn.kernel.org/pub/linux/kernel/v4.x/linux-4.13.5.tar.xz &> /dev/null
The above script just tries to download the latest linux kernel tarball into the directory from what it is launched using wget . During the task, if the SIGINT or SIGTERM signals are received (notice how you can specify more than one signal on the same line), the partially downloaded file will be deleted.

In this case the command are actually two: the first is the echo which prints the message onscreen, and the second is the actual rm command (we provided the -i option to it, so it will ask user confirmation before removing), and they are separated by a semicolon. Instead of specifying commands this way, you can also call functions: this would give you more re-usability. Notice that if you don't provide any command the signal(s) will just be ignored!

This is the output of the script above when it receives a SIGINT signal:

$ ./fetchlinux.sh
Downloading tarball...
^Csignal caught, cleaning...
rm: remove regular file 'linux_tarball.tar.xz'?
A very important thing to remember is that when a script is terminated by a signal, like above, its exist status will be the result of 128 + the signal number . As you can see, the script above, being terminated by a SIGINT, has an exit status of 130 :
$ echo $?
130
Lastly, you can disable a trap just by calling trap followed by the - sign, followed by the signal(s) name or number:
trap - SIGINT SIGTERM
The signals will take back the value they had upon the entrance to shell. Pseudo-signals As already mentioned above, trap can be set not only for signals which allows the script to respond but also to what we can call "pseudo-signals". They are not technically signals, but correspond to certain situations that can be specified: EXIT When EXIT is specified in a trap, the command of the trap will be execute on exit from the shell. ERR This will cause the argument of the trap to be executed when a command returns a non-zero exit status, with some exceptions (the same of the shell errexit option): the command must not be part of a while or until loop; it must not be part of an if construct, nor part of a && or || list, and its value must not be inverted by using the ! operator. DEBUG This will cause the argument of the trap to be executed before every simple command, for , case or select commands, and before the first command in shell functions RETURN The argument of the trap is executed after a function or a script sourced by using source or the . command.

[Sep 01, 2017] linux - Looping through the content of a file in Bash - Stack Overflow

Notable quotes:
"... done <<< "$(...)" ..."
Sep 01, 2017 | stackoverflow.com
down vote favorite 234

Peter Mortensen , asked Oct 5 '09 at 17:52

How do I iterate through each line of a text file with Bash ?

With this script

echo "Start!"
for p in (peptides.txt)
do
    echo "${p}"
done

I get this output on the screen:

Start!
./runPep.sh: line 3: syntax error near unexpected token `('
./runPep.sh: line 3: `for p in (peptides.txt)'

(Later I want to do something more complicated with $p than just output to the screen.)


The environment variable SHELL is (from env):

SHELL=/bin/bash

/bin/bash --version output:

GNU bash, version 3.1.17(1)-release (x86_64-suse-linux-gnu)
Copyright (C) 2005 Free Software Foundation, Inc.

cat /proc/version output:

Linux version 2.6.18.2-34-default (geeko@buildhost) (gcc version 4.1.2 20061115 (prerelease) (SUSE Linux)) #1 SMP Mon Nov 27 11:46:27 UTC 2006

The file peptides.txt contains:

RKEKNVQ
IPKKLLQK
QYFHQLEKMNVK
IPKKLLQK
GDLSTALEVAIDCYEK
QYFHQLEKMNVKIPENIYR
RKEKNVQ
VLAKHGKLQDAIN
ILGFMK
LEDVALQILL

Bruno De Fraine , answered Oct 5 '09 at 18:00

One way to do it is:
while read p; do
  echo $p
done <peptides.txt

Exceptionally, if the loop body may read from standard input , you can open the file using a different file descriptor:

while read -u 10 p; do
  ...
done 10<peptides.txt

Here, 10 is just an arbitrary number (different from 0, 1, 2).

Warren Young , answered Oct 5 '09 at 17:54

cat peptides.txt | while read line
do
   # do something with $line here
done

Stan Graves , answered Oct 5 '09 at 18:18

Option 1a: While loop: Single line at a time: Input redirection
#!/bin/bash
filename='peptides.txt'
echo Start
while read p; do 
    echo $p
done < $filename

Option 1b: While loop: Single line at a time:
Open the file, read from a file descriptor (in this case file descriptor #4).

#!/bin/bash
filename='peptides.txt'
exec 4<$filename
echo Start
while read -u4 p ; do
    echo $p
done

Option 2: For loop: Read file into single variable and parse.
This syntax will parse "lines" based on any white space between the tokens. This still works because the given input file lines are single work tokens. If there were more than one token per line, then this method would not work as well. Also, reading the full file into a single variable is not a good strategy for large files.

#!/bin/bash
filename='peptides.txt'
filelines=`cat $filename`
echo Start
for line in $filelines ; do
    echo $line
done

mightypile , answered Oct 4 '13 at 13:30

This is no better than other answers, but is one more way to get the job done in a file without spaces (see comments). I find that I often need one-liners to dig through lists in text files without the extra step of using separate script files.
for word in $(cat peptides.txt); do echo $word; done

This format allows me to put it all in one command-line. Change the "echo $word" portion to whatever you want and you can issue multiple commands separated by semicolons. The following example uses the file's contents as arguments into two other scripts you may have written.

for word in $(cat peptides.txt); do cmd_a.sh $word; cmd_b.py $word; done

Or if you intend to use this like a stream editor (learn sed) you can dump the output to another file as follows.

for word in $(cat peptides.txt); do cmd_a.sh $word; cmd_b.py $word; done > outfile.txt

I've used these as written above because I have used text files where I've created them with one word per line. (See comments) If you have spaces that you don't want splitting your words/lines, it gets a little uglier, but the same command still works as follows:

OLDIFS=$IFS; IFS=$'\n'; for line in $(cat peptides.txt); do cmd_a.sh $line; cmd_b.py $line; done > outfile.txt; IFS=$OLDIFS

This just tells the shell to split on newlines only, not spaces, then returns the environment back to what it was previously. At this point, you may want to consider putting it all into a shell script rather than squeezing it all into a single line, though.

Best of luck!

Jahid , answered Jun 9 '15 at 15:09

Use a while loop, like this:
while IFS= read -r line; do
   echo "$line"
done <file

Notes:

  1. If you don't set the IFS properly, you will lose indentation.
  2. You should almost always use the -r option with read.
  3. Don't read lines with for

codeforester , answered Jan 14 at 3:30

A few more things not covered by other answers: Reading from a delimited file
# ':' is the delimiter here, and there are three fields on each line in the file
# IFS set below is restricted to the context of `read`, it doesn't affect any other code
while IFS=: read -r field1 field2 field3; do
  # process the fields
  # if the line has less than three fields, the missing fields will be set to an empty string
  # if the line has more than three fields, `field3` will get all the values, including the third field plus the delimiter(s)
done < input.txt
Reading from more than one file at a time
while read -u 3 -r line1 && read -u 4 -r line2; do
  # process the lines
  # note that the loop will end when we reach EOF on either of the files, because of the `&&`
done 3< input1.txt 4< input2.txt
Reading a whole file into an array (Bash version 4+)
readarray -t my_array < my_file

or

mapfile -t my_array < my_file

And then

for line in "${my_array[@]}"; do
  # process the lines
done

Anjul Sharma , answered Mar 8 '16 at 16:10

If you don't want your read to be broken by newline character, use -
#!/bin/bash
while IFS='' read -r line || [[ -n "$line" ]]; do
    echo "$line"
done < "$1"

Then run the script with file name as parameter.

Sine , answered Nov 14 '13 at 14:23

#!/bin/bash
#
# Change the file name from "test" to desired input file 
# (The comments in bash are prefixed with #'s)
for x in $(cat test.txt)
do
    echo $x
done

dawg , answered Feb 3 '16 at 19:15

Suppose you have this file:
$ cat /tmp/test.txt
Line 1
    Line 2 has leading space
Line 3 followed by blank line

Line 5 (follows a blank line) and has trailing space    
Line 6 has no ending CR

There are four elements that will alter the meaning of the file output read by many Bash solutions:

  1. The blank line 4;
  2. Leading or trailing spaces on two lines;
  3. Maintaining the meaning of individual lines (i.e., each line is a record);
  4. The line 6 not terminated with a CR.

If you want the text file line by line including blank lines and terminating lines without CR, you must use a while loop and you must have an alternate test for the final line.

Here are the methods that may change the file (in comparison to what cat returns):

1) Lose the last line and leading and trailing spaces:

$ while read -r p; do printf "%s\n" "'$p'"; done </tmp/test.txt
'Line 1'
'Line 2 has leading space'
'Line 3 followed by blank line'
''
'Line 5 (follows a blank line) and has trailing space'

(If you do while IFS= read -r p; do printf "%s\n" "'$p'"; done </tmp/test.txt instead, you preserve the leading and trailing spaces but still lose the last line if it is not terminated with CR)

2) Using process substitution with cat will reads the entire file in one gulp and loses the meaning of individual lines:

$ for p in "$(cat /tmp/test.txt)"; do printf "%s\n" "'$p'"; done
'Line 1
    Line 2 has leading space
Line 3 followed by blank line

Line 5 (follows a blank line) and has trailing space    
Line 6 has no ending CR'

(If you remove the " from $(cat /tmp/test.txt) you read the file word by word rather than one gulp. Also probably not what is intended...)


The most robust and simplest way to read a file line-by-line and preserve all spacing is:

$ while IFS= read -r line || [[ -n $line ]]; do printf "'%s'\n" "$line"; done </tmp/test.txt
'Line 1'
'    Line 2 has leading space'
'Line 3 followed by blank line'
''
'Line 5 (follows a blank line) and has trailing space    '
'Line 6 has no ending CR'

If you want to strip leading and trading spaces, remove the IFS= part:

$ while read -r line || [[ -n $line ]]; do printf "'%s'\n" "$line"; done </tmp/test.txt
'Line 1'
'Line 2 has leading space'
'Line 3 followed by blank line'
''
'Line 5 (follows a blank line) and has trailing space'
'Line 6 has no ending CR'

(A text file without a terminating \n , while fairly common, is considered broken under POSIX. If you can count on the trailing \n you do not need || [[ -n $line ]] in the while loop.)

More at the BASH FAQ

,

Here is my real life example how to loop lines of another program output, check for substrings, drop double quotes from variable, use that variable outside of the loop. I guess quite many is asking these questions sooner or later.
##Parse FPS from first video stream, drop quotes from fps variable
## streams.stream.0.codec_type="video"
## streams.stream.0.r_frame_rate="24000/1001"
## streams.stream.0.avg_frame_rate="24000/1001"
FPS=unknown
while read -r line; do
  if [[ $FPS == "unknown" ]] && [[ $line == *".codec_type=\"video\""* ]]; then
    echo ParseFPS $line
    FPS=parse
  fi
  if [[ $FPS == "parse" ]] && [[ $line == *".r_frame_rate="* ]]; then
    echo ParseFPS $line
    FPS=${line##*=}
    FPS="${FPS%\"}"
    FPS="${FPS#\"}"
  fi
done <<< "$(ffprobe -v quiet -print_format flat -show_format -show_streams -i "$input")"
if [ "$FPS" == "unknown" ] || [ "$FPS" == "parse" ]; then 
  echo ParseFPS Unknown frame rate
fi
echo Found $FPS

Declare variable outside of the loop, set value and use it outside of loop requires done <<< "$(...)" syntax. Application need to be run within a context of current console. Quotes around the command keeps newlines of output stream.

Loop match for substrings then reads name=value pair, splits right-side part of last = character, drops first quote, drops last quote, we have a clean value to be used elsewhere.

[Jul 26, 2017] I feel stupid declare not found in bash scripting

A single space can make a huge difference in bash :-)
www.linuxquestions.org

Mohtek

I feel stupid: declare not found in bash scripting? I was anxious to get my feet wet, and I'm only up to my toes before I'm stuck...this seems very very easy but I'm not sure what I've done wrong. Below is the script and its output. What the heck am I missing?

______________________________________________________
#!/bin/bash
declare -a PROD[0]="computers" PROD[1]="HomeAutomation"
printf "${ PROD[*]}"
_______________________________________________________

products.sh: 6: declare: not found
products.sh: 8: Syntax error: Bad substitution

[email protected]

I ran what you posted (but at the command line, not in a script, though that should make no significant difference), and got this:

Code:

-bash: ${ PROD[*]}: bad substitution

In other words, I couldn't reproduce your first problem, the "declare: not found" error. Try the declare command by itself, on the command line.

And I got rid of the "bad substitution" problem when I removed the space which is between the ${ and the PROD on the printf line.

Hope this helps.

blackhole54

The previous poster identified your second problem.

As far as your first problem goes ... I am not a bash guru although I have written a number of bash scripts. So far I have found no need for declare statements. I suspect that you might not need it either. But if you do want to use it, the following does work:

Code:
#!/bin/bash

declare -a PROD
PROD[0]="computers"
PROD[1]="HomeAutomation"
printf "${PROD[*]}\n"

EDIT: My original post was based on an older version of bash. When I tried the declare statement you posted I got an error message, but one that was different from yours. I just tried it on a newer version of bash, and your declare statement worked fine. So it might depend on the version of bash you are running. What I posted above runs fine on both versions.

[Jul 26, 2017] Associative array declaration gotcha

Jul 26, 2017 | unix.stackexchange.com

bash silently does function return on (re-)declare of global associative read-only array - Unix & Linux Stack Exchange

Ron Burk :

Obviously cut out of a much more complex script that was more meaningful:

#!/bin/bash

function InitializeConfig(){
    declare -r -g -A SHCFG_INIT=( [a]=b )
    declare -r -g -A SHCFG_INIT=( [c]=d )
    echo "This statement never gets executed"
}

set -o xtrace

InitializeConfig
echo "Back from function"
The output looks like this:
ronburk@ubuntu:~/ubucfg$ bash bug.sh
+ InitializeConfig
+ SHCFG_INIT=([a]=b)
+ declare -r -g -A SHCFG_INIT
+ SHCFG_INIT=([c]=d)
+ echo 'Back from function'
Back from function
Bash seems to silently execute a function return upon the second declare statement. Starting to think this really is a new bug, but happy to learn otherwise.

Other details:

Machine: x86_64
OS: linux-gnu
Compiler: gcc
Compilation CFLAGS:  -DPROGRAM='bash' -DCONF_HOSTTYPE='x86_64' -DCONF_OSTYPE='linux-gnu' -DCONF_MACHTYPE='x86_64-pc-linux-gn$
uname output: Linux ubuntu 3.16.0-38-generic #52~14.04.1-Ubuntu SMP Fri May 8 09:43:57 UTC 2015 x86_64 x86_64 x86_64 GNU/Lin$
Machine Type: x86_64-pc-linux-gnu

Bash Version: 4.3
Patch Level: 11
Release Status: release
bash array readonly
share improve this question edited Jun 14 '15 at 17:43 asked Jun 14 '15 at 7:05 118

By gum, you're right! Then I get readonly warning on second declare, which is reasonable, and the function completes. The xtrace output is also interesting; implies declare without single quotes is really treated as two steps. Ready to become superstitious about always single-quoting the argument to declare . Hard to see how popping the function stack can be anything but a bug, though. – Ron Burk Jun 14 '15 at 23:58

Weird. Doesn't happen in bash 4.2.53(1). – choroba Jun 14 '15 at 7:22
I can reproduce this problem with bash version 4.3.11 (Ubuntu 14.04.1 LTS). It works fine with bash 4.2.8 (Ubuntu 11.04). – Cyrus Jun 14 '15 at 7:34
Maybe related: unix.stackexchange.com/q/56815/116972 I can get expected result with declare -r -g -A 'SHCFG_INIT=( [a]=b )' . – yaegashi Jun 14 '15 at 23:22
add a comment |

I found this thread in [email protected] related to test -v on an assoc array. In short, bash implicitly did test -v SHCFG_INIT[0] in your script. I'm not sure this behavior got introduced in 4.3.

You might want to use declare -p to workaround this...

if  declare p SHCFG_INIT >/dev/null >& ; then
    echo "looks like SHCFG_INIT not defined"
fi
====
Well, rats. I think your answer is correct, but also reveals I'm really asking two separate questions when I thought they were probably the same issue. Since the title better reflects what turns out to be the "other" question, I'll leave this up for a while and see if anybody knows what's up with the mysterious implicit function return... Thanks! – Ron Burk Jun 14 '15 at 17:01
Edited question to focus on the remaining issue. Thanks again for the answer on the "-v" issue with associative arrays. – Ron Burk Jun 14 '15 at 17:55
Accepting this answer. Complete answer is here plus your comments above plus (IMHO) there's a bug in this version of bash (can't see how there can be any excuse for popping the function stack without warning). Thanks for your excellent research on this! – Ron Burk Jun 21 '15 at 19:31

[Jul 25, 2017] Beginner Mistakes

Highly recommended!
Jul 25, 2017 | wiki.bash-hackers.org

Script execution Your perfect Bash script executes with syntax errors If you write Bash scripts with Bash specific syntax and features, run them with Bash , and run them with Bash in native mode .

Wrong

  • no shebang
    • the interpreter used depends on the OS implementation and current shell
    • can be run by calling bash with the script name as an argument, e.g. bash myscript
  • #!/bin/sh shebang
    • depends on what /bin/sh actually is, for a Bash it means compatiblity mode, not native mode

See also:

Your script named "test" doesn't execute Give it another name. The executable test already exists.

In Bash it's a builtin. With other shells, it might be an executable file. Either way, it's bad name choice!

Workaround: You can call it using the pathname:

/home/user/bin/test

Globbing Brace expansion is not globbing The following command line is not related to globbing (filename expansion):

# YOU EXPECT
# -i1.vob -i2.vob -i3.vob ....

echo -i{*.vob,}

# YOU GET
# -i*.vob -i
Why? The brace expansion is simple text substitution. All possible text formed by the prefix, the postfix and the braces themselves are generated. In the example, these are only two: -i*.vob and -i . The filename expansion happens after that, so there is a chance that -i*.vob is expanded to a filename - if you have files like -ihello.vob . But it definitely doesn't do what you expected.

Please see:

Test-command
  • if [ $foo ]
  • if [-d $dir]

Please see:

Variables Setting variables The Dollar-Sign There is no $ (dollar-sign) when you reference the name of a variable! Bash is not PHP!
# THIS IS WRONG!
$myvar="Hello world!"

A variable name preceeded with a dollar-sign always means that the variable gets expanded . In the example above, it might expand to nothing (because it wasn't set), effectively resulting in

="Hello world!"
which definitely is wrong !

When you need the name of a variable, you write only the name , for example

  • (as shown above) to set variables: picture=/usr/share/images/foo.png
  • to name variables to be used by the read builtin command: read picture
  • to name variables to be unset: unset picture

When you need the content of a variable, you prefix its name with a dollar-sign , like

  • echo "The used picture is: $picture"
Whitespace Putting spaces on either or both sides of the equal-sign ( = ) when assigning a value to a variable will fail.
# INCORRECT 1
example = Hello

# INCORRECT 2
example= Hello

# INCORRECT 3
example =Hello

The only valid form is no spaces between the variable name and assigned value

# CORRECT 1
example=Hello

# CORRECT 2
example=" Hello"

Expanding (using) variables A typical beginner's trap is quoting.

As noted above, when you want to expand a variable i.e. "get the content", the variable name needs to be prefixed with a dollar-sign. But, since Bash knows various ways to quote and does word-splitting, the result isn't always the same.

Let's define an example variable containing text with spaces:

example="Hello world"
Used form result number of words
$example Hello world 2
"$example" Hello world 1
\$example $example 1
'$example' $example 1

If you use parameter expansion, you must use the name ( PATH ) of the referenced variables/parameters. i.e. not ( $PATH ):

# WRONG!
echo "The first character of PATH is ${$PATH:0:1}"

# CORRECT
echo "The first character of PATH is ${PATH:0:1}"

Note that if you are using variables in arithmetic expressions , then the bare name is allowed:

((a=$a+7))         # Add 7 to a
((a = a + 7))      # Add 7 to a.  Identical to the previous command.
((a += 7))         # Add 7 to a.  Identical to the previous command.

a=$((a+7))         # POSIX-compatible version of previous code.

Please see:

Exporting Exporting a variable means to give newly created (child-)processes a copy of that variable. not copy a variable created in a child process to the parent process. The following example does not work, since the variable hello is set in a child process (the process you execute to start that script ./script.sh ):
$ cat script.sh
export hello=world

$ ./script.sh
$ echo $hello
$

Exporting is one-way. The direction is parent process to child process, not the reverse. The above example will work, when you don't execute the script, but include ("source") it:

$ source ./script.sh
$ echo $hello
world
$
In this case, the export command is of no use.

Please see:

Exit codes Reacting to exit codes If you just want to react to an exit code, regardless of its specific value, you don't need to use $? in a test command like this:
grep
 ^root:
etc
passwd
>/
dev
null
>&

 
if
$?
-neq
then
echo
"root was not found - check the pub at the corner"
fi

This can be simplified to:

if
grep
 ^root:
etc
passwd
>/
dev
null
>&
then
echo
"root was not found - check the pub at the corner"
fi

Or, simpler yet:

grep
 ^root:
etc
passwd
>/
dev
null
>&
||
echo
"root was not found - check the pub at the corner"

If you need the specific value of $? , there's no other choice. But if you need only a "true/false" exit indication, there's no need for $? .

See also:

Output vs. Return Value It's important to remember the different ways to run a child command, and whether you want the output, the return value, or neither.

When you want to run a command (or a pipeline) and save (or print) the output , whether as a string or an array, you use Bash's $(command) syntax:

$(ls -l /tmp)
newvariable=$(printf "foo")

When you want to use the return value of a command, just use the command, or add ( ) to run a command or pipeline in a subshell:

if grep someuser /etc/passwd ; then
    # do something
fi

if ( w | grep someuser | grep sqlplus ) ; then
    # someuser is logged in and running sqlplus
fi

Make sure you're using the form you intended:

# WRONG!
if $(grep ERROR /var/log/messages) ; then
    # send alerts
fi

[Jul 25, 2017] Arrays in bash 4.x

Jul 25, 2017 | wiki.bash-hackers.org

Purpose An array is a parameter that holds mappings from keys to values. Arrays are used to store a collection of parameters into a parameter. Arrays (in any programming language) are a useful and common composite data structure, and one of the most important scripting features in Bash and other shells.

Here is an abstract representation of an array named NAMES . The indexes go from 0 to 3.

NAMES
 0: Peter
 1: Anna
 2: Greg
 3: Jan

Instead of using 4 separate variables, multiple related variables are grouped grouped together into elements of the array, accessible by their key . If you want the second name, ask for index 1 of the array NAMES . Indexing Bash supports two different types of ksh-like one-dimensional arrays. Multidimensional arrays are not implemented .

  • Indexed arrays use positive integer numbers as keys. Indexed arrays are always sparse , meaning indexes are not necessarily contiguous. All syntax used for both assigning and dereferencing indexed arrays is an arithmetic evaluation context (see Referencing ). As in C and many other languages, the numerical array indexes start at 0 (zero). Indexed arrays are the most common, useful, and portable type. Indexed arrays were first introduced to Bourne-like shells by ksh88. Similar, partially compatible syntax was inherited by many derivatives including Bash. Indexed arrays always carry the -a attribute.
  • Associative arrays (sometimes known as a "hash" or "dict") use arbitrary nonempty strings as keys. In other words, associative arrays allow you to look up a value from a table based upon its corresponding string label. Associative arrays are always unordered , they merely associate key-value pairs. If you retrieve multiple values from the array at once, you can't count on them coming out in the same order you put them in. Associative arrays always carry the -A attribute, and unlike indexed arrays, Bash requires that they always be declared explicitly (as indexed arrays are the default, see declaration ). Associative arrays were first introduced in ksh93, and similar mechanisms were later adopted by Zsh and Bash version 4. These three are currently the only POSIX-compatible shells with any associative array support.
Syntax Referencing To accommodate referring to array variables and their individual elements, Bash extends the parameter naming scheme with a subscript suffix. Any valid ordinary scalar parameter name is also a valid array name: [[:alpha:]_][[:alnum:]_]* . The parameter name may be followed by an optional subscript enclosed in square brackets to refer to a member of the array.

The overall syntax is arrname[subscript] - where for indexed arrays, subscript is any valid arithmetic expression, and for associative arrays, any nonempty string. Subscripts are first processed for parameter and arithmetic expansions, and command and process substitutions. When used within parameter expansions or as an argument to the unset builtin, the special subscripts * and @ are also accepted which act upon arrays analogously to the way the @ and * special parameters act upon the positional parameters. In parsing the subscript, bash ignores any text that follows the closing bracket up to the end of the parameter name.

With few exceptions, names of this form may be used anywhere ordinary parameter names are valid, such as within arithmetic expressions , parameter expansions , and as arguments to builtins that accept parameter names. An array is a Bash parameter that has been given the -a (for indexed) or -A (for associative) attributes . However, any regular (non-special or positional) parameter may be validly referenced using a subscript, because in most contexts, referring to the zeroth element of an array is synonymous with referring to the array name without a subscript.

# "x" is an ordinary non-array parameter.
$ x=hi; printf '%s ' "$x" "${x[0]}"; echo "${_[0]}"
hi hi hi

The only exceptions to this rule are in a few cases where the array variable's name refers to the array as a whole. This is the case for the unset builtin (see destruction ) and when declaring an array without assigning any values (see declaration ). Declaration The following explicitly give variables array attributes, making them arrays:

Syntax Description
ARRAY=() Declares an indexed array ARRAY and initializes it to be empty. This can also be used to empty an existing array.
ARRAY[0]= Generally sets the first element of an indexed array. If no array ARRAY existed before, it is created.
declare -a ARRAY Declares an indexed array ARRAY . An existing array is not initialized.
declare -A ARRAY Declares an associative array ARRAY . This is the one and only way to create associative arrays.
Storing values Storing values in arrays is quite as simple as storing values in normal variables.
Syntax Description
ARRAY[N]=VALUE Sets the element N of the indexed array ARRAY to VALUE . N can be any valid arithmetic expression
ARRAY[STRING]=VALUE Sets the element indexed by STRING of the associative array ARRAY .
ARRAY=VALUE As above. If no index is given, as a default the zeroth element is set to VALUE . Careful, this is even true of associative arrays - there is no error if no key is specified, and the value is assigned to string index "0".
ARRAY=(E1 E2 ) Compound array assignment - sets the whole array ARRAY to the given list of elements indexed sequentially starting at zero. The array is unset before assignment unless the += operator is used. When the list is empty ( ARRAY=() ), the array will be set to an empty array. This method obviously does not use explicit indexes. An associative array can not be set like that! Clearing an associative array using ARRAY=() works.
ARRAY=([X]=E1 [Y]=E2 ) Compound assignment for indexed arrays with index-value pairs declared individually (here for example X and Y ). X and Y are arithmetic expressions. This syntax can be combined with the above - elements declared without an explicitly specified index are assigned sequentially starting at either the last element with an explicit index, or zero.
ARRAY=([S1]=E1 [S2]=E2 ) Individual mass-setting for associative arrays . The named indexes (here: S1 and S2 ) are strings.
ARRAY+=(E1 E2 ) Append to ARRAY.

As of now, arrays can't be exported. Getting values article about parameter expansion and check the notes about arrays.

Syntax Description
${ARRAY[N]} Expands to the value of the index N in the indexed array ARRAY . If N is a negative number, it's treated as the offset from the maximum assigned index (can't be used for assignment) - 1
${ARRAY[S]} Expands to the value of the index S in the associative array ARRAY .
"${ARRAY[@]}"
${ARRAY[@]}
"${ARRAY[*]}"
${ARRAY[*]}
Similar to mass-expanding positional parameters , this expands to all elements. If unquoted, both subscripts * and @ expand to the same result, if quoted, @ expands to all elements individually quoted, * expands to all elements quoted as a whole.
"${ARRAY[@]:N:M}"
${ARRAY[@]:N:M}
"${ARRAY[*]:N:M}"
${ARRAY[*]:N:M}
Similar to what this syntax does for the characters of a single string when doing substring expansion , this expands to M elements starting with element N . This way you can mass-expand individual indexes. The rules for quoting and the subscripts * and @ are the same as above for the other mass-expansions.

For clarification: When you use the subscripts @ or * for mass-expanding, then the behaviour is exactly what it is for $@ and $* when mass-expanding the positional parameters . You should read this article to understand what's going on. Metadata

Syntax Description
${#ARRAY[N]} Expands to the length of an individual array member at index N ( stringlength
${#ARRAY[STRING]} Expands to the length of an individual associative array member at index STRING ( stringlength )
${#ARRAY[@]}
${#ARRAY[*]}
Expands to the number of elements in ARRAY
${!ARRAY[@]}
${!ARRAY[*]}
Expands to the indexes in ARRAY since BASH 3.0
Destruction The unset builtin command is used to destroy (unset) arrays or individual elements of arrays.
Syntax Description
unset -v ARRAY
unset -v ARRAY[@]
unset -v ARRAY[*]
Destroys a complete array
unset -v ARRAY[N] Destroys the array element at index N
unset -v ARRAY[STRING] Destroys the array element of the associative array at index STRING

It is best to explicitly specify -v when unsetting variables with unset.

pathname expansion to occur due to the presence of glob characters.

Example: You are in a directory with a file named x1 , and you want to destroy an array element x[1] , with

unset x[1]
then pathname expansion will expand to the filename x1 and break your processing!

Even worse, if nullglob is set, your array/index will disappear.

To avoid this, always quote the array name and index:

unset -v 'x[1]'

This applies generally to all commands which take variable names as arguments. Single quotes preferred.

Usage Numerical Index Numerical indexed arrays are easy to understand and easy to use. The Purpose and Indexing chapters above more or less explain all the needed background theory.

Now, some examples and comments for you.

Let's say we have an array sentence which is initialized as follows:

sentence=(Be liberal in what you accept, and conservative in what you send)

Since no special code is there to prevent word splitting (no quotes), every word there will be assigned to an individual array element. When you count the words you see, you should get 12. Now let's see if Bash has the same opinion:

$ echo ${#sentence[@]}
12

Yes, 12. Fine. You can take this number to walk through the array. Just subtract 1 from the number of elements, and start your walk at 0 (zero)

((n_elements=${#sentence[@]}, max_index=n_elements - 1))

for ((i = 0; i <= max_index; i++)); do
  echo "Element $i: '${sentence[i]}'"
done

You always have to remember that, it seems newbies have problems sometimes. Please understand that numerical array indexing begins at 0 (zero)

The method above, walking through an array by just knowing its number of elements, only works for arrays where all elements are set, of course. If one element in the middle is removed, then the calculation is nonsense, because the number of elements doesn't correspond to the highest used index anymore (we call them " sparse arrays "). Associative (Bash 4) Associative arrays (or hash tables ) are not much more complicated than numerical indexed arrays. The numerical index value (in Bash a number starting at zero) just is replaced with an arbitrary string:

# declare -A, introduced with Bash 4 to declare an associative array
declare -A sentence

sentence[Begin]='Be liberal in what'
sentence[Middle]='you accept, and conservative'
sentence[End]='in what you send'
sentence['Very end']=...

Beware: don't rely on the fact that the elements are ordered in memory like they were declared, it could look like this:

# output from 'set' command
sentence=([End]="in what you send" [Middle]="you accept, and conservative " [Begin]="Be liberal in what " ["Very end"]="...")
This effectively means, you can get the data back with "${sentence[@]}" , of course (just like with numerical indexing), but you can't rely on a specific order. If you want to store ordered data, or re-order data, go with numerical indexes. For associative arrays, you usually query known index values:
for element in Begin Middle End "Very end"; do
    printf "%s" "${sentence[$element]}"
done
printf "\n"

A nice code example: Checking for duplicate files using an associative array indexed with the SHA sum of the files:

# Thanks to Tramp in #bash for the idea and the code

unset flist; declare -A flist;
while read -r sum fname; do 
    if [[ ${flist[$sum]} ]]; then
        printf 'rm -- "%s" # Same as >%s<\n' "$fname" "${flist[$sum]}" 
    else
        flist[$sum]="$fname"
    fi
done <  <(find . -type f -exec sha256sum {} +)  >rmdups

Integer arrays Any type attributes applied to an array apply to all elements of the array. If the integer attribute is set for either indexed or associative arrays, then values are considered as arithmetic for both compound and ordinary assignment, and the += operator is modified in the same way as for ordinary integer variables.

 ~ $ ( declare -ia 'a=(2+4 [2]=2+2 [a[2]]="a[2]")' 'a+=(42 [a[4]]+=3)'; declare -p a )
declare -ai a='([0]="6" [2]="4" [4]="7" [5]="42")'

a[0] is assigned to the result of 2+4 . a[1] gets the result of 2+2 . The last index in the first assignment is the result of a[2] , which has already been assigned as 4 , and its value is also given a[2] .

This shows that even though any existing arrays named a in the current scope have already been unset by using = instead of += to the compound assignment, arithmetic variables within keys can self-reference any elements already assigned within the same compound-assignment. With integer arrays this also applies to expressions to the right of the = . (See evaluation order , the right side of an arithmetic assignment is typically evaluated first in Bash.)

The second compound assignment argument to declare uses += , so it appends after the last element of the existing array rather than deleting it and creating a new array, so a[5] gets 42 .

Lastly, the element whose index is the value of a[4] ( 4 ), gets 3 added to its existing value, making a[4] == 7 . Note that having the integer attribute set this time causes += to add, rather than append a string, as it would for a non-integer array.

The single quotes force the assignments to be evaluated in the environment of declare . This is important because attributes are only applied to the assignment after assignment arguments are processed. Without them the += compound assignment would have been invalid, and strings would have been inserted into the integer array without evaluating the arithmetic. A special-case of this is shown in the next section.

eval , but there are differences.) 'Todo: ' Discuss this in detail.

Indirection Arrays can be expanded indirectly using the indirect parameter expansion syntax. Parameters whose values are of the form: name[index] , name[@] , or name[*] when expanded indirectly produce the expected results. This is mainly useful for passing arrays (especially multiple arrays) by name to a function.

This example is an "isSubset"-like predicate which returns true if all key-value pairs of the array given as the first argument to isSubset correspond to a key-value of the array given as the second argument. It demonstrates both indirect array expansion and indirect key-passing without eval using the aforementioned special compound assignment expansion.

isSubset() {
    local -a 'xkeys=("${!'"$1"'[@]}")' 'ykeys=("${!'"$2"'[@]}")'
    set -- "${@/%/[key]}"

    (( ${#xkeys[@]} <= ${#ykeys[@]} )) || return 1

    local key
    for key in "${xkeys[@]}"; do
        [[ ${!2+_} && ${!1} == ${!2} ]] || return 1
    done
}

main() {
    # "a" is a subset of "b"
    local -a 'a=({0..5})' 'b=({0..10})'
    isSubset a b
    echo $? # true

    # "a" contains a key not in "b"
    local -a 'a=([5]=5 {6..11})' 'b=({0..10})'
    isSubset a b
    echo $? # false

    # "a" contains an element whose value != the corresponding member of "b"
    local -a 'a=([5]=5 6 8 9 10)' 'b=({0..10})'
    isSubset a b
    echo $? # false
}

main

This script is one way of implementing a crude multidimensional associative array by storing array definitions in an array and referencing them through indirection. The script takes two keys and dynamically calls a function whose name is resolved from the array.

callFuncs() {
    # Set up indirect references as positional parameters to minimize local name collisions.
    set -- "${@:1:3}" ${2+'a["$1"]' "$1"'["$2"]'}

    # The only way to test for set but null parameters is unfortunately to test each individually.
    local x
    for x; do
        [[ $x ]] || return 0
    done

    local -A a=(
        [foo]='([r]=f [s]=g [t]=h)'
        [bar]='([u]=i [v]=j [w]=k)'
        [baz]='([x]=l [y]=m [z]=n)'
        ) ${4+${a["$1"]+"${1}=${!3}"}} # For example, if "$1" is "bar" then define a new array: bar=([u]=i [v]=j [w]=k)

    ${4+${a["$1"]+"${!4-:}"}} # Now just lookup the new array. for inputs: "bar" "v", the function named "j" will be called, which prints "j" to stdout.
}

main() {
    # Define functions named {f..n} which just print their own names.
    local fun='() { echo "$FUNCNAME"; }' x

    for x in {f..n}; do
        eval "${x}${fun}"
    done

    callFuncs "$@"
}

main "$@"

Bugs and Portability Considerations

  • Arrays are not specified by POSIX. One-dimensional indexed arrays are supported using similar syntax and semantics by most Korn-like shells.
  • Associative arrays are supported via typeset -A in Bash 4, Zsh, and Ksh93.
  • In Ksh93, arrays whose types are not given explicitly are not necessarily indexed. Arrays defined using compound assignments which specify subscripts are associative by default. In Bash, associative arrays can only be created by explicitly declaring them as associative, otherwise they are always indexed. In addition, ksh93 has several other compound structures whose types can be determined by the compound assignment syntax used to create them.
  • In Ksh93, using the = compound assignment operator unsets the array, including any attributes that have been set on the array prior to assignment. In order to preserve attributes, you must use the += operator. However, declaring an associative array, then attempting an a=( ) style compound assignment without specifying indexes is an error. I can't explain this inconsistency.
     $ ksh -c 'function f { typeset -a a; a=([0]=foo [1]=bar); typeset -p a; }; f' # Attribute is lost, and since subscripts are given, we default to associative.
    typeset -A a=([0]=foo [1]=bar)
     $ ksh -c 'function f { typeset -a a; a+=([0]=foo [1]=bar); typeset -p a; }; f' # Now using += gives us the expected results.
    typeset -a a=(foo bar)
     $ ksh -c 'function f { typeset -A a; a=(foo bar); typeset -p a; }; f' # On top of that, the reverse does NOT unset the attribute. No idea why.
     ksh: f: line 1: cannot append index array to associative array a
    
  • Only Bash and mksh support compound assignment with mixed explicit subscripts and automatically incrementing subscripts. In ksh93, in order to specify individual subscripts within a compound assignment, all subscripts must be given (or none). Zsh doesn't support specifying individual subscripts at all.
  • Appending to a compound assignment is a fairly portable way to append elements after the last index of an array. In Bash, this also sets append mode for all individual assignments within the compound assignment, such that if a lower subscript is specified, subsequent elements will be appended to previous values. In ksh93, it causes subscripts to be ignored, forcing appending everything after the last element. (Appending has different meaning due to support for multi-dimensional arrays and nested compound datastructures.)
     $ ksh -c 'function f { typeset -a a; a+=(foo bar baz); a+=([3]=blah [0]=bork [1]=blarg [2]=zooj); typeset -p a; }; f' # ksh93 forces appending to the array, disregarding subscripts
    typeset -a a=(foo bar baz '[3]=blah' '[0]=bork' '[1]=blarg' '[2]=zooj')
     $ bash -c 'function f { typeset -a a; a+=(foo bar baz); a+=(blah [0]=bork blarg zooj); typeset -p a; }; f' # Bash applies += to every individual subscript.
    declare -a a='([0]="foobork" [1]="barblarg" [2]="bazzooj" [3]="blah")'
     $ mksh -c 'function f { typeset -a a; a+=(foo bar baz); a+=(blah [0]=bork blarg zooj); typeset -p a; }; f' # Mksh does like Bash, but clobbers previous values rather than appending.
    set -A a
    typeset a[0]=bork
    typeset a[1]=blarg
    typeset a[2]=zooj
    typeset a[3]=blah
    
  • In Bash and Zsh, the alternate value assignment parameter expansion ( ${arr[idx]:=foo} ) evaluates the subscript twice, first to determine whether to expand the alternate, and second to determine the index to assign the alternate to. See evaluation order .
     $ : ${_[$(echo $RANDOM >&2)1]:=$(echo hi >&2)}
    13574
    hi
    14485
    
  • In Zsh, arrays are indexed starting at 1 in its default mode. Emulation modes are required in order to get any kind of portability.
  • Zsh and mksh do not support compound assignment arguments to typeset .
  • Ksh88 didn't support modern compound array assignment syntax. The original (and most portable) way to assign multiple elements is to use the set -A name arg1 arg2 syntax. This is supported by almost all shells that support ksh-like arrays except for Bash. Additionally, these shells usually support an optional -s argument to set which performs lexicographic sorting on either array elements or the positional parameters. Bash has no built-in sorting ability other than the usual comparison operators.
     $ ksh -c 'set -A arr -- foo bar bork baz; typeset -p arr' # Classic array assignment syntax
    typeset -a arr=(foo bar bork baz)
     $ ksh -c 'set -sA arr -- foo bar bork baz; typeset -p arr' # Native sorting!
    typeset -a arr=(bar baz bork foo)
     $ mksh -c 'set -sA arr -- foo "[3]=bar" "[2]=baz" "[7]=bork"; typeset -p arr' # Probably a bug. I think the maintainer is aware of it.
    set -A arr
    typeset arr[2]=baz
    typeset arr[3]=bar
    typeset arr[7]=bork
    typeset arr[8]=foo
    
  • Evaluation order for assignments involving arrays varies significantly depending on context. Notably, the order of evaluating the subscript or the value first can change in almost every shell for both expansions and arithmetic variables. See evaluation order for details.
  • Bash 4.1.* and below cannot use negative subscripts to address array indexes relative to the highest-numbered index. You must use the subscript expansion, i.e. "${arr[@]:(-n):1}" , to expand the nth-last element (or the next-highest indexed after n if arr[n] is unset). In Bash 4.2, you may expand (but not assign to) a negative index. In Bash 4.3, ksh93, and zsh, you may both assign and expand negative offsets.
  • ksh93 also has an additional slice notation: "${arr[n..m]}" where n and m are arithmetic expressions. These are needed for use with multi-dimensional arrays.
  • Assigning or referencing negative indexes in mksh causes wrap-around. The max index appears to be UINT_MAX , which would be addressed by arr[-1] .
  • So far, Bash's -v var test doesn't support individual array subscripts. You may supply an array name to test whether an array is defined, but can't check an element. ksh93's -v supports both. Other shells lack a -v test.
Bugs
  • Fixed in 4.3 Bash 4.2.* and earlier considers each chunk of a compound assignment, including the subscript for globbing. The subscript part is considered quoted, but any unquoted glob characters on the right-hand side of the [ ]= will be clumped with the subscript and counted as a glob. Therefore, you must quote anything on the right of the = sign. This is fixed in 4.3, so that each subscript assignment statement is expanded following the same rules as an ordinary assignment. This also works correctly in ksh93.
    $ touch '[1]=a'; bash -c 'a=([1]=*); echo "${a[@]}"'
    [1]=a
    
    mksh has a similar but even worse problem in that the entire subscript is considered a glob.
    $ touch 1=a; mksh -c 'a=([123]=*); print -r -- "${a[@]}"'
    1=a
    
  • Fixed in 4.3 In addition to the above globbing issue, assignments preceding "declare" have an additional effect on brace and pathname expansion.
    $ set -x; foo=bar declare arr=( {1..10} )
    + foo=bar
    + declare 'a=(1)' 'a=(2)' 'a=(3)' 'a=(4)' 'a=(5)'
    
    $ touch xy=foo
    $ declare x[y]=*
    + declare 'x[y]=*'
    $ foo=bar declare x[y]=*
    + foo=bar
    + declare xy=foo
    
    Each word (the entire assignment) is subject to globbing and brace expansion. This appears to trigger the same strange expansion mode as let , eval , other declaration commands, and maybe more.
  • Fixed in 4.3 Indirection combined with another modifier expands arrays to a single word.
    $ a=({a..c}) b=a[@]; printf '<%s> ' "${!b}"; echo; printf '<%s> ' "${!b/%/foo}"; echo
    <a> <b> <c>
    <a b cfoo>
    
  • Fixed in 4.3 Process substitutions are evaluated within array indexes. Zsh and ksh don't do this in any arithmetic context.
     
    # print "moo"
    dev=fd=1 _[1<(echo moo >&2)]=
    
    # Fork bomb
    ${dev[${dev='dev[1>(${dev[dev]})]'}]}
    
Evaluation order Here are some of the nasty details of array assignment evaluation order. You can use this testcase code to generate these results.
Each testcase prints evaluation order for indexed array assignment
contexts. Each context is tested for expansions (represented by digits) and
arithmetic (letters), ordered from left to right within the expression. The
output corresponds to the way evaluation is re-ordered for each shell:

a[ $1 a ]=${b[ $2 b ]:=${c[ $3 c ]}}               No attributes
a[ $1 a ]=${b[ $2 b ]:=c[ $3 c ]}                  typeset -ia a
a[ $1 a ]=${b[ $2 b ]:=c[ $3 c ]}                  typeset -ia b
a[ $1 a ]=${b[ $2 b ]:=c[ $3 c ]}                  typeset -ia a b
(( a[ $1 a ] = b[ $2 b ] ${c[ $3 c ]} ))           No attributes
(( a[ $1 a ] = ${b[ $2 b ]:=c[ $3 c ]} ))          typeset -ia b
a+=( [ $1 a ]=${b[ $2 b ]:=${c[ $3 c ]}} [ $4 d ]=$(( $5 e )) ) typeset -a a
a+=( [ $1 a ]=${b[ $2 b ]:=c[ $3 c ]} [ $4 d ]=${5}e ) typeset -ia a

bash: 4.2.42(1)-release
2 b 3 c 2 b 1 a
2 b 3 2 b 1 a c
2 b 3 2 b c 1 a
2 b 3 2 b c 1 a c
1 2 3 c b a
1 2 b 3 2 b c c a
1 2 b 3 c 2 b 4 5 e a d
1 2 b 3 2 b 4 5 a c d e

ksh93: Version AJM 93v- 2013-02-22
1 2 b b a
1 2 b b a
1 2 b b a
1 2 b b a
1 2 3 c b a
1 2 b b a
1 2 b b a 4 5 e d
1 2 b b a 4 5 d e

mksh: @(#)MIRBSD KSH R44 2013/02/24
2 b 3 c 1 a
2 b 3 1 a c
2 b 3 c 1 a
2 b 3 c 1 a
1 2 3 c a b
1 2 b 3 c a
1 2 b 3 c 4 5 e a d
1 2 b 3 4 5 a c d e

zsh: 5.0.2
2 b 3 c 2 b 1 a
2 b 3 2 b 1 a c
2 b 1 a
2 b 1 a
1 2 3 c b a
1 2 b a
1 2 b 3 c 2 b 4 5 e
1 2 b 3 2 b 4 5

See also

Recommended Links

Google matched content

Softpanorama Recommended

Top articles

[Jul 04, 2020] Learn Bash Debugging Techniques the Hard Way by Ian Miell Published on Jul 04, 2020 | zwischenzugs.com

[Jul 25, 2017] Beginner Mistakes Published on Jul 25, 2017 | wiki.bash-hackers.org

Sites

Top articles

Sites

...



Etc

Society

Groupthink : Two Party System as Polyarchy : Corruption of Regulators : Bureaucracies : Understanding Micromanagers and Control Freaks : Toxic Managers :   Harvard Mafia : Diplomatic Communication : Surviving a Bad Performance Review : Insufficient Retirement Funds as Immanent Problem of Neoliberal Regime : PseudoScience : Who Rules America : Neoliberalism  : The Iron Law of Oligarchy : Libertarian Philosophy

Quotes

War and Peace : Skeptical Finance : John Kenneth Galbraith :Talleyrand : Oscar Wilde : Otto Von Bismarck : Keynes : George Carlin : Skeptics : Propaganda  : SE quotes : Language Design and Programming Quotes : Random IT-related quotesSomerset Maugham : Marcus Aurelius : Kurt Vonnegut : Eric Hoffer : Winston Churchill : Napoleon Bonaparte : Ambrose BierceBernard Shaw : Mark Twain Quotes

Bulletin:

Vol 25, No.12 (December, 2013) Rational Fools vs. Efficient Crooks The efficient markets hypothesis : Political Skeptic Bulletin, 2013 : Unemployment Bulletin, 2010 :  Vol 23, No.10 (October, 2011) An observation about corporate security departments : Slightly Skeptical Euromaydan Chronicles, June 2014 : Greenspan legacy bulletin, 2008 : Vol 25, No.10 (October, 2013) Cryptolocker Trojan (Win32/Crilock.A) : Vol 25, No.08 (August, 2013) Cloud providers as intelligence collection hubs : Financial Humor Bulletin, 2010 : Inequality Bulletin, 2009 : Financial Humor Bulletin, 2008 : Copyleft Problems Bulletin, 2004 : Financial Humor Bulletin, 2011 : Energy Bulletin, 2010 : Malware Protection Bulletin, 2010 : Vol 26, No.1 (January, 2013) Object-Oriented Cult : Political Skeptic Bulletin, 2011 : Vol 23, No.11 (November, 2011) Softpanorama classification of sysadmin horror stories : Vol 25, No.05 (May, 2013) Corporate bullshit as a communication method  : Vol 25, No.06 (June, 2013) A Note on the Relationship of Brooks Law and Conway Law

History:

Fifty glorious years (1950-2000): the triumph of the US computer engineering : Donald Knuth : TAoCP and its Influence of Computer Science : Richard Stallman : Linus Torvalds  : Larry Wall  : John K. Ousterhout : CTSS : Multix OS Unix History : Unix shell history : VI editor : History of pipes concept : Solaris : MS DOSProgramming Languages History : PL/1 : Simula 67 : C : History of GCC developmentScripting Languages : Perl history   : OS History : Mail : DNS : SSH : CPU Instruction Sets : SPARC systems 1987-2006 : Norton Commander : Norton Utilities : Norton Ghost : Frontpage history : Malware Defense History : GNU Screen : OSS early history

Classic books:

The Peter Principle : Parkinson Law : 1984 : The Mythical Man-MonthHow to Solve It by George Polya : The Art of Computer Programming : The Elements of Programming Style : The Unix Hater’s Handbook : The Jargon file : The True Believer : Programming Pearls : The Good Soldier Svejk : The Power Elite

Most popular humor pages:

Manifest of the Softpanorama IT Slacker Society : Ten Commandments of the IT Slackers Society : Computer Humor Collection : BSD Logo Story : The Cuckoo's Egg : IT Slang : C++ Humor : ARE YOU A BBS ADDICT? : The Perl Purity Test : Object oriented programmers of all nations : Financial Humor : Financial Humor Bulletin, 2008 : Financial Humor Bulletin, 2010 : The Most Comprehensive Collection of Editor-related Humor : Programming Language Humor : Goldman Sachs related humor : Greenspan humor : C Humor : Scripting Humor : Real Programmers Humor : Web Humor : GPL-related Humor : OFM Humor : Politically Incorrect Humor : IDS Humor : "Linux Sucks" Humor : Russian Musical Humor : Best Russian Programmer Humor : Microsoft plans to buy Catholic Church : Richard Stallman Related Humor : Admin Humor : Perl-related Humor : Linus Torvalds Related humor : PseudoScience Related Humor : Networking Humor : Shell Humor : Financial Humor Bulletin, 2011 : Financial Humor Bulletin, 2012 : Financial Humor Bulletin, 2013 : Java Humor : Software Engineering Humor : Sun Solaris Related Humor : Education Humor : IBM Humor : Assembler-related Humor : VIM Humor : Computer Viruses Humor : Bright tomorrow is rescheduled to a day after tomorrow : Classic Computer Humor

The Last but not Least Technology is dominated by two types of people: those who understand what they do not manage and those who manage what they do not understand ~Archibald Putt. Ph.D


Copyright © 1996-2021 by Softpanorama Society. www.softpanorama.org was initially created as a service to the (now defunct) UN Sustainable Development Networking Programme (SDNP) without any remuneration. This document is an industrial compilation designed and created exclusively for educational use and is distributed under the Softpanorama Content License. Original materials copyright belong to respective owners. Quotes are made for educational purposes only in compliance with the fair use doctrine.

FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to advance understanding of computer science, IT technology, economic, scientific, and social issues. We believe this constitutes a 'fair use' of any such copyrighted material as provided by section 107 of the US Copyright Law according to which such material can be distributed without profit exclusively for research and educational purposes.

This is a Spartan WHYFF (We Help You For Free) site written by people for whom English is not a native language. Grammar and spelling errors should be expected. The site contain some broken links as it develops like a living tree...

You can use PayPal to to buy a cup of coffee for authors of this site

Disclaimer:

The statements, views and opinions presented on this web page are those of the author (or referenced source) and are not endorsed by, nor do they necessarily reflect, the opinions of the Softpanorama society. We do not warrant the correctness of the information provided or its fitness for any purpose. The site uses AdSense so you need to be aware of Google privacy policy. You you do not want to be tracked by Google please disable Javascript for this site. This site is perfectly usable without Javascript.

Last modified: June, 08, 2021