Softpanorama

May the source be with you, but remember the KISS principle ;-)
Home Switchboard Unix Administration Red Hat TCP/IP Networks Neoliberalism Toxic Managers
(slightly skeptical) Educational society promoting "Back to basics" movement against IT overcomplexity and  bastardization of classic Unix

Loops in Bash

News

bash Control Structures

Recommended Links

For loop Examples of for loops While and until loops Examples of while loops
Arithmetic expressions Comparison operators Double square bracket conditionals Usage of pipes with loops Compound conditionals Seq command Exiting the loop with break
String Operations in Shell exec command BASH Debugging Shell history Tips Humor Etc

Note: Materials of this lecture notes combine copyrighted materials from Learning the Korn Shell (A Nutshell Handbook) by Bill Rosenblatt (first edition, 1993) and examples from Advanced Bash-Scripting Guide. This lecture notes were created by the author for his Unix shell course in FDU. As they include copyrighted materials they should be used strictly for educational purposes.


Introduction

A loop is a block of code that iterates (repeats) a list of commands as long as the loop control condition is true. Loop constructs were introduced in Bourne shell around 1977 and they were not very well thought out; deficient even in comparison to C, to say nothing about AWK. But we have what we have and there is no reason to concentrate on complains. We need to learn how to make lemonade from the lemon.

Borne shell introduced several types of loops. They were quite backward and brain-dead by programming language design standards of the time. But for shells they were pretty innovative and powerful (although they were inferior to AWK control structures which was created approximately at the same time in the same organization and has much higher level of design and implementation ;-). Stephen R. Bourne despite his previous involvement with Algol-68 complier proved to be a mediocre designer, or constrains imposed on him were too tight, or both. He ended his career as a chief technology officer at Icon Venture Partners, a venture capital firm based in Menlo Park, California :-)

Now bash loops looks completely archaic and syntax looks like a perversion (the requirement to use a semicolon before do, if it used on the same line with while or for is especially annoying), but, again, we have what we have. In you can't overcome allergy please use Perl or other more modern scripting language. The other annoying wart of Bourne shell and its derivatives including bash is that until loop is implemented incorrectly in all shells since Bourne shell -- it is implemented as a while loop with negated condition checked at the top; the test for the condition specified is positioned before the loop body, not after, like in "proper" until loop.

For compatibility reasons the control structures introduced by Steve Bourne in his Borne shell were preserved in all derivatives like ksh and bash in the form they were introduced, despite the fact that some of them look very strange or arbitrary or both even at the time of Bourne shell design. It was a very bad decision, even taking compatibility pressure into account. My impression is that Bourne shell is never designed, it was just a rather primitive hack of existing Mashey shell adding extentions in a manner: something here, something there. It was Mashey shell that first used $ sign marker for dereferencing a variable.

As any hack it was done on far lower level of architectural thinking in comparison with the design of AWK, which, paradoxically, was also the product of Bell labs, created exactly at the same time. It's kind of strange that two teams working in the same lab on similar project never cooperated; that happens often. But the fact that managers did not realize the potential of AWK and continued with Borne shell instead of throwing it into a garbage can and extending AWK to the shell level tells something about quality of management. Truth be told, management in IT project is always a problems and Unix team did not have the worst managers possible ;-). Actually it was due to Doug McIlroy, then a department head in the Computing Science Research Center, that Unix got pipes concept. Please try to find another manager, who make similar level contribution to such a project.

In any case, you need to be patient and avoid crying "What a crap you are teaching us !!!" at the lectures ;-). Humans are very adaptable and get used to things even if they look very unnatural and obscure at the beginning, especially if they are well paid for those skills :-).

The list of loops that modern shells such as bash 3.x and ksh93 support

Most current version of shells (bash and ksh93) support three original types of loop from Bourne shell (with its unique Algol-68 style syntax; what is funny that while Stephen Bourne borrowed Algol-68 syntax (and shell if the only language that I know were this syntax survived), he was never able to provide a distinct lexical level for the shell. Shell has no distinct lexical level and no lexical scanner.

Bash now also supports "C-style for loop". Arithmetic conditional expressions ((...)) can (and should for arithmetic comparisons) now be used in the while loop. That means that old examples which use double square brackets (or worse, single square brackets) for arithmetic comparisons are now obsolete and need to be revised. In any case the situation slightly improved, although long development history and compatibilities issues created a lot of warts.

  1. while loops (there is no "real" until loops in shell (with the test in the bottom), but there is until keyword that reverses the test at the top ;-)

  2. "legacy" for loops statement (Bourne-style). Very limited and idiosyncratic. Still useful in certain circumstances.

  3. C-style for loop (was introduced in Bash 2.04). This is for loop with classic C-style syntax. For example
    for (( i=0 ; i < 10 ; i++ )) ; do
       echo $i ;
    done

    You can also define infinite loop this way

    for (( ; ; )) ; do 
    done 

Loops can be connected to a pipe This type of loops are unique to shell. They are very powerful, albeit little known and underappreciated constructs.

while loop

Bash provides while loops (the loop which has the test of the condition at the top and breaks if condition is false). The syntax for while loop is:

while condition 
do
    statements...
done

If do statement is positioned on the same line as the while statement you need to separate them with semicolon:

while condition ; do
    statements...
done

The condition is either a command (in this case RC code is checked on each iteration and loop end if RC is not zero) or shell conditional operator (conditional expression enclosed in iether double square or double round brackets) There can be multiple commands in the condition (in this case return code from the last one is used to check for termination).

Various types of conditions can be combined using logical operators && and ||, for example [[ $version= '6.4' ]] && (( RAM > 64000000 ))

At the same time while and until loops are actually most useful (and less prone to errors) when "the header" conditions are simple. For example

while ping -c 1 $ip ; do
   sleep 60;
done
echo "Connectivity to $ip lost"

Additional exits from the loop can be provided using break statement (see below)

You can reverse the condition using until instead of while (remember that until loop in shell is just while loop with reversed condition of success; the check is at the top of the loop):

until ping -c 1 $ip ; do
   echo $ip still down
   sleep 60;
done
echo "Server $ip rebooted"

Please note that in the past while loop was often used to imitate C-style for loop, for example:

declare -i i=1
while (( i < 10 )); do
	echo i=$i
	let i++
done

But now it is unnecessary, as bash since version 2.04 (that means "for a long time" as current version is 4.x) has a "Classic C-style" for loop construct . It is just very rarely used and is virtually unknown. See C-style for loop below.

While loop is often used to read the file like by line and perform some action on lines:

exec <$file # redirects stdin to a file
while read line # read one line from the file ( the return code is zero if read statement can read the line) 
do
   echo $line
done

or, using pipes

cat $file | while read line 
do
   echo $line
done
The trick with exec command demonstrated above redirects the file to STDIN. See exec command for more details. In essence it is equivalent to following invocation of the script like:
./somescript < inputfile

The script can accomplish the same thing internally by using: exec < inputfile. After that line, the script's input is inputfile. You can do the same trick to output:

exec > somescript.log 2>&1

This send stderr and stdout combined into the file. There is more to it, but this is the basics.

As a more complex example let's discuss an implementation of a simplified version of the shell's built-in whence command. By "simplified," we mean that we will implement only the part that checks all of the directories in your PATH for the command you give as argument (we won't implement checking for aliases, built-in commands, etc.).

The simplest way to do it is to picking the directories in PATH one by one and check if the file is present in it. To extract the directory we will use the shell's pattern-matching operator %% , and seeing if there is a file with the given name in the directory that you have permission to execute. Here is the code:

mypath=$PATH:
while [[ -n $mypath ]]; do
    mydir=${mypath%%:*}
    if [[ -x $mydir/$1 && ! -d $mydir/$1 ]]; then
        print "$mydir/$1"
        break # exit the loop 
    fi
    mypath=${mypath#*:}  
done

Here is the trace of this script

bash -x  test.sh nonexistant
+ mypath=/usr/lib64/qt-3.3/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin:
+ [[ -n /usr/lib64/qt-3.3/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin: ]]
+ mydir=/usr/lib64/qt-3.3/bin
+ [[ -x /usr/lib64/qt-3.3/bin/nonexistant ]]
+ mypath=/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin:
+ [[ -n /usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin: ]]
+ mydir=/usr/local/sbin
+ [[ -x /usr/local/sbin/nonexistant ]]
+ mypath=/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin:
+ [[ -n /usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin: ]]
+ mydir=/usr/local/bin
+ [[ -x /usr/local/bin/nonexistant ]]
+ mypath=/sbin:/bin:/usr/sbin:/usr/bin:/root/bin:
+ [[ -n /sbin:/bin:/usr/sbin:/usr/bin:/root/bin: ]]
+ mydir=/sbin
+ [[ -x /sbin/nonexistant ]]
+ mypath=/bin:/usr/sbin:/usr/bin:/root/bin:
+ [[ -n /bin:/usr/sbin:/usr/bin:/root/bin: ]]
+ mydir=/bin
+ [[ -x /bin/nonexistant ]]
+ mypath=/usr/sbin:/usr/bin:/root/bin:
+ [[ -n /usr/sbin:/usr/bin:/root/bin: ]]
+ mydir=/usr/sbin
+ [[ -x /usr/sbin/nonexistant ]]
+ mypath=/usr/bin:/root/bin:
+ [[ -n /usr/bin:/root/bin: ]]
+ mydir=/usr/bin
+ [[ -x /usr/bin/nonexistant ]]
+ mypath=/root/bin:
+ [[ -n /root/bin: ]]
+ mydir=/root/bin
+ [[ -x /root/bin/nonexistant ]]
+ mypath=
+ [[ -n '' ]]

The only things that might need commentary here is an obsuver version of a regular expression used in the statement:

mydir=${mypath%%:*}

This is the notation that was introduced in Korn shell and it allow to perform string operations that previously you need to perform via external function. That notation is very idiosyncratic: this one deletes the shortest match to the pattern given from the front of the string. So this statement deletes the front directory from $mypath and assigns the result to the variable mydir.

Thus, the code loops through all of the directories in PATH, one by one, truncating PATH in the process. It exits when it finds a matching executable file or when it has "eaten up" the entire PATH. If no matching executable file is found, it prints nothing and exits with an error status.

We can enhance this script a bit by taking advantage of the UNIX utility file(1) and using it instead of echo. Command file examines files given as arguments and determines what type they are, based on the file's magic number and various heuristics (educated guesses). A magic number is a field in the header of an executable file that the linker sets to identify what type of executable it is. However, if filename is not an executable program, it will examine the first few lines and try to guess what kind of information the file contains. If the file contains text (as opposed to binary data), file will look for indications that it is English, shell commands, C, FORTRAN, troff(1) input, and various other things. file guess wrong sometimes, but it is mostly correct.

mypath=$PATH:
while [[ -n $mypath ]]; do
    mydir=${mypath%%:*}
    if [[ -x $mydir/$1 && ! -d $mydir/$1 ]]; then
        file "$mydir/$1"
        break # exit the loop 
    fi
    mypath=${mypath#*:}  
done

You can feed a file into the while loop and this is a typical was to read a file in shell line by line and to process each line:

while read p; do
echo $p
done < myfile.txt

The loop body may read from arbitrary channel, you can open, for example channel 10:
while read -u 10 p; do
  ...
done 10< myfile.txt

Here, 10 is just an arbitrary number (different from classic channels 0, 1, and 2). We can also use multiple wildcards and even environment variables in the word list:

for x in /etc/*.conf /var/mes* 
do
    cp $x /Backup
done

You can also use while loop for processing options in bash scripts (Bash While Loop Example – nixCraft) and this is also a typical way to program this in shell:

......
..
while getopts ae:f:hd:s:qx: option
do
        case "${option}"
        in
                a) ALARM="TRUE";;
                e) ADMIN=${OPTARG};;
                d) DOMAIN=${OPTARG};;
                f) SERVERFILE=$OPTARG;;
                s) WHOIS_SERVER=$OPTARG;;
                q) QUIET="TRUE";;
                x) WARNDAYS=$OPTARG;;
                \?) usage
                    exit 1;;
        esac
done
.......

until loop -- botched by design type of loop in bash

To make things confusing there is a construct called until loop but it is implemented incorrectly and has the same test at the top as while loop, unlike loop constructs in Pascal (while/do and repeat/until) and C (while and do/until). This is a legacy of Borne shell (which has a really horrible design) and it is amazing how such incorrect solution survived for so long (the notion of feature deprecation was unknown in 70th and 80th; but it is strange that it was unknown by POSIX committee (committee to defeat alliance of Sun and AT&T ;-). The sad fact that it is still not widely used in shell designer world even now, 40 years since shell was created.

In shell the until condition is checked at the top of the loop, not at the bottom as it is in analogous constructs in C and Pascal. So this is just a while loop with reversed check for termination

As a result of this idiosyncrasy you can convert any until loop into a while loop by simply negating the condition. Therefore we will ignore the existence of until loop throughout the rest of course.

Breaking out of a Loop

To break out of a loop, the command word break is used. Command execution will continue with the first command line found after the end of the loop (after the word "done").

If loops are nested, it is possible to break out not only out of the current but out of any number of nested loops by following the word break by specifing the number of loops you need to exit :

while true
do

for count in nine eight seven six five four three two one ; do
echo "$count"
if [[ "$count" = "four" ]] ; then
break
fi
done

echo Launch was aborted at $count # break statement passes control to this line

done

In this example we will break out of two inner loops:

while true
do

for count in nine eight seven six five four three two one ; do
do
echo "$count"
if [[ "$count" = "four" ]]
then
break 2
fi
done

command n

done
another_command_line # <<<<<break 2 statement passes control to this line <<<<

Continuing to the next iteration

It is possible to force the to skip the rest of loop body and continue to the next iteration. In other words continue in Bash means: proceed to the next iteration of the loop. While this statement traditionally called continue (as in C) it is actually more accurately can be called the next statement (as in Perl).

for level1_var in one two three four five ; do
echo "$level1_var"
for level2_var in a b c d
do # <<<<continue statement passes control to this line

echo "$level1_var $level2_var"
if [[ "$level2_var" = "c" ]] ; then
continue
fi
commands_in_nested_loop

done
done

The continue statement can have argument which can be used for escaping from nested loop. It specifies the number of levels to continue the next operation.

for level1_var in one two three four five ; do

# <<<<continue 2 statement passes control to this line
echo "$level1_var"
for level2_var in a b c d ; do
# <<<<continue statement passes control to this line

echo "$level1_var $level2_var"
if [[ "$level2_var" = "c" ]] ; then
continue 2
fi
commands_in_nested_loop

done

done

Examples of while Loops
from Advanced BASH Programming Guide

Slightly updated to reflect constructs available in bash 4.2:

var0=0
LIMIT=10

while (( var0 < LIMIT" )) ; do
  echo -n "$var0 "        # -n suppresses newline.
  #             ^ Space is used to separate printed out numbers.

  var0=`expr $var0 + 1`   # var0=$(($var0+1))  also works.
                          # var0=$((var0 + 1)) also works.
                          # let "var0 += 1"    also works.
done                      # Various other methods also work.

echo

exit 0

Another while loop

while [[ "$var1" != "end" ]]     # while test "$var1" != "end"
do                             # also works.
  echo "Input variable #1 (end to exit) "
  read var1                    # Not 'read $var1' (shell idiosyncrasy).
  echo "variable #1 = $var1"   # Need quotes because of "#".
  # If input is 'end', echoes it here.
  # Does not test for termination condition until top of loop.
  echo
done  
exit 0

A while loop may employ C-like syntax by using the double parentheses construct :

# wh-loopc.sh: Count to 10 in a "while" loop.

LIMIT=10
a=1

while (( a < LIMIT )) ; do
  echo -n "$a "
  let "a+=1"
done           # No surprises, so far.

echo; 

# +=================================================================+

# Now, repeat with C-like syntax.

((a = 1))      # a=1
# Double parentheses permit space when setting a variable, as in C.

while (( a <= LIMIT ))   # Double parentheses, and no "$" preceding variables.
do
  echo -n "$a "
  ((a += 1))   # let "a+=1"
  # Yes, indeed.
  # Double parentheses permit incrementing a variable with C-like syntax.
done

echo

# Now, C programmers can feel right at home in Bash.

exit 0

Examples of until Loops from Advanced BASH Programming Guide

#!/bin/bash

END_CONDITION=end

until [[ "$var1" = "$END_CONDITION" ]] ; do
  echo "Input variable #1 "
  echo "($END_CONDITION to exit)"
  read var1
  echo "variable #1 = $var1"
  echo
done  

exit 0

For loops

In bash there are two types of for loops: "old" and "new" (C-style). The latter is much better and unless you need to do something for which old for loop is especially suitable you should avoid using old syntax. Forget about silly noise about compatibility. Bash is now standard de-facto and is available of most operating systems. It is still not available by default on HP-UX 11, but precompiled version exists and is easily installable. It is easier to install bash on all systems that struggle with compatibility questions (ksh93 is an alternative and its more reliable).

"Old" for loop

This type of for look is a bastard child of the initial versions of shell when the key requirement was small size of executable. Like many bad design decisions it survived in old subsequent version. The for loop allows you to repeat a section of code a fixed number of times. During each time through the code (known as an iteration), a special variable called a loop variable is set to a different value; this way each iteration can do something slightly different.

The for loop used in Born shell is different from for look used in C. The main difference is that the shell's for loop doesn't let you specify a number interactions and counter. It iterated thought the list which can be supplied in several different ways:

The syntax for the for loop in shell is close to foreach loop in Perl:

for name [in list ]
do
    statements that can use  varibale $name, which changes on each iteration...
done
for example:
for p in `seq 2 10` ; do  ssh p$p "hostname && ls -l /etc/cron.hourly "; done

The list can also represent a basic regular expression. for example

for myfile in * ; do
     echo $myfile  is resingin the current directy
done
In the example above * is a basic regular expression that will be applied to the current directory. Regular expression can be more complex
for myfile in /etc/*.conf ; do
     echo $myfile  is resingin the current directy
done
This will list config files in /etc directory. In a typical RHEL /etc directory there is over 60 such files with suggests that something is wrong with Linux ;-)

More about using lists in for loop

The list is a list of names. (If in list is omitted, the list defaults to "$@", i.e., the quoted list of command-line arguments, but you should always supply the in list for the sake of clarity.). For example

for user in nick serg dave leo bob john
do
    grep $user /etc/passwd
done

The list with elements over which we iterate can also be represented by a variable:

USER_NAMES="nick serg dave leo bob john" 
for user in $USER_NAMES ; do
    grep $user /etc/passwd 
done

The [list] in a for loop may contain filename globbing, that is, using wildcards for filename expansion.

The * in the for construct is not the same as $*. It's a wildcard, i.e., all files in a directory.

CP=$(which cp)
for f in * ; do
   if [ -f ${f}.bak ] ; then
      echo "skiping $f file. Backup exists"
      continue # read the next file
   fi
   $CP $f $f.bak
done;

Another similar example:

for myfile in /etc/r* ; do
    if [ -d "$myfile" ] 
    then
      echo "directory $myfile"
    else
      echo "\t$myfile"
    fi
done

As an example let's create a script that prints the type of each file passed as parameter:

for filename in "$@" ; do
    finfo $filename
    print
done

The loop calls subroutine finfo, which we now need to define:

function finfo 
{
    if [[ ! -a $1 ]]; then
        print "file $1 does not exist."
        return 1
    fi
    file $1
}

The complete script consists of the for loop code and the above function, in either order; good programming style dictates that the function definition should go first.

Using ranges in for loop

Bash version 3.0+ has built-in support for setting up ranges:

#!/bin/bash
for i in {1..5}
do
   echo "Welcome $i times"
done

Bash v4.0+ has built-in support for setting up a step value using {START..END..INCREMENT} syntax:

#!/bin/bash
echo "Bash version ${BASH_VERSION}..."
for i in {0..10..2}
  do 
     echo "Welcome $i times"
 done
#!/bin/bash 
   echo "Bash version ${BASH_VERSION}..." 
   for i in {0..10..2} do 
   echo "Welcome $i times" 
   done

Older way of ding the same is to use seq command with for loop you can generate simple sequences programmatically:

#!/bin/bash
for i in `seq 1 10`; do
   echo $i
done

This example is trivial to replicate with while loop with counter command. But you can also generate sequences of floating-point numbers, for example :

for flc in $(seq 1.0 .01 1.1)
do
   echo $flc; 
done

Here is more complex example borrowed from Learning the Korn Shell (A Nutshell Handbook)

Your UNIX system has the ability to transfer files from an MS-DOS system, but it leaves the DOS filenames intact. Write a script that translates the filenames in a given directory from DOS format to a more UNIX-friendly format.

DOS filenames have the format FILENAME.EXT. FILENAME can be up to eight characters long; EXT is an extension that can be up to three characters. The dot is required even if the extension is null; letters are all uppercase. We want to do the following:

  1. Translate letters from uppercase to lowercase.

  2. If the extension is null, remove the dot.

The first tool we will need for this job is the UNIX tr(1) utility, which translates characters on a one-to-one basis. Given the arguments charset1 and charset2, it will translate characters in the standard input that are members of charset1 into corresponding characters in charset2. The two sets are ranges of characters enclosed in square brackets ([] in standard regular-expression form in the manner of grep, awk, ed, etc.). More to the point, tr [A-Z] [a-z] takes its standard input, converts uppercase letters to lowercase, and writes the converted text to the standard output.

That takes care of the first step in the translation process. We can use a Korn shell string operator to handle the second. Here is the code for a script we'll call dosmv:

for filename in ${1:+$1/}* ; do
    newfilename=$(print $filename | tr [A-Z] [a-z])
    newfilename=${newfilename%.}
    print "$filename -> $newfilename"
    mv $filename $newfilename
done

This script accepts a directory name as argument, the default being the current directory. The expression ${1:+$1/} evaluates to the argument ($1) with a slash appended if the argument is supplied, or the null string if it isn't supplied. So the entire expression ${1:+$1/}* evaluates to all files in the given directory, or all files in the current directory if no argument is given.

Therefore, filename takes on the value of each filename in the list. filename gets translated into newfilename in two steps. (We could have done it in one, but readability would have suffered.) The first step uses tr in a pipeline within a command substitution construct. Our old friend print makes the value of filename the standard input to tr. tr's output becomes the value of the command substitution expression, which is assigned to newfilename. Thus, if $filename were DOSFILE.TXT, newfilename would become dosfile.txt.

The second step uses one of the shell's pattern-matching operators, the one that deletes the shortest match it finds at the end of the string. The pattern here is ., which means a dot at the end of the string. This means that the expression $rere{newfilename%.} will delete a dot from $newfilename only if it's at the end of the string; otherwise the expression will leave $newfilename intact. For example, if $newfilename is dosfile.txt, it will be untouched, but if it's dosfile., the expression will change it to dosfile without the final dot. In either case, the new value is assigned back to newfilename.

UNIX regular expression mavens should remember that this is shell wildcard syntax, in which dots are not operators and therefore do not need to be backslash-escaped.

The last statement in the for loop body does the file renaming with the standard UNIX mv(1) command. Before that, a print command simply informs the user of what's happening.

There is one little problem with the solution on the previous page: if there are any files in the given directory that aren't DOS files (in particular, if there are files whose names don't contain uppercase letters and don't contain a dot), then the conversion will do nothing to those filenames and mv will be called with two identical arguments. mv will complain with the message: mv: filename and filename are identical. We can solve this problem by letting grep determine whether each file has a DOS filename or not. The grep regular expression:

[^a-z]\{1,8\}\.[^a-z]\{0,3\}

is adequate (for these purposes) for matching DOS-format filenames. The character class [^a-z] means "any character except a lowercase letter." So the entire regular expression means: "Between 1 and 8 non-lowercase letters, followed by a dot, followed by 0 to 3 non-lowercase letters."

When grep runs, it normally prints all of the lines in its standard input that match the pattern you give it as argument. But we only need it to test whether or not the pattern is matched. Luckily, grep's exit status is "well-behaved": it's 0 if there is a match in the input, 1 if not. Therefore, we can use the exit status to test for a match. We also need to discard grep's output; to do this, we redirect it to the special file /dev/null, which is colloquially known as the "bit bucket." Any output directed to /dev/null effectively disappears. Thus, the command line:

print "$filename" | grep '[^a-z]\{1,8\}\.[^a-z]\{0,3\}' > /dev/null

Some Berkeley-derived versions of UNIX have a -s ("silent") option to grep that suppresses standard output, thereby making redirection to /dev/null unnecessary.

prints nothing and returns exit status 0 if the filename is in DOS format, 1 if not.

As an exercise please modify our ren script to incorporate so that the argument list may contain wild cards.

New (C-style) for loop

In bash (since version 2.04) new form of for syntax, one that looks a lot like C Language, but with double parentheses, was introduced:

for (( i=0 ; i < 10 ; i++ )) ; do echo $i ; done

Its more general form can be described as:

for (( expr1 ; expr2 ; expr3 )) ; do list ; done

The use of double parentheses indicates that expressions can use syntax of ((..)) construct.

Here is example how to shutdown Windows XP after certain time (in the example below 10 min) interval using shutdown command:

for (( i = 1; i <= 10; i++ )) ; do
    sleep 60
done
shutdown -s

Several iteration counters can be used. For example:

for (( i=0, j=10 ; i < 10 ; i++, j-- ))
do
    echo $i $j
done

That for loop initializes two variables (i and j), then has a more complex second expression adding the two together before doing the less-than comparison. The comma operator is used again in the third expression to increment both variables.

Like "while true" C-style for loop can be used for creation of "infinite loops", loops which end (break) due to Ctrl-C entered or due to execution of some internal statement that passes control outside the loop, not because of the condition specified in loop header. For example

for (( ; ; ))
do
   sleep 60
   echo `date` "[Hit CTRL+C to stop]"
done

Examples of for loops
from Advanced BASH programming Guide

#!/bin/bash
# Listing the planets.

for planet in Mercury Venus Earth Mars Jupiter Saturn Uranus Neptune Pluto
do
  echo $planet  # Each planet on a separate line.
done

echo

for planet in "Mercury Venus Earth Mars Jupiter Saturn Uranus Neptune Pluto"
# All planets constitute a single argument and will be printed on same line.
do
  echo $planet
done

exit 0

Each element in double or single quotes can contain multiple words. Those words can besplit in the inner loop. See Example 11-14) to force parsing of each [list] element and assignment of each component to the positional parameters.

for loop with two parameters in each [list] element

#!/bin/bash
# Planets revisited.

# Associate the name of each planet with its distance from the sun.

for planet in "Mercury 36" "Venus 67" "Earth 93"  "Mars 142" "Jupiter 483"
do
  set -- $planet  # Parses variable "planet" and sets positional parameters.
                  # the "--" prevents nasty surprises if $planet is null or begins with a dash.

  # May need to save original positional parameters, since they get overwritten.
  # One way of doing this is to use an array,
  #        original_params=("$@")

  echo "$1		$2,000,000 miles from the sun"
  #-------two  tabs---concatenate zeroes onto parameter $2
done

# (Thanks, S.C., for additional clarification.)

exit 0

A variable may supply the [list] in a for loop.

Fileinfo: operating on a file list contained in a variable

#!/bin/bash
# fileinfo.sh

FILES="/usr/sbin/privatepw
/usr/sbin/pwck
/usr/sbin/go500gw
/usr/bin/fakefile
/sbin/mkreiserfs
/sbin/ypbind"     # List of files you are curious about.
                  # Threw in a dummy file, /usr/bin/fakefile.

echo

for file in $FILES
do

  if [ ! -e "$file" ]       # Check if file exists.
  then
    echo "$file does not exist."; echo
    continue                # On to next.
   fi

  ls -l $file | awk '{ print $9 "         file size: " $5 }'  # Print 2 fields.
  whatis `basename $file`   # File info.
  echo
done  

exit 0

The [list] in a for loop may contain filename globbing, that is, using wildcards for filename expansion.

Operating on files with a for loop

#!/bin/bash
# list-glob.sh: Generating [list] in a for-loop using "globbing".

echo

for file in *
do
  ls -l "$file"  # Lists all files in $PWD (current directory).
  # Recall that the wild card character "*" matches every filename,
  # however, in "globbing", it doesn't match dot-files.

  # If the pattern matches no file, it is expanded to itself.
  # To prevent this, set the nullglob option
  # (shopt -s nullglob).
  # Thanks, S.C.
done

echo; echo

for file in [jx]*
do
  rm -f $file    # Removes only files beginning with "j" or "x" in $PWD.
  echo "Removed file \"$file\"".
done

echo

exit 0

Omitting the in [list] part of a for loop causes the loop to operate on $@, the list of arguments given on the command line to the script.

A particularly clever illustration of this is Example A-17.

Missing in [list] in a for loop

#!/bin/bash

#  Invoke this script both with and without arguments,
#+ and see what happens.

for a
do
 echo -n "$a "
done

#  The 'in list' missing, therefore the loop operates on '$@'
#+ (command-line argument list, including whitespace).

echo

exit 0

It is possible to use command substitution to generate the [list] in a for loop.

Generating the [list] in a for loop with command substitution

#!/bin/bash
#  for-loopcmd.sh: for-loop with [list]
#+ generated by command substitution.

NUMBERS="9 7 3 8 37.53"

for number in `echo $NUMBERS`  # for number in 9 7 3 8 37.53
do
  echo -n "$number "
done

echo 
exit 0

This is a somewhat more complex example of using command substitution to create the [list].

A grep replacement for binary files

#!/bin/bash
# bin-grep.sh: Locates matching strings in a binary file.

# A "grep" replacement for binary files.
# Similar effect to "grep -a"

E_BADARGS=65
E_NOFILE=66

if [ $# -ne 2 ]
then
  echo "Usage: `basename $0` search_string filename"
  exit $E_BADARGS
fi

if [ ! -f "$2" ]
then
  echo "File \"$2\" does not exist."
  exit $E_NOFILE
fi 

IFS="\n"         # Per suggestion of Paulo Marcel Coelho Aragao.
for word in $( strings "$2" | grep "$1" )
# The "strings" command lists strings in binary files.
# Output then piped to "grep", which tests for desired string.
do
  echo $word
done

# As S.C. points out, lines 23 - 29 could be replaced with the simpler
#    strings "$2" | grep "$1" | tr -s "$IFS" '[\n*]'


# Try something like  "./bin-grep.sh mem /bin/ls"  to exercise this script.

exit 0

More of the same.

Listing all users on the system

#!/bin/bash
# userlist.sh

PASSWORD_FILE=/etc/passwd
n=1           # User number

for name in $(awk 'BEGIN{FS=":"}{print $1}' < "$PASSWORD_FILE" )
# Field separator = :    ^^^^^^
# Print first field              ^^^^^^^^
# Get input from password file               ^^^^^^^^^^^^^^^^^
do
  echo "USER #$n = $name"
  let "n += 1"
done  


# USER #1 = root
# USER #2 = bin
# USER #3 = daemon
# ...
# USER #30 = bozo

exit 0

A final example of the [list] resulting from command substitution.

Checking all the binaries in a directory for authorship

#!/bin/bash
# findstring.sh:
# Find a particular string in binaries in a specified directory.

directory=/usr/bin/
fstring="Free Software Foundation"  # See which files come from the FSF.

for file in $( find $directory -type f -name '*' | sort )
do
  strings -f $file | grep "$fstring" | sed -e "s%$directory%%"
  #  In the "sed" expression,
  #+ it is necessary to substitute for the normal "/" delimiter
  #+ because "/" happens to be one of the characters filtered out.
  #  Failure to do so gives an error message (try it).
done  

exit 0

#  Exercise (easy):
#  ---------------
#  Convert this script to taking command-line parameters
#+ for $directory and $fstring.

The output of a for loop may be piped to a command or commands.

Listing the symbolic links in a directory

#!/bin/bash
# symlinks.sh: Lists symbolic links in a directory.


directory=${1-`pwd`}
#  Defaults to current working directory,
#+ if not otherwise specified.
#  Equivalent to code block below.
# ----------------------------------------------------------
# ARGS=1                 # Expect one command-line argument.
#
# if [ $# -ne "$ARGS" ]  # If not 1 arg...
# then
#   directory=`pwd`      # current working directory
# else
#   directory=$1
# fi
# ----------------------------------------------------------

echo "symbolic links in directory \"$directory\""

for file in "$( find $directory -type l )"   # -type l = symbolic links
do
  echo "$file"
done | sort                                  # Otherwise file list is unsorted.
#  Strictly speaking, a loop isn't really necessary here,
#+ since the output of the "find" command is expanded into a single word.
#  However, it's easy to understand and illustrative this way.

#  As Dominik 'Aeneas' Schnitzer points out,
#+ failing to quote  $( find $directory -type l )
#+ will choke on filenames with embedded whitespace.
#  Even this will only pick up the first field of each argument.

exit 0


# Jean Helou proposes the following alternative:

echo "symbolic links in directory \"$directory\""
# Backup of the current IFS. One can never be too cautious.
OLDIFS=$IFS
IFS=:

for file in $(find $directory -type l -printf "%p$IFS")
do     #                              ^^^^^^^^^^^^^^^^
       echo "$file"
       done|sort

The stdout of a loop may be redirected to a file, as this slight modification to the previous example shows.

Symbolic links in a directory, saved to a file

#!/bin/bash
# symlinks.sh: Lists symbolic links in a directory.

OUTFILE=symlinks.list                         # save file

directory=${1-`pwd`}
#  Defaults to current working directory,
#+ if not otherwise specified.


echo "symbolic links in directory \"$directory\"" > "$OUTFILE"
echo "---------------------------" >> "$OUTFILE"

for file in "$( find $directory -type l )"    # -type l = symbolic links
do
  echo "$file"
done | sort >> "$OUTFILE"                     # stdout of loop
#           ^^^^^^^^^^^^^                       redirected to save file.

exit 0

There is an alternative syntax to a for loop that will look very familiar to C programmers. This requires double parentheses.

A C-like for loop

#!/bin/bash
# Two ways to count up to 10.

echo

# Standard syntax.
for a in 1 2 3 4 5 6 7 8 9 10
do
  echo -n "$a "
done  

echo; echo

# +==========================================+

# Now, let's do the same, using C-like syntax.

LIMIT=10

for ((a=1; a <= LIMIT ; a++))  # Double parentheses, and "LIMIT" with no "$".
do
  echo -n "$a "
done                           # A construct borrowed from 'ksh93'.

echo; echo

# +=========================================================================+

# Let's use the C "comma operator" to increment two variables simultaneously.

for ((a=1, b=1; a <= LIMIT ; a++, b++))  # The comma chains together operations.
do
  echo -n "$a-$b "
done

echo; echo

exit 0

---

Now, a for-loop used in a "real-life" context.


Top Visited
Switchboard
Latest
Past week
Past month

NEWS CONTENTS

Old News ;-)

[Sep 01, 2017] linux - Looping through the content of a file in Bash - Stack Overflow

Notable quotes:
"... done <<< "$(...)" ..."
Sep 01, 2017 | stackoverflow.com
down vote favorite 234

Peter Mortensen , asked Oct 5 '09 at 17:52

How do I iterate through each line of a text file with Bash ?

With this script

echo "Start!"
for p in (peptides.txt)
do
    echo "${p}"
done

I get this output on the screen:

Start!
./runPep.sh: line 3: syntax error near unexpected token `('
./runPep.sh: line 3: `for p in (peptides.txt)'

(Later I want to do something more complicated with $p than just output to the screen.)


The environment variable SHELL is (from env):

SHELL=/bin/bash

/bin/bash --version output:

GNU bash, version 3.1.17(1)-release (x86_64-suse-linux-gnu)
Copyright (C) 2005 Free Software Foundation, Inc.

cat /proc/version output:

Linux version 2.6.18.2-34-default (geeko@buildhost) (gcc version 4.1.2 20061115 (prerelease) (SUSE Linux)) #1 SMP Mon Nov 27 11:46:27 UTC 2006

The file peptides.txt contains:

RKEKNVQ
IPKKLLQK
QYFHQLEKMNVK
IPKKLLQK
GDLSTALEVAIDCYEK
QYFHQLEKMNVKIPENIYR
RKEKNVQ
VLAKHGKLQDAIN
ILGFMK
LEDVALQILL

Bruno De Fraine , answered Oct 5 '09 at 18:00

One way to do it is:
while read p; do
  echo $p
done <peptides.txt

Exceptionally, if the loop body may read from standard input , you can open the file using a different file descriptor:

while read -u 10 p; do
  ...
done 10<peptides.txt

Here, 10 is just an arbitrary number (different from 0, 1, 2).

Warren Young , answered Oct 5 '09 at 17:54

cat peptides.txt | while read line
do
   # do something with $line here
done

Stan Graves , answered Oct 5 '09 at 18:18

Option 1a: While loop: Single line at a time: Input redirection
#!/bin/bash
filename='peptides.txt'
echo Start
while read p; do 
    echo $p
done < $filename

Option 1b: While loop: Single line at a time:
Open the file, read from a file descriptor (in this case file descriptor #4).

#!/bin/bash
filename='peptides.txt'
exec 4<$filename
echo Start
while read -u4 p ; do
    echo $p
done

Option 2: For loop: Read file into single variable and parse.
This syntax will parse "lines" based on any white space between the tokens. This still works because the given input file lines are single work tokens. If there were more than one token per line, then this method would not work as well. Also, reading the full file into a single variable is not a good strategy for large files.

#!/bin/bash
filename='peptides.txt'
filelines=`cat $filename`
echo Start
for line in $filelines ; do
    echo $line
done

mightypile , answered Oct 4 '13 at 13:30

This is no better than other answers, but is one more way to get the job done in a file without spaces (see comments). I find that I often need one-liners to dig through lists in text files without the extra step of using separate script files.
for word in $(cat peptides.txt); do echo $word; done

This format allows me to put it all in one command-line. Change the "echo $word" portion to whatever you want and you can issue multiple commands separated by semicolons. The following example uses the file's contents as arguments into two other scripts you may have written.

for word in $(cat peptides.txt); do cmd_a.sh $word; cmd_b.py $word; done

Or if you intend to use this like a stream editor (learn sed) you can dump the output to another file as follows.

for word in $(cat peptides.txt); do cmd_a.sh $word; cmd_b.py $word; done > outfile.txt

I've used these as written above because I have used text files where I've created them with one word per line. (See comments) If you have spaces that you don't want splitting your words/lines, it gets a little uglier, but the same command still works as follows:

OLDIFS=$IFS; IFS=$'\n'; for line in $(cat peptides.txt); do cmd_a.sh $line; cmd_b.py $line; done > outfile.txt; IFS=$OLDIFS

This just tells the shell to split on newlines only, not spaces, then returns the environment back to what it was previously. At this point, you may want to consider putting it all into a shell script rather than squeezing it all into a single line, though.

Best of luck!

Jahid , answered Jun 9 '15 at 15:09

Use a while loop, like this:
while IFS= read -r line; do
   echo "$line"
done <file

Notes:

  1. If you don't set the IFS properly, you will lose indentation.
  2. You should almost always use the -r option with read.
  3. Don't read lines with for

codeforester , answered Jan 14 at 3:30

A few more things not covered by other answers: Reading from a delimited file
# ':' is the delimiter here, and there are three fields on each line in the file
# IFS set below is restricted to the context of `read`, it doesn't affect any other code
while IFS=: read -r field1 field2 field3; do
  # process the fields
  # if the line has less than three fields, the missing fields will be set to an empty string
  # if the line has more than three fields, `field3` will get all the values, including the third field plus the delimiter(s)
done < input.txt
Reading from more than one file at a time
while read -u 3 -r line1 && read -u 4 -r line2; do
  # process the lines
  # note that the loop will end when we reach EOF on either of the files, because of the `&&`
done 3< input1.txt 4< input2.txt
Reading a whole file into an array (Bash version 4+)
readarray -t my_array < my_file

or

mapfile -t my_array < my_file

And then

for line in "${my_array[@]}"; do
  # process the lines
done

Anjul Sharma , answered Mar 8 '16 at 16:10

If you don't want your read to be broken by newline character, use -
#!/bin/bash
while IFS='' read -r line || [[ -n "$line" ]]; do
    echo "$line"
done < "$1"

Then run the script with file name as parameter.

Sine , answered Nov 14 '13 at 14:23

#!/bin/bash
#
# Change the file name from "test" to desired input file 
# (The comments in bash are prefixed with #'s)
for x in $(cat test.txt)
do
    echo $x
done

dawg , answered Feb 3 '16 at 19:15

Suppose you have this file:
$ cat /tmp/test.txt
Line 1
    Line 2 has leading space
Line 3 followed by blank line

Line 5 (follows a blank line) and has trailing space    
Line 6 has no ending CR

There are four elements that will alter the meaning of the file output read by many Bash solutions:

  1. The blank line 4;
  2. Leading or trailing spaces on two lines;
  3. Maintaining the meaning of individual lines (i.e., each line is a record);
  4. The line 6 not terminated with a CR.

If you want the text file line by line including blank lines and terminating lines without CR, you must use a while loop and you must have an alternate test for the final line.

Here are the methods that may change the file (in comparison to what cat returns):

1) Lose the last line and leading and trailing spaces:

$ while read -r p; do printf "%s\n" "'$p'"; done </tmp/test.txt
'Line 1'
'Line 2 has leading space'
'Line 3 followed by blank line'
''
'Line 5 (follows a blank line) and has trailing space'

(If you do while IFS= read -r p; do printf "%s\n" "'$p'"; done </tmp/test.txt instead, you preserve the leading and trailing spaces but still lose the last line if it is not terminated with CR)

2) Using process substitution with cat will reads the entire file in one gulp and loses the meaning of individual lines:

$ for p in "$(cat /tmp/test.txt)"; do printf "%s\n" "'$p'"; done
'Line 1
    Line 2 has leading space
Line 3 followed by blank line

Line 5 (follows a blank line) and has trailing space    
Line 6 has no ending CR'

(If you remove the " from $(cat /tmp/test.txt) you read the file word by word rather than one gulp. Also probably not what is intended...)


The most robust and simplest way to read a file line-by-line and preserve all spacing is:

$ while IFS= read -r line || [[ -n $line ]]; do printf "'%s'\n" "$line"; done </tmp/test.txt
'Line 1'
'    Line 2 has leading space'
'Line 3 followed by blank line'
''
'Line 5 (follows a blank line) and has trailing space    '
'Line 6 has no ending CR'

If you want to strip leading and trading spaces, remove the IFS= part:

$ while read -r line || [[ -n $line ]]; do printf "'%s'\n" "$line"; done </tmp/test.txt
'Line 1'
'Line 2 has leading space'
'Line 3 followed by blank line'
''
'Line 5 (follows a blank line) and has trailing space'
'Line 6 has no ending CR'

(A text file without a terminating \n , while fairly common, is considered broken under POSIX. If you can count on the trailing \n you do not need || [[ -n $line ]] in the while loop.)

More at the BASH FAQ

,

Here is my real life example how to loop lines of another program output, check for substrings, drop double quotes from variable, use that variable outside of the loop. I guess quite many is asking these questions sooner or later.
##Parse FPS from first video stream, drop quotes from fps variable
## streams.stream.0.codec_type="video"
## streams.stream.0.r_frame_rate="24000/1001"
## streams.stream.0.avg_frame_rate="24000/1001"
FPS=unknown
while read -r line; do
  if [[ $FPS == "unknown" ]] && [[ $line == *".codec_type=\"video\""* ]]; then
    echo ParseFPS $line
    FPS=parse
  fi
  if [[ $FPS == "parse" ]] && [[ $line == *".r_frame_rate="* ]]; then
    echo ParseFPS $line
    FPS=${line##*=}
    FPS="${FPS%\"}"
    FPS="${FPS#\"}"
  fi
done <<< "$(ffprobe -v quiet -print_format flat -show_format -show_streams -i "$input")"
if [ "$FPS" == "unknown" ] || [ "$FPS" == "parse" ]; then 
  echo ParseFPS Unknown frame rate
fi
echo Found $FPS

Declare variable outside of the loop, set value and use it outside of loop requires done <<< "$(...)" syntax. Application need to be run within a context of current console. Quotes around the command keeps newlines of output stream.

Loop match for substrings then reads name=value pair, splits right-side part of last = character, drops first quote, drops last quote, we have a clean value to be used elsewhere.

[Jan 14, 2017] Bash For Loop Examples by Vivek Gite

October 31, 2008 (last updated November 13, 2016)
How do I use bash for loop to repeat certain task under Linux / UNIX operating system? How do I set infinite loops using for statement? How do I use three-parameter for loop control expression?

A 'for loop' is a bash programming language statement which allows code to be repeatedly executed. A for loop is classified as an iteration statement i.e. it is the repetition of a process within a bash script.

For example, you can run UNIX command or task 5 times or read and process list of files using a for loop. A for loop can be used at a shell prompt or within a shell script itself.

for loop syntax

Numeric ranges for syntax is as follows:

for VARIABLE in 1 2 3 4 5 .. N
do
	command1
	command2
	commandN
done

for VARIABLE in 1 2 3 4 5 .. N do command1 command2 commandN done

OR

for VARIABLE in file1 file2 file3
do
	command1 on $VARIABLE
	command2
	commandN
done

for VARIABLE in file1 file2 file3 do command1 on $VARIABLE command2 commandN done

OR

for OUTPUT in $(Linux-Or-Unix-Command-Here)
do
	command1 on $OUTPUT
	command2 on $OUTPUT
	commandN
done

for OUTPUT in $(Linux-Or-Unix-Command-Here) do command1 on $OUTPUT command2 on $OUTPUT commandN done

Examples

This type of for loop is characterized by counting. The range is specified by a beginning (#1) and ending number (#5). The for loop executes a sequence of commands for each member in a list of items. A representative example in BASH is as follows to display welcome message 5 times with for loop:

#!/bin/bash
for i in 1 2 3 4 5
do
   echo "Welcome $i times"
done

#!/bin/bash for i in 1 2 3 4 5 do echo "Welcome $i times" done

Sometimes you may need to set a step value (allowing one to count by two's or to count backwards for instance). Latest bash version 3.0+ has inbuilt support for setting up ranges:

#!/bin/bash
for i in {1..5}
do
   echo "Welcome $i times"
done

#!/bin/bash for i in {1..5} do echo "Welcome $i times" done

Bash v4.0+ has built-in support for setting up a step value using {START..END..INCREMENT} syntax:

#!/bin/bash
echo "Bash version ${BASH_VERSION}..."
for i in {0..10..2}
  do 
     echo "Welcome $i times"
 done

#!/bin/bash echo "Bash version ${BASH_VERSION}..." for i in {0..10..2} do echo "Welcome $i times" done

Sample outputs:

Bash version 4.0.33(0)-release...
Welcome 0 times
Welcome 2 times
Welcome 4 times
Welcome 6 times
Welcome 8 times
Welcome 10 times

The seq command (outdated)

WARNING! The seq command print a sequence of numbers and it is here due to historical reasons. The following examples is only recommend for older bash version. All users (bash v3.x+) are recommended to use the above syntax.

The seq command can be used as follows. A representative example in seq is as follows:

#!/bin/bash
for i in $(seq 1 2 20)
do
   echo "Welcome $i times"
done

#!/bin/bash for i in $(seq 1 2 20) do echo "Welcome $i times" done

There is no good reason to use an external command such as seq to count and increment numbers in the for loop, hence it is recommend that you avoid using seq. The builtin command are fast.

Three-expression bash for loops syntax

This type of for loop share a common heritage with the C programming language. It is characterized by a three-parameter loop control expression; consisting of an initializer (EXP1), a loop-test or condition (EXP2), and a counting expression (EXP3).

for (( EXP1; EXP2; EXP3 ))
do
	command1
	command2
	command3
done

for (( EXP1; EXP2; EXP3 )) do command1 command2 command3 done

A representative three-expression example in bash as follows:

#!/bin/bash
for (( c=1; c<=5; c++ ))
do  
   echo "Welcome $c times"
done

#!/bin/bash for (( c=1; c<=5; c++ )) do echo "Welcome $c times" done

Sample output:

Welcome 1 times
Welcome 2 times
Welcome 3 times
Welcome 4 times
Welcome 5 times

How do I use for as infinite loops?

Infinite for loop can be created with empty expressions, such as:

#!/bin/bash
for (( ; ; ))
do
   echo "infinite loops [ hit CTRL+C to stop]"
done

#!/bin/bash for (( ; ; )) do echo "infinite loops [ hit CTRL+C to stop]" done

Conditional exit with break

You can do early exit with break statement inside the for loop. You can exit from within a FOR, WHILE or UNTIL loop using break. General break statement inside the for loop:

for I in 1 2 3 4 5
do
  statements1      #Executed for all values of ''I'', up to a disaster-condition if any.
  statements2
  if (disaster-condition)
  then
	break       	   #Abandon the loop.
  fi
  statements3          #While good and, no disaster-condition.
done

for I in 1 2 3 4 5 do statements1 #Executed for all values of ''I'', up to a disaster-condition if any. statements2 if (disaster-condition) then break #Abandon the loop. fi statements3 #While good and, no disaster-condition. done

Following shell script will go though all files stored in /etc directory. The for loop will be abandon when /etc/resolv.conf file found.

#!/bin/bash
for file in /etc/*
do
	if [ "${file}" == "/etc/resolv.conf" ]
	then
		countNameservers=$(grep -c nameserver /etc/resolv.conf)
		echo "Total  ${countNameservers} nameservers defined in ${file}"
		break
	fi
done

#!/bin/bash for file in /etc/* do if [ "${file}" == "/etc/resolv.conf" ] then countNameservers=$(grep -c nameserver /etc/resolv.conf) echo "Total ${countNameservers} nameservers defined in ${file}" break fi done

Early continuation with continue statement

To resume the next iteration of the enclosing FOR, WHILE or UNTIL loop use continue statement.

for I in 1 2 3 4 5
do
  statements1      #Executed for all values of ''I'', up to a disaster-condition if any.
  statements2
  if (condition)
  then
	continue   #Go to next iteration of I in the loop and skip statements3
  fi
  statements3
done

for I in 1 2 3 4 5 do statements1 #Executed for all values of ''I'', up to a disaster-condition if any. statements2 if (condition) then continue #Go to next iteration of I in the loop and skip statements3 fi statements3 done

This script make backup of all file names specified on command line. If .bak file exists, it will skip the cp command.

#!/bin/bash
FILES="$@"
for f in $FILES
do
        # if .bak backup file exists, read next file
	if [ -f ${f}.bak ]
	then
		echo "Skiping $f file..."
		continue  # read next file and skip the cp command
	fi
        # we are here means no backup file exists, just use cp command to copy file
	/bin/cp $f $f.bak
done

[Dec 06, 2015] Bash For Loop Examples

A very nice tutorial by Vivek Gite (created October 31, 2008 last updated June 24, 2015). His mistake is putting new for loop too far inside the tutorial. It should emphazied, not hidden.
June 24, 2015 | cyberciti.biz

... ... ...

Bash v4.0+ has inbuilt support for setting up a step value using {START..END..INCREMENT} syntax:

#!/bin/bash
echo "Bash version ${BASH_VERSION}..."
for i in {0..10..2}
  do
     echo "Welcome $i times"
 done

Sample outputs:

Bash version 4.0.33(0)-release...
Welcome 0 times
Welcome 2 times
Welcome 4 times
Welcome 6 times
Welcome 8 times
Welcome 10 times

... ... ...

Three-expression bash for loops syntax

This type of for loop share a common heritage with the C programming language. It is characterized by a three-parameter loop control expression; consisting of an initializer (EXP1), a loop-test or condition (EXP2), and a counting expression (EXP3).

for (( EXP1; EXP2; EXP3 ))
do
	command1
	command2
	command3
done

A representative three-expression example in bash as follows:

#!/bin/bash
for (( c=1; c<=5; c++ ))
do
   echo "Welcome $c times"
done
... ... ...

Jadu Saikia, November 2, 2008, 3:37 pm

Nice one. All the examples are explained well, thanks Vivek.

seq 1 2 20
output can also be produced using jot

jot – 1 20 2

The infinite loops as everyone knows have the following alternatives.

while(true)
or
while :

//Jadu

Andi Reinbrech, November 18, 2010, 7:42 pm
I know this is an ancient thread, but thought this trick might be helpful to someone:

For the above example with all the cuts, simply do

set `echo $line`

This will split line into positional parameters and you can after the set simply say

F1=$1; F2=$2; F3=$3

I used this a lot many years ago on solaris with "set `date`", it neatly splits the whole date string into variables and saves lots of messy cutting :-)

… no, you can't change the FS, if it's not space, you can't use this method

Peko, July 16, 2009, 6:11 pm
Hi Vivek,
Thanks for this a useful topic.

IMNSHO, there may be something to modify here
=======================
Latest bash version 3.0+ has inbuilt support for setting up a step value:

#!/bin/bash
for i in {1..5}
=======================
1) The increment feature seems to belong to the version 4 of bash.
Reference: http://bash-hackers.org/wiki/doku.php/syntax/expansion/brace
Accordingly, my bash v3.2 does not include this feature.

BTW, where did you read that it was 3.0+ ?
(I ask because you may know some good website of interest on the subject).

2) The syntax is {from..to..step} where from, to, step are 3 integers.
You code is missing the increment.

Note that GNU Bash documentation may be bugged at this time,
because on GNU Bash manual, you will find the syntax {x..y[incr]}
which may be a typo. (missing the second ".." between y and increment).

see http://www.gnu.org/software/bash/manual/bashref.html#Brace-Expansion

The Bash Hackers page
again, see http://bash-hackers.org/wiki/doku.php/syntax/expansion/brace
seeems to be more accurate,
but who knows ? Anyway, at least one of them may be right… ;-)

Keep on the good work of your own,
Thanks a million.

- Peko

Michal Kaut July 22, 2009, 6:12 am
Hello,

is there a simple way to control the number formatting? I use several computers, some of which have non-US settings with comma as a decimal point. This means that
for x in $(seq 0 0.1 1) gives 0 0.1 0.2 … 1 one some machines and 0 0,1 0,2 … 1 on other.
Is there a way to force the first variant, regardless of the language settings? Can I, for example, set the keyboard to US inside the script? Or perhaps some alternative to $x that would convert commas to points?
(I am sending these as parameters to another code and it won't accept numbers with commas…)

The best thing I could think of is adding x=`echo $x | sed s/,/./` as a first line inside the loop, but there should be a better solution? (Interestingly, the sed command does not seem to be upset by me rewriting its variable.)

Thanks,
Michal

Peko July 22, 2009, 7:27 am

To Michal Kaut:

Hi Michal,

Such output format is configured through LOCALE settings.

I tried :

export LC_CTYPE="en_EN.UTF-8″; seq 0 0.1 1

and it works as desired.

You just have to find the exact value for LC_CTYPE that fits to your systems and your needs.

Peko

Peko July 22, 2009, 2:29 pm

To Michal Kaus [2]

Ooops – ;-)
Instead of LC_CTYPE,
LC_NUMERIC should be more appropriate
(Although LC_CTYPE is actually yielding to the same result – I tested both)

By the way, Vivek has already documented the matter : http://www.cyberciti.biz/tips/linux-find-supportable-character-sets.html

Philippe Petrinko October 30, 2009, 8:35 am

To Vivek:
Regarding your last example, that is : running a loop through arguments given to the script on the command line, there is a simplier way of doing this:
# instead of:
# FILES="$@"
# for f in $FILES

# use the following syntax
for arg
do
# whatever you need here – try : echo "$arg"
done

Of course, you can use any variable name, not only "arg".

Philippe Petrinko November 11, 2009, 11:25 am

To tdurden:

Why would'nt you use

1) either a [for] loop
for old in * ; do mv ${old} ${old}.new; done

2) Either the [rename] command ?
excerpt form "man rename" :

RENAME(1) Perl Programmers Reference Guide RENAME(1)

NAME
rename – renames multiple files

SYNOPSIS
rename [ -v ] [ -n ] [ -f ] perlexpr [ files ]

DESCRIPTION
"rename" renames the filenames supplied according to the rule specified
as the first argument. The perlexpr argument is a Perl expression
which is expected to modify the $_ string in Perl for at least some of
the filenames specified. If a given filename is not modified by the
expression, it will not be renamed. If no filenames are given on the
command line, filenames will be read via standard input.

For example, to rename all files matching "*.bak" to strip the
extension, you might say

rename 's/\.bak$//' *.bak

To translate uppercase names to lower, you'd use

rename 'y/A-Z/a-z/' *

- Philippe

Philippe Petrinko November 11, 2009, 9:27 pm

If you set the shell option extglob, Bash understands some more powerful patterns. Here, a is one or more pattern, separated by the pipe-symbol (|).

?() Matches zero or one occurrence of the given patterns
*() Matches zero or more occurrences of the given patterns
+() Matches one or more occurrences of the given patterns
@() Matches one of the given patterns
!() Matches anything except one of the given patterns

source: http://www.bash-hackers.org/wiki/doku.php/syntax/pattern

Philippe Petrinko November 12, 2009, 3:44 pm

To Sean:
Right, the more sharp a knife is, the easier it can cut your fingers…

I mean: There are side-effects to the use of file globbing (like in [ for f in * ] ) , when the globbing expression matches nothing: the globbing expression is not susbtitued.

Then you might want to consider using [ nullglob ] shell extension,
to prevent this.
see: http://www.bash-hackers.org/wiki/doku.php/syntax/expansion/globs#customization

Devil hides in detail ;-)

Dominic January 14, 2010, 10:04 am

There is an interesting difference between the exit value for two different for looping structures (hope this comes out right):
for (( c=1; c<=2; c++ )) do echo -n "inside (( )) loop c is $c, "; done; echo "done (( )) loop c is $c"
for c in {1..2}; do echo -n "inside { } loop c is $c, "; done; echo "done { } loop c is $c"

You see that the first structure does a final increment of c, the second does not. The first is more useful IMO because if you have a conditional break in the for loop, then you can subsequently test the value of $c to see if the for loop was broken or not; with the second structure you can't know whether the loop was broken on the last iteration or continued to completion.

Dominic January 14, 2010, 10:09 am

sorry, my previous post would have been clearer if I had shown the output of my code snippet, which is:
inside (( )) loop c is 1, inside (( )) loop c is 2, done (( )) loop c is 3
inside { } loop c is 1, inside { } loop c is 2, done { } loop c is 2

Philippe Petrinko March 9, 2010, 2:34 pm

@Dmitry

And, again, as stated many times up there, using [seq] is counter productive, because it requires a call to an external program, when you should Keep It Short and Simple, using only bash internals functions:


for ((c=1; c<21; c+=2)); do echo "Welcome $c times" ; done

(and I wonder why Vivek is sticking to that old solution which should be presented only for historical reasons when there was no way of using bash internals.
By the way, this historical recall should be placed only at topic end, and not on top of the topic, which makes newbies sticking to the not-up-to-date technique ;-) )

Sean March 9, 2010, 11:15 pm

I have a comment to add about using the builtin for (( … )) syntax. I would agree the builtin method is cleaner, but from what I've noticed with other builtin functionality, I had to check the speed advantage for myself. I wrote the following files:

builtin_count.sh:

#!/bin/bash
for ((i=1;i<=1000000;i++))
do
echo "Output $i"
done

seq_count.sh:

#!/bin/bash
for i in $(seq 1 1000000)
do
echo "Output $i"
done

And here were the results that I got:
time ./builtin_count.sh
real 0m22.122s
user 0m18.329s
sys 0m3.166s

time ./seq_count.sh
real 0m19.590s
user 0m15.326s
sys 0m2.503s

The performance increase isn't too significant, especially when you are probably going to be doing something a little more interesting inside of the for loop, but it does show that builtin commands are not necessarily faster.

Andi Reinbrech November 18, 2010, 8:35 pm

The reason why the external seq is faster, is because it is executed only once, and returns a huge splurb of space separated integers which need no further processing, apart from the for loop advancing to the next one for the variable substitution.

The internal loop is a nice and clean/readable construct, but it has a lot of overhead. The check expression is re-evaluated on every iteration, and a variable on the interpreter's heap gets incremented, possibly checked for overflow etc. etc.

Note that the check expression cannot be simplified or internally optimised by the interpreter because the value may change inside the loop's body (yes, there are cases where you'd want to do this, however rare and stupid they may seem), hence the variables are volatile and get re-evaluted.

I.e. botom line, the internal one has more overhead, the "seq" version is equivalent to either having 1000000 integers inside the script (hard coded), or reading once from a text file with 1000000 integers with a cat. Point being that it gets executed only once and becomes static.

OK, blah blah fishpaste, past my bed time :-)

Cheers,
Andi

Anthony Thyssen June 4, 2010, 6:53 am

The {1..10} syntax is pretty useful as you can use a variable with it!

limit=10
echo {1..${limit}}
{1..10}

You need to eval it to get it to work!

limit=10
eval "echo {1..${limit}}"
1 2 3 4 5 6 7 8 9 10

'seq' is not avilable on ALL system (MacOSX for example)
and BASH is not available on all systems either.

You are better off either using the old while-expr method for computer compatiblity!

   limit=10; n=1;
   while [ $n -le 10 ]; do
     echo $n;
     n=`expr $n + 1`;
   done

Alternativally use a seq() function replacement…

 # seq_count 10
seq_count() {
  i=1; while [ $i -le $1 ]; do echo $i; i=`expr $i + 1`; done
}
# simple_seq 1 2 10
simple_seq() {
  i=$1; while [ $i -le $3 ]; do echo $i; i=`expr $i + $2`; done
}
seq_integer() {
    if [ "X$1" = "X-f" ]
    then format="$2"; shift; shift
    else format="%d"
    fi
    case $# in
    1) i=1 inc=1 end=$1 ;;
    2) i=$1 inc=1 end=$2 ;;
    *) i=$1 inc=$2 end=$3 ;;
    esac
    while [ $i -le $end ]; do
      printf "$format\n" $i;
      i=`expr $i + $inc`;
    done
  }

Edited: by Admin – added code tags.

TheBonsai June 4, 2010, 9:57 am

The Bash C-style for loop was taken from KSH93, thus I guess it's at least portable towards Korn and Z.

The seq-function above could use i=$((i + inc)), if only POSIX matters. expr is obsolete for those things, even in POSIX.

Philippe Petrinko June 4, 2010, 10:15 am

Right Bonsai,
( http://www.opengroup.org/onlinepubs/009695399/utilities/xcu_chap02.html#tag_02_06_04 )

But FOR C-style does not seem to be POSIXLY-correct…

Read on-line reference issue 6/2004,
Top is here, http://www.opengroup.org/onlinepubs/009695399/mindex.html

and the Shell and Utilities volume (XCU) T.OC. is here
http://www.opengroup.org/onlinepubs/009695399/utilities/toc.html
doc is:
http://www.opengroup.org/onlinepubs/009695399/basedefs/xbd_chap01.html

and FOR command:
http://www.opengroup.org/onlinepubs/009695399/utilities/xcu_chap02.html#tag_02_09_04_03

Anthony Thyssen June 6, 2010, 7:18 am

TheBonsai wrote…. "The seq-function above could use i=$((i + inc)), if only POSIX matters. expr is obsolete for those things, even in POSIX."

I am not certain it is in Posix. It was NOT part of the original Bourne Shell, and on some machines, I deal with Bourne Shell. Not Ksh, Bash, or anything else.

Bourne Shell syntax works everywhere! But as 'expr' is a builtin in more modern shells, then it is not a big loss or slow down.

This is especially important if writing a replacement command, such as for "seq" where you want your "just-paste-it-in" function to work as widely as possible.

I have been shell programming pretty well all the time since 1988, so I know what I am talking about! Believe me.

MacOSX has in this regard been the worse, and a very big backward step in UNIX compatibility. 2 year after it came out, its shell still did not even understand most of the normal 'test' functions. A major pain to write shells scripts that need to also work on this system.

TheBonsai June 6, 2010, 12:35 pm

Yea, the question was if it's POSIX, not if it's 100% portable (which is a difference). The POSIX base more or less is a subset of the Korn features (88, 93), pure Bourne is something "else", I know. Real portability, which means a program can go wherever UNIX went, only in C ;)

Philippe Petrinko November 22, 2010, 8:23 am

And if you want to get rid of double-quotes, use:

one-liner code:
while read; do record=${REPLY}; echo ${record}|while read -d ","; do field="${REPLY#\"}"; field="${field%\"}"; echo ${field}; done; done<data

script code, added of some text to better see record and field breakdown:

#!/bin/bash
while read
do
echo "New record"
record=${REPLY}
echo ${record}|while read -d ,
do
field="${REPLY#\"}"
field="${field%\"}"
echo "Field is :${field}:"
done
done<data

Does it work with your data?

- PP

Philippe Petrinko November 22, 2010, 9:01 am

Of course, all the above code was assuming that your CSV file is named "data".

If you want to use anyname with the script, replace:

done<data

With:

done

And then use your script file (named for instance "myScript") with standard input redirection:

myScript < anyFileNameYouWant

Enjoy!

Philippe Petrinko November 22, 2010, 11:28 am

well no there is a bug, last field of each record is not read – it needs a workout and may be IFS modification ! After all that's what it was built for… :O)

Anthony Thyssen November 22, 2010, 11:31 pm

Another bug is the inner loop is a pipeline, so you can't assign variables for use later in the script. but you can use '<<<' to break the pipeline and avoid the echo.

But this does not help when you have commas within the quotes! Which is why you needed quotes in the first place.

In any case It is a little off topic. Perhaps a new thread for reading CVS files in shell should be created.

Philippe Petrinko November 24, 2010, 6:29 pm

Anthony,
Would you try this one-liner script on your CSV file?

This one-liner assumes that CSV file named [data] has __every__ field double-quoted.


while read; do r="${REPLY#\"}";echo "${r//\",\"/\"}"|while read -d \";do echo "Field is :${REPLY}:";done;done<data

Here is the same code, but for a script file, not a one-liner tweak.


#!/bin/bash
# script csv01.sh
#
# 1) Usage
# This script reads from standard input
# any CSV with double-quoted data fields
# and breaks down each field on standard output
#
# 2) Within each record (line), _every_ field MUST:
# - Be surrounded by double quotes,
# - and be separated from preceeding field by a comma
# (not the first field of course, no comma before the first field)
#
while read
do
echo "New record" # this is not mandatory-just for explanation
#
#
# store REPLY and remove opening double quote
record="${REPLY#\"}"
#
#
# replace every "," by a single double quote
record=${record//\",\"/\"}
#
#
echo ${record}|while read -d \"
do
# store REPLY into variable "field"
field="${REPLY}"
#
#
echo "Field is :${field}:" # just for explanation
done
done

This script named here [cvs01.sh] must be used so:

cvs01.sh < my-cvs-file-with-doublequotes

Philippe Petrinko November 24, 2010, 6:35 pm

@Anthony,

By the way, using [REPLY] in the outer loop _and_ the inner loop is not a bug.
As long as you know what you do, this is not problem, you just have to store [REPLY] value conveniently, as this script shows.

TheBonsai March 8, 2011, 6:26 am
for ((i=1; i<=20; i++)); do printf "%02d\n" "$i"; done

nixCraft March 8, 2011, 6:37 am

+1 for printf due to portability, but you can use bashy .. syntax too

for i in {01..20}; do echo "$i"; done

TheBonsai March 8, 2011, 6:48 am

Well, it isn't portable per se, it makes it portable to pre-4 Bash versions.

I think a more or less "portable" (in terms of POSIX, at least) code would be

i=0
while [ "$((i >= 20))" -eq 0 ]; do
  printf "%02d\n" "$i"
  i=$((i+1))
done

Philip Ratzsch April 20, 2011, 5:53 am

I didn't see this in the article or any of the comments so I thought I'd share. While this is a contrived example, I find that nesting two groups can help squeeze a two-liner (once for each range) into a one-liner:

for num in {{1..10},{15..20}};do echo $num;done

Great reference article!

Philippe Petrinko April 20, 2011, 8:23 am

@Philip
Nice thing to think of, using brace nesting, thanks for sharing.

Philippe Petrinko May 6, 2011, 10:13 am

Hello Sanya,

That would be because brace expansion does not support variables. I have to check this.
Anyway, Keep It Short and Simple: (KISS) here is a simple solution I already gave above:

xstart=1;xend=10;xstep=1
for (( x = $xstart; x <= $xend; x += $xstep)); do echo $x;done

Actually, POSIX compliance allows to forget $ in for quotes, as said before, you could also write:

xstart=1;xend=10;xstep=1
for (( x = xstart; x <= xend; x += xstep)); do echo $x;done

Philippe Petrinko May 6, 2011, 10:48 am

Sanya,

Actually brace expansion happens __before__ $ parameter exapansion, so you cannot use it this way.

Nevertheless, you could overcome this this way:

max=10; for i in $(eval echo {1..$max}); do echo $i; done

Sanya May 6, 2011, 11:42 am

Hello, Philippe

Thanks for your suggestions
You basically confirmed my findings, that bash constructions are not as simple as zsh ones.
But since I don't care about POSIX compliance, and want to keep my scripts "readable" for less experienced people, I would prefer to stick to zsh where my simple for-loop works

Cheers, Sanya

Philippe Petrinko May 6, 2011, 12:07 pm

Sanya,

First, you got it wrong: solutions I gave are not related to POSIX, I just pointed out that POSIX allows not to use $ in for (( )), which is just a little bit more readable – sort of.

Second, why do you see this less readable than your [zsh] [for loop]?

for (( x = start; x <= end; x += step)) do
echo "Loop number ${x}"
done

It is clear that it is a loop, loop increments and limits are clear.

IMNSHO, if anyone cannot read this right, he should not be allowed to code. :-D

BFN

Anthony Thyssen May 8, 2011, 11:30 pm

If you are going to do… $(eval echo {1..$max});
You may as well use "seq" or one of the many other forms.
See all the other comments on doing for loops.

Tom P May 19, 2011, 12:16 pm

I am trying to use the variable I set in the for line on to set another variable with a different extension. Couldn't get this to work and couldnt find it anywhere on the web… Can someone help.

Example:

FILE_TOKEN=`cat /tmp/All_Tokens.txt`
for token in $FILE_TOKEN
do
A1_$token=`grep $A1_token /file/path/file.txt | cut -d ":" -f2`

my goal is to take the values from the ALL Tokens file and set a new variable with A1_ infront of it… This tells be that A1_ is not a command…

[May 02, 2015] ArithmeticExpression

2014-02-13 | Greg's Wiki

Arithmetic in BASH is integer math only. You can't do floating point math in Bash; if you need that capability, see Bash FAQ #22.

Also see the Bash hackers article about the full syntax theory. /!\ The $[ ] syntax is deprecated

There are several ways to tell Bash to treat numbers as integers instead of strings, and to do basic arithmetic operations on them. The first is to use the let command:

let a=17+23
echo "a = $a"      # Prints a = 40

Note that each arithmetic expression has to be passed as a single argument to the let command, so you need quotes if there are spaces or globbing characters, thus:

let a=17 + 23      # WRONG
let a="17 + 23"    # Right
lBash as an Enterprise-level Shellet 'a = 17 + 23'  # Right
let a=17 a+=23     # Right (2 arithmetic expressions)

let a[1]=1+1       # Wrong (try after touch a1=1+1 or with shopt -s failglob)
let 'a[1]=1+1'     # Right
let a\[1]=1+1      # Right

Division in Bash is integer division, and it truncates the results, just as in C:

let a=28/6
echo "a = $a"      # Prints a = 4

In addition to the let command, one may use the (( )) syntax to enforce an arithmetic context. If there is a $ (dollar sign) before the parentheses, then a substitution is performed (more on this below). White space is allowed inside (( )) with much greater leniency than with let, and variables inside (( )) don't require $ (because string literals aren't allowed). Examples:

((a=$a+7))         # Add 7 to a
((a = a + 7))      # Add 7 to a.  Identical to the previous command.
((a += 7))         # Add 7 to a.  Identical to the previous command.

((a = RANDOM % 10 + 1))     # Choose a random number from 1 to 10.
                            # % is modulus, as in C.

# (( )) may also be used as a command.  > or < inside (( )) means
# greater/less than, not output/input redirection.
if ((a > 5)); then echo "a is more than 5"; fi

(( )) without the leading $ is not a standard sh feature. It comes from ksh and is only available in ksh, Bash and zsh. $(( )) substitution is allowed in the POSIX shell. As one would expect, the result of the arithmetic expression inside the $(( )) is substituted into the original command. Like for parameter substitution, arithmetic substitution is subject to word splitting so should be quoted to prevent it when in list contexts. Here are some examples of the use of the arithmetic substitution syntax:

a=$((a+7))         # POSIX-compatible version of previous code.
if test "$((a%4))" = 0; then ...
lvcreate -L "$((4*1096))" -n lvname vgname   # Actual HP-UX example.

Variables may be declared as integers so that any subsequent assignments to them will always assume a numeric context. Essentially any variable that's declared as an integer acts as if you had a let command in front of it when you assign to it. For example:

unset b             # Forget any previous declarations
b=7+5; echo "$b"    # Prints 7+5
declare -i b        # Declare b as an integer
b=7+5; echo "$b"    # Prints 12

Also, array indices are a numeric context:

n=0
while read line; do
   array[n++]=$line      # array[] forces a numeric context
done

There is one common pitfall with arithmetic expressions in Bash: numbers with leading zeroes are treated as octal. For example,

# Suppose today is September 19th.
month=$(date +%m)
next_month=$(( (month == 12) ? 1 : month+1 ))
# bash: 09: value too great for base (error token is "09")

This causes great confusion among people who are extracting zero-padded numbers from various sources (although dates are by far the most common) and then doing math on them without sanitizing them first. (It's especially bad if you write a program like this in March, test it, roll it out... and then it doesn't blow up until August 1.)

If you have leading-zero problems with Bash's built-in arithmetic, there are two possible solutions. The first is, obviously, to remove the leading zeroes from the numbers before doing math with them. This is not trivial in Bash, unfortunately, because Bash has no ability to perform substitutions on a variable using regular expressions (it can only do it with "glob" patterns). But you could use a loop:

# This removes leading zeroes from a, one at a time.
while [[ $a = 0* ]]; do a=${a#0}; done

You can do the above without using a loop, by using extended globs; see FAQ #67 for more information. Or, you could use sed; that may be more efficient if you're reading many numbers from a stream, and can arrange to sanitize them all in one command, rather than one by one.

Without a loop:

# This removes leading zeroes from a, all at once.
a=${a##+(0)}

The third solution is to force Bash to treat all numbers as base 10 by prefixing them with 10#. This might be more efficient, but also may be less elegant to read.

a=008
let b=a+1       # Generates an error because 008 is not valid in octal.
let b=10#$a+1   # Force a to be treated as base 10.  Note: the $ is required.

Finally, a note on the exit status of commands, and the notions of "true" and "false", is in order. When bash runs a command, that command will return an exit status from 0 to 255. 0 is considered "success" (which is "true" when used in the context of an if or while command). However, in an arithmetic context, there are places where the C language rules (0 is false, anything else is true) apply.

Some examples:

true; echo "$?"       # Writes 0, because a successful command returns 0.
((10 > 6)); echo "$?" # Also 0.  An arithmetic command returns 0 for true.
echo "$((10 > 6))"    # Writes 1.  An arithmetic expression returns 1 for true.

In addition to a comparison returning 1 for true, an arithmetic expression that evaluates to a non-zero value is also true in the sense of a command.

if ((1)); then echo true; fi     # Writes true.

This also lets you use "flag" variables, just like in a C program:

found=0
while ...; do
   ...
  if something; then found=1; fi    # Found one!  Keep going.
   ...
done
if ((found)); then ...

Here is a function to convert numbers in other bases to decimal (base 10):

todec() {
    echo "$(( $1#$2 ))"
}

Examples:

todec 16 ffe    # -> 4094
todec 2 100100  # -> 36

[Sep 10, 2010] bash iterator trick

The UNIX Blog

A neat little feature I never new existed in bash is being able to iterate over a sequence of number in a more or less C-esque manner. Coming from Bourne/Korn shell background creating an elegant iterator is always a slight nuisance, since you would come up with something like this to iterate over a sequence of numbers:

i=1;
while [ $i -lt 10 ]; do
i=`expr $i + 1`;
.....
done

Well, not exactly the most elegant solution. With bash on the other hand it can be done as simple as:

for((i=1; $i<10; i++)); do
....
done

Simple and to the point.

Bash by example, Part 2

The standard "for" loop in bash is pretty idiosyncratic
OK, we've covered conditionals, now it's time to explore bash looping constructs. We'll start with the standard "for" loop. Here's a basic example:
#!/usr/bin/env bash

for x in one two three four
do
    echo number $x
done

output:
number one
number two 
number three 
number four

What exactly happened? The "for x" part of our "for" loop defined a new environment variable (also called a loop control variable) called "$x", which was successively set to the values "one", "two", "three", and "four". After each assignment, the body of the loop (the code between the "do" ... "done") was executed once. In the body, we referred to the loop control variable "$x" using standard variable expansion syntax, like any other environment variable. Also notice that "for" loops always accept some kind of word list after the "in" statement. In this case we specified four English words, but the word list can also refer to file(s) on disk or even file wildcards. Look at the following example, which demonstrates how to use standard shell wildcards:

#!/usr/bin/env bash

for myfile in /etc/r*
do
    if [ -d "$myfile" ] 
    then
      echo "$myfile (dir)"
    else
      echo "$myfile"
    fi
done

output:

/etc/rc.d (dir)
/etc/resolv.conf
/etc/resolv.conf~
/etc/rpc  

The above code looped over each file in /etc that began with an "r". To do this, bash first took our wildcard /etc/r* and expanded it, replacing it with the string /etc/rc.d /etc/resolv.conf /etc/resolv.conf~ /etc/rpc before executing the loop. Once inside the loop, the "-d" conditional operator was used to perform two different actions, depending on whether myfile was a directory or not. If it was, a " (dir)" was appended to the output line.

We can also use multiple wildcards and even environment variables in the word list:

for x in /etc/r??? /var/lo* /home/drobbins/mystuff/* /tmp/${MYPATH}/*
do
    cp $x /mnt/mydir
done

Bash will perform wildcard and variable expansion in all the right places, and potentially create a very long word list.

While all of our wildcard expansion examples have used absolute paths, you can also use relative paths, as follows:

for x in ../* mystuff/*
do
    echo $x is a silly file
done

In the above example, bash performs wildcard expansion relative to the current working directory, just like when you use relative paths on the command line. Play around with wildcard expansion a bit. You'll notice that if you use absolute paths in your wildcard, bash will expand the wildcard to a list of absolute paths. Otherwise, bash will use relative paths in the subsequent word list. If you simply refer to files in the current working directory (for example, if you type "for x in *"), the resultant list of files will not be prefixed with any path information. Remember that preceding path information can be stripped using the "basename" executable, as follows:

for x in /var/log/*
do
    echo `basename $x` is a file living in /var/log
done

Of course, it's often handy to perform loops that operate on a script's command-line arguments. Here's an example of how to use the "$@" variable, introduced at the beginning of this article:

#!/usr/bin/env bash

for thing in "$@"
do
    echo you typed ${thing}.
done

output:

$ allargs hello there you silly
you typed hello.
you typed there.
you typed you.
you typed silly.

Recommended Links

Google matched content

Softpanorama Recommended

Top articles

Sites

Please visit Heiner Steven SHELLdorado the best shell scripting site on the Internet
Please visit nixCraft
blog by Vivek Gite

Wikipedia

Tutorials



Etc

Society

Groupthink : Two Party System as Polyarchy : Corruption of Regulators : Bureaucracies : Understanding Micromanagers and Control Freaks : Toxic Managers :   Harvard Mafia : Diplomatic Communication : Surviving a Bad Performance Review : Insufficient Retirement Funds as Immanent Problem of Neoliberal Regime : PseudoScience : Who Rules America : Neoliberalism  : The Iron Law of Oligarchy : Libertarian Philosophy

Quotes

War and Peace : Skeptical Finance : John Kenneth Galbraith :Talleyrand : Oscar Wilde : Otto Von Bismarck : Keynes : George Carlin : Skeptics : Propaganda  : SE quotes : Language Design and Programming Quotes : Random IT-related quotesSomerset Maugham : Marcus Aurelius : Kurt Vonnegut : Eric Hoffer : Winston Churchill : Napoleon Bonaparte : Ambrose BierceBernard Shaw : Mark Twain Quotes

Bulletin:

Vol 25, No.12 (December, 2013) Rational Fools vs. Efficient Crooks The efficient markets hypothesis : Political Skeptic Bulletin, 2013 : Unemployment Bulletin, 2010 :  Vol 23, No.10 (October, 2011) An observation about corporate security departments : Slightly Skeptical Euromaydan Chronicles, June 2014 : Greenspan legacy bulletin, 2008 : Vol 25, No.10 (October, 2013) Cryptolocker Trojan (Win32/Crilock.A) : Vol 25, No.08 (August, 2013) Cloud providers as intelligence collection hubs : Financial Humor Bulletin, 2010 : Inequality Bulletin, 2009 : Financial Humor Bulletin, 2008 : Copyleft Problems Bulletin, 2004 : Financial Humor Bulletin, 2011 : Energy Bulletin, 2010 : Malware Protection Bulletin, 2010 : Vol 26, No.1 (January, 2013) Object-Oriented Cult : Political Skeptic Bulletin, 2011 : Vol 23, No.11 (November, 2011) Softpanorama classification of sysadmin horror stories : Vol 25, No.05 (May, 2013) Corporate bullshit as a communication method  : Vol 25, No.06 (June, 2013) A Note on the Relationship of Brooks Law and Conway Law

History:

Fifty glorious years (1950-2000): the triumph of the US computer engineering : Donald Knuth : TAoCP and its Influence of Computer Science : Richard Stallman : Linus Torvalds  : Larry Wall  : John K. Ousterhout : CTSS : Multix OS Unix History : Unix shell history : VI editor : History of pipes concept : Solaris : MS DOSProgramming Languages History : PL/1 : Simula 67 : C : History of GCC developmentScripting Languages : Perl history   : OS History : Mail : DNS : SSH : CPU Instruction Sets : SPARC systems 1987-2006 : Norton Commander : Norton Utilities : Norton Ghost : Frontpage history : Malware Defense History : GNU Screen : OSS early history

Classic books:

The Peter Principle : Parkinson Law : 1984 : The Mythical Man-MonthHow to Solve It by George Polya : The Art of Computer Programming : The Elements of Programming Style : The Unix Hater’s Handbook : The Jargon file : The True Believer : Programming Pearls : The Good Soldier Svejk : The Power Elite

Most popular humor pages:

Manifest of the Softpanorama IT Slacker Society : Ten Commandments of the IT Slackers Society : Computer Humor Collection : BSD Logo Story : The Cuckoo's Egg : IT Slang : C++ Humor : ARE YOU A BBS ADDICT? : The Perl Purity Test : Object oriented programmers of all nations : Financial Humor : Financial Humor Bulletin, 2008 : Financial Humor Bulletin, 2010 : The Most Comprehensive Collection of Editor-related Humor : Programming Language Humor : Goldman Sachs related humor : Greenspan humor : C Humor : Scripting Humor : Real Programmers Humor : Web Humor : GPL-related Humor : OFM Humor : Politically Incorrect Humor : IDS Humor : "Linux Sucks" Humor : Russian Musical Humor : Best Russian Programmer Humor : Microsoft plans to buy Catholic Church : Richard Stallman Related Humor : Admin Humor : Perl-related Humor : Linus Torvalds Related humor : PseudoScience Related Humor : Networking Humor : Shell Humor : Financial Humor Bulletin, 2011 : Financial Humor Bulletin, 2012 : Financial Humor Bulletin, 2013 : Java Humor : Software Engineering Humor : Sun Solaris Related Humor : Education Humor : IBM Humor : Assembler-related Humor : VIM Humor : Computer Viruses Humor : Bright tomorrow is rescheduled to a day after tomorrow : Classic Computer Humor

The Last but not Least Technology is dominated by two types of people: those who understand what they do not manage and those who manage what they do not understand ~Archibald Putt. Ph.D


Copyright © 1996-2021 by Softpanorama Society. www.softpanorama.org was initially created as a service to the (now defunct) UN Sustainable Development Networking Programme (SDNP) without any remuneration. This document is an industrial compilation designed and created exclusively for educational use and is distributed under the Softpanorama Content License. Original materials copyright belong to respective owners. Quotes are made for educational purposes only in compliance with the fair use doctrine.

FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to advance understanding of computer science, IT technology, economic, scientific, and social issues. We believe this constitutes a 'fair use' of any such copyrighted material as provided by section 107 of the US Copyright Law according to which such material can be distributed without profit exclusively for research and educational purposes.

This is a Spartan WHYFF (We Help You For Free) site written by people for whom English is not a native language. Grammar and spelling errors should be expected. The site contain some broken links as it develops like a living tree...

You can use PayPal to to buy a cup of coffee for authors of this site

Disclaimer:

The statements, views and opinions presented on this web page are those of the author (or referenced source) and are not endorsed by, nor do they necessarily reflect, the opinions of the Softpanorama society. We do not warrant the correctness of the information provided or its fitness for any purpose. The site uses AdSense so you need to be aware of Google privacy policy. You you do not want to be tracked by Google please disable Javascript for this site. This site is perfectly usable without Javascript.

Last modified: September 04, 2019