May the source be with you, but remember the KISS principle ;-)
Home Switchboard Unix Administration Red Hat TCP/IP Networks Neoliberalism Toxic Managers
(slightly skeptical) Educational society promoting "Back to basics" movement against IT overcomplexity and  bastardization of classic Unix

Control Structures

Pseudoscientific sudo-religious junk propagated under the name of structured programming was one positive affect: it attracted attention and researchers to this field. 

the most productive approach is to study real programs (or thier control graphs) and try to distill the set of control structures that is fundamental to programming. Unfortunatly this is a difficult task as control structures do not exist independently of the language

Study of program control grpohs is more productive approach.  In this domain the problem can be reformulated as the problem iof decomposition of a given contol flow grpah into primary components.  There are several classic approaches to recreation of  high level control structures from an assembler code based on program graph decomposition 

Decomposition of program graph into "single entry-single exit" subgraphs (hammocks). All control in a hammock must be passed to the entry note called head and all control from the quants should pass via the exit note called focus. This idea was suggested by  Kasyanov in V. N. Kas'janov. Distinguishing hammocks in a directed graph. Soviet Math. Doklady, 16(5):448--450, 1975. ; Kasyanov's decomposition algorithm was weak and later he completely lost track of his own idea. Essentially the same idea was independendently reinvented by Gordon Oulsnam who proposed the first practically usable decomposition algorithm based of direct and reverse dominators (Oulsnam work was received in August 1984, but published in 1987).  Later Oulsnam applied this decomposition to complexity metric of the program.  Unfortunately I lost my copy of his papers.

The idea of prime hammocks or control quants.  Control quant is non-decomposable subgraph of the graph of the program with just one entry and one exit node. That means that all control quants are hammocks but not vise versa.  control quants can be also called "prime control structures" to use the analogy with prime numbers.   This idea was invented by R.A. Maddux in his  Ph.D. dissertation "Study of Program Structure, University of Waterloo (July 1975). Note that this was the same year Kas'janov paper was published. Maddux called his control structures "prime programs." 

Similar approach was independently reinvented myself much later (I was not aware about his work, but was aware about Kasyanov's and Oulsnam works.  I called such structures "prime hammocks" or control quants, as they are not further decomposable into simpler structures. See

Nikolai  Bezroukov. “Quantum Decomposition of Program Schemata on Prime Hammocks”, Software Maintenance for Real Time Software Systems Based on Micro and Mini Computers. [in Russian] Kiev Institute of Civil Aviation Engineers.  Kiev (1988),  pp. 3-17. 

The short summary of  Maddux work and enumeration of control quants can be found in Terrence W Pratt and Marvin V. Zelkowitz book "Programming languages: design and implementation".

The problem with this approach is that multilevel jumps (exception handling)  strongly affect program decomposition. This can be mitigated by the "normalization" of program graph before decomposition or by  minimization of the number of components (backtracking based) approaches.

I created a pilot system used for the analyses of air traffic control programs based ont his approach but my grant ended before I managed to get practically important results and later I switched to the malware analysis and disassembly and never returned to this topic. 

Top Visited
Past week
Past month


Old News ;-)

[Sep 06, 2019] Using Case Insensitive Matches with Bash Case Statements by Steven Vona

Jun 30, 2019 |

If you want to match the pattern regardless of it's case (Capital letters or lowercase letters) you can set the nocasematch shell option with the shopt builtin. You can do this as the first line of your script. Since the script will run in a subshell it won't effect your normal environment.

 shopt -s nocasematch
 read -p "Name a Star Trek character: " CHAR
 case $CHAR in
   "Seven of Nine" | Neelix | Chokotay | Tuvok | Janeway )
       echo "$CHAR was in Star Trek Voyager"
   Archer | Phlox | Tpol | Tucker )
       echo "$CHAR was in Star Trek Enterprise"
   Odo | Sisko | Dax | Worf | Quark )
       echo "$CHAR was in Star Trek Deep Space Nine"
   Worf | Data | Riker | Picard )
       echo "$CHAR was in Star Trek The Next Generation" &&  echo "/etc/redhat-release"
   *) echo "$CHAR is not in this script." 

[Sep 02, 2019] Switch statement for bash script

Sep 02, 2019 |
Switch statement for bash script
<a rel='nofollow' target='_blank' href='//'><img border='0' alt='' src='//;n=a054b75' /></a>
[ Log in to get rid of this advertisement] Hello, i am currently trying out the switch statement using bash script.

showmenu () {
echo "1. Number1"
echo "2. Number2"
echo "3. Number3"
echo "4. All"
echo "5. Quit"

while true
read choice
echo "Enter a choice:"
case "$choice" in
echo "Number One"
echo "Number Two"
echo "Number Three"
echo "Number One, Two, Three"
echo "Program Exited"
exit 0
echo "Please enter number ONLY ranging from 1-5!"

1. Number1
2. Number2
3. Number3
4. All
5. Quit
Enter a choice:

So, when the code is run, a menu with option 1-5 will be shown, then the user will be asked to enter a choice and finally an output is shown. But it is possible if the user want to enter multiple choices. For example, user enter choice "1" and "3", so the output will be "Number One" and "Number Three". Any idea?

Just something to get you started.


#! /bin/bash
showmenu ()
    typeset ii
    typeset -i jj=1
    typeset -i kk
    typeset -i valid=0  # valid=1 if input is good

    while (( ! valid ))
        for ii in "${options[@]}"
            echo "$jj) $ii"
            let jj++
        read -e -p 'Select a list of actions : ' -a answer
        for kk in "${answer[@]}"
            if (( kk < 1 || kk > "${#options[@]}" ))
                echo "Error Item $jj is out of bounds" 1>&2
            let jj++

typeset -r c1=Number1
typeset -r c2=Number2
typeset -r c3=Number3
typeset -r c4=All
typeset -r c5=Quit
typeset -ra options=($c1 $c2 $c3 $c4 $c5)
typeset -a answer
typeset -i kk
while true
    for kk in "${answer[@]}"
        case $kk in
            echo 'Number One'
            echo 'Number Two'
            echo 'Number Three'
            echo 'Number One, Two, Three'
            echo 'Program Exit'
            exit 0
View Public Profile
View LQ Blog
View Review Entries
View HCL Entries
Find More Posts by stevenworr
Old 11-16-2009, 10:10 PM # 4
wjs1990 Member
Registered: Nov 2009 Posts: 30
Original Poster
Rep: Reputation: 15
Ok will try it out first. Thanks.
Last edited by wjs1990; 11-16-2009 at 10:13 PM .
View Public Profile
View LQ Blog
View Review Entries
View HCL Entries
Find More Posts by wjs1990
Old 11-16-2009, 10:16 PM # 5
evo2 LQ Guru
Registered: Jan 2009 Location: Japan Distribution: Mostly Debian and CentOS Posts: 5,945
Rep: Reputation: 1376 Reputation: 1376 Reputation: 1376 Reputation: 1376 Reputation: 1376 Reputation: 1376 Reputation: 1376 Reputation: 1376 Reputation: 1376 Reputation: 1376
This can be done just by wrapping your case block in a for loop and changing one line.


showmenu () {
    echo "1. Number1"
    echo "2. Number2"
    echo "3. Number3"
    echo "4. All"
    echo "5. Quit"

while true ; do
    read choices
    for choice in $choices ; do
        case "$choice" in
                echo "Number One" ;;
                echo "Number Two" ;;
                echo "Number Three" ;;
                echo "Numbers One, two, three" ;;
                echo "Exit"
                exit 0 ;;
                echo "Please enter number ONLY ranging from 1-5!"
You can now enter any number of numbers seperated by white space.



[Dec 06, 2015] Bash For Loop Examples

A very nice tutorial by Vivek Gite (created October 31, 2008 last updated June 24, 2015). His mistake is putting new for loop too far inside the tutorial. It should emphazied, not hidden.
June 24, 2015 |

... ... ...

Bash v4.0+ has inbuilt support for setting up a step value using {START..END..INCREMENT} syntax:

echo "Bash version ${BASH_VERSION}..."
for i in {0..10..2}
     echo "Welcome $i times"

Sample outputs:

Bash version 4.0.33(0)-release...
Welcome 0 times
Welcome 2 times
Welcome 4 times
Welcome 6 times
Welcome 8 times
Welcome 10 times

... ... ...

Three-expression bash for loops syntax

This type of for loop share a common heritage with the C programming language. It is characterized by a three-parameter loop control expression; consisting of an initializer (EXP1), a loop-test or condition (EXP2), and a counting expression (EXP3).

for (( EXP1; EXP2; EXP3 ))

A representative three-expression example in bash as follows:

for (( c=1; c<=5; c++ ))
   echo "Welcome $c times"
... ... ...

Jadu Saikia, November 2, 2008, 3:37 pm

Nice one. All the examples are explained well, thanks Vivek.

seq 1 2 20
output can also be produced using jot

jot – 1 20 2

The infinite loops as everyone knows have the following alternatives.

while :


Andi Reinbrech, November 18, 2010, 7:42 pm
I know this is an ancient thread, but thought this trick might be helpful to someone:

For the above example with all the cuts, simply do

set `echo $line`

This will split line into positional parameters and you can after the set simply say

F1=$1; F2=$2; F3=$3

I used this a lot many years ago on solaris with "set `date`", it neatly splits the whole date string into variables and saves lots of messy cutting :-)

… no, you can't change the FS, if it's not space, you can't use this method

Peko, July 16, 2009, 6:11 pm
Hi Vivek,
Thanks for this a useful topic.

IMNSHO, there may be something to modify here
Latest bash version 3.0+ has inbuilt support for setting up a step value:

for i in {1..5}
1) The increment feature seems to belong to the version 4 of bash.
Accordingly, my bash v3.2 does not include this feature.

BTW, where did you read that it was 3.0+ ?
(I ask because you may know some good website of interest on the subject).

2) The syntax is {} where from, to, step are 3 integers.
You code is missing the increment.

Note that GNU Bash documentation may be bugged at this time,
because on GNU Bash manual, you will find the syntax {x..y[incr]}
which may be a typo. (missing the second ".." between y and increment).


The Bash Hackers page
again, see
seeems to be more accurate,
but who knows ? Anyway, at least one of them may be right… ;-)

Keep on the good work of your own,
Thanks a million.

- Peko

Michal Kaut July 22, 2009, 6:12 am

is there a simple way to control the number formatting? I use several computers, some of which have non-US settings with comma as a decimal point. This means that
for x in $(seq 0 0.1 1) gives 0 0.1 0.2 … 1 one some machines and 0 0,1 0,2 … 1 on other.
Is there a way to force the first variant, regardless of the language settings? Can I, for example, set the keyboard to US inside the script? Or perhaps some alternative to $x that would convert commas to points?
(I am sending these as parameters to another code and it won't accept numbers with commas…)

The best thing I could think of is adding x=`echo $x | sed s/,/./` as a first line inside the loop, but there should be a better solution? (Interestingly, the sed command does not seem to be upset by me rewriting its variable.)


Peko July 22, 2009, 7:27 am

To Michal Kaut:

Hi Michal,

Such output format is configured through LOCALE settings.

I tried :

export LC_CTYPE="en_EN.UTF-8″; seq 0 0.1 1

and it works as desired.

You just have to find the exact value for LC_CTYPE that fits to your systems and your needs.


Peko July 22, 2009, 2:29 pm

To Michal Kaus [2]

Ooops – ;-)
Instead of LC_CTYPE,
LC_NUMERIC should be more appropriate
(Although LC_CTYPE is actually yielding to the same result – I tested both)

By the way, Vivek has already documented the matter :

Philippe Petrinko October 30, 2009, 8:35 am

To Vivek:
Regarding your last example, that is : running a loop through arguments given to the script on the command line, there is a simplier way of doing this:
# instead of:
# FILES="$@"
# for f in $FILES

# use the following syntax
for arg
# whatever you need here – try : echo "$arg"

Of course, you can use any variable name, not only "arg".

Philippe Petrinko November 11, 2009, 11:25 am

To tdurden:

Why would'nt you use

1) either a [for] loop
for old in * ; do mv ${old} ${old}.new; done

2) Either the [rename] command ?
excerpt form "man rename" :

RENAME(1) Perl Programmers Reference Guide RENAME(1)

rename – renames multiple files

rename [ -v ] [ -n ] [ -f ] perlexpr [ files ]

"rename" renames the filenames supplied according to the rule specified
as the first argument. The perlexpr argument is a Perl expression
which is expected to modify the $_ string in Perl for at least some of
the filenames specified. If a given filename is not modified by the
expression, it will not be renamed. If no filenames are given on the
command line, filenames will be read via standard input.

For example, to rename all files matching "*.bak" to strip the
extension, you might say

rename 's/\.bak$//' *.bak

To translate uppercase names to lower, you'd use

rename 'y/A-Z/a-z/' *

- Philippe

Philippe Petrinko November 11, 2009, 9:27 pm

If you set the shell option extglob, Bash understands some more powerful patterns. Here, a is one or more pattern, separated by the pipe-symbol (|).

?() Matches zero or one occurrence of the given patterns
*() Matches zero or more occurrences of the given patterns
+() Matches one or more occurrences of the given patterns
@() Matches one of the given patterns
!() Matches anything except one of the given patterns


Philippe Petrinko November 12, 2009, 3:44 pm

To Sean:
Right, the more sharp a knife is, the easier it can cut your fingers…

I mean: There are side-effects to the use of file globbing (like in [ for f in * ] ) , when the globbing expression matches nothing: the globbing expression is not susbtitued.

Then you might want to consider using [ nullglob ] shell extension,
to prevent this.

Devil hides in detail ;-)

Dominic January 14, 2010, 10:04 am

There is an interesting difference between the exit value for two different for looping structures (hope this comes out right):
for (( c=1; c<=2; c++ )) do echo -n "inside (( )) loop c is $c, "; done; echo "done (( )) loop c is $c"
for c in {1..2}; do echo -n "inside { } loop c is $c, "; done; echo "done { } loop c is $c"

You see that the first structure does a final increment of c, the second does not. The first is more useful IMO because if you have a conditional break in the for loop, then you can subsequently test the value of $c to see if the for loop was broken or not; with the second structure you can't know whether the loop was broken on the last iteration or continued to completion.

Dominic January 14, 2010, 10:09 am

sorry, my previous post would have been clearer if I had shown the output of my code snippet, which is:
inside (( )) loop c is 1, inside (( )) loop c is 2, done (( )) loop c is 3
inside { } loop c is 1, inside { } loop c is 2, done { } loop c is 2

Philippe Petrinko March 9, 2010, 2:34 pm


And, again, as stated many times up there, using [seq] is counter productive, because it requires a call to an external program, when you should Keep It Short and Simple, using only bash internals functions:

for ((c=1; c<21; c+=2)); do echo "Welcome $c times" ; done

(and I wonder why Vivek is sticking to that old solution which should be presented only for historical reasons when there was no way of using bash internals.
By the way, this historical recall should be placed only at topic end, and not on top of the topic, which makes newbies sticking to the not-up-to-date technique ;-) )

Sean March 9, 2010, 11:15 pm

I have a comment to add about using the builtin for (( … )) syntax. I would agree the builtin method is cleaner, but from what I've noticed with other builtin functionality, I had to check the speed advantage for myself. I wrote the following files:

for ((i=1;i<=1000000;i++))
echo "Output $i"

for i in $(seq 1 1000000)
echo "Output $i"

And here were the results that I got:
time ./
real 0m22.122s
user 0m18.329s
sys 0m3.166s

time ./
real 0m19.590s
user 0m15.326s
sys 0m2.503s

The performance increase isn't too significant, especially when you are probably going to be doing something a little more interesting inside of the for loop, but it does show that builtin commands are not necessarily faster.

Andi Reinbrech November 18, 2010, 8:35 pm

The reason why the external seq is faster, is because it is executed only once, and returns a huge splurb of space separated integers which need no further processing, apart from the for loop advancing to the next one for the variable substitution.

The internal loop is a nice and clean/readable construct, but it has a lot of overhead. The check expression is re-evaluated on every iteration, and a variable on the interpreter's heap gets incremented, possibly checked for overflow etc. etc.

Note that the check expression cannot be simplified or internally optimised by the interpreter because the value may change inside the loop's body (yes, there are cases where you'd want to do this, however rare and stupid they may seem), hence the variables are volatile and get re-evaluted.

I.e. botom line, the internal one has more overhead, the "seq" version is equivalent to either having 1000000 integers inside the script (hard coded), or reading once from a text file with 1000000 integers with a cat. Point being that it gets executed only once and becomes static.

OK, blah blah fishpaste, past my bed time :-)


Anthony Thyssen June 4, 2010, 6:53 am

The {1..10} syntax is pretty useful as you can use a variable with it!

echo {1..${limit}}

You need to eval it to get it to work!

eval "echo {1..${limit}}"
1 2 3 4 5 6 7 8 9 10

'seq' is not avilable on ALL system (MacOSX for example)
and BASH is not available on all systems either.

You are better off either using the old while-expr method for computer compatiblity!

   limit=10; n=1;
   while [ $n -le 10 ]; do
     echo $n;
     n=`expr $n + 1`;

Alternativally use a seq() function replacement…

 # seq_count 10
seq_count() {
  i=1; while [ $i -le $1 ]; do echo $i; i=`expr $i + 1`; done
# simple_seq 1 2 10
simple_seq() {
  i=$1; while [ $i -le $3 ]; do echo $i; i=`expr $i + $2`; done
seq_integer() {
    if [ "X$1" = "X-f" ]
    then format="$2"; shift; shift
    else format="%d"
    case $# in
    1) i=1 inc=1 end=$1 ;;
    2) i=$1 inc=1 end=$2 ;;
    *) i=$1 inc=$2 end=$3 ;;
    while [ $i -le $end ]; do
      printf "$format\n" $i;
      i=`expr $i + $inc`;

Edited: by Admin – added code tags.

TheBonsai June 4, 2010, 9:57 am

The Bash C-style for loop was taken from KSH93, thus I guess it's at least portable towards Korn and Z.

The seq-function above could use i=$((i + inc)), if only POSIX matters. expr is obsolete for those things, even in POSIX.

Philippe Petrinko June 4, 2010, 10:15 am

Right Bonsai,
( )

But FOR C-style does not seem to be POSIXLY-correct…

Read on-line reference issue 6/2004,
Top is here,

and the Shell and Utilities volume (XCU) T.OC. is here
doc is:

and FOR command:

Anthony Thyssen June 6, 2010, 7:18 am

TheBonsai wrote…. "The seq-function above could use i=$((i + inc)), if only POSIX matters. expr is obsolete for those things, even in POSIX."

I am not certain it is in Posix. It was NOT part of the original Bourne Shell, and on some machines, I deal with Bourne Shell. Not Ksh, Bash, or anything else.

Bourne Shell syntax works everywhere! But as 'expr' is a builtin in more modern shells, then it is not a big loss or slow down.

This is especially important if writing a replacement command, such as for "seq" where you want your "just-paste-it-in" function to work as widely as possible.

I have been shell programming pretty well all the time since 1988, so I know what I am talking about! Believe me.

MacOSX has in this regard been the worse, and a very big backward step in UNIX compatibility. 2 year after it came out, its shell still did not even understand most of the normal 'test' functions. A major pain to write shells scripts that need to also work on this system.

TheBonsai June 6, 2010, 12:35 pm

Yea, the question was if it's POSIX, not if it's 100% portable (which is a difference). The POSIX base more or less is a subset of the Korn features (88, 93), pure Bourne is something "else", I know. Real portability, which means a program can go wherever UNIX went, only in C ;)

Philippe Petrinko November 22, 2010, 8:23 am

And if you want to get rid of double-quotes, use:

one-liner code:
while read; do record=${REPLY}; echo ${record}|while read -d ","; do field="${REPLY#\"}"; field="${field%\"}"; echo ${field}; done; done<data

script code, added of some text to better see record and field breakdown:

while read
echo "New record"
echo ${record}|while read -d ,
echo "Field is :${field}:"

Does it work with your data?

- PP

Philippe Petrinko November 22, 2010, 9:01 am

Of course, all the above code was assuming that your CSV file is named "data".

If you want to use anyname with the script, replace:




And then use your script file (named for instance "myScript") with standard input redirection:

myScript < anyFileNameYouWant


Philippe Petrinko November 22, 2010, 11:28 am

well no there is a bug, last field of each record is not read – it needs a workout and may be IFS modification ! After all that's what it was built for… :O)

Anthony Thyssen November 22, 2010, 11:31 pm

Another bug is the inner loop is a pipeline, so you can't assign variables for use later in the script. but you can use '<<<' to break the pipeline and avoid the echo.

But this does not help when you have commas within the quotes! Which is why you needed quotes in the first place.

In any case It is a little off topic. Perhaps a new thread for reading CVS files in shell should be created.

Philippe Petrinko November 24, 2010, 6:29 pm

Would you try this one-liner script on your CSV file?

This one-liner assumes that CSV file named [data] has __every__ field double-quoted.

while read; do r="${REPLY#\"}";echo "${r//\",\"/\"}"|while read -d \";do echo "Field is :${REPLY}:";done;done<data

Here is the same code, but for a script file, not a one-liner tweak.

# script
# 1) Usage
# This script reads from standard input
# any CSV with double-quoted data fields
# and breaks down each field on standard output
# 2) Within each record (line), _every_ field MUST:
# - Be surrounded by double quotes,
# - and be separated from preceeding field by a comma
# (not the first field of course, no comma before the first field)
while read
echo "New record" # this is not mandatory-just for explanation
# store REPLY and remove opening double quote
# replace every "," by a single double quote
echo ${record}|while read -d \"
# store REPLY into variable "field"
echo "Field is :${field}:" # just for explanation

This script named here [] must be used so: < my-cvs-file-with-doublequotes

Philippe Petrinko November 24, 2010, 6:35 pm


By the way, using [REPLY] in the outer loop _and_ the inner loop is not a bug.
As long as you know what you do, this is not problem, you just have to store [REPLY] value conveniently, as this script shows.

TheBonsai March 8, 2011, 6:26 am
for ((i=1; i<=20; i++)); do printf "%02d\n" "$i"; done

nixCraft March 8, 2011, 6:37 am

+1 for printf due to portability, but you can use bashy .. syntax too

for i in {01..20}; do echo "$i"; done

TheBonsai March 8, 2011, 6:48 am

Well, it isn't portable per se, it makes it portable to pre-4 Bash versions.

I think a more or less "portable" (in terms of POSIX, at least) code would be

while [ "$((i >= 20))" -eq 0 ]; do
  printf "%02d\n" "$i"

Philip Ratzsch April 20, 2011, 5:53 am

I didn't see this in the article or any of the comments so I thought I'd share. While this is a contrived example, I find that nesting two groups can help squeeze a two-liner (once for each range) into a one-liner:

for num in {{1..10},{15..20}};do echo $num;done

Great reference article!

Philippe Petrinko April 20, 2011, 8:23 am

Nice thing to think of, using brace nesting, thanks for sharing.

Philippe Petrinko May 6, 2011, 10:13 am

Hello Sanya,

That would be because brace expansion does not support variables. I have to check this.
Anyway, Keep It Short and Simple: (KISS) here is a simple solution I already gave above:

for (( x = $xstart; x <= $xend; x += $xstep)); do echo $x;done

Actually, POSIX compliance allows to forget $ in for quotes, as said before, you could also write:

for (( x = xstart; x <= xend; x += xstep)); do echo $x;done

Philippe Petrinko May 6, 2011, 10:48 am


Actually brace expansion happens __before__ $ parameter exapansion, so you cannot use it this way.

Nevertheless, you could overcome this this way:

max=10; for i in $(eval echo {1..$max}); do echo $i; done

Sanya May 6, 2011, 11:42 am

Hello, Philippe

Thanks for your suggestions
You basically confirmed my findings, that bash constructions are not as simple as zsh ones.
But since I don't care about POSIX compliance, and want to keep my scripts "readable" for less experienced people, I would prefer to stick to zsh where my simple for-loop works

Cheers, Sanya

Philippe Petrinko May 6, 2011, 12:07 pm


First, you got it wrong: solutions I gave are not related to POSIX, I just pointed out that POSIX allows not to use $ in for (( )), which is just a little bit more readable – sort of.

Second, why do you see this less readable than your [zsh] [for loop]?

for (( x = start; x <= end; x += step)) do
echo "Loop number ${x}"

It is clear that it is a loop, loop increments and limits are clear.

IMNSHO, if anyone cannot read this right, he should not be allowed to code. :-D


Anthony Thyssen May 8, 2011, 11:30 pm

If you are going to do… $(eval echo {1..$max});
You may as well use "seq" or one of the many other forms.
See all the other comments on doing for loops.

Tom P May 19, 2011, 12:16 pm

I am trying to use the variable I set in the for line on to set another variable with a different extension. Couldn't get this to work and couldnt find it anywhere on the web… Can someone help.


FILE_TOKEN=`cat /tmp/All_Tokens.txt`
for token in $FILE_TOKEN
A1_$token=`grep $A1_token /file/path/file.txt | cut -d ":" -f2`

my goal is to take the values from the ALL Tokens file and set a new variable with A1_ infront of it… This tells be that A1_ is not a command…

[Jun 06, 2011] Measuring Prime Program Complexity by MARVIN V. ZELKOWITZ & JIANHUI TIAN

Jan 1, 1994 | University of Maryland

This paper uses the prime program decomposition of a program as the basis for a measure that closely correlates with our intuitive notion of program complexity. This measure is based upon the information theory ideas of randomness and entropy such that results about structured programming, data abstractions, and other programming paradigms can be stated in quantitative terms, and empirical means can be used to validate the assumptions used to develop the model.

As a graph-based model, it can be applied to several graphical examples as extensions not otherwise available to sourcecode based models. This paper introduces the measure, derives several properties for it, and gives some simple examples to demonstrate that the measure is a plausible approximation of our notions concerning structured programming.

[Jun 06, 2011] Global data flow analysis by decomposition into primes

Global data flow analysis by decomposition into primes

Ira R. Forman ICSE '82 Proceedings of the 6th international conference on Software engineering IEEE Computer Society Press Los Alamitos, CA, USA

Cognitive Design Elements to Support the Construction of .. - Storey, Fracchia, Müller (1999) (11 citations)
observed that programmers first develop a control-flow abstraction of the program which captures the

Understanding Function Behaviors through Program Slicing - De Lucia, Fasolino, Munro (1996) (4 citations)
of the program she builds is a control-flow abstraction called the program model. The program

From Program Comprehension to Tool Requirements for an.. - von Mayrhauser, Vans (1993) (Correct) (2 citations)
representation programmers build is a control flow abstraction of the program called the program

Comprehension Processes During Large Scale Maintenance - von Mayrhauser, Vans (1994) (Correct) (1 citation)
build is a program model consisting of a control flow abstraction of the program [5]Suppose an engineer

From Code Understanding Needs to Reverse Engineering Tool.. - von Mayrhauser, Vans (1993) (Correct)
representation programmers build is a control flow abstraction of the program called the program

Program Comprehension And Enhancement Of Software - von Mayrhauser, Vans, Lang (Correct)
6] found that programmers first build a control flow abstraction of the program called the program

Control Abstraction in Parallel Programming Languages - Lawrence Crowl Department (1992) (Correct)
that parallelism is a form of control flow, control abstraction is particularly important for parallel

Identifying Procedural Structure in Cobol Programs - John Field Ramalingam (Correct)
Abstract The principal control-flow abstraction mechanism in the Cobol language is the

Replicable GMBs: A tool for the specification of.. - Mar'ia Cecilia (Correct)
layer, but do little to help create the dialog control layer, let alone to provide semantic feedback.
at just two levels of abstraction: control flow, specified with GMB, and operations, specified in
Structure is specified at several levels of abstraction, by nesting boxes, as in Structured Boxes

VASE User's Manual Version 1.0 - David Jablonowski (Correct)
steering applications. vase uses a control-flow abstraction to describe the complex internal

Congestion Control in BBN Packet-Switched Networks* - John Robinson Dan (Correct)
result in instabilities, when delay is used to control flow rates. 3. It is difficult to design a fair
its submission rate. The use of window-based flow control protocols at the higher level will
reasons: 1. There may be no window-oriented flow control in use for some of the congesting traffic. 2.

LISP AND SYMBOLIC COMPUTATION: An International Journal, .. - Callee-Save Registers In (Correct)
to A-calculus, but which closely reflects the control-flow and data-flow operations of a yon Neumann

Programming the Hawaii Parallel Computer - Richard Halverson Jr (Correct)
traditional language constructs for defining control flow. One way to use the XILINX board on a HPC

Rule Based Specification of Information Systems - Peter Mcbrien And (Correct)
and costly due to the difference between control flow and data flow not having been modelled in

March 1995 - Flight Safety Digest (Correct)
airports Airspace capacity bottlenecks Flow control and the competition for slot times and,

Progressive Profiling: A Methodology based on - Profile Propagation And (Correct)
for exact matches based on each procedure's control flow structure, and propagates the entire profile
or additional information from the compiler. 3 Control Flow-based Binary Matching This section describes

Unknown - (Correct)
the tightcoupling between wormhole routers for flow control to detect and recover from potential dead-
The basic idea is to use the fine-grained flow control and backpressure of wormhole routing to
The CR routing mechanisms provide end-to-end flow control and fault tolerance that obviates the need for

A Backward Error Recovery Scheme for the - Apemille Parallel Computer (Correct)
unit (called Tmille) performing mainly flow- control, signal handling, and global integer

Graph-Based Models for Managing Development Processes.. - Carl-Arndt Krapp Sven (Correct)
relationships, tasks are connected by control flows (similar to precedence relationships in net
and for testing are inserted into the task net. Control flow dependencies are established between the

Why Computer Interfaces Are Not Like Paintings: - The User As (Correct)
language is usually based on either a control flow or a data flow model. Graphical interface: a

Congestion Control in Routing Networks - Chien (1986) (Correct)
Problem .49 4.2 Flow Control in Data Networks .50
64 5.3.1 System Performance without Flow Control .64 5.3.2 System Performance
.64 5.3.2 System Performance with Flow Control .69 5.4 Overall System

IP switching in a simplified ATM environment - Mika Ilvesmki Marko (Correct)
Internet traffic has the ability to handle the flow control on connection basis and that no additional
on connection basis and that no additional flow control is needed. Amount of congestion Received

Flip-Tick Architecture: a cycle-oriented architecture - For Distributed Problem (Correct)
events. The cycles are not based on an explicit control flow management that takes care of timely operation
only way of exchanging data between actors, the control flow results from three intertwined mechanisms: i)

Application Performance of a Linux Cluster - Using Converse Laxmikant (Correct)
is not guaranteed, we implement a window-based flow-control protocol. To deal with the fact that there are

Towards Maximising the use of Structural VHDL for Synthesis - O'Brien, Robert, Maginot (Correct)
but only in specific areas such as describing control flow and synchronisation. In the rest of this

Light Weight Optimizations for Reducing Hot Saves and.. - Of Callee-Saved Registers (Correct)
according to their liveness in the call and control flow graphs of the program This method requires a
accomplish this, they also make use of data and control flow analyses. Others [4, 8, 9, 10, 16] handle the

Primitive Geometric Operations on - Planar Algebraic Curves (Correct)
sufficient precisions to direct the algorithm control flows correctly. These primitive geometric
and efficiency dynamically depending on the control flow of algorithm, i.e.use low precision

Henry Chang - Top-Down Constraint-Driven Design (Correct)
criteria set by the mapping function, then the flow control is returned to the mapping function. On a
As before, if this step fails, the flow control is returned to the mapping function. If it is
If all of the specifications are met, the flow control is returned to the upper node with the

Demonstration of an Automated Integrated Test.. - Margaria, Niese, Steffen (2001) (Correct)
correctness and consistency of the test cases' control flow logic [3]During the test case design, vital

Unification of Finite Failure Non-Homogeneous Poisson - Process Models Through (Correct)
types of coverage definitions in literature: control-flow and data-flow coverage[2, 6, 7, 16]Each
on the Effectiveness of Dataflow-and Control-flowbased Test Adequacy Criteria,Proc. of Intl.

Terrence Pratt



Groupthink : Two Party System as Polyarchy : Corruption of Regulators : Bureaucracies : Understanding Micromanagers and Control Freaks : Toxic Managers :   Harvard Mafia : Diplomatic Communication : Surviving a Bad Performance Review : Insufficient Retirement Funds as Immanent Problem of Neoliberal Regime : PseudoScience : Who Rules America : Neoliberalism  : The Iron Law of Oligarchy : Libertarian Philosophy


War and Peace : Skeptical Finance : John Kenneth Galbraith :Talleyrand : Oscar Wilde : Otto Von Bismarck : Keynes : George Carlin : Skeptics : Propaganda  : SE quotes : Language Design and Programming Quotes : Random IT-related quotesSomerset Maugham : Marcus Aurelius : Kurt Vonnegut : Eric Hoffer : Winston Churchill : Napoleon Bonaparte : Ambrose BierceBernard Shaw : Mark Twain Quotes


Vol 25, No.12 (December, 2013) Rational Fools vs. Efficient Crooks The efficient markets hypothesis : Political Skeptic Bulletin, 2013 : Unemployment Bulletin, 2010 :  Vol 23, No.10 (October, 2011) An observation about corporate security departments : Slightly Skeptical Euromaydan Chronicles, June 2014 : Greenspan legacy bulletin, 2008 : Vol 25, No.10 (October, 2013) Cryptolocker Trojan (Win32/Crilock.A) : Vol 25, No.08 (August, 2013) Cloud providers as intelligence collection hubs : Financial Humor Bulletin, 2010 : Inequality Bulletin, 2009 : Financial Humor Bulletin, 2008 : Copyleft Problems Bulletin, 2004 : Financial Humor Bulletin, 2011 : Energy Bulletin, 2010 : Malware Protection Bulletin, 2010 : Vol 26, No.1 (January, 2013) Object-Oriented Cult : Political Skeptic Bulletin, 2011 : Vol 23, No.11 (November, 2011) Softpanorama classification of sysadmin horror stories : Vol 25, No.05 (May, 2013) Corporate bullshit as a communication method  : Vol 25, No.06 (June, 2013) A Note on the Relationship of Brooks Law and Conway Law


Fifty glorious years (1950-2000): the triumph of the US computer engineering : Donald Knuth : TAoCP and its Influence of Computer Science : Richard Stallman : Linus Torvalds  : Larry Wall  : John K. Ousterhout : CTSS : Multix OS Unix History : Unix shell history : VI editor : History of pipes concept : Solaris : MS DOSProgramming Languages History : PL/1 : Simula 67 : C : History of GCC developmentScripting Languages : Perl history   : OS History : Mail : DNS : SSH : CPU Instruction Sets : SPARC systems 1987-2006 : Norton Commander : Norton Utilities : Norton Ghost : Frontpage history : Malware Defense History : GNU Screen : OSS early history

Classic books:

The Peter Principle : Parkinson Law : 1984 : The Mythical Man-MonthHow to Solve It by George Polya : The Art of Computer Programming : The Elements of Programming Style : The Unix Hater’s Handbook : The Jargon file : The True Believer : Programming Pearls : The Good Soldier Svejk : The Power Elite

Most popular humor pages:

Manifest of the Softpanorama IT Slacker Society : Ten Commandments of the IT Slackers Society : Computer Humor Collection : BSD Logo Story : The Cuckoo's Egg : IT Slang : C++ Humor : ARE YOU A BBS ADDICT? : The Perl Purity Test : Object oriented programmers of all nations : Financial Humor : Financial Humor Bulletin, 2008 : Financial Humor Bulletin, 2010 : The Most Comprehensive Collection of Editor-related Humor : Programming Language Humor : Goldman Sachs related humor : Greenspan humor : C Humor : Scripting Humor : Real Programmers Humor : Web Humor : GPL-related Humor : OFM Humor : Politically Incorrect Humor : IDS Humor : "Linux Sucks" Humor : Russian Musical Humor : Best Russian Programmer Humor : Microsoft plans to buy Catholic Church : Richard Stallman Related Humor : Admin Humor : Perl-related Humor : Linus Torvalds Related humor : PseudoScience Related Humor : Networking Humor : Shell Humor : Financial Humor Bulletin, 2011 : Financial Humor Bulletin, 2012 : Financial Humor Bulletin, 2013 : Java Humor : Software Engineering Humor : Sun Solaris Related Humor : Education Humor : IBM Humor : Assembler-related Humor : VIM Humor : Computer Viruses Humor : Bright tomorrow is rescheduled to a day after tomorrow : Classic Computer Humor

The Last but not Least Technology is dominated by two types of people: those who understand what they do not manage and those who manage what they do not understand ~Archibald Putt. Ph.D

Copyright © 1996-2021 by Softpanorama Society. was initially created as a service to the (now defunct) UN Sustainable Development Networking Programme (SDNP) without any remuneration. This document is an industrial compilation designed and created exclusively for educational use and is distributed under the Softpanorama Content License. Original materials copyright belong to respective owners. Quotes are made for educational purposes only in compliance with the fair use doctrine.

FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to advance understanding of computer science, IT technology, economic, scientific, and social issues. We believe this constitutes a 'fair use' of any such copyrighted material as provided by section 107 of the US Copyright Law according to which such material can be distributed without profit exclusively for research and educational purposes.

This is a Spartan WHYFF (We Help You For Free) site written by people for whom English is not a native language. Grammar and spelling errors should be expected. The site contain some broken links as it develops like a living tree...

You can use PayPal to to buy a cup of coffee for authors of this site


The statements, views and opinions presented on this web page are those of the author (or referenced source) and are not endorsed by, nor do they necessarily reflect, the opinions of the Softpanorama society. We do not warrant the correctness of the information provided or its fitness for any purpose. The site uses AdSense so you need to be aware of Google privacy policy. You you do not want to be tracked by Google please disable Javascript for this site. This site is perfectly usable without Javascript.

Last modified: September 10, 2019