Softpanorama

May the source be with you, but remember the KISS principle ;-)
Contents Bulletin Scripting in shell and Perl Network troubleshooting History Humor

Bash Tips and Tricks

News

Bash customization Recommended Links Unix Sysadmin Tips Command history reuse cdpath

Advanced filesystem navigation

IFS Customizing Shell Dot Files: .profile, RC-file, and history

 

Examples of .bashrc files Shell Prompts WinSCP Tips Readline and inputrc Piping Vim Buffer Through Unix Filters: ! and !! Commands
.screenrc examples Attaching to and detaching from screen sessions How to rename files with special characters in names Midnight Commander Tips and Tricks Shell Aliases basename function dirname
BASH Debugging Arithmetic Expressions in BASH String Operations if statements in shell sort command tr command exec command
dd cut command System Activity Reporter (sar) Shell Input and Output Redirection Unix Find Tutorial. Using -exec option with find Unix find tutorial Finding files using file name or path Brace Expansion
AWK Tips AWK one liners GNU Screen Tips VIM Tips pv Command completion BASH Debugging
  SSH Tips SCP Tips   Shell Input and Output Redirection Bash Built-in Variables Directory favorites
Brace Expansion Process Substitution in Shell Sequences of commands in Unix shell Subshells Shell scripts collections nmap_tips Annotated List of Bash Enhancements
Pipes in Loops Pushd, popd and dirs Sysadmin Horror Stories Unix shells history Unix Shell Tips and Tricks Humor  Etc

See also Unix Shell Tips and Tricks

The introduction below was adapted from article "Unix Scripting: some Traps, Pitfalls and Recommendations" by Marc Dobson


Introduction

Bash has several gotchas

Recommendation 1: Create  a special bash related log file or notebook where you write your findings. Environment is now so complex that you will definitely forget some of the most useful findings, if you do not write them down and periodically browse the content.

 For the same purpose create and maintain separate file with aliases (say .aliases) and a file with functions (say .functions), where you can write all the best ideas you have found or invented yourself. Which actually might never visit you the second time unless you write them the first time.  Just don't overdo it, too many aliases are as bad as too few.  Here excessive zeal is really destructive.  But even if you do not use them resizing your .aliases and .functions file is a very useful exercise that refresh some of long forgotten skills that at one point of time you used to have ;-)

Recommendation 2: In order to force bash to write lines in history on exit you need to put the line

shopt -s histappend

into your .bash_profile or a similar file (e.g. .profile) that executes for interactive sessions only.  

Recommendation 3: Add to your Prompt command history -a to preserve history from multiple terminals. This is a very neat trick !!!

Bash history handling with multiple terminals

The bash session that is saved is the one for the terminal that is closed the latest. If you want to save the commands for every session, you could use the trick explained here.

export PROMPT_COMMAND='history -a'

To quote the manpage: “If set, the value is executed as a command prior to issuing each primary prompt.”

So every time my command has finished, it appends the unwritten history item to ~/.bash

ATTENTION: If you use multiple shell sessions and do not use this trick, you need to write the history manually to preserver it using the command history -a

See also:

Recommendation 4: ls command has option -h which like in df produces "human readable" size of the file. So the most famous shell alias

alias ll='ls -la'

Might better be written as

alias ll='ls -halF'

Recommendation 5: If you prefer light color for your terminal, you are generally screwed: it is very difficult select proper colors for light background.  Default colors work well on black of dark blue background, but that's it.  For light background you need to limit yourself to three basic colors (black, red and blue) and forget about all other nuances.  Actually they do not matter much anyway, too many colors it is just another sign of overcomplexity of the Linux environment as people simply stop paying attention to them.   To disable or simplify color scheme create your own DIR_COLORS file or use option --nocolor

Recommendation 6: When sourcing a script always use a path name for the file or at lease the prefix "./". By default Bash first searches regular names in PATH first. You can disable this behaviour with shopt -u sourcepath  but it you work on multiple boxes where you are not primary administrator you can't just put this option into /etc/bashrc.  Sourcing script from a wrong directory might lead to disasters/horror stories, especially, if you are working as root.

Recommendation 7: always choose a unique script name. Ther is nothing wrong with long names, if they help to prevent a SNAFU. Unique script names can easily be obtained by prefixing the name with the project name to the script name (e.g. gpfs_setup  ). Bad generic names, where multiple scripts with the same name might exist in multiple directories are for example, setup, configure, install etc...

Recommendation 8: While this page is about clever, ingenious tips and tricks, you should never try to be too clever or too bold. Always play safe and test your commands, such as find with -exec option  by printing set of files they operate on before applying it to a production server (especially if this is a remote server). System administration is a very conservative profession and absence of SNAFU is more important that demonstration of excessive cleverness, boldness... 

Sourcing versus Executing

In the sourced file, an EXIT command, will terminate the whole script in which it was issues (the shell that invoked this script)  not just sourced sub-script where the exit  command was executed.

In contrast in standalone scripts which are executed in subshell the EXIT command in this case will exit the shell/interpreter which was started to execute the "main" script. Therefore the executed script file will just stop and return to the shell which called it.

As an example take the following two scripts. Script 1 is:

#!/bin/bash

echo "Executing script2"
./script2
if [ $? -eq 0 ]; then
   echo "Executing ls in /tmp/md"
   ls -l /tmp/md
else
   echo "Exiting"
   exit 1
fi

And script 2 is:

#!/bin/bash

echo "In script 2"
if [ -e "/tmp/md" ]; then
   echo "/tmp/md exists"
else
   echo "/tmp/md does not exist"
   exit 1
fi

Both scripts should have the execute bit set. Start a BASH shell by typing bash, and at the next prompt execute script 1. The following output is produced:

If directory /tmp/md  exists: If directory /tmp/md  does not exist:
Executing script2
In script 2
/tmp/md exists
Executing ls in /tmp/md
total 0
Executing script2
In script 2
/tmp/md does not exist
Exiting

Now change script 1 to source script 2 instead of executing it (source ./script2  instead of ./script2). When the script 1 is executed the following output will be produced:

If directory /tmp/md  exists: If directory /tmp/md  does not exist:
Executing script2
In script 2
/tmp/md exists
Executing ls in /tmp/md
total 0
Executing script2
In script 2
/tmp/md does not exist

If the directory /tmp/md  exists then the output is the same and exactly the same commands were executed. If however the directory /tmp/md  does not exist then the script 2 has an EXIT and as it was sourced from script 1, it is actually script 1 which exits without the desired effect, i.e. printing "Exiting". In this case it is not very important but it could have very profound consequences with complex scripts.

The ambiguity in this case is compounded by the difference in coding in the two branches of the IF statement of script 2. For the case when the directory exists the EXIT command is implicit (the script goes to the end and exits normally), whereas for the case when the directory does not exist the EXIT command is explicit (this is the one which causes the exit from script 1).

If the programmer wishes to exit from a sourced script file (as he would with the EXIT command in an executed script), he may do so with:

return [n]
where "[n]" is the return value that can be tested for in the script/shell which sourced the script file (as with the EXIT command). Beware though that the RETURN command is also used to exit a function, therefore make sure that the RETURN command is placed in the appropriate place for the desired effect.

Do not use source functionality as a poor man subroutines

If the same functionality is required (i.e. the same commands) to be executed multiple times it is better to use shell functions or standalone scripts, then to source the same fragment multiple times.  If you use this "multiple sourcing"  as a poor man subroutine always put a banner to remind yourself what is happening:

#!/bin/bash

echo "We have been executed"
echo "Sourcing the external commands from the file /root/bin/standard_gpfs_setup_actions..."
. /root/bin/standard_gpfs_setup_actions
echo "Exiting"

If the set of commands is written as a file that needs to be sourced  use the full path or least specify "dot-slash prefix if it reside in the current directory. Never use "naked", non-qualified names. For example

. ./sourced_script

Top Visited
Switchboard
Latest
Past week
Past month

NEWS CONTENTS

Old News ;-)

[Oct 31, 2017] High-speed Bash by Tom Ryder

Notable quotes:
"... One of my favourite technical presentations I've read online has been Hal Pomeranz's Unix Command-Line Kung Fu , a catalogue of shortcuts and efficient methods of doing very clever things with the Bash shell. None of these are grand arcane secrets, but they're things that are often forgotten in the course of daily admin work, when you find yourself typing something you needn't, or pressing up repeatedly to find something you wrote for which you could simply search your command history. ..."
Jan 24, 2012 | sanctum.geek.nz

One of my favourite technical presentations I've read online has been Hal Pomeranz's Unix Command-Line Kung Fu , a catalogue of shortcuts and efficient methods of doing very clever things with the Bash shell. None of these are grand arcane secrets, but they're things that are often forgotten in the course of daily admin work, when you find yourself typing something you needn't, or pressing up repeatedly to find something you wrote for which you could simply search your command history.

I highly recommend reading the whole thing, as I think even the most experienced shell users will find there are useful tidbits in there that would make their lives easier and their time with the shell more productive, beyond simpler things like tab completion.

Here, I'll recap two of the things I thought were the most simple and useful items in the presentation for general shell usage, and see if I can add a little value to them with reference to the Bash manual.

History with Ctrl+R

For many shell users, finding a command in history means either pressing the up arrow key repeatedly, or perhaps piping a history call through grep . It turns out there's a much nicer way to do this, using Bash's built-in history searching functionality; if you press Ctrl+R and start typing a search pattern, the most recent command matching that pattern will automatically be inserted on your current line, at which point you can adapt it as you need, or simply press Enter to run it again. You can keep pressing Ctrl+R to move further back in your history to the next-most recent match. On my shell, if I search through my history for git , I can pull up what I typed for a previous commit:

(reverse-i-search)`git': git commit -am "Pulled up-to-date colors."

This functionality isn't actually exclusive to Bash; you can establish a history search function in quite a few tools that use GNU Readline, including the MySQL client command line.

You can search forward through history in the same way with Ctrl+S, but it's likely you'll have to fix up a couple of terminal annoyances first.

Additionally, if like me you're a Vim user and you don't really like having to reach for the arrow keys, or if you're on a terminal where those keys are broken for whatever reason, you can browse back and forth within your command history with Ctrl+P (previous) and Ctrl+N (next). These are just a few of the Emacs-style shortcuts that GNU Readline provides; check here for a more complete list .

Repeating commands with !!

The last command you ran in Bash can be abbreviated on the next line with two exclamation marks:

$ echo "Testing."
Testing.
$ !!
Testing.

You can use this to simply repeat a command over and over again, although for that you really should be using watch , but more interestingly it turns out this is very handy for building complex pipes in stages. Suppose you were building a pipeline to digest some data generated from a program like netstat , perhaps to determine the top 10 IP addresses that are holding open the most connections to a server. You might be able to build a pipeline like this:

# netstat -ant
# !! | awk '{print $5}'
# !! | sort
# !! | uniq -c
# !! | sort -rn
# !! | sed 10q

Similarly, you can repeat the last argument from the previous command line using !$ , which is useful if you're doing a set of operations on one file, such as checking it out via RCS, editing it, and checking it back in:

$ co -l file.txt
$ vim !$
$ ci -u !$

Or if you happen to want to work on a set of arguments, you can repeat all of the arguments from the previous command using !* :

$ touch a.txt b.txt c.txt
$ rm !*

When you remember to user these three together, they can save you a lot of typing, and will really increase your accuracy because you won't be at risk of mistyping any of the commands or arguments. Naturally, however, it pays to be careful what you're running through rm !

[Oct 31, 2017] Learning the content of /bin and /usr/bin by Tom Ryder

Mar 16, 2012 | sanctum.geek.nz

When you have some spare time, something instructive to do that can help fill gaps in your Unix knowledge and to get a better idea of the programs installed on your system and what they can do is a simple whatis call, run over all the executable files in your /bin and /usr/bin directories.

This will give you a one-line summary of the file's function if available from man pages.

tom@conan:/bin$ whatis *
bash (1) - GNU Bourne-Again SHell
bunzip2 (1) - a block-sorting file compressor, v1.0.4
busybox (1) - The Swiss Army Knife of Embedded Linux
bzcat (1) - decompresses files to stdout
...

tom@conan:/usr/bin$ whatis *
[ (1)                - check file types and compare values
2to3 (1)             - Python2 to Python3 converter
2to3-2.7 (1)         - Python2 to Python3 converter
411toppm (1)         - convert Sony Mavica .411 image to ppm
...

It also works on many of the files in other directories, such as /etc :

tom@conan:/etc$ whatis *
acpi (1)             - Shows battery status and other ACPI information
adduser.conf (5)     - configuration file for adduser(8) and addgroup(8)
adjtime (3)          - correct the time to synchronize the system clock
aliases (5)          - Postfix local alias database format
...

Because packages often install more than one binary and you're only in the habit of using one or two of them, this process can tell you about programs on your system that you may have missed, particularly standard tools that solve common problems. As an example, I first learned about watch this way, having used a clunky solution with for loops with sleep calls to do the same thing many times before.

[Oct 31, 2017] Shell config subfiles by Tom Ryder

Notable quotes:
"... Note that we unset the config variable after we're done, otherwise it'll be in the namespace of our shell where we don't need it. You may also wish to check for the existence of the ~/.bashrc.d directory, check there's at least one matching file inside it, or check that the file is readable before attempting to source it, depending on your preference. ..."
"... Thanks to commenter oylenshpeegul for correcting the syntax of the loops. ..."
Jan 30, 2015 | sanctum.geek.nz

Large shell startup scripts ( .bashrc , .profile ) over about fifty lines or so with a lot of options, aliases, custom functions, and similar tweaks can get cumbersome to manage over time, and if you keep your dotfiles under version control it's not terribly helpful to see large sets of commits just editing the one file when it could be more instructive if broken up into files by section.

Given that shell configuration is just shell code, we can apply the source builtin (or the . builtin for POSIX sh ) to load several files at the end of a .bashrc , for example:

source ~/.bashrc.options
source ~/.bashrc.aliases
source ~/.bashrc.functions

This is a better approach, but it still binds us into using those filenames; we still have to edit the ~/.bashrc file if we want to rename them, or remove them, or add new ones.

Fortunately, UNIX-like systems have a common convention for this, the .d directory suffix, in which sections of configuration can be stored to be read by a main configuration file dynamically. In our case, we can create a new directory ~/.bashrc.d :

$ ls ~/.bashrc.d
options.bash
aliases.bash
functions.bash

With a slightly more advanced snippet at the end of ~/.bashrc , we can then load every file with the suffix .bash in this directory:

# Load any supplementary scripts
for config in "$HOME"/.bashrc.d/*.bash ; do
    source "$config"
done
unset -v config

Note that we unset the config variable after we're done, otherwise it'll be in the namespace of our shell where we don't need it. You may also wish to check for the existence of the ~/.bashrc.d directory, check there's at least one matching file inside it, or check that the file is readable before attempting to source it, depending on your preference.

The same method can be applied with .profile to load all scripts with the suffix .sh in ~/.profile.d , if we want to write in POSIX sh , with some slightly different syntax:

# Load any supplementary scripts
for config in "$HOME"/.profile.d/*.sh ; do
    . "$config"
done
unset -v config

Another advantage of this method is that if you have your dotfiles under version control, you can arrange to add extra snippets on a per-machine basis unversioned, without having to update your .bashrc file.

Here's my implementation of the above method, for both .bashrc and .profile :

Thanks to commenter oylenshpeegul for correcting the syntax of the loops.

[Oct 31, 2017] Better Bash history by Tom Ryder

Feb 21, 2012 | sanctum.geek.nz

By default, the Bash shell keeps the history of your most recent session in the .bash_history file, and the commands you've issued in your current session are also available with a history call. These defaults are useful for keeping track of what you've been up to in the shell on any given machine, but with disks much larger and faster than they were when Bash was designed, a little tweaking in your .bashrc file can record history more permanently, consistently, and usefully. Append history instead of rewriting it

You should start by setting the histappend option, which will mean that when you close a session, your history will be appended to the .bash_history file rather than overwriting what's in there.

shopt -s histappend
Allow a larger history file

The default maximum number of commands saved into the .bash_history file is a rather meager 500. If you want to keep history further back than a few weeks or so, you may as well bump this up by explicitly setting $HISTSIZE to a much larger number in your .bashrc . We can do the same thing with the $HISTFILESIZE variable.

HISTFILESIZE=1000000
HISTSIZE=1000000

The man page for Bash says that HISTFILESIZE can be unset to stop truncation entirely, but unfortunately this doesn't work in .bashrc files due to the order in which variables are set; it's therefore more straightforward to simply set it to a very large number.

If you're on a machine with resource constraints, it might be a good idea to occasionally archive old .bash_history files to speed up login and reduce memory footprint.

Don't store specific lines

You can prevent commands that start with a space from going into history by setting $HISTCONTROL to ignorespace . You can also ignore duplicate commands, for example repeated du calls to watch a file grow, by adding ignoredups . There's a shorthand to set both in ignoreboth .

HISTCONTROL=ignoreboth

You might also want to remove the use of certain commands from your history, whether for privacy or readability reasons. This can be done with the $HISTIGNORE variable. It's common to use this to exclude ls calls, job control builtins like bg and fg , and calls to history itself:

HISTIGNORE='ls:bg:fg:history'
Record timestamps

If you set $HISTTIMEFORMAT to something useful, Bash will record the timestamp of each command in its history. In this variable you can specify the format in which you want this timestamp displayed when viewed with history . I find the full date and time to be useful, because it can be sorted easily and works well with tools like cut and awk .

HISTTIMEFORMAT='%F %T '
Use one command per line

To make your .bash_history file a little easier to parse, you can force commands that you entered on more than one line to be adjusted to fit on only one with the cmdhist option:

shopt -s cmdhist
Store history immediately

By default, Bash only records a session to the .bash_history file on disk when the session terminates. This means that if you crash or your session terminates improperly, you lose the history up to that point. You can fix this by recording each line of history as you issue it, through the $PROMPT_COMMAND variable:

PROMPT_COMMAND='history -a'

[Oct 31, 2017] Bash history expansion by Tom Ryder

Notable quotes:
"... Thanks to commenter Mihai Maruseac for pointing out a bug in the examples. ..."
Aug 16, 2012 | sanctum.geek.nz

Setting the Bash option histexpand allows some convenient typing shortcuts using Bash history expansion . The option can be set with either of these:

$ set -H
$ set -o histexpand

It's likely that this option is already set for all interactive shells, as it's on by default. The manual, man bash , describes these features as follows:

-H  Enable ! style history substitution. This option is on
    by default when the shell is interactive.

You may have come across this before, perhaps to your annoyance, in the following error message that comes up whenever ! is used in a double-quoted string, or without being escaped with a backslash:

$ echo "Hi, this is Tom!"
bash: !": event not found

If you don't want the feature and thereby make ! into a normal character, it can be disabled with either of these:

$ set +H
$ set +o histexpand

History expansion is actually a very old feature of shells, having been available in csh before Bash usage became common.

This article is a good followup to Better Bash history , which among other things explains how to include dates and times in history output, as these examples do.

Basic history expansion

Perhaps the best known and most useful of these expansions is using !! to refer to the previous command. This allows repeating commands quickly, perhaps to monitor the progress of a long process, such as disk space being freed while deleting a large file:

$ rm big_file &
[1] 23608
$ du -sh .
3.9G    .
$ !!
du -sh .
3.3G    .

It can also be useful to specify the full filesystem path to programs that aren't in your $PATH :

$ hdparm
-bash: hdparm: command not found
$ /sbin/!!
/sbin/hdparm

In each case, note that the command itself is printed as expanded, and then run to print the output on the following line.

History by absolute index

However, !! is actually a specific example of a more general form of history expansion. For example, you can supply the history item number of a specific command to repeat it, after looking it up with history :

$ history | grep expand
 3951  2012-08-16 15:58:53  set -o histexpand
$ !3951
set -o histexpand

You needn't enter the !3951 on a line by itself; it can be included as any part of the command, for example to add a prefix like sudo :

$ sudo !3850

If you include the escape string \! as part of your Bash prompt , you can include the current command number in the prompt before the command, making repeating commands by index a lot easier as long as they're still visible on the screen.

History by relative index

It's also possible to refer to commands relative to the current command. To subtitute the second-to-last command, we can type !-2 . For example, to check whether truncating a file with sed worked correctly:

$ wc -l bigfile.txt
267 bigfile.txt
$ printf '%s\n' '11,$d' w | ed -s bigfile.txt
$ !-2
wc -l bigfile.txt
10 bigfile.txt

This works further back into history, with !-3 , !-4 , and so on.

Expanding for historical arguments

In each of the above cases, we're substituting for the whole command line. There are also ways to get specific tokens, or words , from the command if we want that. To get the first argument of a particular command in the history, use the !^ token:

$ touch a.txt b.txt c.txt
$ ls !^
ls a.txt
a.txt

To get the last argument, add !$ :

$ touch a.txt b.txt c.txt
$ ls !$
ls c.txt
c.txt

To get all arguments (but not the command itself), use !* :

$ touch a.txt b.txt c.txt
$ ls !*
ls a.txt b.txt c.txt
a.txt  b.txt  c.txt

This last one is particularly handy when performing several operations on a group of files; we could run du and wc over them to get their size and character count, and then perhaps decide to delete them based on the output:

$ du a.txt b.txt c.txt
4164    a.txt
5184    b.txt
8356    c.txt
$ wc !*
wc a.txt b.txt c.txt
16689    94038  4250112 a.txt
20749   117100  5294592 b.txt
33190   188557  8539136 c.txt
70628   399695 18083840 total
$ rm !*
rm a.txt b.txt c.txt

These work not just for the preceding command in history, but also absolute and relative command numbers:

$ history 3
 3989  2012-08-16 16:30:59  wc -l b.txt
 3990  2012-08-16 16:31:05  du -sh c.txt
 3991  2012-08-16 16:31:12  history 3
$ echo !3989^
echo -l
-l
$ echo !3990$
echo c.txt
c.txt
$ echo !-1*
echo c.txt
c.txt

More generally, you can use the syntax !n:w to refer to any specific argument in a history item by number. In this case, the first word, usually a command or builtin, is word 0 :

$ history | grep bash
 4073  2012-08-16 20:24:53  man bash
$ !4073:0
man
What manual page do you want?
$ !4073:1
bash

You can even select ranges of words by separating their indices with a hyphen:

$ history | grep apt-get
 3663  2012-08-15 17:01:30  sudo apt-get install gnome
$ !3663:0-1 purge !3663:3
sudo apt-get purge gnome

You can include ^ and $ as start and endpoints for these ranges, too. 3* is a shorthand for 3-$ , meaning "all arguments from the third to the last."

Expanding history by string

You can also refer to a previous command in the history that starts with a specific string with the syntax !string :

$ !echo
echo c.txt
c.txt
$ !history
history 3
 4011  2012-08-16 16:38:28  rm a.txt b.txt c.txt
 4012  2012-08-16 16:42:48  echo c.txt
 4013  2012-08-16 16:42:51  history 3

If you want to match any part of the command line, not just the start, you can use !?string? :

$ !?bash?
man bash

Be careful when using these, if you use them at all. By default it will run the most recent command matching the string immediately , with no prompting, so it might be a problem if it doesn't match the command you expect.

Checking history expansions before running

If you're paranoid about this, Bash allows you to audit the command as expanded before you enter it, with the histverify option:

$ shopt -s histverify
$ !rm
$ rm a.txt b.txt c.txt

This option works for any history expansion, and may be a good choice for more cautious administrators. It's a good thing to add to one's .bashrc if so.

If you don't need this set all the time, but you do have reservations at some point about running a history command, you can arrange to print the command without running it by adding a :p suffix:

$ !rm:p
rm important-file

In this instance, the command was expanded, but thankfully not actually run.

Substituting strings in history expansions

To get really in-depth, you can also perform substitutions on arbitrary commands from the history with !!:gs/pattern/replacement/ . This is getting pretty baroque even for Bash, but it's possible you may find it useful at some point:

$ !!:gs/txt/mp3/
rm a.mp3 b.mp3 c.mp3

If you only want to replace the first occurrence, you can omit the g :

$ !!:s/txt/mp3/
rm a.mp3 b.txt c.txt
Stripping leading directories or trailing files

If you want to chop a filename off a long argument to work with the directory, you can do this by adding an :h suffix, kind of like a dirname call in Perl:

$ du -sh /home/tom/work/doc.txt
$ cd !$:h
cd /home/tom/work

To do the opposite, like a basename call in Perl, use :t :

$ ls /home/tom/work/doc.txt
$ document=!$:t
document=doc.txt
Stripping extensions or base names

A bit more esoteric, but still possibly useful; to strip a file's extension, use :r :

$ vi /home/tom/work/doc.txt
$ stripext=!$:r
stripext=/home/tom/work/doc

To do the opposite, to get only the extension, use :e :

$ vi /home/tom/work/doc.txt
$ extonly=!$:e
extonly=.txt
Quoting history

If you're performing substitution not to execute a command or fragment but to use it as a string, it's likely you'll want to quote it. For example, if you've just found through experiment and trial and error an ideal ffmpeg command line to accomplish some task, you might want to save it for later use by writing it to a script:

$ ffmpeg -f alsa -ac 2 -i hw:0,0 -f x11grab -r 30 -s 1600x900 \
> -i :0.0+1600,0 -acodec pcm_s16le -vcodec libx264 -preset ultrafast \
> -crf 0 -threads 0 "$(date +%Y%m%d%H%M%S)".mkv

To make sure all the escaping is done correctly, you can write the command into the file with the :q modifier:

$ echo '#!/usr/bin/env bash' >ffmpeg.sh
$ echo !ffmpeg:q >>ffmpeg.sh

In this case, this will prevent Bash from executing the command expansion "$(date ... )" , instead writing it literally to the file as desired. If you build a lot of complex commands interactively that you later write to scripts once completed, this feature is really helpful and saves a lot of cutting and pasting.

Thanks to commenter Mihai Maruseac for pointing out a bug in the examples.

[Oct 31, 2017] Prompt directory shortening by Tom Ryder

Notable quotes:
"... If you're using Bash version 4.0 or above ( bash --version ), you can save a bit of terminal space by setting the PROMPT_DIRTRIM variable for the shell. This limits the length of the tail end of the \w and \W expansions to that number of path elements: ..."
Nov 07, 2014 | sanctum.geek.nz

The common default of some variant of \h:\w\$ for a Bash prompt PS1 string includes the \w escape character, so that the user's current working directory appears in the prompt, but with $HOME shortened to a tilde:

tom@sanctum:~$
tom@sanctum:~/Documents$
tom@sanctum:/usr/local/nagios$

This is normally very helpful, particularly if you leave your shell for a time and forget where you are, though of course you can always call the pwd shell builtin. However it can get annoying for very deep directory hierarchies, particularly if you're using a smaller terminal window:

tom@sanctum:/chroot/apache/usr/local/perl/app-library/lib/App/Library/Class:~$

If you're using Bash version 4.0 or above ( bash --version ), you can save a bit of terminal space by setting the PROMPT_DIRTRIM variable for the shell. This limits the length of the tail end of the \w and \W expansions to that number of path elements:

tom@sanctum:/chroot/apache/usr/local/app-library/lib/App/Library/Class$ PROMPT_DIRTRIM=3
tom@sanctum:.../App/Library/Class$

This is a good thing to include in your ~/.bashrc file if you often find yourself deep in directory trees where the upper end of the hierarchy isn't of immediate interest to you. You can remove the effect again by unsetting the variable:

tom@sanctum:.../App/Library/Class$ unset PROMPT_DIRTRIM
tom@sanctum:/chroot/apache/usr/local/app-library/lib/App/Library/Class$

[Oct 20, 2017] Simple logical operators in Bash - Stack Overflow

Notable quotes:
"... Backquotes ( ` ` ) are old-style form of command substitution, with some differences: in this form, backslash retains its literal meaning except when followed by $ , ` , or \ , and the first backquote not preceded by a backslash terminates the command substitution; whereas in the $( ) form, all characters between the parentheses make up the command, none are treated specially. ..."
"... Double square brackets delimit a Conditional Expression. And, I find the following to be a good reading on the subject: "(IBM) Demystify test, [, [[, ((, and if-then-else" ..."
Oct 20, 2017 | stackoverflow.com

Amit , Jun 7, 2011 at 19:18

I have a couple of variables and I want to check the following condition (written out in words, then my failed attempt at bash scripting):
if varA EQUALS 1 AND ( varB EQUALS "t1" OR varB EQUALS "t2" ) then 

do something

done.

And in my failed attempt, I came up with:

if (($varA == 1)) && ( (($varB == "t1")) || (($varC == "t2")) ); 
  then
    scale=0.05
  fi

Best answer Gilles

What you've written actually almost works (it would work if all the variables were numbers), but it's not an idiomatic way at all.

This is the idiomatic way to write your test in bash:

if [[ $varA = 1 && ($varB = "t1" || $varC = "t2") ]]; then

If you need portability to other shells, this would be the way (note the additional quoting and the separate sets of brackets around each individual test):

if [ "$varA" = 1 ] && { [ "$varB" = "t1" ] || [ "$varC" = "t2" ]; }; then

Will Sheppard , Jun 19, 2014 at 11:07

It's better to use == to differentiate the comparison from assigning a variable (which is also = ) – Will Sheppard Jun 19 '14 at 11:07

Cbhihe , Apr 3, 2016 at 8:05

+1 @WillSheppard for yr reminder of proper style. Gilles, don't you need a semicolon after yr closing curly bracket and before "then" ? I always thought if , then , else and fi could not be on the same line... As in:

if [ "$varA" = 1 ] && { [ "$varB" = "t1" ] || [ "$varC" = "t2" ]; }; then

Cbhihe Apr 3 '16 at 8:05

Rockallite , Jan 19 at 2:41

Backquotes ( ` ` ) are old-style form of command substitution, with some differences: in this form, backslash retains its literal meaning except when followed by $ , ` , or \ , and the first backquote not preceded by a backslash terminates the command substitution; whereas in the $( ) form, all characters between the parentheses make up the command, none are treated specially.

Rockallite Jan 19 at 2:41

Peter A. Schneider , Aug 28 at 13:16

You could emphasize that single brackets have completely different semantics inside and outside of double brackets. (Because you start with explicitly pointing out the subshell semantics but then only as an aside mention the grouping semantics as part of conditional expressions. Was confusing to me for a second when I looked at your idiomatic example.) – Peter A. Schneider Aug 28 at 13:16

matchew , Jun 7, 2011 at 19:29

very close
if (( $varA == 1 )) && [[ $varB == 't1' || $varC == 't2' ]]; 
  then 
    scale=0.05
  fi

should work.

breaking it down

(( $varA == 1 ))

is an integer comparison where as

$varB == 't1'

is a string comparison. otherwise, I am just grouping the comparisons correctly.

Double square brackets delimit a Conditional Expression. And, I find the following to be a good reading on the subject: "(IBM) Demystify test, [, [[, ((, and if-then-else"

Peter A. Schneider , Aug 28 at 13:21

Just to be sure: The quoting in 't1' is unnecessary, right? Because as opposed to arithmetic instructions in double parentheses, where t1 would be a variable, t1 in a conditional expression in double brackets is just a literal string.

I.e., [[ $varB == 't1' ]] is exactly the same as [[ $varB == t1 ]] , right? – Peter A. Schneider Aug 28 at 13:21

[Oct 19, 2017] Bash One-Liners bashoneliners.com

Oct 19, 2017 | www.bashoneliners.com
Kill a process running on port 8080
 $ lsof -i :8080 | awk 'NR > 1 {print $2}' | xargs --no-run-if-empty kill

-- by Janos on Sept. 1, 2017, 8:31 p.m.

Make a new folder and cd into it.
 $ mkcd(){ NAME=$1; mkdir -p "$NAME"; cd "$NAME"; }

-- by PrasannaNatarajan on Aug. 3, 2017, 6:49 a.m.

Go up to a particular folder
 $ alias ph='cd ${PWD%/public_html*}/public_html'

-- by Jab2870 on July 18, 2017, 6:07 p.m.

Explanation

I work on a lot of websites and often need to go up to the public_html folder.

This command creates an alias so that however many folders deep I am, I will be taken up to the correct folder.

alias ph='....' : This creates a shortcut so that when command ph is typed, the part between the quotes is executed

cd ... : This changes directory to the directory specified

PWD : This is a global bash variable that contains the current directory

${...%/public_html*} : This removes /public_html and anything after it from the specified string

Finally, /public_html at the end is appended onto the string.

So, to sum up, when ph is run, we ask bash to change the directory to the current working directory with anything after public_html removed.

Open another terminal at current location
 $ $TERMINAL & disown

-- by Jab2870 on July 18, 2017, 3:04 p.m.

Explanation

Opens another terminal window at the current location.

Use Case

I often cd into a directory and decide it would be useful to open another terminal in the same folder, maybe for an editor or something. Previously, I would open the terminal and repeat the CD command.

I have aliased this command to open so I just type open and I get a new terminal already in my desired folder.

The & disown part of the command stops the new terminal from being dependant on the first meaning that you can still use the first and if you close the first, the second will remain open. Limitations

It relied on you having the $TERMINAL global variable set. If you don't have this set you could easily change it to something like the following:

gnome-terminal & disown or konsole & disown

Preserve your fingers from cd ..; cd ..; cd..; cd..;
 $ up(){ DEEP=$1; for i in $(seq 1 ${DEEP:-"1"}); do cd ../; done; }

-- by alireza6677 on June 28, 2017, 5:40 p.m.

Generate a sequence of numbers
 $ echo {01..10}

-- by Elkku on March 1, 2015, 12:04 a.m.

Explanation

This example will print:

01 02 03 04 05 06 07 08 09 10

While the original one-liner is indeed IMHO the canonical way to loop over numbers, the brace expansion syntax of Bash 4.x has some kick-ass features such as correct padding of the number with leading zeros. Limitations

The zero-padding feature works only in Bash >=4.

Tweet

Related one-liners
Generate a sequence of numbers
 $ for ((i=1; i<=10; ++i)); do echo $i; done

-- by Janos on Nov. 4, 2014, 12:29 p.m.

Explanation

This is similar to seq , but portable. seq does not exist in all systems and is not recommended today anymore. Other variations to emulate various uses with seq :

# seq 1 2 10
for ((i=1; i<=10; i+=2)); do echo $i; done

# seq -w 5 10
for ((i=5; i<=10; ++i)); do printf '%02d\n' $i; done
Find recent logs that contain the string "Exception"
 $ find . -name '*.log' -mtime -2 -exec grep -Hc Exception {} \; | grep -v :0$

-- by Janos on July 19, 2014, 7:53 a.m.

Explanation

The find :

  • -name '*.log' -- match files ending with .log
  • -mtime -2 -- match files modified within the last 2 days
  • -exec CMD ARGS \; -- for each file found, execute command, where {} in ARGS will be replaced with the file's path

The grep :

  • -c is to print the count of the matches instead of the matches themselves
  • -H is to print the name of the file, as grep normally won't print it when there is only one filename argument
  • The output lines will be in the format path:count . Files that didn't match "Exception" will still be printed, with 0 as count
  • The second grep filters the output of the first, excluding lines that end with :0 (= the files that didn't contain matches)

Extra tips:

  • Change "Exception" to the typical relevant failure indicator of your application
  • Add -i for grep to make the search case insensitive
  • To make the find match strictly only files, add -type f
  • Schedule this as a periodic job, and pipe the output to a mailer, for example | mailx -s 'error counts' yourmail@example.com
Remove offending key from known_hosts file with one swift move
 $ sed -i 18d .ssh/known_hosts

-- by EvaggelosBalaskas on Jan. 16, 2013, 2:29 p.m.

Explanation

Using sed to remove a specific line.

The -i parameter is to edit the file in-place. Limitations

This works as posted in GNU sed . In BSD sed , the -i flag requires a parameter to use as the suffix of a backup file. You can set it to empty to not use a backup file:

[Oct 16, 2017] Indenting Here-Documents - bash Cookbook

Oct 16, 2017 | www.safaribooksonline.com

Indenting Here-Documents Problem

The here-document is great, but it's messing up your shell script's formatting. You want to be able to indent for readability. Solution

Use <<- and then you can use tab characters (only!) at the beginning of lines to indent this portion of your shell script.

   $ cat myscript.sh
        ...
             grep $1 <<-'EOF'
                lots of data
                can go here
                it's indented with tabs
                to match the script's indenting
                but the leading tabs are
                discarded when read
                EOF
            ls
        ...
        $
Discussion

The hyphen just after the << is enough to tell bash to ignore the leading tab characters. This is for tab characters only and not arbitrary white space. This is especially important with the EOF or any other marker designation. If you have spaces there, it will not recognize the EOF as your ending marker, and the "here" data will continue through to the end of the file (swallowing the rest of your script). Therefore, you may want to always left-justify the EOF (or other marker) just to be safe, and let the formatting go on this one line.

[Oct 16, 2017] Indenting bourne shell here documents

Oct 16, 2017 | prefetch.net

The Bourne shell provides here documents to allow block of data to be passed to a process through STDIN. The typical format for a here document is something similar to this:

command <<ARBITRARY_TAG
data to pass 1
data to pass 2
ARBITRARY_TAG

This will send the data between the ARBITRARY_TAG statements to the standard input of the process. In order for this to work, you need to make sure that the data is not indented. If you indent it for readability, you will get a syntax error similar to the following:

./test: line 12: syntax error: unexpected end of file

To allow your here documents to be indented, you can append a "-" to the end of the redirection strings like so:

if [ "${STRING}" = "SOMETHING" ]
then
        somecommand <<-EOF
        this is a string1
        this is a string2
        this is a string3
        EOF
fi

You will need to use tabs to indent the data, but that is a small price to pay for added readability. Nice!

[Oct 09, 2017] TMOUT - Auto Logout Linux Shell When There Isn't Any Activity by Aaron Kili

Oct 07, 2017 | www.tecmint.com
... ... ..

To enable automatic user logout, we will be using the TMOUT shell variable, which terminates a user's login shell in case there is no activity for a given number of seconds that you can specify.

To enable this globally (system-wide for all users), set the above variable in the /etc/profile shell initialization file.

[Sep 27, 2017] Arithmetic Evaluation

Sep 27, 2017 | mywiki.wooledge.org

Bash has several different ways to say we want to do arithmetic instead of string operations. Let's look at them one by one.

The first way is the let command:

$ unset a; a=4+5
$ echo $a
4+5
$ let a=4+5
$ echo $a
9

You may use spaces, parentheses and so forth, if you quote the expression:

$ let a='(5+2)*3'

For a full list of operators availabile, see help let or the manual.

Next, the actual arithmetic evaluation compound command syntax:

$ ((a=(5+2)*3))

This is equivalent to let , but we can also use it as a command , for example in an if statement:

$ if (($a == 21)); then echo 'Blackjack!'; fi

Operators such as == , < , > and so on cause a comparison to be performed, inside an arithmetic evaluation. If the comparison is "true" (for example, 10 > 2 is true in arithmetic -- but not in strings!) then the compound command exits with status 0. If the comparison is false, it exits with status 1. This makes it suitable for testing things in a script.

Although not a compound command, an arithmetic substitution (or arithmetic expression ) syntax is also available:

$ echo "There are $(($rows * $columns)) cells"

Inside $((...)) is an arithmetic context , just like with ((...)) , meaning we do arithmetic (multiplying things) instead of string manipulations (concatenating $rows , space, asterisk, space, $columns ). $((...)) is also portable to the POSIX shell, while ((...)) is not.

Readers who are familiar with the C programming language might wish to know that ((...)) has many C-like features. Among them are the ternary operator:

$ ((abs = (a >= 0) ? a : -a))

and the use of an integer value as a truth value:

$ if ((flag)); then echo "uh oh, our flag is up"; fi

Note that we used variables inside ((...)) without prefixing them with $ -signs. This is a special syntactic shortcut that Bash allows inside arithmetic evaluations and arithmetic expressions.

There is one final thing we must mention about ((flag)) . Because the inside of ((...)) is C-like, a variable (or expression) that evaluates to zero will be considered false for the purposes of the arithmetic evaluation. Then, because the evaluation is false, it will exit with a status of 1. Likewise, if the expression inside ((...)) is non-zero , it will be considered true ; and since the evaluation is true, it will exit with status 0. This is potentially very confusing, even to experts, so you should take some time to think about this. Nevertheless, when things are used the way they're intended, it makes sense in the end:

$ flag=0      # no error
$ while read line; do
>   if [[ $line = *err* ]]; then flag=1; fi
> done < inputfile
$ if ((flag)); then echo "oh no"; fi

[Sep 27, 2017] Integer ASCII value to character in BASH using printf

Sep 27, 2017 | stackoverflow.com

user14070 , asked May 20 '09 at 21:07

Character to value works:
$ printf "%d\n" \'A
65
$

I have two questions, the first one is most important:

broaden , answered Nov 18 '09 at 10:10

One line
printf "\x$(printf %x 65)"

Two lines

set $(printf %x 65)
printf "\x$1"

Here is one if you do not mind using awk

awk 'BEGIN{printf "%c", 65}'

mouviciel , answered May 20 '09 at 21:12

For this kind of conversion, I use perl:
perl -e 'printf "%c\n", 65;'

user2350426 , answered Sep 22 '15 at 23:16

This works (with the value in octal):
$ printf '%b' '\101'
A

even for (some: don't go over 7) sequences:

$ printf '%b' '\'{101..107}
ABCDEFG

A general construct that allows (decimal) values in any range is:

$ printf '%b' $(printf '\\%03o' {65..122})
ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz

Or you could use the hex values of the characters:

$ printf '%b' $(printf '\\x%x' {65..122})
ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz

You also could get the character back with xxd (use hexadecimal values):

$ echo "41" | xxd -p -r
A

That is, one action is the reverse of the other:

$ printf "%x" "'A" | xxd -p -r
A

And also works with several hex values at once:

$ echo "41 42 43 44 45 46 47 48 49 4a" | xxd -p -r
ABCDEFGHIJ

or sequences (printf is used here to get hex values):

$ printf '%x' {65..90} | xxd -r -p 
ABCDEFGHIJKLMNOPQRSTUVWXYZ

Or even use awk:

$ echo 65 | awk '{printf("%c",$1)}'
A

even for sequences:

$ seq 65 90 | awk '{printf("%c",$1)}'
ABCDEFGHIJKLMNOPQRSTUVWXYZ

David Hu , answered Dec 1 '11 at 9:43

For your second question, it seems the leading-quote syntax ( \'A ) is specific to printf :

If the leading character is a single-quote or double-quote, the value shall be the numeric value in the underlying codeset of the character following the single-quote or double-quote.

From http://pubs.opengroup.org/onlinepubs/009695399/utilities/printf.html

Naaff , answered May 20 '09 at 21:21

One option is to directly input the character you're interested in using hex or octal notation:
printf "\x41\n"
printf "\101\n"

MagicMercury86 , answered Feb 21 '12 at 22:49

If you want to save the ASCII value of the character: (I did this in BASH and it worked)
{
char="A"

testing=$( printf "%d" "'${char}" )

echo $testing}

output: 65

chand , answered Nov 20 '14 at 10:05

Here's yet another way to convert 65 into A (via octal):
help printf  # in Bash
man bash | less -Ip '^[[:blank:]]*printf'

printf "%d\n" '"A'
printf "%d\n" "'A"

printf '%b\n' "$(printf '\%03o' 65)"

To search in man bash for \' use (though futile in this case):

man bash | less -Ip "\\\'"  # press <n> to go through the matches

,

If you convert 65 to hexadecimal it's 0x41 :

$ echo -e "\x41" A

[Sep 27, 2017] linux - How to convert DOS-Windows newline (CRLF) to Unix newline in a Bash script

Notable quotes:
"... Technically '1' is your program, b/c awk requires one when given option. ..."

Koran Molovik , asked Apr 10 '10 at 15:03

How can I programmatically (i.e., not using vi ) convert DOS/Windows newlines to Unix?

The dos2unix and unix2dos commands are not available on certain systems. How can I emulate these with commands like sed / awk / tr ?

Jonathan Leffler , answered Apr 10 '10 at 15:13

You can use tr to convert from DOS to Unix; however, you can only do this safely if CR appears in your file only as the first byte of a CRLF byte pair. This is usually the case. You then use:
tr -d '\015' <DOS-file >UNIX-file

Note that the name DOS-file is different from the name UNIX-file ; if you try to use the same name twice, you will end up with no data in the file.

You can't do it the other way round (with standard 'tr').

If you know how to enter carriage return into a script ( control-V , control-M to enter control-M), then:

sed 's/^M$//'     # DOS to Unix
sed 's/$/^M/'     # Unix to DOS

where the '^M' is the control-M character. You can also use the bash ANSI-C Quoting mechanism to specify the carriage return:

sed $'s/\r$//'     # DOS to Unix
sed $'s/$/\r/'     # Unix to DOS

However, if you're going to have to do this very often (more than once, roughly speaking), it is far more sensible to install the conversion programs (e.g. dos2unix and unix2dos , or perhaps dtou and utod ) and use them.

ghostdog74 , answered Apr 10 '10 at 15:21

tr -d "\r" < file

take a look here for examples using sed :

# IN UNIX ENVIRONMENT: convert DOS newlines (CR/LF) to Unix format.
sed 's/.$//'               # assumes that all lines end with CR/LF
sed 's/^M$//'              # in bash/tcsh, press Ctrl-V then Ctrl-M
sed 's/\x0D$//'            # works on ssed, gsed 3.02.80 or higher

# IN UNIX ENVIRONMENT: convert Unix newlines (LF) to DOS format.
sed "s/$/`echo -e \\\r`/"            # command line under ksh
sed 's/$'"/`echo \\\r`/"             # command line under bash
sed "s/$/`echo \\\r`/"               # command line under zsh
sed 's/$/\r/'                        # gsed 3.02.80 or higher

Use sed -i for in-place conversion e.g. sed -i 's/..../' file .

Steven Penny , answered Apr 30 '14 at 10:02

Doing this with POSIX is tricky:

To remove carriage returns:

ex -bsc '%!awk "{sub(/\r/,\"\")}1"' -cx file

To add carriage returns:

ex -bsc '%!awk "{sub(/$/,\"\r\")}1"' -cx file

Norman Ramsey , answered Apr 10 '10 at 22:32

This problem can be solved with standard tools, but there are sufficiently many traps for the unwary that I recommend you install the flip command, which was written over 20 years ago by Rahul Dhesi, the author of zoo . It does an excellent job converting file formats while, for example, avoiding the inadvertant destruction of binary files, which is a little too easy if you just race around altering every CRLF you see...

Gordon Davisson , answered Apr 10 '10 at 17:50

The solutions posted so far only deal with part of the problem, converting DOS/Windows' CRLF into Unix's LF; the part they're missing is that DOS use CRLF as a line separator , while Unix uses LF as a line terminator . The difference is that a DOS file (usually) won't have anything after the last line in the file, while Unix will. To do the conversion properly, you need to add that final LF (unless the file is zero-length, i.e. has no lines in it at all). My favorite incantation for this (with a little added logic to handle Mac-style CR-separated files, and not molest files that're already in unix format) is a bit of perl:
perl -pe 'if ( s/\r\n?/\n/g ) { $f=1 }; if ( $f || ! $m ) { s/([^\n])\z/$1\n/ }; $m=1' PCfile.txt

Note that this sends the Unixified version of the file to stdout. If you want to replace the file with a Unixified version, add perl's -i flag.

codaddict , answered Apr 10 '10 at 15:09

Using AWK you can do:
awk '{ sub("\r$", ""); print }' dos.txt > unix.txt

Using Perl you can do:

perl -pe 's/\r$//' < dos.txt > unix.txt

anatoly techtonik , answered Oct 31 '13 at 9:40

If you don't have access to dos2unix , but can read this page, then you can copy/paste dos2unix.py from here.
#!/usr/bin/env python
"""\
convert dos linefeeds (crlf) to unix (lf)
usage: dos2unix.py <input> <output>
"""
import sys

if len(sys.argv[1:]) != 2:
  sys.exit(__doc__)

content = ''
outsize = 0
with open(sys.argv[1], 'rb') as infile:
  content = infile.read()
with open(sys.argv[2], 'wb') as output:
  for line in content.splitlines():
    outsize += len(line) + 1
    output.write(line + '\n')

print("Done. Saved %s bytes." % (len(content)-outsize))

Cross-posted from superuser .

nawK , answered Sep 4 '14 at 0:16

An even simpler awk solution w/o a program:
awk -v ORS='\r\n' '1' unix.txt > dos.txt

Technically '1' is your program, b/c awk requires one when given option.

UPDATE : After revisiting this page for the first time in a long time I realized that no one has yet posted an internal solution, so here is one:

while IFS= read -r line;
do printf '%s\n' "${line%$'\r'}";
done < dos.txt > unix.txt

Santosh , answered Mar 12 '15 at 22:36

This worked for me
tr "\r" "\n" < sampledata.csv > sampledata2.csv

ThorSummoner , answered Jul 30 '15 at 17:38

Super duper easy with PCRE;

As a script, or replace $@ with your files.

#!/usr/bin/env bash
perl -pi -e 's/\r\n/\n/g' -- $@

This will overwrite your files in place!

I recommend only doing this with a backup (version control or otherwise)

Ashley Raiteri , answered May 19 '14 at 23:25

For Mac osx if you have homebrew installed [ http://brew.sh/][1]
brew install dos2unix

for csv in *.csv; do dos2unix -c mac ${csv}; done;

Make sure you have made copies of the files, as this command will modify the files in place. The -c mac option makes the switch to be compatible with osx.

lzc , answered May 31 '16 at 17:15

TIMTOWTDI!
perl -pe 's/\r\n/\n/; s/([^\n])\z/$1\n/ if eof' PCfile.txt

Based on @GordonDavisson

One must consider the possibility of [noeol] ...

kazmer , answered Nov 6 '16 at 23:30

You can use awk. Set the record separator ( RS ) to a regexp that matches all possible newline character, or characters. And set the output record separator ( ORS ) to the unix-style newline character.
awk 'BEGIN{RS="\r|\n|\r\n|\n\r";ORS="\n"}{print}' windows_or_macos.txt > unix.txt

user829755 , answered Jul 21 at 9:21

interestingly in my git-bash on windows sed "" did the trick already:
$ echo -e "abc\r" >tst.txt
$ file tst.txt
tst.txt: ASCII text, with CRLF line terminators
$ sed -i "" tst.txt
$ file tst.txt
tst.txt: ASCII text

My guess is that sed ignores them when reading lines from input and always writes unix line endings on output.

Gannet , answered Jan 24 at 8:38

As an extension to Jonathan Leffler's Unix to DOS solution, to safely convert to DOS when you're unsure of the file's current line endings:
sed '/^M$/! s/$/^M/'

This checks that the line does not already end in CRLF before converting to CRLF.

vmsnomad , answered Jun 23 at 18:37

Had just to ponder that same question (on Windows-side, but equally applicable to linux.) Surprisingly nobody mentioned a very much automated way of doing CRLF<->LF conversion for text-files using good old zip -ll option (Info-ZIP):
zip -ll textfiles-lf.zip files-with-crlf-eol.*
unzip textfiles-lf.zip

NOTE: this would create a zip file preserving the original file names but converting the line endings to LF. Then unzip would extract the files as zip'ed, that is with their original names (but with LF-endings), thus prompting to overwrite the local original files if any.

Relevant excerpt from the zip --help :

zip --help
...
-l   convert LF to CR LF (-ll CR LF to LF)
I tried sed 's/^M$//' file.txt on OSX as well as several other methods ( http://www.thingy-ma-jig.co.uk/blog/25-11-2010/fixing-dos-line-endings or http://hintsforums.macworld.com/archive/index.php/t-125.html ). None worked, the file remained unchanged (btw Ctrl-v Enter was needed to reproduce ^M). In the end I used TextWrangler. Its not strictly command line but it works and it doesn't complain.

[Aug 29, 2017] How to view the `.bash_history` file via command line

Aug 29, 2017 | askubuntu.com

If you actually need the output of the .bash_history file , replace history with

cat ~/.bash_history in all of the commands below.

If you actually want the commands without numbers in front, use this command instead of history :

history | cut -d' ' -f 4-

[Jul 29, 2017] Preserve bash history in multiple terminal windows - Unix Linux Stack Exchange

Jul 29, 2017 | unix.stackexchange.com

Oli , asked Aug 26 '10 at 13:04

I consistently have more than one terminal open. Anywhere from two to ten, doing various bits and bobs. Now let's say I restart and open up another set of terminals. Some remember certain things, some forget.

I want a history that:

Anything I can do to make bash work more like that?

Pablo R. , answered Aug 26 '10 at 14:37

# Avoid duplicates
export HISTCONTROL=ignoredups:erasedups  
# When the shell exits, append to the history file instead of overwriting it
shopt -s histappend

# After each command, append to the history file and reread it
export PROMPT_COMMAND="${PROMPT_COMMAND:+$PROMPT_COMMAND$'\n'}history -a; history -c; history -r"

kch , answered Sep 19 '08 at 17:49

So, this is all my history-related .bashrc thing:
export HISTCONTROL=ignoredups:erasedups  # no duplicate entries
export HISTSIZE=100000                   # big big history
export HISTFILESIZE=100000               # big big history
shopt -s histappend                      # append to history, don't overwrite it

# Save and reload the history after each command finishes
export PROMPT_COMMAND="history -a; history -c; history -r; $PROMPT_COMMAND"

Tested with bash 3.2.17 on Mac OS X 10.5, bash 4.1.7 on 10.6.

lesmana , answered Jun 16 '10 at 16:11

Here is my attempt at Bash session history sharing. This will enable history sharing between bash sessions in a way that the history counter does not get mixed up and history expansion like !number will work (with some constraints).

Using Bash version 4.1.5 under Ubuntu 10.04 LTS (Lucid Lynx).

HISTSIZE=9000
HISTFILESIZE=$HISTSIZE
HISTCONTROL=ignorespace:ignoredups

_bash_history_sync() {
  builtin history -a         #1
  HISTFILESIZE=$HISTSIZE     #2
  builtin history -c         #3
  builtin history -r         #4
}

history() {                  #5
  _bash_history_sync
  builtin history "$@"
}

PROMPT_COMMAND=_bash_history_sync
Explanation:
  1. Append the just entered line to the $HISTFILE (default is .bash_history ). This will cause $HISTFILE to grow by one line.
  2. Setting the special variable $HISTFILESIZE to some value will cause Bash to truncate $HISTFILE to be no longer than $HISTFILESIZE lines by removing the oldest entries.
  3. Clear the history of the running session. This will reduce the history counter by the amount of $HISTSIZE .
  4. Read the contents of $HISTFILE and insert them in to the current running session history. this will raise the history counter by the amount of lines in $HISTFILE . Note that the line count of $HISTFILE is not necessarily $HISTFILESIZE .
  5. The history() function overrides the builtin history to make sure that the history is synchronised before it is displayed. This is necessary for the history expansion by number (more about this later).
More explanation: About the constraints of the history expansion:

When using history expansion by number, you should always look up the number immediately before using it. That means no bash prompt display between looking up the number and using it. That usually means no enter and no ctrl+c.

Generally, once you have more than one Bash session, there is no guarantee whatsoever that a history expansion by number will retain its value between two Bash prompt displays. Because when PROMPT_COMMAND is executed the history from all other Bash sessions are integrated in the history of the current session. If any other bash session has a new command then the history numbers of the current session will be different.

I find this constraint reasonable. I have to look the number up every time anyway because I can't remember arbitrary history numbers.

Usually I use the history expansion by number like this

$ history | grep something #note number
$ !number

I recommend using the following Bash options.

## reedit a history substitution line if it failed
shopt -s histreedit
## edit a recalled history line before executing
shopt -s histverify
Strange bugs:

Running the history command piped to anything will result that command to be listed in the history twice. For example:

$ history | head
$ history | tail
$ history | grep foo
$ history | true
$ history | false

All will be listed in the history twice. I have no idea why.

Ideas for improvements:

Maciej Piechotka , answered Aug 26 '10 at 13:20

I'm not aware of any way using bash . But it's one of the most popular features of zsh .
Personally I prefer zsh over bash so I recommend trying it.

Here's the part of my .zshrc that deals with history:

SAVEHIST=10000 # Number of entries
HISTSIZE=10000
HISTFILE=~/.zsh/history # File
setopt APPEND_HISTORY # Don't erase history
setopt EXTENDED_HISTORY # Add additional data to history like timestamp
setopt INC_APPEND_HISTORY # Add immediately
setopt HIST_FIND_NO_DUPS # Don't show duplicates in search
setopt HIST_IGNORE_SPACE # Don't preserve spaces. You may want to turn it off
setopt NO_HIST_BEEP # Don't beep
setopt SHARE_HISTORY # Share history between session/terminals

Chris Down , answered Nov 25 '11 at 15:46

To do this, you'll need to add two lines to your ~/.bashrc :
shopt -s histappend
PROMPT_COMMAND="history -a;history -c;history -r;"
$PROMPT_COMMAND

From man bash :

If the histappend shell option is enabled (see the description of shopt under SHELL BUILTIN COMMANDS below), the lines are appended to the history file, otherwise the history file is over-written.

Schof , answered Sep 19 '08 at 19:38

You can edit your BASH prompt to run the "history -a" and "history -r" that Muerr suggested:
savePS1=$PS1

(in case you mess something up, which is almost guaranteed)

PS1=$savePS1`history -a;history -r`

(note that these are back-ticks; they'll run history -a and history -r on every prompt. Since they don't output any text, your prompt will be unchanged.

Once you've got your PS1 variable set up the way you want, set it permanently it in your ~/.bashrc file.

If you want to go back to your original prompt while testing, do:

PS1=$savePS1

I've done basic testing on this to ensure that it sort of works, but can't speak to any side-effects from running history -a;history -r on every prompt.

pts , answered Mar 25 '11 at 17:40

If you need a bash or zsh history synchronizing solution which also solves the problem below, then see it at http://ptspts.blogspot.com/2011/03/how-to-automatically-synchronize-shell.html

The problem is the following: I have two shell windows A and B. In shell window A, I run sleep 9999 , and (without waiting for the sleep to finish) in shell window B, I want to be able to see sleep 9999 in the bash history.

The reason why most other solutions here won't solve this problem is that they are writing their history changes to the the history file using PROMPT_COMMAND or PS1 , both of which are executing too late, only after the sleep 9999 command has finished.

jtimberman , answered Sep 19 '08 at 17:38

You can use history -a to append the current session's history to the histfile, then use history -r on the other terminals to read the histfile.

jmanning2k , answered Aug 26 '10 at 13:59

I can offer a fix for that last one: make sure the env variable HISTCONTROL does not specify "ignorespace" (or "ignoreboth").

But I feel your pain with multiple concurrent sessions. It simply isn't handled well in bash.

Toby , answered Nov 20 '14 at 14:53

Here's an alternative that I use. It's cumbersome but it addresses the issue that @axel_c mentioned where sometimes you may want to have a separate history instance in each terminal (one for make, one for monitoring, one for vim, etc).

I keep a separate appended history file that I constantly update. I have the following mapped to a hotkey:

history | grep -v history >> ~/master_history.txt

This appends all history from the current terminal to a file called master_history.txt in your home dir.

I also have a separate hotkey to search through the master history file:

cat /home/toby/master_history.txt | grep -i

I use cat | grep because it leaves the cursor at the end to enter my regex. A less ugly way to do this would be to add a couple of scripts to your path to accomplish these tasks, but hotkeys work for my purposes. I also periodically will pull history down from other hosts I've worked on and append that history to my master_history.txt file.

It's always nice to be able to quickly search and find that tricky regex you used or that weird perl one-liner you came up with 7 months ago.

Yarek T , answered Jul 23 '15 at 9:05

Right, So finally this annoyed me to find a decent solution:
# Write history after each command
_bash_history_append() {
    builtin history -a
}
PROMPT_COMMAND="_bash_history_append; $PROMPT_COMMAND"

What this does is sort of amalgamation of what was said in this thread, except that I don't understand why would you reload the global history after every command. I very rarely care about what happens in other terminals, but I always run series of commands, say in one terminal:

make
ls -lh target/*.foo
scp target/artifact.foo vm:~/

(Simplified example)

And in another:

pv ~/test.data | nc vm:5000 >> output
less output
mv output output.backup1

No way I'd want the command to be shared

rouble , answered Apr 15 at 17:43

Here is my enhancement to @lesmana's answer . The main difference is that concurrent windows don't share history. This means you can keep working in your windows, without having context from other windows getting loaded into your current windows.

If you explicitly type 'history', OR if you open a new window then you get the history from all previous windows.

Also, I use this strategy to archive every command ever typed on my machine.

# Consistent and forever bash history
HISTSIZE=100000
HISTFILESIZE=$HISTSIZE
HISTCONTROL=ignorespace:ignoredups

_bash_history_sync() {
  builtin history -a         #1
  HISTFILESIZE=$HISTSIZE     #2
}

_bash_history_sync_and_reload() {
  builtin history -a         #1
  HISTFILESIZE=$HISTSIZE     #2
  builtin history -c         #3
  builtin history -r         #4
}

history() {                  #5
  _bash_history_sync_and_reload
  builtin history "$@"
}

export HISTTIMEFORMAT="%y/%m/%d %H:%M:%S   "
PROMPT_COMMAND='history 1 >> ${HOME}/.bash_eternal_history'
PROMPT_COMMAND=_bash_history_sync;$PROMPT_COMMAND

simotek , answered Jun 1 '14 at 6:02

I have written a script for setting a history file per session or task its based off the following.
        # write existing history to the old file
        history -a

        # set new historyfile
        export HISTFILE="$1"
        export HISET=$1

        # touch the new file to make sure it exists
        touch $HISTFILE
        # load new history file
        history -r $HISTFILE

It doesn't necessary save every history command but it saves the ones that i care about and its easier to retrieve them then going through every command. My version also lists all history files and provides the ability to search through them all.

Full source: https://github.com/simotek/scripts-config/blob/master/hiset.sh

Litch , answered Aug 11 '15 at 0:15

I chose to put history in a file-per-tty, as multiple people can be working on the same server - separating each session's commands makes it easier to audit.
# Convert /dev/nnn/X or /dev/nnnX to "nnnX"
HISTSUFFIX=`tty | sed 's/\///g;s/^dev//g'`
# History file is now .bash_history_pts0
HISTFILE=".bash_history_$HISTSUFFIX"
HISTTIMEFORMAT="%y-%m-%d %H:%M:%S "
HISTCONTROL=ignoredups:ignorespace
shopt -s histappend
HISTSIZE=1000
HISTFILESIZE=5000

History now looks like:

user@host:~# test 123
user@host:~# test 5451
user@host:~# history
1  15-08-11 10:09:58 test 123
2  15-08-11 10:10:00 test 5451
3  15-08-11 10:10:02 history

With the files looking like:

user@host:~# ls -la .bash*
-rw------- 1 root root  4275 Aug 11 09:42 .bash_history_pts0
-rw------- 1 root root    75 Aug 11 09:49 .bash_history_pts1
-rw-r--r-- 1 root root  3120 Aug 11 10:09 .bashrc

fstang , answered Sep 10 '16 at 19:30

Here I will point out one problem with
export PROMPT_COMMAND="${PROMPT_COMMAND:+$PROMPT_COMMAND$'\n'}history -a; history -c; history -r"

and

PROMPT_COMMAND="$PROMPT_COMMAND;history -a; history -n"

If you run source ~/.bashrc, the $PROMPT_COMMAND will be like

"history -a; history -c; history -r history -a; history -c; history -r"

and

"history -a; history -n history -a; history -n"

This repetition occurs each time you run 'source ~/.bashrc'. You can check PROMPT_COMMAND after each time you run 'source ~/.bashrc' by running 'echo $PROMPT_COMMAND'.

You could see some commands are apparently broken: "history -n history -a". But the good news is that it still works, because other parts still form a valid command sequence (Just involving some extra cost due to executing some commands repetitively. And not so clean.)

Personally I use the following simple version:

shopt -s histappend
PROMPT_COMMAND="history -a; history -c; history -r"

which has most of the functionalities while no such issue as mentioned above.

Another point to make is: there is really nothing magic . PROMPT_COMMAND is just a plain bash environment variable. The commands in it get executed before you get bash prompt (the $ sign). For example, your PROMPT_COMMAND is "echo 123", and you run "ls" in your terminal. The effect is like running "ls; echo 123".

$ PROMPT_COMMAND="echo 123"

output (Just like running 'PROMPT_COMMAND="echo 123"; $PROMPT_COMMAND'):

123

Run the following:

$ echo 3

output:

3
123

"history -a" is used to write the history commands in memory to ~/.bash_history

"history -c" is used to clear the history commands in memory

"history -r" is used to read history commands from ~/.bash_history to memory

See history command explanation here: http://ss64.com/bash/history.html

PS: As other users have pointed out, export is unnecessary. See: using export in .bashrc

Hopping Bunny , answered May 13 '15 at 4:48

Here is the snippet from my .bashrc and short explanations wherever needed:
# The following line ensures that history logs screen commands as well
shopt -s histappend

# This line makes the history file to be rewritten and reread at each bash prompt
PROMPT_COMMAND="$PROMPT_COMMAND;history -a; history -n"
# Have lots of history
HISTSIZE=100000         # remember the last 100000 commands
HISTFILESIZE=100000     # start truncating commands after 100000 lines
HISTCONTROL=ignoreboth  # ignoreboth is shorthand for ignorespace and     ignoredups

The HISTFILESIZE and HISTSIZE are personal preferences and you can change them as per your tastes.

Mulki , answered Jul 24 at 20:49

This works for ZSH
##############################################################################
# History Configuration for ZSH
##############################################################################
HISTSIZE=10000               #How many lines of history to keep in memory
HISTFILE=~/.zsh_history     #Where to save history to disk
SAVEHIST=10000               #Number of history entries to save to disk
#HISTDUP=erase               #Erase duplicates in the history file
setopt    appendhistory     #Append history to the history file (no overwriting)
setopt    sharehistory      #Share history across terminals
setopt    incappendhistory  #Immediately append to the history file, not just when a term is killed

[Jul 29, 2017] shell - How does this bash code detect an interactive session - Stack Overflow

Notable quotes:
"... ', the pattern removal operation is applied to each positional parameter in turn, and the expansion is the resultant list. If parameter is an array variable subscripted with '@' or ' ..."
Jul 29, 2017 | stackoverflow.com

user1284631 , asked Jun 5 '13 at 8:44

Following some issues with scp (it did not like the presence of the bash bind command in my .bashrc file, apparently), I followed the advice of a clever guy on the Internet (I just cannot find that post right now) that put at the top of its .bashrc file this:
[[ ${-#*} != ${-} ]] || return

in order to make sure that the bash initialization is NOT executed unless in interactive session.

Now, that works. However, I am not able to figure how it works. Could you enlighten me?

According to this answer , the $- is the current options set for the shell and I know that the ${} is the so-called "substring" syntax for expanding variables.

However, I do not understand the ${-#*i} part. And why $-#*i is not the same as ${-#*i} .

blue , answered Jun 5 '13 at 8:49

parameter#word}

$parameter##word}

The word is expanded to produce a pattern just as in filename expansion. If the pattern matches the beginning of the expanded value of parameter, then the result of the expansion is the expanded value of parameter with the shortest matching pattern (the '#' case) or the longest matching pattern (the '##' case) deleted.

If parameter is '@' or ' ', the pattern removal operation is applied to each positional parameter in turn, and the expansion is the resultant list. If parameter is an array variable subscripted with '@' or ' ', the pattern removal operation is applied to each member of the array in turn, and the expansion is the resultant list.

Source: http://www.gnu.org/software/bash/manual/html_node/Shell-Parameter-Expansion.html

So basically what happens in ${-#*i} is that *i is expanded, and if it matches the beginning of the value of $- , then the result of the whole expansion is $- with the shortest matching pattern between *i and $- deleted.

Example

VAR "baioasd" 
echo ${VAR#*i};

outputs oasd .

In your case

If shell is interactive, $- will contain the letter 'i', so when you strip the variable $- of the pattern *i you will get a string that is different from the original $- ( [[ ${-#*i} != ${-} ]] yelds true). If shell is not interactive, $- does not contain the letter 'i' so the pattern *i does not match anything in $- and [[ ${-#*i} != $- ]] yelds false, and the return statement is executed.

perreal , answered Jun 5 '13 at 8:53

See this :

To determine within a startup script whether or not Bash is running interactively, test the value of the '-' special parameter. It contains i when the shell is interactive

Your substitution removes the string up to, and including the i and tests if the substituted version is equal to the original string. They will be different if there is i in the ${-} .

[Jul 26, 2017] I feel stupid declare not found in bash scripting

A single space can make a huge difference in bash :-)
www.linuxquestions.org

Mohtek

I feel stupid: declare not found in bash scripting? I was anxious to get my feet wet, and I'm only up to my toes before I'm stuck...this seems very very easy but I'm not sure what I've done wrong. Below is the script and its output. What the heck am I missing?

______________________________________________________
#!/bin/bash
declare -a PROD[0]="computers" PROD[1]="HomeAutomation"
printf "${ PROD[*]}"
_______________________________________________________

products.sh: 6: declare: not found
products.sh: 8: Syntax error: Bad substitution

wjevans_7d1@yahoo.co

I ran what you posted (but at the command line, not in a script, though that should make no significant difference), and got this:

Code:

-bash: ${ PROD[*]}: bad substitution

In other words, I couldn't reproduce your first problem, the "declare: not found" error. Try the declare command by itself, on the command line.

And I got rid of the "bad substitution" problem when I removed the space which is between the ${ and the PROD on the printf line.

Hope this helps.

blackhole54

The previous poster identified your second problem.

As far as your first problem goes ... I am not a bash guru although I have written a number of bash scripts. So far I have found no need for declare statements. I suspect that you might not need it either. But if you do want to use it, the following does work:

Code:
#!/bin/bash

declare -a PROD
PROD[0]="computers"
PROD[1]="HomeAutomation"
printf "${PROD[*]}\n"

EDIT: My original post was based on an older version of bash. When I tried the declare statement you posted I got an error message, but one that was different from yours. I just tried it on a newer version of bash, and your declare statement worked fine. So it might depend on the version of bash you are running. What I posted above runs fine on both versions.

[Jul 26, 2017] Associative array declaration gotcha

Jul 26, 2017 | unix.stackexchange.com

bash silently does function return on (re-)declare of global associative read-only array - Unix & Linux Stack Exchange

Ron Burk :

Obviously cut out of a much more complex script that was more meaningful:

#!/bin/bash

function InitializeConfig(){
    declare -r -g -A SHCFG_INIT=( [a]=b )
    declare -r -g -A SHCFG_INIT=( [c]=d )
    echo "This statement never gets executed"
}

set -o xtrace

InitializeConfig
echo "Back from function"
The output looks like this:
ronburk@ubuntu:~/ubucfg$ bash bug.sh
+ InitializeConfig
+ SHCFG_INIT=([a]=b)
+ declare -r -g -A SHCFG_INIT
+ SHCFG_INIT=([c]=d)
+ echo 'Back from function'
Back from function
Bash seems to silently execute a function return upon the second declare statement. Starting to think this really is a new bug, but happy to learn otherwise.

Other details:

Machine: x86_64
OS: linux-gnu
Compiler: gcc
Compilation CFLAGS:  -DPROGRAM='bash' -DCONF_HOSTTYPE='x86_64' -DCONF_OSTYPE='linux-gnu' -DCONF_MACHTYPE='x86_64-pc-linux-gn$
uname output: Linux ubuntu 3.16.0-38-generic #52~14.04.1-Ubuntu SMP Fri May 8 09:43:57 UTC 2015 x86_64 x86_64 x86_64 GNU/Lin$
Machine Type: x86_64-pc-linux-gnu

Bash Version: 4.3
Patch Level: 11
Release Status: release
bash array readonly
share improve this question edited Jun 14 '15 at 17:43 asked Jun 14 '15 at 7:05 118

By gum, you're right! Then I get readonly warning on second declare, which is reasonable, and the function completes. The xtrace output is also interesting; implies declare without single quotes is really treated as two steps. Ready to become superstitious about always single-quoting the argument to declare . Hard to see how popping the function stack can be anything but a bug, though. – Ron Burk Jun 14 '15 at 23:58

Weird. Doesn't happen in bash 4.2.53(1). – choroba Jun 14 '15 at 7:22
I can reproduce this problem with bash version 4.3.11 (Ubuntu 14.04.1 LTS). It works fine with bash 4.2.8 (Ubuntu 11.04). – Cyrus Jun 14 '15 at 7:34
Maybe related: unix.stackexchange.com/q/56815/116972 I can get expected result with declare -r -g -A 'SHCFG_INIT=( [a]=b )' . – yaegashi Jun 14 '15 at 23:22
add a comment |

I found this thread in bug-bash@gnu.org related to test -v on an assoc array. In short, bash implicitly did test -v SHCFG_INIT[0] in your script. I'm not sure this behavior got introduced in 4.3.

You might want to use declare -p to workaround this...

if  declare p SHCFG_INIT >/dev/null >& ; then
    echo "looks like SHCFG_INIT not defined"
fi
====
Well, rats. I think your answer is correct, but also reveals I'm really asking two separate questions when I thought they were probably the same issue. Since the title better reflects what turns out to be the "other" question, I'll leave this up for a while and see if anybody knows what's up with the mysterious implicit function return... Thanks! – Ron Burk Jun 14 '15 at 17:01
Edited question to focus on the remaining issue. Thanks again for the answer on the "-v" issue with associative arrays. – Ron Burk Jun 14 '15 at 17:55
Accepting this answer. Complete answer is here plus your comments above plus (IMHO) there's a bug in this version of bash (can't see how there can be any excuse for popping the function stack without warning). Thanks for your excellent research on this! – Ron Burk Jun 21 '15 at 19:31

[Jul 26, 2017] Typing variables: declare or typeset

Jul 26, 2017 | www.tldp.org

The declare or typeset builtins , which are exact synonyms, permit modifying the properties of variables. This is a very weak form of the typing [1] available in certain programming languages. The declare command is specific to version 2 or later of Bash. The typeset command also works in ksh scripts.

declare/typeset options
-r readonly
( declare -r var1 works the same as readonly var1 )

This is the rough equivalent of the C const type qualifier. An attempt to change the value of a readonly variable fails with an error message.

declare -r var1=1
echo "var1 = $var1"   # var1 = 1

(( var1++ ))          # x.sh: line 4: var1: readonly variable
-i integer
declare -i number
# The script will treat subsequent occurrences of "number" as an integer.             

number=3
echo "Number = $number"     # Number = 3

number=three
echo "Number = $number"     # Number = 0
# Tries to evaluate the string "three" as an integer.

Certain arithmetic operations are permitted for declared integer variables without the need for expr or let .

n=6/3
echo "n = $n"       # n = 6/3

declare -i n
n=6/3
echo "n = $n"       # n = 2
-a array
declare -a indices

The variable indices will be treated as an array .

-f function(s)
declare -f

A declare -f line with no arguments in a script causes a listing of all the functions previously defined in that script.

declare -f function_name

A declare -f function_name in a script lists just the function named.

-x export
declare -x var3

This declares a variable as available for exporting outside the environment of the script itself.

-x var=$value
declare -x var3=373

The declare command permits assigning a value to a variable in the same statement as setting its properties.

Example 9-10. Using declare to type variables
#!/bin/bash

func1 ()
{
  echo This is a function.
}

declare -f        # Lists the function above.

echo

declare -i var1   # var1 is an integer.
var1=2367
echo "var1 declared as $var1"
var1=var1+1       # Integer declaration eliminates the need for 'let'.
echo "var1 incremented by 1 is $var1."
# Attempt to change variable declared as integer.
echo "Attempting to change var1 to floating point value, 2367.1."
var1=2367.1       # Results in error message, with no change to variable.
echo "var1 is still $var1"

echo

declare -r var2=13.36         # 'declare' permits setting a variable property
                              #+ and simultaneously assigning it a value.
echo "var2 declared as $var2" # Attempt to change readonly variable.
var2=13.37                    # Generates error message, and exit from script.

echo "var2 is still $var2"    # This line will not execute.

exit 0                        # Script will not exit here.
Caution Using the declare builtin restricts the scope of a variable.
foo ()
{
FOO="bar"
}

bar ()
{
foo
echo $FOO
}

bar   # Prints bar.

However . . .

foo (){
declare FOO="bar"
}

bar ()
{
foo
echo $FOO
}

bar  # Prints nothing.


# Thank you, Michael Iatrou, for pointing this out.
9.2.1. Another use for declare

The declare command can be helpful in identifying variables, environmental or otherwise. This can be especially useful with arrays .

bash$


declare | grep HOME


HOME=/home/bozo

bash$


zzy=68


bash$


declare | grep zzy


zzy=68

bash$


Colors=([0]="purple" [1]="reddish-orange" [2]="light green")


bash$


echo ${Colors[@]}


purple reddish-orange light green

bash$


declare | grep Colors


Colors=([0]="purple" [1]="reddish-orange" [2]="light green")

Notes
[1] In this context, typing a variable means to classify it and restrict its properties. For example, a variable declared or typed as an integer is no longer available for string operations .
declare -i intvar

intvar=23
echo "$intvar"   # 23
intvar=stringval
echo "$intvar"   # 0

[Jul 25, 2017] Arrays in bash 4.x

Jul 25, 2017 | wiki.bash-hackers.org

Purpose An array is a parameter that holds mappings from keys to values. Arrays are used to store a collection of parameters into a parameter. Arrays (in any programming language) are a useful and common composite data structure, and one of the most important scripting features in Bash and other shells.

Here is an abstract representation of an array named NAMES . The indexes go from 0 to 3.

NAMES
 0: Peter
 1: Anna
 2: Greg
 3: Jan

Instead of using 4 separate variables, multiple related variables are grouped grouped together into elements of the array, accessible by their key . If you want the second name, ask for index 1 of the array NAMES . Indexing Bash supports two different types of ksh-like one-dimensional arrays. Multidimensional arrays are not implemented .

Syntax Referencing To accommodate referring to array variables and their individual elements, Bash extends the parameter naming scheme with a subscript suffix. Any valid ordinary scalar parameter name is also a valid array name: [[:alpha:]_][[:alnum:]_]* . The parameter name may be followed by an optional subscript enclosed in square brackets to refer to a member of the array.

The overall syntax is arrname[subscript] - where for indexed arrays, subscript is any valid arithmetic expression, and for associative arrays, any nonempty string. Subscripts are first processed for parameter and arithmetic expansions, and command and process substitutions. When used within parameter expansions or as an argument to the unset builtin, the special subscripts * and @ are also accepted which act upon arrays analogously to the way the @ and * special parameters act upon the positional parameters. In parsing the subscript, bash ignores any text that follows the closing bracket up to the end of the parameter name.

With few exceptions, names of this form may be used anywhere ordinary parameter names are valid, such as within arithmetic expressions , parameter expansions , and as arguments to builtins that accept parameter names. An array is a Bash parameter that has been given the -a (for indexed) or -A (for associative) attributes . However, any regular (non-special or positional) parameter may be validly referenced using a subscript, because in most contexts, referring to the zeroth element of an array is synonymous with referring to the array name without a subscript.

# "x" is an ordinary non-array parameter.
$ x=hi; printf '%s ' "$x" "${x[0]}"; echo "${_[0]}"
hi hi hi

The only exceptions to this rule are in a few cases where the array variable's name refers to the array as a whole. This is the case for the unset builtin (see destruction ) and when declaring an array without assigning any values (see declaration ). Declaration The following explicitly give variables array attributes, making them arrays:

Syntax Description
ARRAY=() Declares an indexed array ARRAY and initializes it to be empty. This can also be used to empty an existing array.
ARRAY[0]= Generally sets the first element of an indexed array. If no array ARRAY existed before, it is created.
declare -a ARRAY Declares an indexed array ARRAY . An existing array is not initialized.
declare -A ARRAY Declares an associative array ARRAY . This is the one and only way to create associative arrays.
Storing values Storing values in arrays is quite as simple as storing values in normal variables.
Syntax Description
ARRAY[N]=VALUE Sets the element N of the indexed array ARRAY to VALUE . N can be any valid arithmetic expression
ARRAY[STRING]=VALUE Sets the element indexed by STRING of the associative array ARRAY .
ARRAY=VALUE As above. If no index is given, as a default the zeroth element is set to VALUE . Careful, this is even true of associative arrays - there is no error if no key is specified, and the value is assigned to string index "0".
ARRAY=(E1 E2 ) Compound array assignment - sets the whole array ARRAY to the given list of elements indexed sequentially starting at zero. The array is unset before assignment unless the += operator is used. When the list is empty ( ARRAY=() ), the array will be set to an empty array. This method obviously does not use explicit indexes. An associative array can not be set like that! Clearing an associative array using ARRAY=() works.
ARRAY=([X]=E1 [Y]=E2 ) Compound assignment for indexed arrays with index-value pairs declared individually (here for example X and Y ). X and Y are arithmetic expressions. This syntax can be combined with the above - elements declared without an explicitly specified index are assigned sequentially starting at either the last element with an explicit index, or zero.
ARRAY=([S1]=E1 [S2]=E2 ) Individual mass-setting for associative arrays . The named indexes (here: S1 and S2 ) are strings.
ARRAY+=(E1 E2 ) Append to ARRAY.

As of now, arrays can't be exported. Getting values article about parameter expansion and check the notes about arrays.

Syntax Description
${ARRAY[N]} Expands to the value of the index N in the indexed array ARRAY . If N is a negative number, it's treated as the offset from the maximum assigned index (can't be used for assignment) - 1
${ARRAY[S]} Expands to the value of the index S in the associative array ARRAY .
"${ARRAY[@]}"
${ARRAY[@]}
"${ARRAY[*]}"
${ARRAY[*]}
Similar to mass-expanding positional parameters , this expands to all elements. If unquoted, both subscripts * and @ expand to the same result, if quoted, @ expands to all elements individually quoted, * expands to all elements quoted as a whole.
"${ARRAY[@]:N:M}"
${ARRAY[@]:N:M}
"${ARRAY[*]:N:M}"
${ARRAY[*]:N:M}
Similar to what this syntax does for the characters of a single string when doing substring expansion , this expands to M elements starting with element N . This way you can mass-expand individual indexes. The rules for quoting and the subscripts * and @ are the same as above for the other mass-expansions.

For clarification: When you use the subscripts @ or * for mass-expanding, then the behaviour is exactly what it is for $@ and $* when mass-expanding the positional parameters . You should read this article to understand what's going on. Metadata

Syntax Description
${#ARRAY[N]} Expands to the length of an individual array member at index N ( stringlength
${#ARRAY[STRING]} Expands to the length of an individual associative array member at index STRING ( stringlength )
${#ARRAY[@]}
${#ARRAY[*]}
Expands to the number of elements in ARRAY
${!ARRAY[@]}
${!ARRAY[*]}
Expands to the indexes in ARRAY since BASH 3.0
Destruction The unset builtin command is used to destroy (unset) arrays or individual elements of arrays.
Syntax Description
unset -v ARRAY
unset -v ARRAY[@]
unset -v ARRAY[*]
Destroys a complete array
unset -v ARRAY[N] Destroys the array element at index N
unset -v ARRAY[STRING] Destroys the array element of the associative array at index STRING

It is best to explicitly specify -v when unsetting variables with unset.

pathname expansion to occur due to the presence of glob characters.

Example: You are in a directory with a file named x1 , and you want to destroy an array element x[1] , with

unset x[1]
then pathname expansion will expand to the filename x1 and break your processing!

Even worse, if nullglob is set, your array/index will disappear.

To avoid this, always quote the array name and index:

unset -v 'x[1]'

This applies generally to all commands which take variable names as arguments. Single quotes preferred.

Usage Numerical Index Numerical indexed arrays are easy to understand and easy to use. The Purpose and Indexing chapters above more or less explain all the needed background theory.

Now, some examples and comments for you.

Let's say we have an array sentence which is initialized as follows:

sentence=(Be liberal in what you accept, and conservative in what you send)

Since no special code is there to prevent word splitting (no quotes), every word there will be assigned to an individual array element. When you count the words you see, you should get 12. Now let's see if Bash has the same opinion:

$ echo ${#sentence[@]}
12

Yes, 12. Fine. You can take this number to walk through the array. Just subtract 1 from the number of elements, and start your walk at 0 (zero)

((n_elements=${#sentence[@]}, max_index=n_elements - 1))

for ((i = 0; i <= max_index; i++)); do
  echo "Element $i: '${sentence[i]}'"
done

You always have to remember that, it seems newbies have problems sometimes. Please understand that numerical array indexing begins at 0 (zero)

The method above, walking through an array by just knowing its number of elements, only works for arrays where all elements are set, of course. If one element in the middle is removed, then the calculation is nonsense, because the number of elements doesn't correspond to the highest used index anymore (we call them " sparse arrays "). Associative (Bash 4) Associative arrays (or hash tables ) are not much more complicated than numerical indexed arrays. The numerical index value (in Bash a number starting at zero) just is replaced with an arbitrary string:

# declare -A, introduced with Bash 4 to declare an associative array
declare -A sentence

sentence[Begin]='Be liberal in what'
sentence[Middle]='you accept, and conservative'
sentence[End]='in what you send'
sentence['Very end']=...

Beware: don't rely on the fact that the elements are ordered in memory like they were declared, it could look like this:

# output from 'set' command
sentence=([End]="in what you send" [Middle]="you accept, and conservative " [Begin]="Be liberal in what " ["Very end"]="...")
This effectively means, you can get the data back with "${sentence[@]}" , of course (just like with numerical indexing), but you can't rely on a specific order. If you want to store ordered data, or re-order data, go with numerical indexes. For associative arrays, you usually query known index values:
for element in Begin Middle End "Very end"; do
    printf "%s" "${sentence[$element]}"
done
printf "\n"

A nice code example: Checking for duplicate files using an associative array indexed with the SHA sum of the files:

# Thanks to Tramp in #bash for the idea and the code

unset flist; declare -A flist;
while read -r sum fname; do 
    if [[ ${flist[$sum]} ]]; then
        printf 'rm -- "%s" # Same as >%s<\n' "$fname" "${flist[$sum]}" 
    else
        flist[$sum]="$fname"
    fi
done <  <(find . -type f -exec sha256sum {} +)  >rmdups

Integer arrays Any type attributes applied to an array apply to all elements of the array. If the integer attribute is set for either indexed or associative arrays, then values are considered as arithmetic for both compound and ordinary assignment, and the += operator is modified in the same way as for ordinary integer variables.

 ~ $ ( declare -ia 'a=(2+4 [2]=2+2 [a[2]]="a[2]")' 'a+=(42 [a[4]]+=3)'; declare -p a )
declare -ai a='([0]="6" [2]="4" [4]="7" [5]="42")'

a[0] is assigned to the result of 2+4 . a[1] gets the result of 2+2 . The last index in the first assignment is the result of a[2] , which has already been assigned as 4 , and its value is also given a[2] .

This shows that even though any existing arrays named a in the current scope have already been unset by using = instead of += to the compound assignment, arithmetic variables within keys can self-reference any elements already assigned within the same compound-assignment. With integer arrays this also applies to expressions to the right of the = . (See evaluation order , the right side of an arithmetic assignment is typically evaluated first in Bash.)

The second compound assignment argument to declare uses += , so it appends after the last element of the existing array rather than deleting it and creating a new array, so a[5] gets 42 .

Lastly, the element whose index is the value of a[4] ( 4 ), gets 3 added to its existing value, making a[4] == 7 . Note that having the integer attribute set this time causes += to add, rather than append a string, as it would for a non-integer array.

The single quotes force the assignments to be evaluated in the environment of declare . This is important because attributes are only applied to the assignment after assignment arguments are processed. Without them the += compound assignment would have been invalid, and strings would have been inserted into the integer array without evaluating the arithmetic. A special-case of this is shown in the next section.

eval , but there are differences.) 'Todo: ' Discuss this in detail.

Indirection Arrays can be expanded indirectly using the indirect parameter expansion syntax. Parameters whose values are of the form: name[index] , name[@] , or name[*] when expanded indirectly produce the expected results. This is mainly useful for passing arrays (especially multiple arrays) by name to a function.

This example is an "isSubset"-like predicate which returns true if all key-value pairs of the array given as the first argument to isSubset correspond to a key-value of the array given as the second argument. It demonstrates both indirect array expansion and indirect key-passing without eval using the aforementioned special compound assignment expansion.

isSubset() {
    local -a 'xkeys=("${!'"$1"'[@]}")' 'ykeys=("${!'"$2"'[@]}")'
    set -- "${@/%/[key]}"

    (( ${#xkeys[@]} <= ${#ykeys[@]} )) || return 1

    local key
    for key in "${xkeys[@]}"; do
        [[ ${!2+_} && ${!1} == ${!2} ]] || return 1
    done
}

main() {
    # "a" is a subset of "b"
    local -a 'a=({0..5})' 'b=({0..10})'
    isSubset a b
    echo $? # true

    # "a" contains a key not in "b"
    local -a 'a=([5]=5 {6..11})' 'b=({0..10})'
    isSubset a b
    echo $? # false

    # "a" contains an element whose value != the corresponding member of "b"
    local -a 'a=([5]=5 6 8 9 10)' 'b=({0..10})'
    isSubset a b
    echo $? # false
}

main

This script is one way of implementing a crude multidimensional associative array by storing array definitions in an array and referencing them through indirection. The script takes two keys and dynamically calls a function whose name is resolved from the array.

callFuncs() {
    # Set up indirect references as positional parameters to minimize local name collisions.
    set -- "${@:1:3}" ${2+'a["$1"]' "$1"'["$2"]'}

    # The only way to test for set but null parameters is unfortunately to test each individually.
    local x
    for x; do
        [[ $x ]] || return 0
    done

    local -A a=(
        [foo]='([r]=f [s]=g [t]=h)'
        [bar]='([u]=i [v]=j [w]=k)'
        [baz]='([x]=l [y]=m [z]=n)'
        ) ${4+${a["$1"]+"${1}=${!3}"}} # For example, if "$1" is "bar" then define a new array: bar=([u]=i [v]=j [w]=k)

    ${4+${a["$1"]+"${!4-:}"}} # Now just lookup the new array. for inputs: "bar" "v", the function named "j" will be called, which prints "j" to stdout.
}

main() {
    # Define functions named {f..n} which just print their own names.
    local fun='() { echo "$FUNCNAME"; }' x

    for x in {f..n}; do
        eval "${x}${fun}"
    done

    callFuncs "$@"
}

main "$@"

Bugs and Portability Considerations

Bugs Evaluation order Here are some of the nasty details of array assignment evaluation order. You can use this testcase code to generate these results.
Each testcase prints evaluation order for indexed array assignment
contexts. Each context is tested for expansions (represented by digits) and
arithmetic (letters), ordered from left to right within the expression. The
output corresponds to the way evaluation is re-ordered for each shell:

a[ $1 a ]=${b[ $2 b ]:=${c[ $3 c ]}}               No attributes
a[ $1 a ]=${b[ $2 b ]:=c[ $3 c ]}                  typeset -ia a
a[ $1 a ]=${b[ $2 b ]:=c[ $3 c ]}                  typeset -ia b
a[ $1 a ]=${b[ $2 b ]:=c[ $3 c ]}                  typeset -ia a b
(( a[ $1 a ] = b[ $2 b ] ${c[ $3 c ]} ))           No attributes
(( a[ $1 a ] = ${b[ $2 b ]:=c[ $3 c ]} ))          typeset -ia b
a+=( [ $1 a ]=${b[ $2 b ]:=${c[ $3 c ]}} [ $4 d ]=$(( $5 e )) ) typeset -a a
a+=( [ $1 a ]=${b[ $2 b ]:=c[ $3 c ]} [ $4 d ]=${5}e ) typeset -ia a

bash: 4.2.42(1)-release
2 b 3 c 2 b 1 a
2 b 3 2 b 1 a c
2 b 3 2 b c 1 a
2 b 3 2 b c 1 a c
1 2 3 c b a
1 2 b 3 2 b c c a
1 2 b 3 c 2 b 4 5 e a d
1 2 b 3 2 b 4 5 a c d e

ksh93: Version AJM 93v- 2013-02-22
1 2 b b a
1 2 b b a
1 2 b b a
1 2 b b a
1 2 3 c b a
1 2 b b a
1 2 b b a 4 5 e d
1 2 b b a 4 5 d e

mksh: @(#)MIRBSD KSH R44 2013/02/24
2 b 3 c 1 a
2 b 3 1 a c
2 b 3 c 1 a
2 b 3 c 1 a
1 2 3 c a b
1 2 b 3 c a
1 2 b 3 c 4 5 e a d
1 2 b 3 4 5 a c d e

zsh: 5.0.2
2 b 3 c 2 b 1 a
2 b 3 2 b 1 a c
2 b 1 a
2 b 1 a
1 2 3 c b a
1 2 b a
1 2 b 3 c 2 b 4 5 e
1 2 b 3 2 b 4 5

See also

[Jul 25, 2017] Local variables

Notable quotes:
"... completely local and separate ..."
Jul 25, 2017 | wiki.bash-hackers.org

local to a function:

myfunc
()
local
var
=VALUE
 
# alternative, only when used INSIDE a function
declare
var
=VALUE
 
...

The local keyword (or declaring a variable using the declare command) tags a variable to be treated completely local and separate inside the function where it was declared:

foo
=external
 
printvalue
()
local
foo
=internal
 
echo
$foo

 
 
# this will print "external"
echo
$foo

 
# this will print "internal"

printvalue
 
# this will print - again - "external"
echo
$foo

[Jul 25, 2017] Environment variables

Notable quotes:
"... environment variables ..."
"... including the environment variables ..."
Jul 25, 2017 | wiki.bash-hackers.org

The environment space is not directly related to the topic about scope, but it's worth mentioning.

Every UNIX® process has a so-called environment . Other items, in addition to variables, are saved there, the so-called environment variables . When a child process is created (in Bash e.g. by simply executing another program, say ls to list files), the whole environment including the environment variables is copied to the new process. Reading that from the other side means: Only variables that are part of the environment are available in the child process.

A variable can be tagged to be part of the environment using the export command:

# create a new variable and set it:
# -> This is a normal shell variable, not an environment variable!
myvariable
"Hello world."

 
# make the variable visible to all child processes:
# -> Make it an environment variable: "export" it
export
 myvariable

Remember that the exported variable is a copy . There is no provision to "copy it back to the parent." See the article about Bash in the process tree !


1) under specific circumstances, also by the shell itself

[Jul 25, 2017] Block commenting

Jul 25, 2017 | wiki.bash-hackers.org

: (colon) and input redirection. The : does nothing, it's a pseudo command, so it does not care about standard input. In the following code example, you want to test mail and logging, but not dump the database, or execute a shutdown:

#!/bin/bash
# Write info mails, do some tasks and bring down the system in a safe way
echo
"System halt requested"
 mail
-s
"System halt"
 netadmin
example.com
logger
-t
 SYSHALT
"System halt requested"

 
##### The following "code block" is effectively ignored

:
<<
"SOMEWORD"
etc
init.d
mydatabase clean_stop
mydatabase_dump
var
db
db1
mnt
fsrv0
backups
db1
logger
-t
 SYSHALT
"System halt: pre-shutdown actions done, now shutting down the system"

shutdown
-h
 NOW
SOMEWORD
##### The ignored codeblock ends here
What happened? The : pseudo command was given some input by redirection (a here-document) - the pseudo command didn't care about it, effectively, the entire block was ignored.

The here-document-tag was quoted here to avoid substitutions in the "commented" text! Check redirection with here-documents for more

[Jul 25, 2017] Doing specific tasks: concepts, methods, ideas

Notable quotes:
"... under construction! ..."
Jul 25, 2017 | wiki.bash-hackers.org

[Jul 25, 2017] Keeping persistent history in bash

Jul 25, 2017 | eli.thegreenplace.net

June 11, 2013 at 19:27 Tags Linux , Software & Tools

Update (Jan 26, 2016): I posted a short update about my usage of persistent history.

For someone spending most of his time in front of a Linux terminal, history is very important. But traditional bash history has a number of limitations, especially when multiple terminals are involved (I sometimes have dozens open). Also it's not very good at preserving just the history you're interested in across reboots.

There are many approaches to improve the situation; here I want to discuss one I've been using very successfully in the past few months - a simple "persistent history" that keeps track of history across terminal instances, saving it into a dot-file in my home directory ( ~/.persistent_history ). All commands, from all terminal instances, are saved there, forever. I found this tremendously useful in my work - it saves me time almost every day.

Why does it go into a separate history and not the main one which is accessible by all the existing history manipulation tools? Because IMHO the latter is still worthwhile to be kept separate for the simple need of bringing up recent commands in a single terminal, without mixing up commands from other terminals. While the terminal is open, I want the press "Up" and get the previous command, even if I've executed a 1000 other commands in other terminal instances in the meantime.

Persistent history is very easy to set up. Here's the relevant portion of my ~/.bashrc :

log_bash_persistent_history()
{
  [[
    $(history 1) =~ ^\ *[0-9]+\ +([^\ ]+\ [^\ ]+)\ +(.*)$
  ]]
  local date_part="${BASH_REMATCH[1]}"
  local command_part="${BASH_REMATCH[2]}"
  if [ "$command_part" != "$PERSISTENT_HISTORY_LAST" ]
  then
    echo $date_part "|" "$command_part" >> ~/.persistent_history
    export PERSISTENT_HISTORY_LAST="$command_part"
  fi
}

# Stuff to do on PROMPT_COMMAND
run_on_prompt_command()
{
    log_bash_persistent_history
}

PROMPT_COMMAND="run_on_prompt_command"

The format of the history file created by this is:

2013-06-09 17:48:11 | cat ~/.persistent_history
2013-06-09 17:49:17 | vi /home/eliben/.bashrc
2013-06-09 17:49:23 | ls

Note that an environment variable is used to avoid useless duplication (i.e. if I run ls twenty times in a row, it will only be recorded once).

OK, so we have ~/.persistent_history , how do we use it? First, I should say that it's not used very often, which kind of connects to the point I made earlier about separating it from the much higher-use regular command history. Sometimes I just look into the file with vi or tail , but mostly this alias does the trick for me:

alias phgrep='cat ~/.persistent_history|grep --color'

The alias name mirrors another alias I've been using for ages:

alias hgrep='history|grep --color'

Another tool for managing persistent history is a trimmer. I said earlier this file keeps the history "forever", which is a scary word - what if it grows too large? Well, first of all - worry not. At work my history file grew to about 2 MB after 3 months of heavy usage, and 2 MB is pretty small these days. Appending to the end of a file is very, very quick (I'm pretty sure it's a constant-time operation) so the size doesn't matter much. But trimming is easy:

tail -20000 ~/.persistent_history | tee ~/.persistent_history

Trims to the last 20000 lines. This should be sufficient for at least a couple of months of history, and your workflow should not really rely on more than that :-)

Finally, what's the use of having a tool like this without employing it to collect some useless statistics. Here's a histogram of the 15 most common commands I've used on my home machine's terminal over the past 3 months:

ls        : 865
vi        : 863
hg        : 741
cd        : 512
ll        : 289
pss       : 245
hst       : 200
python    : 168
make      : 167
git       : 148
time      : 94
python3   : 88
./python  : 88
hpu       : 82
cat       : 80

Some explanation: hst is an alias for hg st . hpu is an alias for hg pull -u . pss is my awesome pss tool , and is the reason why you don't see any calls to grep and find in the list. The proportion of Mercurial vs. git commands is likely to change in the very

[Jul 24, 2017] Bash history handling with multiple terminals

Add to your Prompt command history -a to preserve history from multiple terminals. This is a very neat trick !!!
get=

Bash history handling with multiple terminals

The bash session that is saved is the one for the terminal that is closed the latest. If you want to save the commands for every session, you could use the trick explained here.

export PROMPT_COMMAND='history -a'

To quote the manpage: "If set, the value is executed as a command prior to issuing each primary prompt."

So every time my command has finished, it appends the unwritten history item to ~/.bash

ATTENTION: If you use multiple shell sessions and do not use this trick, you need to write the history manually to preserver it using the command history -a

See also:

[Mar 13, 2017] 6.3 Arrays

Notable quotes:
"... aname val1 val2 val3 ..."
"... aname ..."
"... type ..."
"... ad hoc ..."
Mar 13, 2017 | name="KSH-CH-6-SECT-3">

So far we have seen two types of variables: character strings and integers. The third type of variable the Korn shell supports is an array . As you may know, an array is like a list of things; you can refer to specific elements in an array with integer indices , so that a[i] refers to the i th element of array a .

The Korn shell provides an array facility that, while useful, is much more limited than analogous features in conventional programming languages. In particular, arrays can be only one-dimensional (i.e., no arrays of arrays), and they are limited to 1024 elements. Indices can start at 0.

There are two ways to assign values to elements of an array. The first is the most intuitive: you can use the standard shell variable assignment syntax with the array index in brackets ( [] ). For example:

nicknames[2]=bob
nicknames[3]=ed

puts the values bob and ed into the elements of the array nicknames with indices 2 and 3, respectively. As with regular shell variables, values assigned to array elements are treated as character strings unless the assignment is preceded by let .

The second way to assign values to an array is with a variant of the set statement, which we saw in Chapter 3, Customizing Your Environment . The statement:

set -A 

aname val1 val2 val3

 ...

creates the array aname (if it doesn't already exist) and assigns val1 to aname[0] , val2 to aname[1] , etc. As you would guess, this is more convenient for loading up an array with an initial set of values.

To extract a value from an array, use the syntax ${ aname [ i ]} . For example, ${nicknames[2]} has the value "bob". The index i can be an arithmetic expression-see above. If you use * in place of the index, the value will be all elements, separated by spaces. Omitting the index is the same as specifying index 0.

Now we come to the somewhat unusual aspect of Korn shell arrays. Assume that the only values assigned to nicknames are the two we saw above. If you type print " ${nicknames[ * ]}" , you will see the output:

bob ed

In other words, nicknames[0] and nicknames[1] don't exist. Furthermore, if you were to type:

nicknames[9]=pete
nicknames[31]=ralph

and then type print " ${nicknames[ * ]}" , the output would look like this:

bob ed pete ralph

This is why we said "the elements of nicknames with indices 2 and 3" earlier, instead of "the 2nd and 3rd elements of nicknames ". Any array elements with unassigned values just don't exist; if you try to access their values, you will get null strings.

You can preserve whatever whitespace you put in your array elements by using " $ { aname [@] } " (with the double quotes) instead of $ { aname [ * ] } " , just as you can with " $@ " instead of $ * .

The shell provides an operator that tells you how many elements an array has defined: ${# aname [ * ] } . Thus ${#nicknames[ * ] } has the value 4. Note that you need the [ * ] because the name of the array alone is interpreted as the 0th element. This means, for example, that ${#nicknames} equals the length of nicknames[0] (see Chapter 4 ). Since nicknames[0] doesn't exist, the value of ${#nicknames} is 0, the length of the null string.

To be quite frank, we feel that the Korn shell's array facility is of little use to shell programmers. This is partially because it is so limited, but mainly because shell programming tasks are much more often oriented toward character strings and text than toward numbers. If you think of an array as a mapping from integers to values (i.e., put in a number, get out a value), then you can see why arrays are "number-dominated" data structures.

Nevertheless, we can find useful things to do with arrays. For example, here is a cleaner solution to Task 5-4, in which a user can select his or her terminal type ( TERM environment variable) at login time. Recall that the "user-friendly" version of this code used select and a case statement:

print 'Select your terminal type:'
PS3='terminal? '
select term in
    'Givalt GL35a' \
    'Tsoris T-2000' \
    'Shande 531' \
    'Vey VT99'
do
    case $REPLY in
        1 ) TERM=gl35a ;;
        2 ) TERM=t2000 ;;
        3 ) TERM=s531 ;;
        4 ) TERM=vt99 ;;
        * ) print "invalid." ;;
    esac
    if [[ -n $term ]]; then
        print "TERM is $TERM"
        break
    fi
done

We can eliminate the entire case construct by taking advantage of the fact that the select construct stores the user's number choice in the variable REPLY . We just need a line of code that stores all of the possibilities for TERM in an array, in an order that corresponds to the items in the select menu. Then we can use $REPLY to index the array. The resulting code is:

set -A termnames gl35a t2000 s531 vt99
print 'Select your terminal type:'
PS3='terminal? '
select term in
    'Givalt GL35a' \
    'Tsoris T-2000' \
    'Shande 531' \
    'Vey VT99'
do
    if [[ -n $term ]]; then
        TERM=${termnames[REPLY-1]}
        print "TERM is $TERM"
        break
    fi
done

This code sets up the array termnames so that ${termnames[0]} is "gl35a", ${termnames[1]} is "t2000", etc. The line TERM=${termnames[REPLY-1]} essentially replaces the entire case construct by using REPLY to index the array.

Notice that the shell knows to interpret the text in an array index as an arithmetic expression, as if it were enclosed in (( and )) , which in turn means that variable need not be preceded by a dollar sign ( $ ). We have to subtract 1 from the value of REPLY because array indices start at 0, while select menu item numbers start at 1.

6.3.1 typeset

The final Korn shell feature that relates to the kinds of values that variables can hold is the typeset command. If you are a programmer, you might guess that typeset is used to specify the type of a variable (integer, string, etc.); you'd be partially right.

typeset is a rather ad hoc collection of things that you can do to variables that restrict the kinds of values they can take. Operations are specified by options to typeset ; the basic syntax is:

typeset 

-o varname

[=

value

]

Options can be combined; multiple varname s can be used. If you leave out varname , the shell prints a list of variables for which the given option is turned on.

The options available break down into two basic categories:

  1. String formatting operations, such as right- and left-justification, truncation, and letter case control.

  2. Type and attribute functions that are of primary interest to advanced programmers.

6.3.2 Local Variables in Functions

typeset without options has an important meaning: if a typeset statement is inside a function definition, then the variables involved all become local to that function (in addition to any properties they may take on as a result of typeset options). The ability to define variables that are local to "subprogram" units (procedures, functions, subroutines, etc.) is necessary for writing large programs, because it helps keep subprograms independent of the main program and of each other.

If you just want to declare a variable local to a function, use typeset without any options. For example:

function afunc {
    typeset diffvar
    samevar=funcvalue
    diffvar=funcvalue
    print "samevar is $samevar"
    print "diffvar is $diffvar"
}

samevar=globvalue
diffvar=globvalue
print "samevar is $samevar"
print "diffvar is $diffvar"
afunc
print "samevar is $samevar"
print "diffvar is $diffvar"

This code will print the following:

samevar is globvalue
diffvar is globvalue
samevar is funcvalue
diffvar is funcvalue
samevar is funcvalue
diffvar is globvalue

Figure 6.1 shows this graphically.

Figure 6.1: Local variables in functions

[Mar 13, 2017] Leaning the Korn shell: Chapter 6 Integer Variables and Arithmetic

Mar 13, 2017 | docstore.mik.ua
6.2 Integer Variables and Arithmetic

The expression $(($OPTIND - 1)) in the last example gives a clue as to how the shell can do integer arithmetic. As you might guess, the shell interprets words surrounded by $(( and )) as arithmetic expressions. Variables in arithmetic expressions do not need to be preceded by dollar signs, though it is not wrong to do so.

Arithmetic expressions are evaluated inside double quotes, like tildes, variables, and command substitutions. We're finally in a position to state the definitive rule about quoting strings: When in doubt, enclose a string in single quotes, unless it contains tildes or any expression involving a dollar sign, in which case you should use double quotes.

date (1) command on System V-derived versions of UNIX accepts arguments that tell it how to format its output. The argument +%j tells it to print the day of the year, i.e., the number of days since December 31st of the previous year.

We can use +%j to print a little holiday anticipation message:

print "Only $(( (365-$(date +%j)) / 7 )) weeks until the New Year!"

We'll show where this fits in the overall scheme of command-line processing in Chapter 7, Input/Output and Command-line Processing .

The arithmetic expression feature is built in to the Korn shell's syntax, and was available in the Bourne shell (most versions) only through the external command expr (1). Thus it is yet another example of a desirable feature provided by an external command (i.e., a syntactic kludge) being better integrated into the shell. [[ / ]] and getopts are also examples of this design trend.

Korn shell arithmetic expressions are equivalent to their counterparts in the C language. [5] Precedence and associativity are the same as in C. Table 6.2 shows the arithmetic operators that are supported. Although some of these are (or contain) special characters, there is no need to backslash-escape them, because they are within the $(( ... )) syntax.

[5] The assignment forms of these operators are also permitted. For example, $((x += 2)) adds 2 to x and stores the result back in x .

Table 6.2: Arithmetic Operators
Operator Meaning
+ Plus
- Minus
* Times
/ Division (with truncation)
% Remainder
<< Bit-shift left
>> Bit-shift right
& Bitwise and
| Bitwise or
~ Bitwise not
^ Bitwise exclusive or

Parentheses can be used to group subexpressions. The arithmetic expression syntax also (like C) supports relational operators as "truth values" of 1 for true and 0 for false. Table 6.3 shows the relational operators and the logical operators that can be used to combine relational expressions.

Table 6.3: Relational Operators
Operator Meaning
< Less than
> Greater than
<= Less than or equal
>= Greater than or equal
== Equal
!= Not equal
&& Logical and
|| Logical or

For example, $((3 > 2)) has the value 1; $(( (3 > 2) || (4 <= 1) )) also has the value 1, since at least one of the two subexpressions is true.

The shell also supports base N numbers, where N can be up to 36. The notation B # N means " N base B ". Of course, if you omit the B # , the base defaults to 10.

6.2.1 Arithmetic Conditionals

Another construct, closely related to $((...)) , is ((...)) (without the leading dollar sign). We use this for evaluating arithmetic condition tests, just as [[...]] is used for string, file attribute, and other types of tests.

((...)) evaluates relational operators differently from $((...)) so that you can use it in if and while constructs. Instead of producing a textual result, it just sets its exit status according to the truth of the expression: 0 if true, 1 otherwise. So, for example, ((3 > 2)) produces exit status 0, as does (( (3 > 2) || (4 <= 1) )) , but (( (3 > 2) && (4 <= 1) )) has exit status 1 since the second subexpression isn't true.

You can also use numerical values for truth values within this construct. It's like the analogous concept in C, which means that it's somewhat counterintuitive to non-C programmers: a value of 0 means false (i.e., returns exit status 1), and a non-0 value means true (returns exit status 0), e.g., (( 14 )) is true. See the code for the kshdb debugger in Chapter 9 for two more examples of this.

6.2.2 Arithmetic Variables and Assignment

The (( ... )) construct can also be used to define integer variables and assign values to them. The statement:

(( intvar=expression ))

creates the integer variable intvar (if it doesn't already exist) and assigns to it the result of expression .

That syntax isn't intuitive, so the shell provides a better equivalent: the built-in command let . The syntax is:

let intvar=expression 

It is not necessary (because it's actually redundant) to surround the expression with $(( and )) in a let statement. As with any variable assignment, there must not be any space on either side of the equal sign ( = ). It is good practice to surround expressions with quotes, since many characters are treated as special by the shell (e.g., * , # , and parentheses); furthermore, you must quote expressions that include whitespace (spaces or TABs). See Table 6.4 for examples.

Table 6.4: Sample Integer Expression Assignments
Assignment Value
let x= $x
1+4 5
' 1 + 4 ' 5
' (2+3) * 5 ' 25
' 2 + 3 * 5 ' 17
' 17 / 3 ' 5
' 17 % 3 ' 2
' 1<<4 ' 16
' 48>>3 ' 6
' 17 & 3 ' 1
' 17 | 3 ' 19
' 17 ^ 3 ' 18

Here is a small task that makes use of integer arithmetic.

Task 6.1

Write a script called pages that, given the name of a text file, tells how many pages of output it contains. Assume that there are 66 lines to a page but provide an option allowing the user to override that.

We'll make our option - N , a la head . The syntax for this single option is so simple that we need not bother with getopts . Here is the code:

if [[ $1 = -+([0-9]) ]]; then
    let page_lines=${1#-}
    shift
else
    let page_lines=66
fi
let file_lines="$(wc -l < $1)"

let pages=file_lines/page_lines
if (( file_lines % page_lines > 0 )); then
    let pages=pages+1
fi

print "$1 has $pages pages of text."

Notice that we use the integer conditional (( file_lines % page_lines > 0 )) rather than the [[ ... ]] form.

At the heart of this code is the UNIX utility wc(1) , which counts the number of lines, words, and characters (bytes) in its input. By default, its output looks something like this:

8      34     161  bob

wc 's output means that the file bob has 8 lines, 34 words, and 161 characters. wc recognizes the options -l , -w , and -c , which tell it to print only the number of lines, words, or characters, respectively.

wc normally prints the name of its input file (given as argument). Since we want only the number of lines, we have to do two things. First, we give it input from file redirection instead, as in wc -l < bob instead of wc -l bob . This produces the number of lines preceded by a single space (which would normally separate the filename from the number).

Unfortunately, that space complicates matters: the statement let file_lines=$(wc -l < $1) becomes "let file_lines= N " after command substitution; the space after the equal sign is an error. That leads to the second modification, the quotes around the command substitution expression. The statement let file_lines=" N " is perfectly legal, and let knows how to remove the leading space.

The first if clause in the pages script checks for an option and, if it was given, strips the dash ( - ) off and assigns it to the variable page_lines . wc in the command substitution expression returns the number of lines in the file whose name is given as argument.

The next group of lines calculates the number of pages and, if there is a remainder after the division, adds 1. Finally, the appropriate message is printed.

As a bigger example of integer arithmetic, we will complete our emulation of the C shell's pushd and popd functions (Task 4-8). Remember that these functions operate on DIRSTACK , a stack of directories represented as a string with the directory names separated by spaces. The C shell's pushd and popd take additional types of arguments, which are:

The most useful of these features is the ability to get at the n th directory in the stack. Here are the latest versions of both functions:

function pushd { # push current directory onto stack
    dirname=$1
    if [[ -d $dirname && -x $dirname ]]; then
  	  cd $dirname
        DIRSTACK="$dirname ${DIRSTACK:-$PWD}"
        print "$DIRSTACK"
    else
        print "still in $PWD."
    fi
}

function popd {  # pop directory off the stack, cd to new top
    if [[ -n $DIRSTACK ]]; then
        DIRSTACK=${DIRSTACK#* }
        cd ${DIRSTACK%% *}
        print "$PWD"
    else
        print "stack empty, still in $PWD."
    fi
}

To get at the n th directory, we use a while loop that transfers the top directory to a temporary copy of the stack n times. We'll put the loop into a function called getNdirs that looks like this:

function getNdirs{
    stackfront=''
    let count=0
    while (( count < $1 )); do
        stackfront="$stackfront ${DIRSTACK%% *}"
        DIRSTACK=${DIRSTACK#* }
        let count=count+1
    done
}

The argument passed to getNdirs is the n in question. The variable stackfront is the temporary copy that will contain the first n directories when the loop is done. stackfront starts as null; count , which counts the number of loop iterations, starts as 0.

The first line of the loop body appends the top of the stack ( ${DIRSTACK%% * } ) to stackfront ; the second line deletes the top from the stack. The last line increments the counter for the next iteration. The entire loop executes N times, for values of count from 0 to N -1.

When the loop finishes, the last directory in $stackfront is the N th directory. The expression ${stackfront## * } extracts this directory. Furthermore, DIRSTACK now contains the "back" of the stack, i.e., the stack without the first n directories. With this in mind, we can now write the code for the improved versions of pushd and popd :

function pushd {
    if [[ $1 = ++([0-9]) ]]; then
        # case of pushd +n: rotate n-th directory to top
        let num=${1#+}
        getNdirs $num

        newtop=${stackfront##* }
        stackfront=${stackfront%$newtop}

        DIRSTACK="$newtop $stackfront $DIRSTACK"
        cd $newtop

    elif [[ -z $1 ]]; then
        # case of pushd without args; swap top two directories
        firstdir=${DIRSTACK%% *}
        DIRSTACK=${DIRSTACK#* }
        seconddir=${DIRSTACK%% *}
        DIRSTACK=${DIRSTACK#* } 
        DIRSTACK="$seconddir $firstdir $DIRSTACK"
        cd $seconddir

    else
  	  cd $dirname
        # normal case of pushd dirname
        dirname=$1
        if [[ -d $dirname && -x $dirname ]]; then
            DIRSTACK="$dirname ${DIRSTACK:-$PWD}"
            print "$DIRSTACK"
        else
            print still in "$PWD."
        fi
    fi
}

function popd {      # pop directory off the stack, cd to new top
    if [[ $1 = ++([0-9]) ]]; then
        # case of popd +n: delete n-th directory from stack
        let num={$1#+}
        getNdirs $num
        stackfront=${stackfront% *}
        DIRSTACK="$stackfront $DIRSTACK"

    else
        # normal case of popd without argument
        if [[ -n $DIRSTACK ]]; then
            DIRSTACK=${DIRSTACK#* }
            cd ${DIRSTACK%% *}
            print "$PWD"
        else
            print "stack empty, still in $PWD."
        fi
    fi
}

These functions have grown rather large; let's look at them in turn. The if at the beginning of pushd checks if the first argument is an option of the form + N . If so, the first body of code is run. The first let simply strips the plus sign (+) from the argument and assigns the result - as an integer - to the variable num . This, in turn, is passed to the getNdirs function.

The next two assignment statements set newtop to the N th directory - i.e., the last directory in $stackfront - and delete that directory from stackfront . The final two lines in this part of pushd put the stack back together again in the appropriate order and cd to the new top directory.

The elif clause tests for no argument, in which case pushd should swap the top two directories on the stack. The first four lines of this clause assign the top two directories to firstdir and seconddir , and delete these from the stack. Then, as above, the code puts the stack back together in the new order and cd s to the new top directory.

The else clause corresponds to the usual case, where the user supplies a directory name as argument.

popd works similarly. The if clause checks for the + N option, which in this case means delete the N th directory. A let extracts the N as an integer; the getNdirs function puts the first n directories into stackfront . Then the line stackfront=${stackfront% *} deletes the last directory (the N th directory) from stackfront . Finally, the stack is put back together with the N th directory missing.

The else clause covers the usual case, where the user doesn't supply an argument.

Before we leave this subject, here are a few exercises that should test your understanding of this code:

  1. Add code to pushd that exits with an error message if the user supplies no argument and the stack contains fewer than two directories.

  2. Verify that when the user specifies + N and N exceeds the number of directories in the stack, both pushd and popd use the last directory as the N th directory.

  3. Modify the getNdirs function so that it checks for the above condition and exits with an appropriate error message if true.

  4. Change getNdirs so that it uses cut (with command substitution), instead of the while loop, to extract the first N directories. This uses less code but runs more slowly because of the extra processes generated.

[Feb 14, 2017] Ms Dos style aliases for linux

I think alias ipconfig = 'ifconfig' is really useful for people who work with Linus from Windows POc desktop/laptop.
Feb 14, 2017 | bash.cyberciti.biz
# MS-DOS / XP cmd like stuff  
   alias edit = $VISUAL  
   alias copy = 'cp'  
   alias cls = 'clear'  
   alias del = 'rm'  
   alias dir = 'ls'  
   alias md = 'mkdir'  
   alias move = 'mv'  
   alias rd = 'rmdir'  
   alias ren = 'mv'  
   alias ipconfig = 'ifconfig'

[Feb 04, 2017] Quickly find differences between two directories

You will be surprised, but GNU diff use in Linux understands the situation when two arguments are directories and behaves accordingly
Feb 04, 2017 | www.cyberciti.biz

The diff command compare files line by line. It can also compare two directories:

# Compare two folders using diff ##
diff /etc /tmp/etc_old  
Rafal Matczak September 29, 2015, 7:36 am
§ Quickly find differences between two directories
And quicker:
 diff -y <(ls -l ${DIR1}) <(ls -l ${DIR2})  

[Feb 04, 2017] Restoring deleted /tmp folder

Jan 13, 2015 | cyberciti.biz

As my journey continues with Linux and Unix shell, I made a few mistakes. I accidentally deleted /tmp folder. To restore it all you have to do is:

mkdir /tmp
chmod 1777 /tmp
chown root:root /tmp
ls -ld /tmp
 
mkdir /tmp chmod 1777 /tmp chown root:root /tmp ls -ld /tmp 

[Feb 04, 2017] Use CDPATH to access frequent directories in bash - Mac OS X Hints

Feb 04, 2017 | hints.macworld.com
The variable CDPATH defines the search path for the directory containing directories. So it served much like "directories home". The dangers are in creating too complex CDPATH. Often a single directory works best. For example export CDPATH = /srv/www/public_html . Now, instead of typing cd /srv/www/public_html/CSS I can simply type: cd CSS
Use CDPATH to access frequent directories in bash UNIX
Mar 21, '05 10:01:00AM • Contributed by: jonbauman

I often find myself wanting to cd to the various directories beneath my home directory (i.e. ~/Library, ~/Music, etc.), but being lazy, I find it painful to have to type the ~/ if I'm not in my home directory already. Enter CDPATH , as desribed in man bash ):

The search path for the cd command. This is a colon-separated list of directories in which the shell looks for destination directories specified by the cd command. A sample value is ".:~:/usr".
Personally, I use the following command (either on the command line for use in just that session, or in .bash_profile for permanent use):
CDPATH=".:~:~/Library"

This way, no matter where I am in the directory tree, I can just cd dirname , and it will take me to the directory that is a subdirectory of any of the ones in the list. For example:
$ cd
$ cd Documents 
/Users/baumanj/Documents
$ cd Pictures
/Users/username/Pictures
$ cd Preferences
/Users/username/Library/Preferences
etc...

[ robg adds: No, this isn't some deeply buried treasure of OS X, but I'd never heard of the CDPATH variable, so I'm assuming it will be of interest to some other readers as well.]

cdable_vars is also nice
Authored by: clh on Mar 21, '05 08:16:26PM

Check out the bash command shopt -s cdable_vars

From the man bash page:

cdable_vars

If set, an argument to the cd builtin command that is not a directory is assumed to be the name of a variable whose value is the directory to change to.

With this set, if I give the following bash command:

export d="/Users/chap/Desktop"

I can then simply type

cd d

to change to my Desktop directory.

I put the shopt command and the various export commands in my .bashrc file.

[Feb 04, 2017] Copy file into multiple directories

Feb 04, 2017 | www.cyberciti.biz
Instead of running:
cp /path/to/file /usr/dir1
cp /path/to/file /var/dir2
cp /path/to/file /nas/dir3

Run the following command to copy file into multiple dirs:

echo /usr/dir1 /var/dir2 /nas/dir3 | xargs -n 1 cp -v /path/to/file

[Feb 04, 2017] 20 Unix Command Line Tricks – Part I

Feb 04, 2017 | www.cyberciti.biz
Locking a directory

For privacy of my data I wanted to lock down /downloads on my file server. So I ran:

chmod
 0000 
/
downloads

chmod 0000 /downloads

The root user can still has access and ls and cd commands will not work. To go back:

chmod
 0755 
/
downloads

chmod 0755 /downloads Clear gibberish all over the screen

Just type:

reset

reset Becoming human

Pass the -h or -H (and other options) command line option to GNU or BSD utilities to get output of command commands like ls, df, du, in human-understandable formats:

ls
-lh
# print sizes in human readable format (e.g., 1K 234M 2G)
df
-h
df
-k
# show output in bytes, KB, MB, or GB
free
-b
free
-k
free
-m
free
-g
# print sizes in human readable format (e.g., 1K 234M 2G)
du
-h
# get file system perms in human readable format
stat
-c
%
A 
/
boot
# compare human readable numbers
sort
-h
-a
file
# display the CPU information in human readable format on a Linux

lscpu
lscpu 
-e

lscpu 
-e
=cpu,node
# Show the  size of each file but in a more human readable way
tree
-h
tree
-h
/
boot

ls -lh # print sizes in human readable format (e.g., 1K 234M 2G) df -h df -k # show output in bytes, KB, MB, or GB free -b free -k free -m free -g # print sizes in human readable format (e.g., 1K 234M 2G) du -h # get file system perms in human readable format stat -c %A /boot # compare human readable numbers sort -h -a file # display the CPU information in human readable format on a Linux lscpu lscpu -e lscpu -e=cpu,node # Show the size of each file but in a more human readable way tree -h tree -h /boot Show information about known users in the Linux based system

Just type:

## linux version ##

lslogins
 
## BSD version ##

logins

## linux version ## lslogins## BSD version ## logins

Sample outputs:

UID USER      PWD-LOCK PWD-DENY LAST-LOGIN GECOS
  0 root             0        0   22:37:59 root
  1 bin              0        1            bin
  2 daemon           0        1            daemon
  3 adm              0        1            adm
  4 lp               0        1            lp
  5 sync             0        1            sync
  6 shutdown         0        1 2014-Dec17 shutdown
  7 halt             0        1            halt
  8 mail             0        1            mail
 10 uucp             0        1            uucp
 11 operator         0        1            operator
 12 games            0        1            games
 13 gopher           0        1            gopher
 14 ftp              0        1            FTP User
 27 mysql            0        1            MySQL Server
 38 ntp              0        1            
 48 apache           0        1            Apache
 68 haldaemon        0        1            HAL daemon
 69 vcsa             0        1            virtual console memory owner
 72 tcpdump          0        1            
 74 sshd             0        1            Privilege-separated SSH
 81 dbus             0        1            System message bus
 89 postfix          0        1            
 99 nobody           0        1            Nobody
173 abrt             0        1            
497 vnstat           0        1            vnStat user
498 nginx            0        1            nginx user
499 saslauth         0        1            "Saslauthd user"
Confused on a top command output?

Seriously, you need to try out htop instead of top:

sudo
htop

sudo htop Want to run the same command again?

Just type !! . For example:

/
myhome
/
dir
/
script
/
name arg1 arg2
 
# To run the same command again 
!!

 
## To run the last command again as root user
sudo
!!

/myhome/dir/script/name arg1 arg2# To run the same command again !!## To run the last command again as root user sudo !!

The !! repeats the most recent command. To run the most recent command beginning with "foo":

!
foo
# Run the most recent command beginning with "service" as root
sudo
!
service

!foo # Run the most recent command beginning with "service" as root sudo !service

The !$ use to run command with the last argument of the most recent command:

# Edit nginx.conf
sudo
vi
/
etc
/
nginx
/
nginx.conf
 
# Test nginx.conf for errors
/
sbin
/
nginx 
-t
-c
/
etc
/
nginx
/
nginx.conf
 
# After testing a file with "/sbin/nginx -t -c /etc/nginx/nginx.conf", you
# can edit file again with vi
sudo
vi
!
$

# Edit nginx.conf sudo vi /etc/nginx/nginx.conf# Test nginx.conf for errors /sbin/nginx -t -c /etc/nginx/nginx.conf# After testing a file with "/sbin/nginx -t -c /etc/nginx/nginx.conf", you # can edit file again with vi sudo vi !$ Get a reminder you when you have to leave

If you need a reminder to leave your terminal, type the following command:

leave +hhmm

leave +hhmm

Where,

Home sweet home

Want to go the directory you were just in? Run:
cd -
Need to quickly return to your home directory? Enter:
cd
The variable CDPATH defines the search path for the directory containing directories:

export
CDPATH
=
/
var
/
www:
/
nas10

export CDPATH=/var/www:/nas10

Now, instead of typing cd /var/www/html/ I can simply type the following to cd into /var/www/html path:

cd
 html

cd html Editing a file being viewed with less pager

To edit a file being viewed with less pager, press v . You will have the file for edit under $EDITOR:

less
*
.c
less
 foo.html
## Press v to edit file ##
## Quit from editor and you would return to the less pager again ##

less *.c less foo.html ## Press v to edit file ## ## Quit from editor and you would return to the less pager again ## List all files or directories on your system

To see all of the directories on your system, run:

find
/
-type
 d 
|
less

 
# List all directories in your $HOME
find
$HOME
-type
 d 
-ls
|
less

find / -type d | less# List all directories in your $HOME find $HOME -type d -ls | less

To see all of the files, run:

find
/
-type
 f 
|
less

 
# List all files in your $HOME
find
$HOME
-type
 f 
-ls
|
less

find / -type f | less# List all files in your $HOME find $HOME -type f -ls | less Build directory trees in a single command

You can create directory trees one at a time using mkdir command by passing the -p option:

mkdir
-p
/
jail
/
{
dev,bin,sbin,etc,usr,lib,lib64
}
ls
-l
/
jail
/

mkdir -p /jail/{dev,bin,sbin,etc,usr,lib,lib64} ls -l /jail/ Copy file into multiple directories

Instead of running:

cp
/
path
/
to
/
file
/
usr
/
dir1
cp
/
path
/
to
/
file
/
var
/
dir2
cp
/
path
/
to
/
file
/
nas
/
dir3

cp /path/to/file /usr/dir1 cp /path/to/file /var/dir2 cp /path/to/file /nas/dir3

Run the following command to copy file into multiple dirs:

echo
/
usr
/
dir1 
/
var
/
dir2 
/
nas
/
dir3 
|
xargs
-n
1
cp
-v
/
path
/
to
/
file

echo /usr/dir1 /var/dir2 /nas/dir3 | xargs -n 1 cp -v /path/to/file

Creating a shell function is left as an exercise for the reader

Quickly find differences between two directories

The diff command compare files line by line. It can also compare two directories:

ls
-l
/
tmp
/
r
ls
-l
/
tmp
/
s
# Compare two folders using diff ##
diff
/
tmp
/
r
/
/
tmp
/
s
/

[Feb 04, 2017] List all files or directories on your system

Feb 04, 2017 | www.cyberciti.biz
List all files or directories on your system

To see all of the directories on your system, run:

find
/
-type
 d 
|
less

 
# List all directories in your $HOME
find
$HOME
-type
 d 
-ls
|
less

find / -type d | less# List all directories in your $HOME find $HOME -type d -ls | less

To see all of the files, run:

find
/
-type
 f 
|
less

 
# List all files in your $HOME
find
$HOME
-type
 f 
-ls
|
less

basic ~-.bashrc ~-.bash_profile tips thread

Arch Linux Forums

I added some comments explaining each piece.

Misc stuff:

# My prompt, quite basic, decent coloring, shows the value of $?
# (exit value of last command, useful sometimes):
C_DEFAULT="[33[0m]"
C_BLUE="[33[0;34m]"
export PS1="$C_BLUE($C_DEFAULT$?$C_BLUE)[$C_DEFAULTu$C_BLUE@$C_DEFAULTh$C_BLUE:$C_DEFAULTw$C_BLUE]\$ $C_DEFAULT"
export PS2="$C_BLUE> $C_DEFAULT"

# If you allow Ctrl+Alt+Backspace to kill the X server but are paranoid,
# then this alias will ensure that there will be no shell open afterwards.
alias startx="exec startx"

# Let grep colorize the search results
alias g="egrep --color=always"
alias gi="egrep -i --color=always"

# Hostname appended to bash history filename
export HISTFILE="$HOME/.bash_history_`hostname -s`"

# Don't save repeated commands in bash history
export HISTCONTROL="ignoredups"

# Confirm before overwriting something
alias cp="cp -i"

# Disable ^S/^Q flow control (does anyone like/use this at all?)
stty -ixon

# If your resolution gets fucked up, use this to reset (requires XRandR)
alias resreset="xrandr --size 1280x1024"

And some small but handy functions:

# mkmv - creates a new directory and moves the file into it, in 1 step
# Usage: mkmv <file> <directory>
mkmv() {
    mkdir "$2"
    mv "$1" "$2"
}

# sanitize - set file/directory owner and permissions to normal values (644/755)
# Usage: sanitize <file>
sanitize() {
    chmod -R u=rwX,go=rX "$@"
    chown -R ${USER}.users "$@"
}

# nh - run command detached from terminal and without output
# Usage: nh <command>
nh() {
    nohup "$@" &>/dev/null &
}

# run - compile a simple c or cpp file, run the program, afterwards delete it
# Usage: run <file> [params]
run() {
    filename="${1%%.*}"
    extension="${1##*.}"
    file="$1"
    shift
    params="$@"
    command=""

    if [ $extension = "cc" -o $extension = "cpp" -o $extension = "c++" ]; then
        command="g++"
    elif [ $extension = "c" ]; then
        command="gcc"
    else
        echo "Invalid file extension!"
        return 1
    fi 

    $command -Wall -o $filename $file
    chmod a+x $filename
    ./$filename $params
    rm -f $filename 2>/dev/null
}

Offline

... ... ...

function mktar() { tar czf "${1%%/}.tar.gz" "${1%%/}/"; }

function mkmine() { sudo chown -R ${USER} ${1:-.}; }

alias svim='sudo vim'

# mkmv - creates a new directory and moves the file into it, in 1 step
# Usage: mkmv <file> <directory>
mkmv() {
mkdir "$2"
mv "$1" "$2"
}

# sanitize - set file/directory owner and permissions to normal values (644/755)
# Usage: sanitize <file>
sanitize() {
chmod -R u=rwX,go=rX "$@"
chown -R ${USER}.users "$@"
}

# nh - run command detached from terminal and without output
# Usage: nh <command>
nh() {
nohup "$@" &>/dev/null &
}

alias un='tar -zxvf'
alias mountedinfo='df -hT'
alias ping='ping -c 10'
alias openports='netstat -nape --inet'
alias ns='netstat -alnp --protocol=inet | grep -v CLOSE_WAIT | cut
-c-6,21-94 | tail +2'
alias du1='du -h --max-depth=1'
alias da='date "+%Y-%m-%d %A %T %Z"'
alias ebrc='pico ~/.bashrc'

# Alias to multiple ls commands
alias la='ls -Al' # show hidden files
alias ls='ls -aF --color=always' # add colors and file type extensions
alias lx='ls -lXB' # sort by extension
alias lk='ls -lSr' # sort by size
alias lc='ls -lcr' # sort by change time
alias lu='ls -lur' # sort by access time
alias lr='ls -lR' # recursive ls
alias lt='ls -ltr' # sort by date
alias lm='ls -al |more' # pipe through 'more'

# Alias chmod commands
alias mx='chmod a+x'
alias 000='chmod 000'
alias 644='chmod 644'
alias 755='chmod 755'

What are some useful Bash tricks - Quora

Gaurav Gada, Master's Information Management, University of Washington Information School (2018)

Written Dec 19, 2011

sudo !!
For when you forget to add sudo to your commands.

I got to know about this, among others at this beautiful website:
http://www.commandlinefu.com/com...

Mattias Jansson, I like cats

Written Jan 6, 2013

Some things I used to use often... and not so often:

o Comment the line you're currently on (Esc-#).

o Send stuff to a host/port using bash builtins- echo foo > /dev/tcp/host/port. For example, quick and dirty file transfer from a minimal linux install to some place with nc installed:
On destination machine: nc -l 7070 > newfile
On source machine: cat somefile > /dev/tcp/somehostname/7070

o C-x C-e to edit current line in your $EDITOR (all readline-enabled programs have this- I really needed this often when writing an SQL query which ended up being very long)

o I've got this simple shell function to take a config file (which uses the hash as a comment initiator) and dump all the contents which do not start with a comment or whitespace:
unc ()
{
grep -vE "^[ ]*\#" $1 | grep .
}

o A tiny no-nonsense webserver to share the directory you're standing in:
alias webshare='python -c "import SimpleHTTPServer;SimpleHTTPServer.test()"'

Fred Cirera, works at Twitter

Written Jul 30, 2014

I wouldn't classify the following as tricks but things that every developer writing a bash script should know.

Every shell script should start with set -o nounset and set -o errexit

nounset means that using a variable that is not set will raise an error. In the following example if the bash script is called without argument all the files in /var/log will be delete.

  1. #!/bin/bash
  2. set -o nounset
  3. CONTAINER_ROOT=$1
  4. ...
  5. rm $CONTAINER_ROOT/var/log/*
 
errexit when this options is set the the bash scrip will exit is a command fail. In the following example if the directory /bigdisk/temp doesn't exist mktemp will fail but the script will continue and call generate_big_data with no $TEMPFILE.
  1. #!/bin/bash
  2. set -o errexit
  3. TEMPDIR=/bigdisk/temp
  4. TEMPFILE=$(mktemp /$TEMPDIR/app.XXXXXX)
  5. generate_big_data -out $TEMPFILE
 

In that previews case you can write TEMPFILE=$(mktemp /$TEMPDIR/app.XXXXXX) || exit 1 but it is always good to exist on error in order to be sure they will be no surprising side effect when a command called in the middle of your script fail.

Variables substitution

When you don string manipulation use variables substitution. This is faster and save from doing useless forks.

Stop writing things like that: FILENAME=`basename $1` instead write FILENAME=${1##*/} or DIRNAME=${1%/*} instead of DIRNAME=`dirname $1`. You'll find more information on variables substitution in the bash manual in the paragraph Manipulating Strings.

Nick Shelly, Stanford CS PhD candid, Apple, Air Force capt, Rhodes scholar

Written Aug 20, 2012

Ctrl+R to reverse search through your Bash history. Ctrl+R again keeps searching, Ctrl+G cancels the search.

Though GNU's Readline package is not unique to Bash (Python's interactive shell has this as well), reverse search is one of the most useful aspects of command line shells over GUIs.

Dan Fango, www.danfango.co.uk

Written Apr 21, 2015

A couple I haven't seen (or missed) in the previous answers:

Alt+. : brings back the last word from the previous line. If your previous line was "ls somefile.txt" then "vi Alt+." will translate to "vi somefile.txt". Hitting Alt+.multiple times will cycle back through your history

Alt+# : translates to adding # (comment) to the start of your current command line and hitting return

Jianing Yang, Linux system administrator

Written Dec 19, 2011

How about C-x C-e to open your favorite editor for editing the current command line.

Steven Lehar

Written Feb 27, 2014

When I cd to some/long/path, then I type
$ here=`pwd`
then I cd back/to/some/other/path, then for example
$ there=`pwd`
Now I can do stuff like...
$ cd $here
$ cp file.txt $there

Chris Rutherford, 20 years of unix admin

Written Aug 15, 2012

Check out this guys stuff, best bash script tricks Ive seen in my 20 years of scripting http://www.catonmat.net/blog/bas...

Michael Rinus, every day basher

Written Jun 5, 2015

I strongly recommend spending some time at http://www.commandlinefu.com/com...

There is some quite awesome and helpful stuff out there :)

What are some useful .bash_profile and .bashrc tips - Quora

function cl(){ cd "$@" && la; }

function cdn(){ for i in `seq $1`; do cd ..; done;}

PROMPT_COMMAND="${PROMPT_COMMAND:+$PROMPT_COMMAND ; }"'echo `dt` `pwd` $$ $USER "$(history 1)" >> ~/.bash_eternal_history'

if [ -f /etc/bash_completion ]; then
. /etc/bash_completion
fi
alias 'dus=du -sckx * | sort -nr' #directories sorted by size
alias lsdirs="ls -l | grep '^d'"


Gaurav Gada, Master's Information Management, University of Washington Information School (2018)

Written Dec 17, 2011

I have these lines:
1shopt -s histappend
2PROMPT_COMMAND="history -n; history -a"
3unset HISTFILESIZE
4HISTSIZE=2000


The first 2 lines keep the history between multiple bash sessions synced and the last two increase the history size from the default 500.

Yaniv Ng, researcher, atheist, ex-gamer, terminalist vimmer

Written Oct 17, 2014

This is something I found very useful when working with multiple terminals on different directories. Sometimes the new terminal opens in the home directory instead of the current working directory (depending on the terminal program).

Use gg in the terminal where you want to go. Then go to the new terminal and use hh.

 
  1. gg() { pwd > /tmp/last_path; }
  2. hh() { cd $(cat /tmp/last_path); }
 
 
  1. # Easy extract
  2. extract () {
  3. if [ -f $1 ] ; then
  4. case $1 in
  5. *.tar.bz2) tar xvjf $1 ;;
  6. *.tar.gz) tar xvzf $1 ;;
  7. *.bz2) bunzip2 $1 ;;
  8. *.rar) rar x $1 ;;
  9. *.gz) gunzip $1 ;;
  10. *.tar) tar xvf $1 ;;
  11. *.tbz2) tar xvjf $1 ;;
  12. *.tgz) tar xvzf $1 ;;
  13. *.zip) unzip $1 ;;
  14. *.Z) uncompress $1 ;;
  15. *.7z) 7z x $1 ;;
  16. *) echo "don't know how to extract '$1'..." ;;
  17. esac
  18. else
  19. echo "'$1' is not a valid file!"
  20. fi
  21. }

alias top-commands='history | awk "{print $2}" | awk "{print $1}" |sort|uniq

Ch Huang

Written Feb 8, 2011

When bash is invoked as an interactive login shell, or as a non-inter‐
active shell with the --login option, it first reads and executes com‐
mands from the file /etc/profile, if that file exists. After reading
that file, it looks for ~/.bash_profile, ~/.bash_login, and ~/.profile,
in that order, and reads and executes commands from the first one that
exists and is readable.

#in case you rm a file by mistake
alias rm=safe_rm

safe_rm () {
local d t f s

[ -z "$PS1" ] && (/bin/rm "$@"; return)

d="${TRASH_DIR:=$HOME/.__trash}/`date +%W`"
t=`date +%F_%H-%M-%S`
[ -e "$d" ] || mkdir -p "$d" || return

for f do
[ -e "$f" ] || continue
s=`basename "$f"`
/bin/mv "$f" "$d/${t}_$s" || break
done

echo -e "[$? $t `whoami` `pwd`]$@\n" >> "$d/00rmlog.txt"
}

Akhil Ravidas

Written Feb 10, 2013

 
  1. alias Cd='cd -'
 

My Favorite bash Tips and Tricks Linux Journal

However, you can use spaces if they're enclosed in quotes outside the braces or within an item in the comma-separated list:

$ echo {"one ","two ","red ","blue "}fish
one fish two fish red fish blue fish

$ echo {one,two,red,blue}" fish"
one fish two fish red fish blue fish

You also can nest braces, but you must use some caution here too:

$ echo {{1,2,3},1,2,3}
1 2 3 1 2 3

$ echo {{1,2,3}1,2,3}
11 21 31 2 3

[Dec 19, 2016] Unknown Bash Tips and Tricks For Linux Linux.com The source for Linux information

The type command looks a lot like the command builtin, but it does more:

$ type ll
ll is aliased to `ls -alF'

$ type -t grep
alias

Bash Functions

Run declare -F to see a list of Bash's builtin function names. declare -f prints out the complete functions, and declare -f [function-name] prints the named function. type won't find list functions, but once you know a function name it will also print it:

$ type quote
quote is a function
quote () 
{ 
    echo \'${1//\'/\'\\\'\'}\'
}

This even works for your own functions that you create, like this simple example testfunc that does one thing: changes to the /etc directory:

$ function testfunc
> {
> cd /etc
> }

Now you can use declare and type to list and view your new function just like the builtins.

[Dec 19, 2016] Bash Tricks " Linux Magazine

Graham Nicholls • 2 years ago

Oh FGS alias rm="rm -i" what a crock. I have _never_ needed this. Unix/Linux is expert friendly, not fool friendly. Possibly useful if you're root, otherwise just an incredible irritant.

OTOH, I think that history time-stamping should be the default. So useful for auditing, and for "I know I did something the other day" stuff. I use '%c' for my HISTTIMEFORMAT.

marnixava > Graham Nicholls • 2 years ago

I fully agree that the 'rm="rm -i"' alias and similar aliases are irritating. I think it might also lull newcomers into a false sense of security that it's pretty safe to do that command. One day they might be on a system without such an alias. It's good to learn early on that "rm" means it's going to be removed, no ifs or buts. One needs to make a habit of reviewing the command line before hitting enter.

Graham Nicholls > marnixava • 2 years ago

That's a really good point, which I'd not considered.

John Lockard • 2 years ago

The "HISTIGNORE" is interesting for other purposes, but the option for ignoring commands which start with space is actually a setting in bash using "export HISTCONTROL=ignorespace". If you want to eliminate duplicate entries you can use "ignoredups" or "erasedups". "ignoreboth" does both "ignoredups" and "ignorespace".

Ryan • 2 years ago

I like that using chattr -a is mentioned as a possible security fix for .bash_history, when the next talked about item is HISTIGNORE and someone could just export HISTIGNORE="*" and it doesnt matter if .bash_alias is append only. The commands are not logged in the first place to be deleted later.

edit: But good post overall. enjoyed it :)

marnixava > Ryan • 2 years ago

Even if the history file is chattr'ed to append-only mode, wouldn't the user still be able to simply remove that history file? IMHO there are too many workarounds for a determined user to make it worthwhile except perhaps if used only as a gentle reminder that we'd like not to alter the history file.

Linux secrets most users don't know about ITworld

J1r1k: "Alt + . (dot) in bash. Last argument of previous command. It took me few years to discover this."

[Dec 06, 2015] Bash For Loop Examples

A very nice tutorial by Vivek Gite (created October 31, 2008 last updated June 24, 2015). His mistake is putting new for loop too far inside the tutorial. It should emphazied, not hidden.
June 24, 2015 | cyberciti.biz

... ... ...

Bash v4.0+ has inbuilt support for setting up a step value using {START..END..INCREMENT} syntax:

#!/bin/bash
echo "Bash version ${BASH_VERSION}..."
for i in {0..10..2}
  do
     echo "Welcome $i times"
 done

Sample outputs:

Bash version 4.0.33(0)-release...
Welcome 0 times
Welcome 2 times
Welcome 4 times
Welcome 6 times
Welcome 8 times
Welcome 10 times

... ... ...

Three-expression bash for loops syntax

This type of for loop share a common heritage with the C programming language. It is characterized by a three-parameter loop control expression; consisting of an initializer (EXP1), a loop-test or condition (EXP2), and a counting expression (EXP3).

for (( EXP1; EXP2; EXP3 ))
do
	command1
	command2
	command3
done

A representative three-expression example in bash as follows:

#!/bin/bash
for (( c=1; c<=5; c++ ))
do
   echo "Welcome $c times"
done
... ... ...

Jadu Saikia, November 2, 2008, 3:37 pm

Nice one. All the examples are explained well, thanks Vivek.

seq 1 2 20
output can also be produced using jot

jot – 1 20 2

The infinite loops as everyone knows have the following alternatives.

while(true)
or
while :

//Jadu

Andi Reinbrech, November 18, 2010, 7:42 pm
I know this is an ancient thread, but thought this trick might be helpful to someone:

For the above example with all the cuts, simply do

set `echo $line`

This will split line into positional parameters and you can after the set simply say

F1=$1; F2=$2; F3=$3

I used this a lot many years ago on solaris with "set `date`", it neatly splits the whole date string into variables and saves lots of messy cutting :-)

… no, you can't change the FS, if it's not space, you can't use this method

Peko, July 16, 2009, 6:11 pm
Hi Vivek,
Thanks for this a useful topic.

IMNSHO, there may be something to modify here
=======================
Latest bash version 3.0+ has inbuilt support for setting up a step value:

#!/bin/bash
for i in {1..5}
=======================
1) The increment feature seems to belong to the version 4 of bash.
Reference: http://bash-hackers.org/wiki/doku.php/syntax/expansion/brace
Accordingly, my bash v3.2 does not include this feature.

BTW, where did you read that it was 3.0+ ?
(I ask because you may know some good website of interest on the subject).

2) The syntax is {from..to..step} where from, to, step are 3 integers.
You code is missing the increment.

Note that GNU Bash documentation may be bugged at this time,
because on GNU Bash manual, you will find the syntax {x..y[incr]}
which may be a typo. (missing the second ".." between y and increment).

see http://www.gnu.org/software/bash/manual/bashref.html#Brace-Expansion

The Bash Hackers page
again, see http://bash-hackers.org/wiki/doku.php/syntax/expansion/brace
seeems to be more accurate,
but who knows ? Anyway, at least one of them may be right… ;-)

Keep on the good work of your own,
Thanks a million.

- Peko

Michal Kaut July 22, 2009, 6:12 am
Hello,

is there a simple way to control the number formatting? I use several computers, some of which have non-US settings with comma as a decimal point. This means that
for x in $(seq 0 0.1 1) gives 0 0.1 0.2 … 1 one some machines and 0 0,1 0,2 … 1 on other.
Is there a way to force the first variant, regardless of the language settings? Can I, for example, set the keyboard to US inside the script? Or perhaps some alternative to $x that would convert commas to points?
(I am sending these as parameters to another code and it won't accept numbers with commas…)

The best thing I could think of is adding x=`echo $x | sed s/,/./` as a first line inside the loop, but there should be a better solution? (Interestingly, the sed command does not seem to be upset by me rewriting its variable.)

Thanks,
Michal

Peko July 22, 2009, 7:27 am

To Michal Kaut:

Hi Michal,

Such output format is configured through LOCALE settings.

I tried :

export LC_CTYPE="en_EN.UTF-8″; seq 0 0.1 1

and it works as desired.

You just have to find the exact value for LC_CTYPE that fits to your systems and your needs.

Peko

Peko July 22, 2009, 2:29 pm

To Michal Kaus [2]

Ooops – ;-)
Instead of LC_CTYPE,
LC_NUMERIC should be more appropriate
(Although LC_CTYPE is actually yielding to the same result – I tested both)

By the way, Vivek has already documented the matter : http://www.cyberciti.biz/tips/linux-find-supportable-character-sets.html

Philippe Petrinko October 30, 2009, 8:35 am

To Vivek:
Regarding your last example, that is : running a loop through arguments given to the script on the command line, there is a simplier way of doing this:
# instead of:
# FILES="$@"
# for f in $FILES

# use the following syntax
for arg
do
# whatever you need here – try : echo "$arg"
done

Of course, you can use any variable name, not only "arg".

Philippe Petrinko November 11, 2009, 11:25 am

To tdurden:

Why would'nt you use

1) either a [for] loop
for old in * ; do mv ${old} ${old}.new; done

2) Either the [rename] command ?
excerpt form "man rename" :

RENAME(1) Perl Programmers Reference Guide RENAME(1)

NAME
rename – renames multiple files

SYNOPSIS
rename [ -v ] [ -n ] [ -f ] perlexpr [ files ]

DESCRIPTION
"rename" renames the filenames supplied according to the rule specified
as the first argument. The perlexpr argument is a Perl expression
which is expected to modify the $_ string in Perl for at least some of
the filenames specified. If a given filename is not modified by the
expression, it will not be renamed. If no filenames are given on the
command line, filenames will be read via standard input.

For example, to rename all files matching "*.bak" to strip the
extension, you might say

rename 's/\.bak$//' *.bak

To translate uppercase names to lower, you'd use

rename 'y/A-Z/a-z/' *

- Philippe

Philippe Petrinko November 11, 2009, 9:27 pm

If you set the shell option extglob, Bash understands some more powerful patterns. Here, a is one or more pattern, separated by the pipe-symbol (|).

?() Matches zero or one occurrence of the given patterns
*() Matches zero or more occurrences of the given patterns
+() Matches one or more occurrences of the given patterns
@() Matches one of the given patterns
!() Matches anything except one of the given patterns

source: http://www.bash-hackers.org/wiki/doku.php/syntax/pattern

Philippe Petrinko November 12, 2009, 3:44 pm

To Sean:
Right, the more sharp a knife is, the easier it can cut your fingers…

I mean: There are side-effects to the use of file globbing (like in [ for f in * ] ) , when the globbing expression matches nothing: the globbing expression is not susbtitued.

Then you might want to consider using [ nullglob ] shell extension,
to prevent this.
see: http://www.bash-hackers.org/wiki/doku.php/syntax/expansion/globs#customization

Devil hides in detail ;-)

Dominic January 14, 2010, 10:04 am

There is an interesting difference between the exit value for two different for looping structures (hope this comes out right):
for (( c=1; c<=2; c++ )) do echo -n "inside (( )) loop c is $c, "; done; echo "done (( )) loop c is $c"
for c in {1..2}; do echo -n "inside { } loop c is $c, "; done; echo "done { } loop c is $c"

You see that the first structure does a final increment of c, the second does not. The first is more useful IMO because if you have a conditional break in the for loop, then you can subsequently test the value of $c to see if the for loop was broken or not; with the second structure you can't know whether the loop was broken on the last iteration or continued to completion.

Dominic January 14, 2010, 10:09 am

sorry, my previous post would have been clearer if I had shown the output of my code snippet, which is:
inside (( )) loop c is 1, inside (( )) loop c is 2, done (( )) loop c is 3
inside { } loop c is 1, inside { } loop c is 2, done { } loop c is 2

Philippe Petrinko March 9, 2010, 2:34 pm

@Dmitry

And, again, as stated many times up there, using [seq] is counter productive, because it requires a call to an external program, when you should Keep It Short and Simple, using only bash internals functions:


for ((c=1; c<21; c+=2)); do echo "Welcome $c times" ; done

(and I wonder why Vivek is sticking to that old solution which should be presented only for historical reasons when there was no way of using bash internals.
By the way, this historical recall should be placed only at topic end, and not on top of the topic, which makes newbies sticking to the not-up-to-date technique ;-) )

Sean March 9, 2010, 11:15 pm

I have a comment to add about using the builtin for (( … )) syntax. I would agree the builtin method is cleaner, but from what I've noticed with other builtin functionality, I had to check the speed advantage for myself. I wrote the following files:

builtin_count.sh:

#!/bin/bash
for ((i=1;i<=1000000;i++))
do
echo "Output $i"
done

seq_count.sh:

#!/bin/bash
for i in $(seq 1 1000000)
do
echo "Output $i"
done

And here were the results that I got:
time ./builtin_count.sh
real 0m22.122s
user 0m18.329s
sys 0m3.166s

time ./seq_count.sh
real 0m19.590s
user 0m15.326s
sys 0m2.503s

The performance increase isn't too significant, especially when you are probably going to be doing something a little more interesting inside of the for loop, but it does show that builtin commands are not necessarily faster.

Andi Reinbrech November 18, 2010, 8:35 pm

The reason why the external seq is faster, is because it is executed only once, and returns a huge splurb of space separated integers which need no further processing, apart from the for loop advancing to the next one for the variable substitution.

The internal loop is a nice and clean/readable construct, but it has a lot of overhead. The check expression is re-evaluated on every iteration, and a variable on the interpreter's heap gets incremented, possibly checked for overflow etc. etc.

Note that the check expression cannot be simplified or internally optimised by the interpreter because the value may change inside the loop's body (yes, there are cases where you'd want to do this, however rare and stupid they may seem), hence the variables are volatile and get re-evaluted.

I.e. botom line, the internal one has more overhead, the "seq" version is equivalent to either having 1000000 integers inside the script (hard coded), or reading once from a text file with 1000000 integers with a cat. Point being that it gets executed only once and becomes static.

OK, blah blah fishpaste, past my bed time :-)

Cheers,
Andi

Anthony Thyssen June 4, 2010, 6:53 am

The {1..10} syntax is pretty useful as you can use a variable with it!

limit=10
echo {1..${limit}}
{1..10}

You need to eval it to get it to work!

limit=10
eval "echo {1..${limit}}"
1 2 3 4 5 6 7 8 9 10

'seq' is not avilable on ALL system (MacOSX for example)
and BASH is not available on all systems either.

You are better off either using the old while-expr method for computer compatiblity!

   limit=10; n=1;
   while [ $n -le 10 ]; do
     echo $n;
     n=`expr $n + 1`;
   done

Alternativally use a seq() function replacement…

 # seq_count 10
seq_count() {
  i=1; while [ $i -le $1 ]; do echo $i; i=`expr $i + 1`; done
}
# simple_seq 1 2 10
simple_seq() {
  i=$1; while [ $i -le $3 ]; do echo $i; i=`expr $i + $2`; done
}
seq_integer() {
    if [ "X$1" = "X-f" ]
    then format="$2"; shift; shift
    else format="%d"
    fi
    case $# in
    1) i=1 inc=1 end=$1 ;;
    2) i=$1 inc=1 end=$2 ;;
    *) i=$1 inc=$2 end=$3 ;;
    esac
    while [ $i -le $end ]; do
      printf "$format\n" $i;
      i=`expr $i + $inc`;
    done
  }

Edited: by Admin – added code tags.

TheBonsai June 4, 2010, 9:57 am

The Bash C-style for loop was taken from KSH93, thus I guess it's at least portable towards Korn and Z.

The seq-function above could use i=$((i + inc)), if only POSIX matters. expr is obsolete for those things, even in POSIX.

Philippe Petrinko June 4, 2010, 10:15 am

Right Bonsai,
( http://www.opengroup.org/onlinepubs/009695399/utilities/xcu_chap02.html#tag_02_06_04 )

But FOR C-style does not seem to be POSIXLY-correct…

Read on-line reference issue 6/2004,
Top is here, http://www.opengroup.org/onlinepubs/009695399/mindex.html

and the Shell and Utilities volume (XCU) T.OC. is here
http://www.opengroup.org/onlinepubs/009695399/utilities/toc.html
doc is:
http://www.opengroup.org/onlinepubs/009695399/basedefs/xbd_chap01.html

and FOR command:
http://www.opengroup.org/onlinepubs/009695399/utilities/xcu_chap02.html#tag_02_09_04_03

Anthony Thyssen June 6, 2010, 7:18 am

TheBonsai wrote…. "The seq-function above could use i=$((i + inc)), if only POSIX matters. expr is obsolete for those things, even in POSIX."

I am not certain it is in Posix. It was NOT part of the original Bourne Shell, and on some machines, I deal with Bourne Shell. Not Ksh, Bash, or anything else.

Bourne Shell syntax works everywhere! But as 'expr' is a builtin in more modern shells, then it is not a big loss or slow down.

This is especially important if writing a replacement command, such as for "seq" where you want your "just-paste-it-in" function to work as widely as possible.

I have been shell programming pretty well all the time since 1988, so I know what I am talking about! Believe me.

MacOSX has in this regard been the worse, and a very big backward step in UNIX compatibility. 2 year after it came out, its shell still did not even understand most of the normal 'test' functions. A major pain to write shells scripts that need to also work on this system.

TheBonsai June 6, 2010, 12:35 pm

Yea, the question was if it's POSIX, not if it's 100% portable (which is a difference). The POSIX base more or less is a subset of the Korn features (88, 93), pure Bourne is something "else", I know. Real portability, which means a program can go wherever UNIX went, only in C ;)

Philippe Petrinko November 22, 2010, 8:23 am

And if you want to get rid of double-quotes, use:

one-liner code:
while read; do record=${REPLY}; echo ${record}|while read -d ","; do field="${REPLY#\"}"; field="${field%\"}"; echo ${field}; done; done<data

script code, added of some text to better see record and field breakdown:

#!/bin/bash
while read
do
echo "New record"
record=${REPLY}
echo ${record}|while read -d ,
do
field="${REPLY#\"}"
field="${field%\"}"
echo "Field is :${field}:"
done
done<data

Does it work with your data?

- PP

Philippe Petrinko November 22, 2010, 9:01 am

Of course, all the above code was assuming that your CSV file is named "data".

If you want to use anyname with the script, replace:

done<data

With:

done

And then use your script file (named for instance "myScript") with standard input redirection:

myScript < anyFileNameYouWant

Enjoy!

Philippe Petrinko November 22, 2010, 11:28 am

well no there is a bug, last field of each record is not read – it needs a workout and may be IFS modification ! After all that's what it was built for… :O)

Anthony Thyssen November 22, 2010, 11:31 pm

Another bug is the inner loop is a pipeline, so you can't assign variables for use later in the script. but you can use '<<<' to break the pipeline and avoid the echo.

But this does not help when you have commas within the quotes! Which is why you needed quotes in the first place.

In any case It is a little off topic. Perhaps a new thread for reading CVS files in shell should be created.

Philippe Petrinko November 24, 2010, 6:29 pm

Anthony,
Would you try this one-liner script on your CSV file?

This one-liner assumes that CSV file named [data] has __every__ field double-quoted.


while read; do r="${REPLY#\"}";echo "${r//\",\"/\"}"|while read -d \";do echo "Field is :${REPLY}:";done;done<data

Here is the same code, but for a script file, not a one-liner tweak.


#!/bin/bash
# script csv01.sh
#
# 1) Usage
# This script reads from standard input
# any CSV with double-quoted data fields
# and breaks down each field on standard output
#
# 2) Within each record (line), _every_ field MUST:
# - Be surrounded by double quotes,
# - and be separated from preceeding field by a comma
# (not the first field of course, no comma before the first field)
#
while read
do
echo "New record" # this is not mandatory-just for explanation
#
#
# store REPLY and remove opening double quote
record="${REPLY#\"}"
#
#
# replace every "," by a single double quote
record=${record//\",\"/\"}
#
#
echo ${record}|while read -d \"
do
# store REPLY into variable "field"
field="${REPLY}"
#
#
echo "Field is :${field}:" # just for explanation
done
done

This script named here [cvs01.sh] must be used so:

cvs01.sh < my-cvs-file-with-doublequotes

Philippe Petrinko November 24, 2010, 6:35 pm

@Anthony,

By the way, using [REPLY] in the outer loop _and_ the inner loop is not a bug.
As long as you know what you do, this is not problem, you just have to store [REPLY] value conveniently, as this script shows.

TheBonsai March 8, 2011, 6:26 am
for ((i=1; i<=20; i++)); do printf "%02d\n" "$i"; done

nixCraft March 8, 2011, 6:37 am

+1 for printf due to portability, but you can use bashy .. syntax too

for i in {01..20}; do echo "$i"; done

TheBonsai March 8, 2011, 6:48 am

Well, it isn't portable per se, it makes it portable to pre-4 Bash versions.

I think a more or less "portable" (in terms of POSIX, at least) code would be

i=0
while [ "$((i >= 20))" -eq 0 ]; do
  printf "%02d\n" "$i"
  i=$((i+1))
done

Philip Ratzsch April 20, 2011, 5:53 am

I didn't see this in the article or any of the comments so I thought I'd share. While this is a contrived example, I find that nesting two groups can help squeeze a two-liner (once for each range) into a one-liner:

for num in {{1..10},{15..20}};do echo $num;done

Great reference article!

Philippe Petrinko April 20, 2011, 8:23 am

@Philip
Nice thing to think of, using brace nesting, thanks for sharing.

Philippe Petrinko May 6, 2011, 10:13 am

Hello Sanya,

That would be because brace expansion does not support variables. I have to check this.
Anyway, Keep It Short and Simple: (KISS) here is a simple solution I already gave above:

xstart=1;xend=10;xstep=1
for (( x = $xstart; x <= $xend; x += $xstep)); do echo $x;done

Actually, POSIX compliance allows to forget $ in for quotes, as said before, you could also write:

xstart=1;xend=10;xstep=1
for (( x = xstart; x <= xend; x += xstep)); do echo $x;done

Philippe Petrinko May 6, 2011, 10:48 am

Sanya,

Actually brace expansion happens __before__ $ parameter exapansion, so you cannot use it this way.

Nevertheless, you could overcome this this way:

max=10; for i in $(eval echo {1..$max}); do echo $i; done

Sanya May 6, 2011, 11:42 am

Hello, Philippe

Thanks for your suggestions
You basically confirmed my findings, that bash constructions are not as simple as zsh ones.
But since I don't care about POSIX compliance, and want to keep my scripts "readable" for less experienced people, I would prefer to stick to zsh where my simple for-loop works

Cheers, Sanya

Philippe Petrinko May 6, 2011, 12:07 pm

Sanya,

First, you got it wrong: solutions I gave are not related to POSIX, I just pointed out that POSIX allows not to use $ in for (( )), which is just a little bit more readable – sort of.

Second, why do you see this less readable than your [zsh] [for loop]?

for (( x = start; x <= end; x += step)) do
echo "Loop number ${x}"
done

It is clear that it is a loop, loop increments and limits are clear.

IMNSHO, if anyone cannot read this right, he should not be allowed to code. :-D

BFN

Anthony Thyssen May 8, 2011, 11:30 pm

If you are going to do… $(eval echo {1..$max});
You may as well use "seq" or one of the many other forms.
See all the other comments on doing for loops.

Tom P May 19, 2011, 12:16 pm

I am trying to use the variable I set in the for line on to set another variable with a different extension. Couldn't get this to work and couldnt find it anywhere on the web… Can someone help.

Example:

FILE_TOKEN=`cat /tmp/All_Tokens.txt`
for token in $FILE_TOKEN
do
A1_$token=`grep $A1_token /file/path/file.txt | cut -d ":" -f2`

my goal is to take the values from the ALL Tokens file and set a new variable with A1_ infront of it… This tells be that A1_ is not a command…

[Nov 08, 2015] Get timestamps on Bash's History

nickgeoghegan.net
One of the annoyances of Bash, is that searching through your history has no context. When did I last run that command? What commands were run at 3am, while on the lock?

The following, single line, run in the shell, will provide date and time stamping for your Bash History the next time you login, or run bash.

echo  'export HISTTIMEFORMAT="%h/%d - %H:%M:%S "' >>  ~/.bashrc

[May 08, 2014] 25 Even More – Sick Linux Commands UrFix's Blog

6) Display a cool clock on your terminal

watch -t -n1 "date +%T|figlet"

This command displays a clock on your terminal which updates the time every second. Press Ctrl-C to exit.

A couple of variants:

A little bit bigger text:

watch -t -n1 "date +%T|figlet -f big"You can try other figlet fonts, too.

Big sideways characters:

watch -n 1 -t '/usr/games/banner -w 30 $(date +%M:%S)'This requires a particular version of banner and a 40-line terminal or you can adjust the width ("30″ here).

7) intercept stdout/stderr of another process
strace -ff -e trace=write -e write=1,2 -p SOME_PID
8) Remove duplicate entries in a file without sorting.
awk '!x[$0]++' <file>

Using awk, find duplicates in a file without sorting, which reorders the contents. awk will not reorder them, and still find and remove duplicates which you can then redirect into another file.

9) Record a screencast and convert it to an mpeg
ffmpeg -f x11grab -r 25 -s 800x600 -i :0.0 /tmp/outputFile.mpg

Grab X11 input and create an MPEG at 25 fps with the resolution 800×600

10) Mount a .iso file in UNIX/Linux
mount /path/to/file.iso /mnt/cdrom -oloop

"-o loop" lets you use a file as a block device

11) Insert the last command without the last argument (bash)
!:-

/usr/sbin/ab2 -f TLS1 -S -n 1000 -c 100 -t 2 http://www.google.com/then

!:- http://www.urfix.com/is the same as

/usr/sbin/ab2 -f TLS1 -S -n 1000 -c 100 -t 2 http://www.urfix.com/

12) Convert seconds to human-readable format

date -d@1234567890

This example, for example, produces the output, "Fri Feb 13 15:26:30 EST 2009″

13) Job Control
^Z $bg $disown

You're running a script, command, whatever.. You don't expect it to take long, now 5pm has rolled around and you're ready to go home… Wait, it's still running… You forgot to nohup it before running it… Suspend it, send it to the background, then disown it… The ouput wont go anywhere, but at least the command will still run…

14) Edit a file on a remote host using vim
vim scp://username@host//path/to/somefile
15) Monitor the queries being run by MySQL
watch -n 1 mysqladmin --user=<user> --password=<password> processlist

Watch is a very useful command for periodically running another command – in this using mysqladmin to display the processlist. This is useful for monitoring which queries are causing your server to clog up.

More info here: http://codeinthehole.com/archives/2-Monitoring-MySQL-processes.html

16) escape any command aliases
\[command]

e.g. if rm is aliased for 'rm -i', you can escape the alias by prepending a backslash:

rm [file] # WILL prompt for confirmation per the alias

\rm [file] # will NOT prompt for confirmation per the default behavior of the command

17) Show apps that use internet connection at the moment. (Multi-Language)
ss -p

for one line per process:

ss -p | catfor established sockets only:

ss -p | grep STAfor just process names:

ss -p | cut -f2 -sd\"or

ss -p | grep STA | cut -f2 -d\"

18) Send pop-up notifications on Gnome

notify-send ["<title>"] "<body>"

The title is optional.

Options:

-t: expire time in milliseconds.

-u: urgency (low, normal, critical).

-i: icon path.

On Debian-based systems you may need to install the 'libnotify-bin' package.

Useful to advise when a wget download or a simulation ends. Example:

wget URL ; notify-send "Done"

19) quickly rename a file

mv filename.{old,new}
20) Remove all but one specific file
rm -f !(survivior.txt)
21) Generate a random password 30 characters long
strings /dev/urandom | grep -o '[[:alnum:]]' | head -n 30 | tr -d '\n'; echo

Find random strings within /dev/urandom. Using grep filter to just Alphanumeric characters, and then print the first 30 and remove all the line feeds.

22) Run a command only when load average is below a certain threshold
echo "rm -rf /unwanted-but-large/folder" | batch

Good for one off jobs that you want to run at a quiet time. The default threshold is a load average of 0.8 but this can be set using atrun.

23) Binary Clock
watch -n 1 'echo "obase=2;`date +%s`" | bc'

Create a binary clock.

24) Processor / memory bandwidthd? in GB/s
dd if=/dev/zero of=/dev/null bs=1M count=32768

Read 32GB zero's and throw them away.

How fast is your system?

25) Backup all MySQL Databases to individual files
for I in $(mysql -e 'show databases' -s --skip-column-names); 
do mysqldump $I | gzp > "$I.sql.gz"; done

[May 08, 2014] 25 Best Linux Commands UrFix's Blog

25) sshfs name@server:/path/to/folder /path/to/mount/point
Mount folder/filesystem through SSH
Install SSHFS from http://fuse.sourceforge.net/sshfs.html
Will allow you to mount a folder security over a network.

24) !!:gs/foo/bar
Runs previous command replacing foo by bar every time that foo appears
Very useful for rerunning a long command changing some arguments globally.
As opposed to ^foo^bar, which only replaces the first occurrence of foo, this one changes every occurrence.

23) mount | column -t
currently mounted filesystems in nice layout
Particularly useful if you're mounting different drives, using the following command will allow you to see all the filesystems currently mounted on your computer and their respective specs with the added benefit of nice formatting.

22) <space>command
Execute a command without saving it in the history
Prepending one or more spaces to your command won't be saved in history.
Useful for pr0n or passwords on the commandline.

21) ssh user@host cat /path/to/remotefile | diff /path/to/localfile -
Compare a remote file with a local file
Useful for checking if there are differences between local and remote files.

20) mount -t tmpfs tmpfs /mnt -o size=1024m
Mount a temporary ram partition
Makes a partition in ram which is useful if you need a temporary working space as read/write access is fast.
Be aware that anything saved in this partition will be gone after your computer is turned off.

19) dig +short txt <keyword>.wp.dg.cx
Query Wikipedia via console over DNS
Query Wikipedia by issuing a DNS query for a TXT record. The TXT record will also include a short URL to the complete corresponding Wikipedia entry.

18) netstat -tlnp
Lists all listening ports together with the PID of the associated process
The PID will only be printed if you're holding a root equivalent ID.

17) dd if=/dev/dsp | ssh -c arcfour -C username@host dd of=/dev/dsp
output your microphone to a remote computer's speaker
This will output the sound from your microphone port to the ssh target computer's speaker port. The sound quality is very bad, so you will hear a lot of hissing.

16) echo "ls -l" | at midnight
Execute a command at a given time
This is an alternative to cron which allows a one-off task to be scheduled for a certain time.

15) curl -u user:pass -d status="Tweeting from the shell" http://twitter.com/statuses/update.xml
Update twitter via curl

14) ssh -N -L2001:localhost:80 somemachine
start a tunnel from some machine's port 80 to your local post 2001
now you can acces the website by going to http://localhost:2001/

13) reset
Salvage a borked terminal
If you bork your terminal by sending binary data to STDOUT or similar, you can get your terminal back using this command rather than killing and restarting the session. Note that you often won't be able to see the characters as you type them.

12) ffmpeg -f x11grab -s wxga -r 25 -i :0.0 -sameq /tmp/out.mpg
Capture video of a linux desktop

11) > file.txt
Empty a file
For when you want to flush all content from a file without removing it (hat-tip to Marc Kilgus).

10) $ssh-copy-id user@host
Copy ssh keys to user@host to enable password-less ssh logins.
To generate the keys use the command ssh-keygen

9) ctrl-x e
Rapidly invoke an editor to write a long, complex, or tricky command
Next time you are using your shell, try typing ctrl-x e (that is holding control key press x and then e). The shell will take what you've written on the command line thus far and paste it into the editor specified by $EDITOR. Then you can edit at leisure using all the powerful macros and commands of vi, emacs, nano, or whatever.

8 ) !whatever:p
Check command history, but avoid running it
!whatever will search your command history and execute the first command that matches 'whatever'. If you don't feel safe doing this put :p on the end to print without executing. Recommended when running as superuser.

7) mtr google.com
mtr, better than traceroute and ping combined
mtr combines the functionality of the traceroute and ping programs in a single network diagnostic tool.
As mtr starts, it investigates the network connection between the host mtr runs on and HOSTNAME. by sending packets with purposly low TTLs. It continues to send packets with low TTL, noting the response time of the intervening routers. This allows mtr to print the response percentage and response times of the internet route to HOSTNAME. A sudden increase in packetloss or response time is often an indication of a bad (or simply over‐loaded) link.

6 ) cp filename{,.bak}
quickly backup or copy a file with bash

5) ^foo^bar
Runs previous command but replacing
Really useful for when you have a typo in a previous command. Also, arguments default to empty so if you accidentally run: echo "no typozs"
you can correct it with ^z

4) cd -
change to the previous working directory

3):w !sudo tee %
Save a file you edited in vim without the needed permissions
I often forget to sudo before editing a file I don't have write permissions on. When you come to save that file and get the infamous "E212: Can't open file for writing", just issue that vim command in order to save the file without the need to save it to a temp file and then copy it back again.

2) python -m SimpleHTTPServer
Serve current directory tree at http://$HOSTNAME:8000/

1) sudo !!
Run the last command as root
Useful when you forget to use sudo for a command. "!!" grabs the last run command.

[Dec 16, 2012] bash - how do I list the functions defined in my shell - Stack Overflow

Function names and definitions may be listed with the -f option to the declare or typeset builtin commands (see Bash Builtins). The -F option to declare or typeset will list the function names only (and optionally the source file and line number

Unknown Bash Tips and Tricks For Linux Linux.com

Bash Builtins

Bash has a bunch of built-in commands, and some of them are stripped-down versions of their external GNU coreutils cousins. So why use them? You probably already do, because of the order of command execution in Bash:

  1. Bash aliases
  2. Bash keywords
  3. Bash functions
  4. Bash builtins
  5. Scripts and executable programs that are in your PATH

So when you run echo, kill, printf, pwd, or test most likely you're using the Bash builtins rather than the GNU coreutils commands. How do you know? By using one of the Bash builtins to tell you, the command command:

$ command -V echo
echo is a shell builtin

$ command -V ping
ping is /bin/ping

The Bash builtins do not have man pages, but they do have a backwards help builtin command that displays syntax and options:

$ help echo
echo: echo [-neE] [arg ...]
    Write arguments to the standard output.
    
    Display the ARGs on the standard output followed by a newline.
    
    Options:
      -n        do not append a newline
      -e        enable interpretation of the following backslash escapes
[...]

I call it backwards because most Linux commands use a syntax of commandname --help, where help is a command option instead of a command.

The type command looks a lot like the command builtin, but it does more:

$ type -a cat
cat is /bin/cat

$ type -t cat
file

$ type ll
ll is aliased to `ls -alF'

$ type -a echo
echo is a shell builtin
echo is /bin/echo

$ type -t grep
alias

The type utility identifies builtin commands, functions, aliases, keywords (also called reserved words), and also binary executables and scripts, which it calls file. At this point, if you are like me, you are grumbling "How about showing me a LIST of the darned things." I hear and obey, for you can find these delightfully documented in the The GNU Bash Reference Manual indexes. Don't be afraid, because unlike most software documention this isn't a scary mythical creature like Sasquatch, but a real live complete command reference.

The point of this little exercise is so you know what you're really using when you type a command into the Bash shell, and so you know how it looks to Bash. There is one more overlapping Bash builtin, and that is the time keyword:

$ type -t time
keyword

So why would you want to use Bash builtins instead of their GNU cousins? Builtins may execute a little faster than the external commands, because external commands have to fork an extra process. I doubt this is much of an issue on modern computers because we have horsepower to burn, unlike the olden days when all we had were tiny little nanohertzes, but when you're tweaking performance it's one thing to look at. When you want to use the GNU command instead of the Bash builtin use its whole path, which you can find with command, type, or the good old not-Bash command which:

$ which echo
/bin/echo

$ which which
/usr/bin/which 

Bash Functions

Run declare -F to see a list of Bash's builtin function names. declare -f prints out the complete functions, and declare -f [function-name] prints the named function. type won't find list functions, but once you know a function name it will also print it:

$ type quote
quote is a function
quote () 
{ 
    echo \'${1//\'/\'\\\'\'}\'
}

This even works for your own functions that you create, like this simple example testfunc that does one thing: changes to the /etc directory:

$ function testfunc
> {
> cd /etc
> }

Now you can use declare and type to list and view your new function just like the builtins.

Bash's Violent Side

Don't be fooled by Bash's calm, obedient exterior, because it is capable of killing. There have been a lot of changes to how Linux manages processes, in some cases making them more difficult to stop, so knowing how to kill runaway processes is still an important bit of knowledge. Fortunately, despite all this newfangled "progress" the reliable old killers still work.

I've had some troubles with bleeding-edge releases of KMail; it hangs and doesn't want to close by normal means. It spawns a single process, which we can see with the ps command:

ps axf|grep kmail
 2489 ?     Sl  1:44 /usr/bin/kmail -caption KMail

You can start out gently and try this:

$ kill 2489

This sends the default SIGTERM (signal terminate) signal, which is similar to the SIGINT (signal interrupt) sent from the keyboard with Ctrl+c. So what if this doesn't work? Then you amp up your stopping power and use SIGKILL, like this:

$ kill -9 2489

This is the nuclear option and it will work. As the relevant section of the GNU C manual says: "The SIGKILL signal is used to cause immediate program termination. It cannot be handled or ignored, and is therefore always fatal. It is also not possible to block this signal." This is different from SIGTERM and SIGINT and other signals that politely ask processes to terminate. They can be trapped and handled in different ways, and even blocked, so the response you get to a SIGTERM depends on how the program you're trying to kill has been programmed to handle signals. In an ideal world a program responds to SIGTERM by tidying up before exiting, like finishing disk writes and deleting temporary files. SIGKILL knocks it out and doesn't give it a chance to do any cleanup. (See man 7 signal for a complete description of all signals.)

So what's special about Bash kill over GNU /bin/kill? My favorite is how it looks when you invoke the online help summary:

$ help kill

Another advantage is it can use job control numbers in addition to PIDs. In this modern era of tabbed terminal emulators job control isn't the big deal it used to be, but the option is there if you want it. The biggest advantage is you can kill processes even if they have gone berserk and maxed out your system's process number limit, which would prevent you from launching /bin/kill. Yes, there is a limit, and you can see what it is by querying /proc:

$ cat /proc/sys/kernel/threads-max
61985

With Bash kill there are several ways to specify which signal you want to use. These are all the same:

$ kill 2489
$ kill -s TERM 2489
$ kill -s SIGTERM 2489
$ kill -n 15 2489

kill -l lists all supported signals.

If you spend a little quality time with man bash and the GNU Bash Manual I daresay you will learn more valuable tasks that Bash can do for you.

My Favorite Bash Substitution Tricks Drastic Code

My Favorite Bash Substitution Tricks

August 01, 2009

Here's a few tricks that I often use on the command line to save time. They take advantage of some variables that the bash shell uses to store various aspects of your history.

Repeating the last command with !!

Sometimes I run a command that requires sudo access, but forget the sudo. This is a great opportunity to use !! which holds the last command you ran.

 
$ tail /var/log/mail.log
tail: cannot open `/var/log/mail.log' for reading: Permission denied
$ sudo !!
sudo tail /var/log/mail.log
# output of command

The last argument of the last command using !$

Sometimes it's handy to be able to reference the last argument of your last command. This can make certain operations safer, by preventing a fat fingered typo from deleting important files.

 
$ ls *.log
a.log b.log
$ rm -v !$
removed `a.log'
removed `b.log'

Similarly you can use !* to reference all of the last commands' arguments.

 
$ touch a.log b.log
$ rm -v !*
rm -v a.log b.log
removed `a.log'
removed `b.log'

Correcting mistakes with ^^

This is a nifty trick that performs a substitution on your last command. It's great for correcting typos, or running similar commands back to back. It looks for a match with whatever is after the first carrot, and replaces it with whatever is after the second.

 
$ cmhod a+x my_script.sh
-bash: cmhod: command not found
$ ^mh^hm
chmod a+x my_script.sh

I use this one all the time doing rails development if I make a mistake on a script/generate command.


$ script/generate model Animal species:string sex:string birthday:date
exists app/models/
exists test/unit/
exists test/fixtures/
create app/models/animal.rb
create test/unit/animal_test.rb
create test/fixtures/animals.yml
create db/migrate
create db/migrate/20090801180754_create_animals.rb


$ ^generate^destroy
script/destroy model Animal species:string sex:string birthday:date
notempty db/migrate
notempty db
rm db/migrate/20090801180754_create_animals.rb
rm test/fixtures/animals.yml
rm test/unit/animal_test.rb
rm app/models/animal.rb
rmdir test/fixtures
notempty test
rmdir test/unit
notempty test
rmdir app/models
notempty app


$ ^destroy ^generate rspec_
script/generate rspec_model Animal species:string sex:string birthday:date
create app/models/
create spec/models/
create spec/fixtures/
create app/models/animal.rb
create spec/models/animal_spec.rb
create spec/fixtures/animals.yml
create db/migrate
create db/migrate/20090801180937_create_animals.rb


 

Hope someone else finds these as handy as I do.

Tagged with: bash command-line tips |

Comments

  1. Sam August 02, 2009 @ 11:20 AM

    One more tip:

    You can also echo a specific number of arguments off the end of the last command using !:n*, where n is the number of the first argument to echo. For example:

    $ touch 1.log 2.log 3.log 4.log 5.log
    
    $ rm -v !:3*
    rm -v 3.log 4.log 5.log
    3.log
    4.log
    5.log
    

    I don't use this one too much in practice but it could come in handy in certain situations.

  2. Kirsten August 03, 2009 @ 05:15 PM

    Thanks Sam, I didn't know about !* and the ^^ substitution, those will be useful!

Re New line in bash variables pain

Maxim Vexler
Tue, 14 Nov 2006 12:40:28 -0800
On 11/14/06, Oded Arbel <[EMAIL PROTECTED]> wrote:
[snip]
(IFS="$(echo)"; \
for pair in `awk '/^[^[].+[^\n]$/ {print $1,$3}' passwd.fake`; do
echo "$pair"; done)

In the second example, I force the record separator to be only the new
line character (the output from 'echo'. I can probably use \n, but I
wanted to play it safe). Do mind the wrapping of the second form in
parenthesis, otherwise you clobber your global IFS, which is something
you want to avoid.

--
Oded
::..
We make a living by what we get, but we make a life by what we give.
    -- Winston Churchill

Thanks to everyone for the help, all solution worked.
To sum up the tips:

By Oded Arbel:
a. Use a subshell to avoid mistakenly over riding your shell variables.
b. Use "$(echo)" as portable(?) newline variable scripting style.

By Ehud Karni:
a. Pipeing into bash subshell can be accepted inside the shell with read.
b. using a "while read VAR1 VAR2 VAR3..." is a convenient method to
accepting stdin data.
c. awk has system() !!

By Amos Shapira:
a. General work around is to construct the whole command as text, then
use either piping to sh or bash buildin "expr".

By Omer Shapira:
a. xargs -n switch can be used to "collect" variables separated by
either of [\n\t ].

By Valery Reznic:
a. set -- "space delimited word list" can be used as a quick method
for assigning value to number variables ($1..$9). [question: Really?
this does not seem to work for me].
b. bash while loop can get stdin from file IO redirection.

Ariel Biener doesn't understand the need for voodoo in modern life... ;)


Thanks guys for an educational thread.

[Mar 17, 2010] Power Shell Usage Bash Tips & Tricks

Searching the Past

[Aug 9, 2009] My Favorite bash Tips and Tricks

One last tip I'd like to offer is using loops from the command line. The command line is not the place to write complicated scripts that include multiple loops or branching. For small loops, though, it can be a great time saver. Unfortunately, I don't see many people taking advantage of this. Instead, I frequently see people use the up arrow key to go back in the command history and modify the previous command for each iteration.

If you are not familiar with creating for loops or other types of loops, many good books on shell scripting discuss this topic. A discussion on for loops in general is an article in itself.

You can write loops interactively in two ways. The first way, and the method I prefer, is to separate each line with a semicolon. A simple loop to make a backup copy of all the files in a directory would look like this:

$ for file in * ; do cp $file $file.bak; done

Another way to write loops is to press Enter after each line instead of inserting a semicolon. bash recognizes that you are creating a loop from the use of the for keyword, and it prompts you for the next line with a secondary prompt. It knows you are done when you enter the keyword done, signifying that your loop is complete:

$ for file in *
> do cp $file $file.bak
> done

[Aug 4, 2009] Tech Tip View Config Files Without Comments Linux Journal

I've been using this grep invocation for years to trim comments out of config files. Comments are great but can get in your way if you just want to see the currently running configuration. I've found files hundreds of lines long which had fewer than ten active configuration lines, it's really hard to get an overview of what's going on when you have to wade through hundreds of lines of comments.

$ grep ^[^#] /etc/ntp.conf

The regex ^[^#] matches the first character of any line, as long as that character that is not a #. Because blank lines don't have a first character they're not matched either, resulting in a nice compact output of just the active configuration lines.

The Various bash Prompts by Juliet Kemp

PS4 is the prompt shown when you set the debug mode on a shell script using set -x at the top of the script. This echoes each line of the script to STDOUT before executing it. The default prompt is ++. More usefully, you can set it to display the line number, with:
export PS4='$LINENO+ '

It's fairly likely that you already have a personalized setting for PS1, the default bash interaction prompt. But what about the others available: PS2, PS3, and PS4?

PS1 is the default interaction prompt. To set it to give you

username@host:directory$
use
export PS1="u@h w$ "
in your ~/.bash_rc. u is the current username, h the current host, and w the working directory. There's a list of escape codes you can use in the bash man page, or in the Bash Prompt HOWTO.

PS2 is the prompt you get when you extend a command over multiple lines by putting at the end of a line and hitting return. By default it's just >, but you can make this a little more obvious with:

export PS2="more -> "
so it looks like:
juliet@glade:~ $ very-long-command-here
more -> -with -lots -of -options

PS3 governs the prompt that shows up if you use the select statement in a shell script. The default is #?, so if you do nothing to change that, the select statement will print out the options and then just leave that prompt. Alternatively, use this:

PS3="Choose an option: "
select i in yes maybe no
do
	# code to handle reply
done
which will output:
1) yes
2) maybe
3) no
Choose an option: 
Far more readable for the user!

Finally, PS4 is the prompt shown when you set the debug mode on a shell script using set -x at the top of the script. This echoes each line of the script to STDOUT before executing it. The default prompt is ++. More usefully, you can set it to display the line number, with:

export PS4='$LINENO+ '

All of these can be made to be permanent changes by setting them in your ~/.bash_profile or ~/.bashrc file. (Note that this probably makes little sense to do for PS3, which is better to set per-script.)

Recovering Deleted Files With lsof By Juliet Kemp

One of the more neat things you can do with the versatile utility lsof is use it to recover a file you've just accidentally deleted.

A file in Linux is a pointer to an inode, which contains the file data (permissions, owner and where its actual content lives on the disk). Deleting the file removes the link, but not the inode itself – if another process has it open, the inode isn't released for writing until that process is done with it.

To try this out, create a test text file, save it and then type less test.txt. Open another terminal window, and type rm testing.txt. If you try ls testing.txt you'll get an error message. But! less still has a reference to the file. So:

> lsof | grep testing.txt
less	4607	juliet  4r  REG 254,4   21 
           8880214 /home/juliet/testing.txt (deleted)
The important columns are the second one, which gives you the PID of the process that has the file open (4607), and the fourth one, which gives you the file descriptor (4). Now, we go look in /proc, where there will still be a reference to the inode, from which you can copy the file back out:
> ls -l /proc/4607/fd/4
lr-x------ 1 juliet juliet 64 Apr  7 03:19
             /proc/4607/fd/4 -> /home/juliet/testing.txt (deleted)
> cp /proc/4607/fd/4 testing.txt.bk
Note: don't use the -a flag with cp, as this will copy the (broken) symbolic link, rather than the actual file contents.

[Jul 7, 2009] xclip Command-Line Clipboard

xclip (available as a package for Debian and Ubuntu) enables you to interact with the X clipboard directly from the command-line - without having to use the mouse to cut and paste.

This is particularly useful if you're trying to get command-line output over to an e-mail or web page. Instead of scrolling around in the terminal to cut and paste with the mouse, screen by screen, you can use this:

command --arg | xclip
Then go to whichever graphical program you want to paste the input into, and paste with the middle mouse button or the appropriate menu item.

You can also enter the contents of a file straight into xclip:

xclip /path/to/file
and again, can then paste that directly wherever you want it.

The -o option enables you to operate it the other way around: output the contents of the clipboard straight onto the command line. So, you could, for example, copy a command line from a web page, then use

xclip -o
to output it. To output to a file, use
xclip -o /path/to/file

Use the -selection switch to use the buffer-cut or one of the other selection options, rather than the clipboard default. You can also hook it up to an X display other than the default one (e.g., if you're logged on as a different user on :!) with

xclip -d localhost:1

[Jun 29, 2009] !! provides the ability to rerun long commands which cannot be executed on your current account without prefixing them with sudo.

$ whoami
$ sudo !!

[Mar 14, 2009] How to Be Faster at the Linux Command Line

02/05/2009 | hacktux.com

Want to be faster at the Linux command line interface? Since most Linux distributions provide Bash as the default CLI, here are some Bash tricks that will help cut down the amount of typing needed to execute commands. Feel free to comment and share your own speed tricks.

Control-R Through Your History

This is my most used shortcut. Hit Control-R and begin to type a string. You immediately get the last command in your Bash history with that string. Hit Control-R again to cycle further backwards in your history.

For instance, type the following and hit Enter.

grep root /etc/passwd

Then hit Control-R and begin to type 'grep'.

Control-R
(reverse-i-search)`gre': grep root /etc/passwd

When you see the original command listed, hit Enter to execute it. Alternatively, you can also hit the Right-Arrow to edit the command before running it.

Use History Expansion

Bash's command history can be referenced using the exclamation mark. For instance, typing two exclamation marks (!!) will re-execute the last command. The next example executes date twice:

date
!!

If you are interested in more than just the last command executed, type history to see a numbered listing of your Bash's history.

history
39 grep root /etc/passwd
40 date
41 date
42 history

Since grep root /etc/passwd is command number 39, you can re-execute it like so:

!39

You can also reference Bash's history using a search string. For instance, the following will run the last command that started with 'grep'.

!grep

Note, you can set the number of commands stored in your history by setting HISTSIZE.

export HISTSIZE=1000

You can also wipe your history clear with the -c switch.

history -c

Use History Quick Substitution

Historical commands can be edited and reused with quick substitution. Let's say you grep for 'root' in /etc/passwd:

grep root /etc/passwd

Now, you need to grep for 'root' in /etc/group. Substitute 'passwd' for 'group' in the last command using the caret (^).

^passwd^group

The above command will run:

grep root /etc/group

Comments

Sun, 02/08/2009 - 2:25pm - Anonymous (not verified)

For my backup function, I

For my backup function, I use pass the %F-%R to my date command. This would allow me to make multiple backup copies of a file in one day and have them ordered by date/time.

Keith

Thu, 02/05/2009 - 2:58pm - Anonymous (not verified)

Thankyou for ctrl R I have

Thankyou for ctrl R
I have been using command line for two years and one of
my biggest grips was this issue. I and now flying around the command line
thanks Wed, 02/04/2009 - 5:59pm - Max (not verified)

Nice set of tricks. I knew

Nice set of tricks. I knew most of them already but it refreshed my memory. Thanks.

I find even more handy to have this in ~/.inputrc :
# -------- Bind page up/down wih history search ---------
"\e[5~": history-search-backward
"\e[6~": history-search-forward

I'll take the same example : on the bash prompt, type "gre" and Page up, this will give you "grep root /etc/passwd", the last command that started with "gre". Enter Page up again and it'll show you the previous one. Page down is obvioulsy used to show the next one.

I just noticed that the "set -o vi" trick is messing with this one ^_^ Can't tell you why.

Thu, 02/05/2009 - 5:43am - MaximB (not verified)

Nice stuff... There are some

Nice stuff...

There are some GNU/Linux distributions that already use aliases "built-in" .
like rm which is "rm -i" in rhel5 . So if you want to ignore the alias for known commands like rm for example, just type :

command rm

it will ignore the alias for the command.

[Feb 22, 2009] 10 shortcuts to master bash - Program - Linux - Builder AU By Guest Contributor, TechRepublic | 2007/06/25 18:30:02

If you've ever typed a command at the Linux shell prompt, you've probably already used bash -- after all, it's the default command shell on most modern GNU/Linux distributions.

The bash shell is the primary interface to the Linux operating system -- it accepts, interprets and executes your commands, and provides you with the building blocks for shell scripting and automated task execution.

Bash's unassuming exterior hides some very powerful tools and shortcuts. If you're a heavy user of the command line, these can save you a fair bit of typing. This document outlines 10 of the most useful tools:

  1. Easily recall previous commands

    Bash keeps track of the commands you execute in a history buffer, and allows you to recall previous commands by cycling through them with the Up and Down cursor keys. For even faster recall, "speed search" previously-executed commands by typing the first few letters of the command followed by the key combination Ctrl-R; bash will then scan the command history for matching commands and display them on the console. Type Ctrl-R repeatedly to cycle through the entire list of matching commands.

  2. Use command aliases

    If you always run a command with the same set of options, you can have bash create an alias for it. This alias will incorporate the required options, so that you don't need to remember them or manually type them every time. For example, if you always run ls with the -l option to obtain a detailed directory listing, you can use this command:

    bash> alias ls='ls -l' 

    To create an alias that automatically includes the -l option. Once this alias has been created, typing ls at the bash prompt will invoke the alias and produce the ls -l output.

    You can obtain a list of available aliases by invoking alias without any arguments, and you can delete an alias with unalias.

  3. Use filename auto-completion

    Bash supports filename auto-completion at the command prompt. To use this feature, type the first few letters of the file name, followed by Tab. bash will scan the current directory, as well as all other directories in the search path, for matches to that name. If a single match is found, bash will automatically complete the filename for you. If multiple matches are found, you will be prompted to choose one.

  4. Use key shortcuts to efficiently edit the command line

    Bash supports a number of keyboard shortcuts for command-line navigation and editing. The Ctrl-A key shortcut moves the cursor to the beginning of the command line, while the Ctrl-E shortcut moves the cursor to the end of the command line. The Ctrl-W shortcut deletes the word immediately before the cursor, while the Ctrl-K shortcut deletes everything immediately after the cursor. You can undo a deletion with Ctrl-Y.

  5. Get automatic notification of new mail

    You can configure bash to automatically notify you of new mail, by setting the $MAILPATH variable to point to your local mail spool. For example, the command:

    bash> MAILPATH='/var/spool/mail/john'
    bash> export MAILPATH 

    Causes bash to print a notification on john's console every time a new message is appended to John's mail spool.

  6. Run tasks in the background

    Bash lets you run one or more tasks in the background, and selectively suspend or resume any of the current tasks (or "jobs"). To run a task in the background, add an ampersand (&) to the end of its command line. Here's an example:

    bash> tail -f /var/log/messages &
    [1] 614

    Each task backgrounded in this manner is assigned a job ID, which is printed to the console. A task can be brought back to the foreground with the command fg jobnumber, where jobnumber is the job ID of the task you wish to bring to the foreground. Here's an example:

    bash> fg 1

    A list of active jobs can be obtained at any time by typing jobs at the bash prompt.

  7. Quickly jump to frequently-used directories

    You probably already know that the $PATH variable lists bash's "search path" -- the directories it will search when it can't find the requested file in the current directory. However, bash also supports the $CDPATH variable, which lists the directories the cd command will look in when attempting to change directories. To use this feature, assign a directory list to the $CDPATH variable, as shown in the example below:

    bash> CDPATH='.:~:/usr/local/apache/htdocs:/disk1/backups'
    bash> export CDPATH

    Now, whenever you use the cd command, bash will check all the directories in the $CDPATH list for matches to the directory name.

  8. Perform calculations

    Bash can perform simple arithmetic operations at the command prompt. To use this feature, simply type in the arithmetic expression you wish to evaluate at the prompt within double parentheses, as illustrated below. Bash will attempt to perform the calculation and return the answer.

    bash> echo $((16/2))
    8
  9. Customise the shell prompt

    You can customise the bash shell prompt to display -- among other things -- the current username and host name, the current time, the load average and/or the current working directory. To do this, alter the $PS1 variable, as below:

    bash> PS1='\u@\h:\w \@> '
    
    bash> export PS1
    root@medusa:/tmp 03:01 PM>

    This will display the name of the currently logged-in user, the host name, the current working directory and the current time at the shell prompt. You can obtain a list of symbols understood by bash from its manual page.

  10. Get context-specific help

    Bash comes with help for all built-in commands. To see a list of all built-in commands, type help. To obtain help on a specific command, type help command, where command is the command you need help on. Here's an example:

    bash> help alias
    ...some help text...

    Obviously, you can obtain detailed help on the bash shell by typing man bash at your command prompt at any time.

How to Be Faster at the Linux Command Line

Want to be faster at the Linux command line interface? Since most Linux distributions provide Bash as the default CLI, here are some Bash tricks that will help cut down the amount of typing needed to execute commands. Feel free to comment and share your own speed tricks.

Control-R Through Your History

This is my most used shortcut. Hit Control-R and begin to type a string. You immediately get the last command in your Bash history with that string. Hit Control-R again to cycle further backwards in your history.

For instance, type the following and hit Enter.

grep root /etc/passwd

Then hit Control-R and begin to type 'grep'.

Control-R
(reverse-i-search)`gre': grep root /etc/passwd

When you see the original command listed, hit Enter to execute it. Alternatively, you can also hit the Right-Arrow to edit the command before running it.

Use History Expansion

Bash's command history can be referenced using the exclamation mark. For instance, typing two exclamation marks (!!) will re-execute the last command. The next example executes date twice:

date
!!

If you are interested in more than just the last command executed, type history to see a numbered listing of your Bash's history.

history
39 grep root /etc/passwd
40 date
41 date
42 history

Since grep root /etc/passwd is command number 39, you can re-execute it like so:

!39

You can also reference Bash's history using a search string. For instance, the following will run the last command that started with 'grep'.

!grep

Note, you can set the number of commands stored in your history by setting HISTSIZE.

export HISTSIZE=1000

You can also wipe your history clear with the -c switch.

history -c

Use History Quick Substitution

Historical commands can be edited and reused with quick substitution. Let's say you grep for 'root' in /etc/passwd:

grep root /etc/passwd

Now, you need to grep for 'root' in /etc/group. Substitute 'passwd' for 'group' in the last command using the caret (^).

^passwd^group

The above command will run:

grep root /etc/group

Use Vi or Emacs Editing Mode

You can further enhance your abilities to edit previous commands using Vi or Emacs keystrokes. For example, the following sets Vi style command line editing:

set -o vi

After setting Vi mode, try it out by typing a command and hitting Enter.

grep root /etc/passwd

Then, Up-Arrow once to the same command:

Up-Arrow
grep root /etc/passwd

Now, move the cursor to the 'p' in 'passwd' and hit Esc.

grep root /etc/passwd
^

Now, use the Vi cw command to change the word 'passwd' to 'group'.

grep root /etc/group

For more Vi mode options, see this list of commands available in Vi mode. Alternatively, If you prefer Emacs, use Bash's Emacs mode:

set -o emacs

Emacs mode provides shortcuts that are available through the Control and Alt key. For example, Control-A takes you to the beginning of the line and Control-E takes you to the end of the line. Here is a list of commands available in Bash's Emacs mode.

Use Aliases and Functions

Bash allows for commands, or sets of commands, to be aliased into a single instruction. Your interactive Bash shell should already load some useful aliases from /etc/profile.d/. For one, you probably have ll aliased to ls -l.

If you want to see all aliases loaded, run the alias Bash builtin.

alias

To create an alias, use the alias command:

alias ll='ls -l'

Here are some other common aliases:

alias ls='ls --color=tty'
alias l.='ls -d .* --color=auto'
alias cp='cp -i'
alias mv='mv -i'

Note that you can also string together commands. The follow will alias gohome as cd , then run ls. Note that running cd without any arguments will change directory to your $HOME directory.

alias gohome='cd; ls'

Better yet, only run ls if the cd is successful:

alias gohome='cd && ls || echo "error($?) with cd to $HOME"'

More complex commands can be written into a Bash function. Functions will allow you to provide input parameters for a block of code. For instance, let's say you want to create a backup function that puts a user inputted file into ~/backups.

backup() {
file=${1:?"error: I need a file to backup"}

timestamp=$(date '+%m%d%y')
backupdir=~/backups

[ -d ${backupdir} ] || mkdir -p ${backupdir}
cp -a ${file} ${backupdir}/$(basename ${file}).${timestamp}
return $?
}

Like the example above, use functions to automate small, daily tasks. Here is one I use to set my xterm title.

xtitle() {
unset PROMPT_COMMAND
echo -ne "\033]0;${@}\007"
}

Of course, you can use functions together with aliases. Here is one I use to set my xterm title to 'MAIL' and then run Mutt.

alias mutt='xtitle "MAIL" && /usr/bin/mutt'

Finally, to ensure that your custom aliases and functions are available each login, add them to your .bashrc.

vim ~/.bashrc

[Apr 2, 2008] 10 shortcuts to master bash - Program - Linux - Builder AU

2007/06/25 | Guest Contributor, TechRepublic

1. Easily recall previous commands

Bash keeps track of the commands you execute in a history buffer, and allows you to recall previous commands by cycling through them with the Up and Down cursor keys. For even faster recall, "speed search" previously-executed commands by typing the first few letters of the command followed by the key combination Ctrl-R; bash will then scan the command history for matching commands and display them on the console. Type Ctrl-R repeatedly to cycle through the entire list of matching commands.

... ... ...

5. Get automatic notification of new mail

You can configure bash to automatically notify you of new mail, by setting the $MAILPATH variable to point to your local mail spool. For example, the command:

bash> MAILPATH='/var/spool/mail/john'
bash> export MAILPATH 

Causes bash to print a notification on john's console every time a new message is appended to John's mail spool.

6. Run tasks in the background

Bash lets you run one or more tasks in the background, and selectively suspend or resume any of the current tasks (or "jobs"). To run a task in the background, add an ampersand (&) to the end of its command line. Here's an example:

bash> tail -f /var/log/messages &
[1] 614

Each task backgrounded in this manner is assigned a job ID, which is printed to the console. A task can be brought back to the foreground with the command fg jobnumber, where jobnumber is the job ID of the task you wish to bring to the foreground. Here's an example:

bash> fg 1

A list of active jobs can be obtained at any time by typing jobs at the bash prompt.

7. Quickly jump to frequently-used directories

You probably already know that the $PATH variable lists bash's "search path" -- the directories it will search when it can't find the requested file in the current directory. However, bash also supports the $CDPATH variable, which lists the directories the cd command will look in when attempting to change directories. To use this feature, assign a directory list to the $CDPATH variable, as shown in the example below:

bash> CDPATH='.:~:/usr/local/apache/htdocs:/disk1/backups'
bash> export CDPATH

Now, whenever you use the cd command, bash will check all the directories in the $CDPATH list for matches to the directory name.

8. Perform calculations

Bash can perform simple arithmetic operations at the command prompt. To use this feature, simply type in the arithmetic expression you wish to evaluate at the prompt within double parentheses, as illustrated below. Bash will attempt to perform the calculation and return the answer.

bash> echo $((16/2))
8

... ... ...

10. Get context-specific help

Bash comes with help for all built-in commands. To see a list of all built-in commands, type help. To obtain help on a specific command, type help command, where command is the command you need help on. Here's an example:

bash> help alias
...some help text...

Obviously, you can obtain detailed help on the bash shell by typing man bash at your command prompt at any time.

[Mar 30, 2008] Bash tips and tricks " Richard's linux, web design and e-learning collection

# Bash tips and tricks for History related preferences
# see http://richbradshaw.wordpress.com/2007/11/25/bash-tips-and-tricks/

# == 1 Lost bash history ==
# the bash history is only saved when you close the terminal, not after each command. fix it..
shopt -s histappend
PROMPT_COMMAND='history -a'

# == 2. Stupid spelling mistakes ==
# This will make sure that spelling mistakes such as ect instead of etc are ignored.
shopt -s cdspell

# == 3. Duplicate entries in bash history ==
# This will ignore duplicates, as well as ls, bg, fg and exit as well, making for a cleaner bash history.
export HISTIGNORE="&:ls:[bf]g:exit"

# == 4 Multiple line commands split up in history ==
# this will change multiple line commands into single lines for easy editing.
shopt -s cmdhist

My Favorite bash Tips and Tricks

One thing you can do is redirect your output to a file. Basic output redirection should be nothing new to anyone who has spent a reasonable amount of time using any UNIX or Linux shell, so I won't go into detail regarding the basics of output redirection. To save the useful output from the find command, you can redirect the output to a file:
$ find /  -name foo > output.txt

You still see the error messages on the screen but not the path of the file you're looking for. Instead, that is placed in the file output.txt. When the find command completes, you can cat the file output.txt to get the location(s) of the file(s) you want.

That's an acceptable solution, but there's a better way. Instead of redirecting the standard output to a file, you can redirect the error messages to a file. This can be done by placing a 2 directly in front of the redirection angle bracket. If you are not interested in the error messages, you simply can send them to /dev/null:

This shows you the location of file foo, if it exists, without those pesky permission denied error messages. I almost always invoke the find command in this way.

The number 2 represents the standard error output stream. Standard error is where most commands send their error messages. Normal (non-error) output is sent to standard output, which can be represented by the number 1. Because most redirected output is the standard output, output redirection works only on the standard output stream by default. This makes the following two commands equivalent:

$ find / -name foo > output.txt
$ find / -name foo 1> output.txt

Sometimes you might want to save both the error messages and the standard output to file. This often is done with cron jobs, when you want to save all the output to a log file. This also can be done by directing both output streams to the same file:

$ find / -name foo > output.txt 2> output.txt

This works, but again, there's a better way to do it. You can tie the standard error stream to the standard output stream using an ampersand. Once you do this, the error messages goes to wherever you redirect the standard output:

$ find / -name foo > output.txt 2>&1

One caveat about doing this is that the tying operation goes at the end of the command generating the output. This is important if piping the output to another command. This line works as expected:

find -name test.sh 2>&1 | tee /tmp/output2.txt

but this line doesn't:

find -name test.sh | tee /tmp/output2.txt 2>&1

and neither does this one:

find -name test.sh 2>&1 > /tmp/output.txt

I started this discussion on output redirection using the find command as an example, and all the examples used the find command. This discussion isn't limited to the output of find, however. Many other commands can generate enough error messages to obscure the one or two lines of output you need.

Output redirection isn't limited to bash, either. All UNIX/Linux shells support output redirection using the same syntax.

Bash Tip #2 subprocess

Bash bang commands can be used for shortcuts too.

I really only use !$ with the cd command. Here's some examples, although some not really useful. Just to give you an idea of what it does:

  1. which php (maybe it outputs /usr/local/bin/php)
  2. `!!` /path/to/php_script.php (executes php on the script)

Bash Tips and Tricks 'cd' with style

Something you may have seen before in other systems (the much maligned SCO OSes, for example) is this handy option:

shopt -s cdspell

"This will correct minor spelling errors in a 'cd' command, so that instances of transposed characters, missing characters and extra characters are corrected without the need for retyping."

[Mar 20, 2008] bash Tricks From the Developers of the O'Reilly Network - O'Reilly ONLamp Blog

No more worrying about cases

The best bash tip I can share is very helpful when working on systems that don't allow filenames to differ only in case (like OSX and Windows):

create a file called .inputrc in your home directory and put this line in it:

set completion-ignore-case on

Now bash tab-completion won't worry about case in filenames. Thus 'cd sit[tab]' would complete to 'cd Sites/'

Last argument

You can also use Esc-period and get the last parm of the previous line. You can repeatedly use Esc-period to scroll back through time with them. That turns out to be even better than $! because you can edit it once it shows up on your command line.

should be !$
Instead of $!, use !$, it works much better. :)

$ echo asdf
asdf
$ echo !$
echo asdf
asdf
$ echo $!

$

So $! is an empty variable, while !$ brings back the last argument from the last command.

Command substitution
$ for s in `cat server.list`; do ssh $s uptime; done;

Command substution is also done using $(command) notation, which I prefer to the backquotes. It allows commands to be nested (backquotes allow that too, but the inner quotes must be escaped using backslashes, which gets messy.

For example:

$ for s in $(cat server.list); do echo "$s: $(ssh $s uptime)"; done;

or:

# get the uptime for just the first server
$ echo "$(date): $(ssh $(head -1 server.list) uptime)"

=====

More key bindings and tricks
Bash will keep a history of the directories you visit, you just have to ask.

You can also always go back to the previous directory you were in by typing cd - without the need to pushd the current directory. Using it more than once cycles between the current and previous directory.

CTRL-A takes you to the beginning of the line and CTRL-E takes you to the end of the line. This is probably basic shell knowledge,

I think it's actually common readline/emacs knowledge, and it works in much more programs than just Bash or a terminal. For instance, you can enable them in Gnome applications by adding the line
gtk-key-theme-name = "Emacs" to the ~/.gtkrc-2.0 file.

Other handy key bindings you can use are:

There's so much usefull knowledge hidden in Bash that, if you spend any time at the command line, you should really get yourself aquinted with. It saves incredible ammounts of time.

Take for example something I wanted to do yesterday. I wanted to now the number of hits on a certain website. I could have installed a tool to parse the Apache access.log, but this was much easier:

$ cat access.log | cut -d"[" -f2 | cut -d"]" -f1 | cut -d"/" -f2 | uniq -c
28905 Mar
16554 Apr

Takes no more than a couple of seconds to write, but saves so much time.

Try reading through the Bash man page. It's huge, but think of all the stuff you'll learn! Or read some online Bash scripting tutorials. Everything from gathering statistics from files to creating thumbnails of images (From the top of my head: for A in *; do convert $A -resize 140x140 th_$A; done) becomes a cinch.

BASH Help - A Bash Tutorial

Flip the Last Two Characters

If you type like me your fingers spit characters out in the wrong order on occasion. ctrl-t swaps the order that the last two character appear in.

Searching Bash History

As you enter commands at the CLI they are saved in a file ~./.bash_history. From the bash prompt you can browse the most recently used commands through the least recently used commands by pressing the up arrow. Pressing the down arrow does the opposite.

If you have entered a command a long time ago and need to execute it again you can search for it. Type the command 'ctrl-r' and enter the text you want to search for.

[Dec 9, 2007] Cool Solutions Bash - Making use of your .bashrc file

Good sample bashrc file


This Is Your Open Enterprise™
Skip to Content United States - EnglishNovell Home LoginDownload
 Products & Solutions
 Services & Support
 Partners & Communities
 Search
           Advanced Search
SolutionsIdentity, Security and Systems ManagementLinux Operating SystemsWorkgroup CollaborationProducts forIndustriesSmall BusinessProducts A-ZServicesConsultingTrainingSupportCustomer CenterDiscussion ForumsDocumentationKnowledgebasePatches & SecuritySupport by ProductPartnersCertified Partner ProductsFind a PartnerPartner with NovellStrategic PartnersCommunitiesBlogsConnection MagazineCool SolutionsDevelopersNovell Users Intl. (NUI)Partner
> cool solutions home  > cool tools home
Bash - Making use of your .bashrc file
Novell Cool Solutions: Cool ToolRate This Page
Reader Rating    from 4 ratings

Printer Friendlytell a friendDigg This - Slashdot This


In Brief
A sample .bashrc file.

Vitals
Product Categories:
Open Enterprise Server
SUSE Linux
SUSE Linux Enterprise Desktop
SUSE Linux Enterprise Server
Functional Categories:
BASH
Shortcuts
Workgroup
Updated: 23 Oct 2006
File Size: 6.9KB
License: GPL
Download: /coolsolutions/tools/downloads/bashrc.txt
Publisher: David Crouse


Disclaimer
Please read the note from our friends in legal before using this file.



Details

I was playing with my .bashrc file again, and was once again impressed by how you can tweak Linux to do what YOU want it to do so easily. I am sure there are tons of other tweaks you can do to your .bashrc file, but I really like some of mine, and thought I would share them. Some of the alias's I created, some I found on the net, and some things in my .bashrc file are just there for fun, like the "# WELCOME SCREEN", although it does serve a purpose for me at the same time, it might not be something everyone would want or need.

For those that don't know what a .bashrc file does: "The ~/.bashrc file determines the behavior of interactive shells." Quoted From: The Advanced Bash Scripting Guide

Basically , it allows you to create shortcuts (alias's) and interactive programs (functions) that run on the startup of the bash shell or that are used when running an interactive shell. For example, it's much easier to just type: ebrc instead of pico ~/.bashrc (I used the alias ebrc , and it stands for "Edit Bash RC file". I could have also aliased it to just use one letter, making it a VERY fast short cut. The bashrc file allows you to create alias's (shortcuts) to almost anything you want. My list is pretty long, but I'm sure there is someone with a longer list ;)

I have my .bashrc file setup in sections. The following is the breakdown by section of how I keep my list of alias's and functions separated. This is just how I do this, your .bashrc file can be modified to suit YOUR needs, that's the interesting part about the .bashrc file. It's VERY customizable and very easy to change.

Header (So I know when i modified it last and what i was running it on)
Exports (So I can set history size, paths , editors, define colors, etc,)
Sourced Alias's (So I can find those hidden alias's faster)
Workstation Alias's (so i can ssh to local machines quickly)
Remote Server Alias's (so i can ssh to remote servers easily)
Script Alias's (quick links to some of my bashscripts)
Hardware control alias's (so I can control cd/dvd/scanners/audio/etc)
Modified commands (Alias's to normal linux commands with special flags)
Chmod Alias's (makes changing permissions faster)
Alias's for GUI programs (start firefox, etc from command line)
Alias's for xterm and others (open xterm with special settings)
Alias's for Lynx (open lynx with urls - kind of a bash bookmark ;) )
UNused Alias's (Alias's that aren't in use on the system, but that i might use later)
Special functions (more of a function than just an alias..it goes here)
Notes (that should be self explanatory ;) )
Welcome Screen (code to make my bash shell display some stuff as it starts up)

That's how I lay out my .bashrc files. It may not be perfect, but it works well for me. I like making changes in just my .bashrc file and not the global files. I like the .bashrc file because you don't need root permissions to make changes that make your life easier at the bash shell.

The following is my .bashrc file (with some things obviously commented out for security... but most of it should be self explanatory). Anyone with comments/suggestions/ideas feel free to let me know. I'm always looking for new and interesting things to do with the .bashrc file.

Want to know what alias's your bash shell has? Simply type the word alias at the command line. The shell will then print out the list of active alias's to the standard output (normally your screen).

#######################################################
# Dave Crouse's .bashrc file
# www.bashscripts.org
# www.usalug.org
#
# Last Modified 04-08-2006
# Running on OpenSUSE 10
#######################################################


# EXPORTS
#######################################################

PATH=$PATH:/usr/lib/festival/ ;export PATH
export PS1="[\[\033[1;34m\w\[\033[0m]\n[\t \u]$ "
export EDITOR=/usr/bin/pico
export HISTFILESIZE=3000 # the bash history should save 3000 commands
export HISTCONTROL=ignoredups #don't put duplicate lines in the history.
alias hist='history | grep $1' #Requires one input

# Define a few Color's
BLACK='\e[0;30m'
BLUE='\e[0;34m'
GREEN='\e[0;32m'
CYAN='\e[0;36m'
RED='\e[0;31m'
PURPLE='\e[0;35m'
BROWN='\e[0;33m'
LIGHTGRAY='\e[0;37m'
DARKGRAY='\e[1;30m'
LIGHTBLUE='\e[1;34m'
LIGHTGREEN='\e[1;32m'
LIGHTCYAN='\e[1;36m'
LIGHTRED='\e[1;31m'
LIGHTPURPLE='\e[1;35m'
YELLOW='\e[1;33m'
WHITE='\e[1;37m'
NC='\e[0m'              # No Color
# Sample Command using color: echo -e "${CYAN}This is BASH
${RED}${BASH_VERSION%.*}${CYAN} - DISPLAY on ${RED}$DISPLAY${NC}\n"


# SOURCED ALIAS'S AND SCRIPTS
#######################################################

### Begin insertion of bbips alias's ###
source ~/.bbips/commandline/bbipsbashrc
### END bbips alias's ###

# Source global definitions
if [ -f /etc/bashrc ]; then
    . /etc/bashrc
fi

# enable programmable completion features
if [ -f /etc/bash_completion ]; then
    . /etc/bash_completion
fi


# ALIAS'S OF ALL TYPES SHAPES AND FORMS ;)
#######################################################

# Alias's to local workstations
alias tom='ssh 192.168.2.102 -l root'
alias jason='ssh 192.168.2.103 -l root'
alias randy='ssh 192.168.2.104 -l root'
alias bob='ssh 192.168.2.105 -l root'
alias don='ssh 192.168.2.106 -l root'
alias counter='ssh 192.168.2.107 -l root'

# ALIAS TO REMOTE SERVERS
alias ANYNAMEHERE='ssh YOURWEBSITE.com -l USERNAME -p PORTNUMBERHERE'
# My server info removed from above for obvious reasons ;)

# Alias's to TN5250 programs. AS400 access commands.
alias d1='xt5250 env.TERM = IBM-3477-FC env.DEVNAME=D1 192.168.2.5 &'
alias d2='xt5250 env.TERM = IBM-3477-FC env.DEVNAME=D2 192.168.2.5 &'
alias tn5250j='nohup java -jar /home/crouse/tn5250j/lib/tn5250j.jar
2>>error.log &'

# Alias's to some of my BashScripts
alias bics='sh /home/crouse/scripts/bics/bics.sh'
alias backup='sh /home/crouse/scripts/usalugbackup.sh'
alias calc='sh /home/crouse/scripts/bashcalc.sh'
alias makepdf='sh /home/crouse/scripts/makepdf.sh'
alias phonebook='sh /home/crouse/scripts/PHONEBOOK/baps.sh'
alias pb='sh /home/crouse/scripts/PHONEBOOK/baps.sh'
alias ppe='/home/crouse/scripts/passphraseencryption.sh'
alias scripts='cd /home/crouse/scripts'

# Alias's to control hardware
alias cdo='eject /dev/cdrecorder'
alias cdc='eject -t /dev/cdrecorder'
alias dvdo='eject /dev/dvd'
alias dvdc='eject -t /dev/dvd'
alias scan='scanimage -L'
alias playw='for i in *.wav; do play $i; done'
alias playo='for i in *.ogg; do play $i; done'
alias playm='for i in *.mp3; do play $i; done'
alias copydisk='dd if=/dev/dvd of=/dev/cdrecorder' # Copies bit by bit
from dvd to cdrecorder drives.
alias dvdrip='vobcopy -i /dev/dvd/ -o ~/DVDs/ -l'

# Alias's to modified commands
alias ps='ps auxf'
alias home='cd ~'
alias pg='ps aux | grep'  #requires an argument
alias un='tar -zxvf'
alias mountedinfo='df -hT'
alias ping='ping -c 10'
alias openports='netstat -nape --inet'
alias ns='netstat -alnp --protocol=inet | grep -v CLOSE_WAIT | cut
-c-6,21-94 | tail +2'
alias du1='du -h --max-depth=1'
alias da='date "+%Y-%m-%d %A    %T %Z"'
alias ebrc='pico ~/.bashrc'

# Alias to multiple ls commands
alias la='ls -Al'               # show hidden files
alias ls='ls -aF --color=always' # add colors and file type extensions
alias lx='ls -lXB'              # sort by extension
alias lk='ls -lSr'              # sort by size
alias lc='ls -lcr'      # sort by change time
alias lu='ls -lur'      # sort by access time
alias lr='ls -lR'               # recursive ls
alias lt='ls -ltr'              # sort by date
alias lm='ls -al |more'         # pipe through 'more'

# Alias chmod commands
alias mx='chmod a+x'
alias 000='chmod 000'
alias 644='chmod 644'
alias 755='chmod 755'

# Alias Shortcuts to graphical programs.
alias kwrite='kwrite 2>/dev/null &'
alias firefox='firefox 2>/dev/null &'
alias gaim='gaim 2>/dev/null &'
alias kate='kate 2>/dev/null &'
alias suk='kdesu konqueror 2>/dev/null &'

# Alias xterm and aterm
alias term='xterm -bg AntiqueWhite -fg Black &'
alias termb='xterm -bg AntiqueWhite -fg NavyBlue &'
alias termg='xterm -bg AntiqueWhite -fg OliveDrab &'
alias termr='xterm -bg AntiqueWhite -fg DarkRed &'
alias aterm='aterm -ls -fg gray -bg black'
alias xtop='xterm -fn 6x13 -bg LightSlateGray -fg black -e top &'
alias xsu='xterm -fn 7x14 -bg DarkOrange4 -fg white -e su &'

# Alias for lynx web browser
alias bbc='lynx -term=vt100 http://news.bbc.co.uk/text_only.stm'
alias nytimes='lynx -term=vt100 http://nytimes.com'
alias dmregister='lynx -term=vt100 http://desmoinesregister.com'


# SOME OF MY UNUSED ALIAS's
#######################################################

# alias d=`echo "Good Morning Dave. today's date is" | festival --tts;
date +'%A %B %e' | festival --tts`
# alias shrink84='/home/crouse/shrink84/shrink84.sh'
# alias tl='tail -f /var/log/apache/access.log'
# alias te='tail -f /var/log/apache/error.log'


# SPECIAL FUNCTIONS
#######################################################

netinfo ()
{
echo "--------------- Network Information ---------------"
/sbin/ifconfig | awk /'inet addr/ {print $2}'
echo ""
/sbin/ifconfig | awk /'Bcast/ {print $3}'
echo ""
/sbin/ifconfig | awk /'inet addr/ {print $4}'

# /sbin/ifconfig | awk /'HWaddr/ {print $4,$5}'
echo "---------------------------------------------------"
}

spin ()
{
echo -ne "${RED}-"
echo -ne "${WHITE}\b|"
echo -ne "${BLUE}\bx"
sleep .02
echo -ne "${RED}\b+${NC}"
}

scpsend ()
{
scp -P PORTNUMBERHERE "$@"
USERNAME@YOURWEBSITE.com:/var/www/html/pathtodirectoryonremoteserver/;
}


# NOTES
#######################################################

# To temporarily bypass an alias, we preceed the command with a \
# EG:  the ls command is aliased, but to use the normal ls command you would
# type \ls

# mount -o loop /home/crouse/NAMEOFISO.iso /home/crouse/ISOMOUNTDIR/
# umount /home/crouse/NAMEOFISO.iso
# Both commands done as root only.


# WELCOME SCREEN
#######################################################

clear
for i in `seq 1 15` ; do spin; done ;echo -ne "${WHITE} USA Linux Users
Group ${NC}"; for i in `seq 1 15` ; do spin; done ;echo "";
echo -e ${LIGHTBLUE}`cat /etc/SUSE-release` ;
echo -e "Kernel Information: " `uname -smr`;
echo -e ${LIGHTBLUE}`bash --version`;echo ""
echo -ne "Hello $USER today is "; date
echo -e "${WHITE}"; cal ; echo "";
echo -ne "${CYAN}";netinfo;
mountedinfo ; echo ""
echo -ne "${LIGHTBLUE}Uptime for this computer is ";uptime | awk /'up/
{print $3,$4}'
for i in `seq 1 15` ; do spin; done ;echo -ne "${WHITE} http://usalug.org
${NC}"; for i in `seq 1 15` ; do spin; done ;echo "";
echo ""; echo ""The following belong under the "function" section in my .bashrc. Useable as seperate programs, I've integrated them simply as functions for my .bashrc file in order to make them quick to use and easy to modify and find. These are functions that are used to symetrically encrypt and to decrypt files and messages. Some are completely command line, and the last two create gui interfaces to locate the files to encrypt/decrypt. If you create a program out of the functions creating a link via a shortcut/icon on the desktop would create a completely gui based interface to locate and encrypt/decrypt files. Either way, it's an easy way to use gpg.

Requires: zenity, gpg

################### Begin gpg functions ##################
encrypt ()
{
# Use ascii armor
gpg -ac --no-options "$1"
}

bencrypt ()
{
# No ascii armor
# Encrypt binary data. jpegs/gifs/vobs/etc.
gpg -c --no-options "$1"
}

decrypt ()
{
gpg --no-options "$1"
}

pe ()
{
# Passphrase encryption program
# Created by Dave Crouse 01-13-2006
# Reads input from text editor and encrypts to screen.
clear
echo "         Passphrase Encryption Program";
echo "--------------------------------------------------"; echo "";
which $EDITOR &>/dev/null
 if [ $? != "0" ];
     then
     echo "It appears that you do not have a text editor set in your
.bashrc file.";
     echo "What editor would you like to use ? " ;
     read EDITOR ; echo "";
 fi
echo "Enter the name/comment for this message :"
read comment
$EDITOR passphraseencryption
gpg --armor --comment "$comment" --no-options --output
passphraseencryption.gpg --symmetric passphraseencryption
shred -u passphraseencryption ; clear
echo "Outputting passphrase encrypted message"; echo "" ; echo "" ;
cat passphraseencryption.gpg ; echo "" ; echo "" ;
shred -u passphraseencryption.gpg ;
read -p "Hit enter to exit" temp; clear
}

keys ()
{
# Opens up kgpg keymanager
kgpg -k
}

encryptfile ()
{
zenity --title="zcrypt: Select a file to encrypt" --file-selection > zcrypt
encryptthisfile=`cat zcrypt`;rm zcrypt
# Use ascii armor
#  --no-options (for NO gui usage)
gpg -acq --yes ${encryptthisfile}
zenity --info --title "File Encrypted" --text "$encryptthisfile has been
encrypted"
}

decryptfile ()
{
zenity --title="zcrypt: Select a file to decrypt" --file-selection > zcrypt
decryptthisfile=`cat zcrypt`;rm zcrypt
# NOTE: This will OVERWRITE existing files with the same name !!!
gpg --yes -q ${decryptthisfile}
zenity --info --title "File Decrypted" --text "$encryptthisfile has been
decrypted"
}

################### End gpg functions ##################
Novell Cool Solutions (corporate web communities) are produced by WebWise Solutions. www.webwiseone.com

Reader Comments
cool man, really cool. i love such stuffs you know. working in the command line makes you feel like a real linux geek
it's really cool. good job.

AuthorsDocumentationGlossaryKnowledgebaseNovell ConnectionPartner Product GuideSupport ForumsTrainingAppNotes by DateAppNotes by TitleSubmit an AppNoteCool Tools by ProductCool Tools by Tool NameCool Tools by DateCool Tools by CategoryCool Tools by File NameCool Tools by PublisherSubmit a ToolAdvertising in Cool SolutionsTalk to UsSubmit a TipSubmit an AppNoteSubscribeXML/RSS News FeedsFirefox FeedsJavascript News FeedsAccess ManagerAuditBorderManagereDirectoryExteNdIdentity ManagerSecureLoginSentinelZENworksSUSE Linux Enterprise DesktopSUSE Linux Enterprise ServerGroupWiseOpen Enterprise ServerSUSE Linux Enterprise DesktopTeaming + ConferencingIdentity, Security, & Systems ManagementLinux Operating SystemsWorkgroupCool Solutions dot ComCool Solutions HomeResourcesAppNotesCool ToolsGet InvolvedCool Solutions to GoOther CoolsCool BlogsCool Solutions WikiOpen Audio (podcasts)
1.800.529.3400 local numbers
Request Call
Corporate Governance | Legal & Export | Privacy | Subscribe | Feedback | Glossary | RSS | Contact | Printer Friendly
© 2007 Novell, Inc. All Rights Reserved.

xargs, find and several useful shortcuts

See also Unix Xargs and Unix Find Command pages.

Re:pushd and popd (and other tricks) (Score:2)
by Ramses0 (63476) on Wednesday March 10, @07:39PM (#8527252) My favorite "Nifty" was when I spent the time to learn about "xargs" (I pronounce it zargs), and brush up on "for" syntax.

ls | xargs -n 1 echo "ZZZ> "

Basically indents (prefixes) everything with a "ZZZ" string. Not really useful, right? But since it invokes the echo command (or whatever command you specify) $n times (where $n is the number of lines passed to it) this saves me from having to write a lot of crappy little shell scripts sometimes.

A more serious example is:

find -name \*.jsp | sed 's/^/http:\/\/127.0.0.1/server/g' | xargs -n 1 wget

...will find all your jsp's, map them to your localhost webserver, and invoke a wget (fetch) on them. Viola, precompiled JSP's.

Another:

for f in `find -name \*.jsp` ; do echo "==> $f" >> out.txt ; grep "TODO" $f >> out.txt ; done

...this searches JSP's for "TODO" lines and appends them all to a file with a header showing what file they came from (yeah, I know grep can do this, but it's an example. What if grep couldn't?) ...and finally...

( echo "These were the command line params"
echo "---------"
for f in $@ ; do
echo "Param: $f"
done ) | mail -s "List" you@you.com ...the parenthesis let your build up lists of things (like interestingly formatted text) and it gets returned as a chunk, ready to be passed on to some other shell processing function.

Shell scripting has saved me a lot of time in my life, which I am grateful for. :^)

[May 7, 2007] To strip file extensions in bash, like this.rbl --> this


fname=${file%.rbl}

Last argument reuse

tail -f /tmp/foo
rm !$  # !$ is the last argument to the previous command.

Correction sed style

grep 'wibble' afile | lwss  #typo: meant to type less
!!:s/lw/le #!! is last command string, :s does sed-style  modification. :gs does a global replace

# or for simpler corrections
# n.b. textile screws this up. replace the sup elements with circumflexes.
cat .bash_profilx #typo - meant the x to be an e
<sup>x</sup>e # 'repeat last command, subsituting x for e

touch a{1,2,3,4}b # brace gets expanded to a1b a2b a3b a4b so 4 files get touched

cp file{,.old} # brace gets expanded to file file.old , thus creating a backup.

readline Tips and Tricks

The readline library is used by bash and many other programs to read a line from the terminal, allowing the user to edit the line with standard Emacs editing keys.

Useful Commands and Features

The commands in this section are non-mode specific, unlike the ones listed above.

Aliasing Commands

Once again I like how this topic is covered on freeunix.dyndns.org:8088 in "Customizing your Bash environment" I will quote the section entitled "Aliasses":

Altering the Command Prompt Look and Information

Bash has the ability to change how the command prompt is displayed in information as well as colour. This is done by setting the PS1 variable. There is also a PS2 variable. It controls what is displayed after a second line of prompt is added and is usually by default '> '. The PS1 variable is usually set to show some useful information by the Linux distribution you are running but you may want to earn style points by doing your own modifications.

Here are the backslash-escape special characters that have meaning to bash:

Colours In Bash:

Here is an example borrowed from the Bash-Prompt-HOWTO:

This turns the text blue, displays the time in brackets (very useful for not losing track of time while working), and displays the user name, host, and current directory enclosed in brackets. The "\[\033[0m\]" following the $ returns the colour to the previous foreground colour.

How about command prompt modification thats a bit more "pretty":

This one sets up a prompt like this: [user@host] directory $

Break down:

Each user on a system can have their own customized prompt by setting the PS1 variable in either the .bashrc or .profile files located in their home directories.

Basic and Extended Bash Completion

Basic Bash Completion will work in any bash shell. It allows for completion of:

  1. File Names
  2. Directory Names
  3. Executable Names
  4. User Names (when they are prefixed with a ~)
  5. Host Names (when they are prefixed with a @)
  6. Variable Names (when they are prefixed with a $)

This is done simply by pressing the tab key after enough of the word you are trying to complete has been typed in. If when hitting tab the word is not completed there are probably multiple possibilities for the completion. Press tab again and it will list the possibilities. Sometimes on my machine I have to hit it a third time.

Extended Programmable Bash Completion is a program that you can install to complete much more than the names of the things listed above. With extended bash completion you can, for example, complete the name of a computer you are trying to connect to with ssh or scp. It achieves this by looking through the known_hosts file and using the hosts listed there for the completion. This is greatly customizable and the package and more information can be found here.

Configuration of Programmable Bash Completion is done in /etc/bash_completion. Here is a list of completions that are in my bash_completion file by default.


Links
  1. Bash Prompt HOWTO
  2. Bash Reference Manual
  3. Customizing your Bash environment
  4. Working more productively with bash 2.x
  5. Advancing in the Bash Shell
  6. Bash - Bourne Again SHell
  7. What's GNU: Bash - The GNU Shell
  8. Bash Tips in Gentoo Forums
  9. bash(1) - Linux man page
  10. BASHISH

Learn About Bash Scripting:

  1. Bash by example, Part 1
  2. Bash by example, Part 2
  3. Bash by example, Part 3
  4. Advanced Bash-Scripting Guide
  5. A quick guide to writing scripts using the bash shell

unixtips.org bash tips

bash Nicolas Lidzborski at 19 February, 23:54:09
If you want your xterm or rxvt tille bar to show the username, hostname and current directory and if you uses bash, you can set the PROMPT_COMMAND shell variable. Personally, I use the following command in my /etc/profile:
if [ $TERM = "xterm" ]; then
export PROMPT_COMMAND='echo -ne \
"\033]0;${USER}@${HOSTNAME}: ${PWD}\007"'
fi

The test around the export command is done in order to avoid causing problems in text terms.
bash sRp at 19 February, 05:23:23
You can execute bash command a certain number of times by using something similar to the following:
n=0;while test -$n -gt -10; do echo n=$n; n=$[$n+1]; done

That code will print "n=0", "n=1", and so on 10 times.
bash sRp at 30 January, 07:18:30
You can use CTRL-_ or CTRL-X, CTRL-U to make undo's at the bash prompt.
bash Ian Eure at 29 January, 12:55:02
Bash supports tab-completion. That is, you type the first few characters of a command (or file / directory) and hit tab, and bash automagically completes it for you. For example, if you wanted to run the program WPrefs (Window Maker prefrences util), all you have to do is type WP<tab> and bash will fill in the rest plus a trailing space.
bash sRp at 28 January, 01:06:05
Hitting META-P in bash will allow you to search through the bash history.
bash sRp at 27 January, 13:24:42
If you find yourself having to cd back and forth between long directory names, bash's pushd is the perfect solution. Start in one of the directories, and the type pushd directory2 to go to the second directory. Now if you type dirs you should see the two directories listed. To switch between these two directories just type pushd +1
bash sRp at 27 January, 13:16:39
While using bash, if you have typed a long command, and then realize you don't want to execute it yet, don't delete it. Simply append a # to the beginning of the line, and then hit enter. Bash will not execute the command, but will store it in history so later you can go back, remove the # from the front, and execute it.
bash sRp at 27 January, 13:10:05
In the bash shell, CTRL-U will delete everything to the left of the cursor.
bash sRp at 27 January, 13:08:22
CTRL-T in bash will transpose two characters; great for typos.
bash sRp at 21 January, 05:39:18
Hitting CTRL-W in bash will delete the word just before your cursor. CTRL-Y will yank back in the last deleted word (or words if they were delete consecutively). If you deleted words after you deleted what you wanted to yank back in, and already pressed CTRL-Y, you can use ALT-Y to look through those words.
bash Mike Lowrie at 19 January, 07:17:36
Here's another way to change into long directory names in bash. For example, the directory, samba-2.0.0beat2. You can put in cd samb* and it will change to the directory that matches the wildcard.
bash Sid Boyce at 19 January, 05:00:05
In the bash shell, you can utilize shortcuts. If your last command started with an l was less xxx, then !l will re-execute it. However, if you had been using lpr and ln as well, and you wanted to run less again, then !le would execute it.
bash sRp at 18 January, 21:25:17
In bash, hitting ALT-b will move you back a word, and hitting ALT-f will move you forward a word.
bash sRp at 18 January, 21:25:10
Typeing CTRL-l at a bash prompt, will clear the screen, and put the current line at the top of the screen.
bash schvin at 17 January, 12:46:44
Turning on the scrolllock in a console will pause or suspend the current command in progress in bash, such as ls, du or mpg123.
bash mulo at 30 September, 21:43:22
To lowercase files in current $PWD #!/bin/sh for x in * do newx=`echo $x | tr "[:upper:]" "[:lower:]";` mv "$x" "$newx" echo "$x --> $newx" done
bash Jose at 30 September, 21:43:33
For one fast and effective `clear' use echo e='\ec' It does more that `clear'
bash nexz at 30 September, 21:44:50
Finding out all the commands installed on your box? At the prompt, press tab twice and it will ask you if you want to see all the commands. Say y and it will show you all the commands that you installed on your box including shell syntax. Very easy to find out and to familiar yourself with the commands you don't know (btw, this only searches according to path variable set in bash login files). But be careful if you are the root; try --help or man page first before blindly type into it. If all the commands listed are in single column and you can't see the top, edit .bash_profile or .bashrc to include this alias: alias ls="ls -C". Then you should be able to see all. One other alternative might be to increase the buffer for the terminal so that it will hold more characters. Hope this helps!
bash Daniel Giribet at 30 September, 21:46:07
Would you like to list only directories (without a long -l listing)?
dirs () {
ls -F $1 | grep \/ | sed -e 's/\/$//g'
}


Use 'dirs ' on your bash shell and enjoy!

bash sRp at 31 July, 19:23:44
The readline support in the bash shell defaults to emacs editing mode. You can easily switch that to vi mode by issuing the following command: set -o vi.
bash Antonio at 8 February, 12:42:20
If you use bash, you can search backwards into its history: hit CTRL-R and start typing what you want to search (it works exactly as in Emacs). If there are lots of similar lines in your history, repetedly typing CTRL-R will browse through them
bash irfan ahmed at 23 December, 19:36:34
bash allows you to move between the current directory and the previous directory using the hyphen after the cd command. Say you were in /home/john/pies/american. You give the command cd /home/jack/steak/grilled Now you could back to the ../../american directory using cd -
bash hictio at 18 January, 02:30:17
you can clear the screen when you logout, in bash, by adding this to the ~/.bash_logout file:
 
setterm -clear
 
if you don´t have a .bash_logout file, just make one.
bash johnnycal at 18 January, 02:31:14
I use cd bla; ls -l bla so much I made a function for it see
 
function see () { cd $1; ls . ; }
 
bash Nate Fox at 27 December, 04:32:07
In bash, if you add this:
 
complete -d cd
 
Into your ~/.bash_profile or /etc/profile file, then when you cd, it will only search for directories. So if you have a file called "jiggy" and a directory called "joogy" and those are the only things in the directory, and you type cd and press tab, it will just go into "joogy".
bash sRp at 5 September, 17:02:58
Under bash or zsh, if you would like to edit a previous command in a text editor instead of on the command line, use the fc command.
bash frodo at 10 April, 04:15:04
Aliasing dir to list just directories can be useful. To do so, do the following:
alias dir='ls -l | grep ^d'
grep in this case searches for a d in the first column of each line.
bash HellHound at 14 January, 04:31:20
Another search-in-bash thingy: CTRL+R, this is more "realtime"--when you enter a char/string, it gives you a found match directly.
bash Joerg Tretter at
If you want to switch off the "beep" during command line-completion you should add an entry either in your ~/.inputrc or system wide in your /etc/inputrc:

for visual signal : set bell-style visible
for absolutely no signal: set bell-style none
bash Jason P. Stanford at 20 May, 05:21:59
This is a variation for the "colorful directory listing" hint users, that works "better" under bash. Put the following in $HOME/.bashrc or $HOME/.bash_profile:
function v () { ls -l --color=auto $*; }
function d () { ls --color=auto $*; }

HINT: Think of 'v' as "verbose" and 'd' as "directory". And they're much quicker to type (only a single char), so this should satisfy most unix junkies.


Etc

FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available in our efforts to advance understanding of environmental, political, human rights, economic, democracy, scientific, and social justice issues, etc. We believe this constitutes a 'fair use' of any such copyrighted material as provided for in section 107 of the US Copyright Law. In accordance with Title 17 U.S.C. Section 107, the material on this site is distributed without profit exclusivly for research and educational purposes.   If you wish to use copyrighted material from this site for purposes of your own that go beyond 'fair use', you must obtain permission from the copyright owner. 

ABUSE: IPs or network segments from which we detect a stream of probes might be blocked for no less then 90 days. Multiple types of probes increase this period.  

Society

Groupthink : Two Party System as Polyarchy : Corruption of Regulators : Bureaucracies : Understanding Micromanagers and Control Freaks : Toxic Managers :   Harvard Mafia : Diplomatic Communication : Surviving a Bad Performance Review : Insufficient Retirement Funds as Immanent Problem of Neoliberal Regime : PseudoScience : Who Rules America : Neoliberalism  : The Iron Law of Oligarchy : Libertarian Philosophy

Quotes

War and Peace : Skeptical Finance : John Kenneth Galbraith :Talleyrand : Oscar Wilde : Otto Von Bismarck : Keynes : George Carlin : Skeptics : Propaganda  : SE quotes : Language Design and Programming Quotes : Random IT-related quotesSomerset Maugham : Marcus Aurelius : Kurt Vonnegut : Eric Hoffer : Winston Churchill : Napoleon Bonaparte : Ambrose BierceBernard Shaw : Mark Twain Quotes

Bulletin:

Vol 25, No.12 (December, 2013) Rational Fools vs. Efficient Crooks The efficient markets hypothesis : Political Skeptic Bulletin, 2013 : Unemployment Bulletin, 2010 :  Vol 23, No.10 (October, 2011) An observation about corporate security departments : Slightly Skeptical Euromaydan Chronicles, June 2014 : Greenspan legacy bulletin, 2008 : Vol 25, No.10 (October, 2013) Cryptolocker Trojan (Win32/Crilock.A) : Vol 25, No.08 (August, 2013) Cloud providers as intelligence collection hubs : Financial Humor Bulletin, 2010 : Inequality Bulletin, 2009 : Financial Humor Bulletin, 2008 : Copyleft Problems Bulletin, 2004 : Financial Humor Bulletin, 2011 : Energy Bulletin, 2010 : Malware Protection Bulletin, 2010 : Vol 26, No.1 (January, 2013) Object-Oriented Cult : Political Skeptic Bulletin, 2011 : Vol 23, No.11 (November, 2011) Softpanorama classification of sysadmin horror stories : Vol 25, No.05 (May, 2013) Corporate bullshit as a communication method  : Vol 25, No.06 (June, 2013) A Note on the Relationship of Brooks Law and Conway Law

History:

Fifty glorious years (1950-2000): the triumph of the US computer engineering : Donald Knuth : TAoCP and its Influence of Computer Science : Richard Stallman : Linus Torvalds  : Larry Wall  : John K. Ousterhout : CTSS : Multix OS Unix History : Unix shell history : VI editor : History of pipes concept : Solaris : MS DOSProgramming Languages History : PL/1 : Simula 67 : C : History of GCC developmentScripting Languages : Perl history   : OS History : Mail : DNS : SSH : CPU Instruction Sets : SPARC systems 1987-2006 : Norton Commander : Norton Utilities : Norton Ghost : Frontpage history : Malware Defense History : GNU Screen : OSS early history

Classic books:

The Peter Principle : Parkinson Law : 1984 : The Mythical Man-MonthHow to Solve It by George Polya : The Art of Computer Programming : The Elements of Programming Style : The Unix Hater’s Handbook : The Jargon file : The True Believer : Programming Pearls : The Good Soldier Svejk : The Power Elite

Most popular humor pages:

Manifest of the Softpanorama IT Slacker Society : Ten Commandments of the IT Slackers Society : Computer Humor Collection : BSD Logo Story : The Cuckoo's Egg : IT Slang : C++ Humor : ARE YOU A BBS ADDICT? : The Perl Purity Test : Object oriented programmers of all nations : Financial Humor : Financial Humor Bulletin, 2008 : Financial Humor Bulletin, 2010 : The Most Comprehensive Collection of Editor-related Humor : Programming Language Humor : Goldman Sachs related humor : Greenspan humor : C Humor : Scripting Humor : Real Programmers Humor : Web Humor : GPL-related Humor : OFM Humor : Politically Incorrect Humor : IDS Humor : "Linux Sucks" Humor : Russian Musical Humor : Best Russian Programmer Humor : Microsoft plans to buy Catholic Church : Richard Stallman Related Humor : Admin Humor : Perl-related Humor : Linus Torvalds Related humor : PseudoScience Related Humor : Networking Humor : Shell Humor : Financial Humor Bulletin, 2011 : Financial Humor Bulletin, 2012 : Financial Humor Bulletin, 2013 : Java Humor : Software Engineering Humor : Sun Solaris Related Humor : Education Humor : IBM Humor : Assembler-related Humor : VIM Humor : Computer Viruses Humor : Bright tomorrow is rescheduled to a day after tomorrow : Classic Computer Humor

The Last but not Least


Copyright © 1996-2016 by Dr. Nikolai Bezroukov. www.softpanorama.org was created as a service to the UN Sustainable Development Networking Programme (SDNP) in the author free time. This document is an industrial compilation designed and created exclusively for educational use and is distributed under the Softpanorama Content License.

The site uses AdSense so you need to be aware of Google privacy policy. You you do not want to be tracked by Google please disable Javascript for this site. This site is perfectly usable without Javascript.

Original materials copyright belong to respective owners. Quotes are made for educational purposes only in compliance with the fair use doctrine.

FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to advance understanding of computer science, IT technology, economic, scientific, and social issues. We believe this constitutes a 'fair use' of any such copyrighted material as provided by section 107 of the US Copyright Law according to which such material can be distributed without profit exclusively for research and educational purposes.

This is a Spartan WHYFF (We Help You For Free) site written by people for whom English is not a native language. Grammar and spelling errors should be expected. The site contain some broken links as it develops like a living tree...

You can use PayPal to make a contribution, supporting development of this site and speed up access. In case softpanorama.org is down you can use the at softpanorama.info

Disclaimer:

The statements, views and opinions presented on this web page are those of the author (or referenced source) and are not endorsed by, nor do they necessarily reflect, the opinions of the author present and former employers, SDNP or any other organization the author may be associated with. We do not warrant the correctness of the information provided or its fitness for any purpose.

Last modified: November 08, 2017