Softpanorama

May the source be with you, but remember the KISS principle ;-)
Home Switchboard Unix Administration Red Hat TCP/IP Networks Neoliberalism Toxic Managers
(slightly skeptical) Educational society promoting "Back to basics" movement against IT overcomplexity and  bastardization of classic Unix

Reimplementation of cut in AWK

News

Unix cut command

Best Shell Books

Recommended Links Perl power tools

AWK Regular expressions

Reimplementation of cut in Perl

Tips Humor Etc

AWK programs automatically parse input and put the field in $1, $2,... variables. So reimplementation of cut in AWK in most simple case is just one print statement, for example:

awk 'BEGIN { FS = ":" } { print $1 | "sort" }' /etc/passwd
This program prints a sorted list of the login names of all users.

If an input file or output file are not specified, awk will expect input from stdin or output to stdout.

AWK is very flexible about matching patterns. Patterns can be

(This last example selects lines where the two characters starting in fifth column are xx and the third field matches nasty, plus lines beginning with The, plus lines ending with mean, plus lines in which the fourth field is greater than two.)

HANDY ONE-LINERS FOR AWK 22 July 2003 compiled by Eric Pement ...


 # print the number of fields in each line, followed by the line
 awk '{ print NF ":" $0 } '

 # print the last field of each line
 awk '{ print $NF }'

SELECTIVE PRINTING OF CERTAIN LINES:

 # print first 10 lines of file (emulates behavior of "head")
 awk 'NR < 11'

 # print first line of file (emulates "head -1")
 awk 'NR>1{exit};1'

  # print the last 2 lines of a file (emulates "tail -2")
 awk '{y=x "\n" $0; x=$0};END{print y}'

 # print the last line of a file (emulates "tail -1")
 awk 'END{print}'

 # print only lines which match regular expression (emulates "grep")
 awk '/regex/'

 # print only lines which do NOT match regex (emulates "grep -v")
 awk '!/regex/'

 # print the line immediately before a regex, but not the line
 # containing the regex
 awk '/regex/{print x};{x=$0}'
 awk '/regex/{print (x=="" ? "match on line 1" : x)};{x=$0}'

 # print the line immediately after a regex, but not the line
 # containing the regex
 awk '/regex/{getline;print}'

 # grep for AAA and BBB and CCC (in any order)
 awk '/AAA/; /BBB/; /CCC/'

 # grep for AAA and BBB and CCC (in that order)
 awk '/AAA.*BBB.*CCC/'

 # print only lines of 65 characters or longer
 awk 'length > 64'

 # print only lines of less than 65 characters
 awk 'length < 64'

 # print section of file from regular expression to end of file
 awk '/regex/,0'
 awk '/regex/,EOF'

 # print section of file based on line numbers (lines 8-12, inclusive)
 awk 'NR==8,NR==12'

 # print line number 52
 awk 'NR==52'
 awk 'NR==52 {print;exit}'          # more efficient on large files

 # print section of file between two regular expressions (inclusive)
 awk '/Iowa/,/Montana/'             # case sensitive


SELECTIVE DELETION OF CERTAIN LINES:

 # delete ALL blank lines from a file (same as "grep '.' ")
 awk NF
 awk '/./'


CREDITS AND THANKS:

Special thanks to Peter S. Tillier for helping me with the first release
of this FAQ file.

For additional syntax instructions, including the way to apply editing
commands from a disk file instead of the command line, consult:

"sed & awk, 2nd Edition," by Dale Dougherty and Arnold Robbins
  O'Reilly, 1997
"UNIX Text Processing," by Dale Dougherty and Tim O'Reilly
  Hayden Books, 1987
"Effective awk Programming, 3rd Edition." by Arnold Robbins
  O'Reilly, 2001

To fully exploit the power of awk, one must understand "regular
expressions." For detailed discussion of regular expressions, see
"Mastering Regular Expressions, 2d edition" by Jeffrey Friedl
   (O'Reilly, 2002).

The manual ("man") pages on Unix systems may be helpful (try "man awk",
"man nawk", "man regexp", or the section on regular expressions in "man
ed"), but man pages are notoriously difficult. They are not written to
teach awk use or regexps to first-time users, but as a reference text
for those already acquainted with these tools.

USE OF '\t' IN awk SCRIPTS: For clarity in documentation, we have used
the expression '\t' to indicate a tab character (0x09) in the scripts.
All versions of awk, even the UNIX System 7 version should recognize
the '\t' abbreviation.

#---end of file---

AWK One-Liners

Although awk can be used to write programs of some complexity, many useful programs are not complicated. Here is a collection of short programs that you might find handy and/or instructive:
  1. Print the total numnber of input lines:
    END { print NR }
  2. Print the tenth input line:
    NR == 10
  3. Print the last field of every input line:
    { print $NF }
  4. Print the last field of the last input line:
    { field = $NF}
    END { print field } 
  5. Print every input line with more than 4 fields:
    NF > 4
  6. Print every input line in which the last field is more than 4:
    $NF > 4
  7. Print the total number of fields in all input lines:
    { nf = nf + NF }
    END { print nf }      
  8. Print the total number of lines that contain Beth:
    /Beth/ { nlines = nlines + 1 }
    END { print nlines }      
  9. Print the largest first fields and the line that contains it ( assumes some $1 is positive):
    $1 > max { max = $1 ; maxlines = $0 }
    END { print max, maxline)
           
  10. Print every line that has at least one field:
    NF > 0
  11. Pritn every line longer than 80 characters:
    length($0) > 80
  12. Print the numer of fields in every line, followed by the line itself:
    { print NF, $0 }
  13. Print the first two fields, in opposite order, of every line:
    { print $2, $1 }
  14. Exchange the first two fields of every line and then print the line:
    { temp = $1 ; $1 = $2 ; $2 = temp ; print }
  15. Print every line witg rge first field replaced by the line number:
    { $1 = NR ; print }
  16. Print every line after erasing the second field:
    { $2 = ""; print }
  17. Print in reverse order the fields of every line:
    { for (i=NF ; i>0 ; i=i-1) printf( "%s ", $1)
           printf("\n")
    }       
  18. Print the sums of the fields of every line:
    { sum = 0
           for ( i=1 ; i<=NF ; i=i+1) sum = sum + $i
           print sum
    }       
  19. Ad up all fields in all lines and print the sum:
    { for ( i=1 ; i<=NF ; i=i+1 ) sum = sum + $i}
    END { print sum }       
           
  20. Print every line after replacing each field by its absolute value:
    { for (i=1 ; i<=NF ; i=i+1) if ($i<0) $i=-$i
           print
    }       
Source: The AWK Programming Language

Top Visited
Switchboard
Latest
Past week
Past month

NEWS CONTENTS

Old News ;-)

[3.0] Awk Examples, Nawk, & Awk Quick Reference

For example, suppose I want to turn a document with single-spacing into a document with double-spacing. I could easily do that with the following Awk program:

   awk '{print ; print ""}' infile > outfile
Notice how single-quotes (' ') are used to allow using double-quotes (" ") within the Awk expression. This "hides" special characters from the shell you are using. You could also do this as follows:
   awk "{print ; print \"\"}" infile > outfile 
-- but the single-quote method is simpler.

This program does what it supposed to, but it also doubles every blank line in the input file, which leaves a lot of empty space in the output. That's easy to fix, just tell Awk to print an extra blank line if the current line is not blank:

   awk '{print ; if (NF != 0) print ""}' infile > outfile
* One of the problems with Awk is that it is ingenious enough to make a user want to tinker with it, and use it for tasks for which it isn't really appropriate. For example, you could use Awk to count the number of lines in a file:
   awk 'END {print NR}' infile
-- but this is dumb, because the "wc (word count)" utility gives the same answer with less bother. "Use the right tool for the job."

Awk is the right tool for slightly more complicated tasks. Once I had a file containing an email distribution list. The email addresses of various different groups were placed on consecutive lines in the file, with the different groups separated by blank lines. If I wanted to quickly and reliably determine how many people were on the distribution list, I couldn't use "wc", since, it counts blank lines, but Awk handled it easily:

   awk 'NF != 0 {++count} END {print count}' list
* Another problem I ran into was determining the average size of a number of files. I was creating a set of bitmaps with a scanner and storing them on a floppy disk. The disk started getting full and I was curious to know just how many more bitmaps I could store on the disk.

I could obtain the file sizes in bytes using "wc -c" or the "list" utility ("ls -l" or "ll"). A few tests showed that "ll" was faster. Since "ll" lists the file size in the fifth field, all I had to do was sum up the fifth field and divide by NR. There was one slight problem, however: the first line of the output of "ll" listed the total number of sectors used, and had to be skipped.

No problem. I simply entered:

   ll | awk 'NR!=1 {s+=$5} END {print "Average: " s/(NR-1)}'
This gave me the average as about 40 KB per file.

* Awk is useful for performing simple iterative computations for which a more sophisticated language like C might prove overkill. Consider the Fibonacci sequence:

   1 1 2 3 5 8 13 21 34 ...
Each element in the sequence is constructed by adding the two previous elements together, with the first two elements defined as both "1". It's a discrete formula for exponential growth. It is very easy to use Awk to generate this sequence:
   awk 'BEGIN {a=1;b=1; while(++x<=10){print a; t=a;a=a+b;b=t}; exit}'
This generates the following output data:
   1
   2
   3
   5
   8
   13
   21
   34
   55
   89

Recommended Links

Google matched content

Softpanorama Recommended

Top articles

Sites

The AWK Programming Language

The awk programming language

-home-jpc1-rpmfind-redhat-5.2-sparc-usr_share_awk_Tree.html

Index of -system-monitor-2.1.2-scripts-awk

Index of /sw/sun4_56/gawk-3.0.3/lib/awk



Etc

Society

Groupthink : Two Party System as Polyarchy : Corruption of Regulators : Bureaucracies : Understanding Micromanagers and Control Freaks : Toxic Managers :   Harvard Mafia : Diplomatic Communication : Surviving a Bad Performance Review : Insufficient Retirement Funds as Immanent Problem of Neoliberal Regime : PseudoScience : Who Rules America : Neoliberalism  : The Iron Law of Oligarchy : Libertarian Philosophy

Quotes

War and Peace : Skeptical Finance : John Kenneth Galbraith :Talleyrand : Oscar Wilde : Otto Von Bismarck : Keynes : George Carlin : Skeptics : Propaganda  : SE quotes : Language Design and Programming Quotes : Random IT-related quotesSomerset Maugham : Marcus Aurelius : Kurt Vonnegut : Eric Hoffer : Winston Churchill : Napoleon Bonaparte : Ambrose BierceBernard Shaw : Mark Twain Quotes

Bulletin:

Vol 25, No.12 (December, 2013) Rational Fools vs. Efficient Crooks The efficient markets hypothesis : Political Skeptic Bulletin, 2013 : Unemployment Bulletin, 2010 :  Vol 23, No.10 (October, 2011) An observation about corporate security departments : Slightly Skeptical Euromaydan Chronicles, June 2014 : Greenspan legacy bulletin, 2008 : Vol 25, No.10 (October, 2013) Cryptolocker Trojan (Win32/Crilock.A) : Vol 25, No.08 (August, 2013) Cloud providers as intelligence collection hubs : Financial Humor Bulletin, 2010 : Inequality Bulletin, 2009 : Financial Humor Bulletin, 2008 : Copyleft Problems Bulletin, 2004 : Financial Humor Bulletin, 2011 : Energy Bulletin, 2010 : Malware Protection Bulletin, 2010 : Vol 26, No.1 (January, 2013) Object-Oriented Cult : Political Skeptic Bulletin, 2011 : Vol 23, No.11 (November, 2011) Softpanorama classification of sysadmin horror stories : Vol 25, No.05 (May, 2013) Corporate bullshit as a communication method  : Vol 25, No.06 (June, 2013) A Note on the Relationship of Brooks Law and Conway Law

History:

Fifty glorious years (1950-2000): the triumph of the US computer engineering : Donald Knuth : TAoCP and its Influence of Computer Science : Richard Stallman : Linus Torvalds  : Larry Wall  : John K. Ousterhout : CTSS : Multix OS Unix History : Unix shell history : VI editor : History of pipes concept : Solaris : MS DOSProgramming Languages History : PL/1 : Simula 67 : C : History of GCC developmentScripting Languages : Perl history   : OS History : Mail : DNS : SSH : CPU Instruction Sets : SPARC systems 1987-2006 : Norton Commander : Norton Utilities : Norton Ghost : Frontpage history : Malware Defense History : GNU Screen : OSS early history

Classic books:

The Peter Principle : Parkinson Law : 1984 : The Mythical Man-MonthHow to Solve It by George Polya : The Art of Computer Programming : The Elements of Programming Style : The Unix Hater’s Handbook : The Jargon file : The True Believer : Programming Pearls : The Good Soldier Svejk : The Power Elite

Most popular humor pages:

Manifest of the Softpanorama IT Slacker Society : Ten Commandments of the IT Slackers Society : Computer Humor Collection : BSD Logo Story : The Cuckoo's Egg : IT Slang : C++ Humor : ARE YOU A BBS ADDICT? : The Perl Purity Test : Object oriented programmers of all nations : Financial Humor : Financial Humor Bulletin, 2008 : Financial Humor Bulletin, 2010 : The Most Comprehensive Collection of Editor-related Humor : Programming Language Humor : Goldman Sachs related humor : Greenspan humor : C Humor : Scripting Humor : Real Programmers Humor : Web Humor : GPL-related Humor : OFM Humor : Politically Incorrect Humor : IDS Humor : "Linux Sucks" Humor : Russian Musical Humor : Best Russian Programmer Humor : Microsoft plans to buy Catholic Church : Richard Stallman Related Humor : Admin Humor : Perl-related Humor : Linus Torvalds Related humor : PseudoScience Related Humor : Networking Humor : Shell Humor : Financial Humor Bulletin, 2011 : Financial Humor Bulletin, 2012 : Financial Humor Bulletin, 2013 : Java Humor : Software Engineering Humor : Sun Solaris Related Humor : Education Humor : IBM Humor : Assembler-related Humor : VIM Humor : Computer Viruses Humor : Bright tomorrow is rescheduled to a day after tomorrow : Classic Computer Humor

The Last but not Least Technology is dominated by two types of people: those who understand what they do not manage and those who manage what they do not understand ~Archibald Putt. Ph.D


Copyright © 1996-2021 by Softpanorama Society. www.softpanorama.org was initially created as a service to the (now defunct) UN Sustainable Development Networking Programme (SDNP) without any remuneration. This document is an industrial compilation designed and created exclusively for educational use and is distributed under the Softpanorama Content License. Original materials copyright belong to respective owners. Quotes are made for educational purposes only in compliance with the fair use doctrine.

FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to advance understanding of computer science, IT technology, economic, scientific, and social issues. We believe this constitutes a 'fair use' of any such copyrighted material as provided by section 107 of the US Copyright Law according to which such material can be distributed without profit exclusively for research and educational purposes.

This is a Spartan WHYFF (We Help You For Free) site written by people for whom English is not a native language. Grammar and spelling errors should be expected. The site contain some broken links as it develops like a living tree...

You can use PayPal to to buy a cup of coffee for authors of this site

Disclaimer:

The statements, views and opinions presented on this web page are those of the author (or referenced source) and are not endorsed by, nor do they necessarily reflect, the opinions of the Softpanorama society. We do not warrant the correctness of the information provided or its fitness for any purpose. The site uses AdSense so you need to be aware of Google privacy policy. You you do not want to be tracked by Google please disable Javascript for this site. This site is perfectly usable without Javascript.

Last modified: March 12, 2019