|
Home | Switchboard | Unix Administration | Red Hat | TCP/IP Networks | Neoliberalism | Toxic Managers |
(slightly skeptical) Educational society promoting "Back to basics" movement against IT overcomplexity and bastardization of classic Unix |
News | Shells | Recommended Links | Options | Examples | Pipes | AWK | xargs |
find | grep | sort | cut | tee | Exit Status | Etc |
|
The uniq command can eliminate or count duplicate lines in a presorted file. It reads in lines and compares the previous line to the current line. Depending on the options specified on the command line it may display only unique lines or one occurrence of repeated lines or both types of lines. The common idiom is
|
sort | uniq
and
sort | uniq -c
For example
grep 'hysteresis' * | awk -F: '{print $1}' | sort | uniq | wc -l
or in more complex pipe:
cut -d '"' -f 2 $1 | cut -d '/' -f 3 | tr '[:upper:]' '[:lower:]' \ | sort | uniq -c | sort -r > $1_sites
The default output is to display lines that only appear once and one copy of lines that appear more than once. It is also useful to filter out multiple blank lines from unsorted output of other commands. For example, the dircmp command displays its output using pr; thus the output usually scrolls off your screen before you can read it. But if you pipe the output of the dircmp command through the uniq command, the blank lines are reduced and the output is more compact.
Syntax
uniq [ -options [ input [output] ]
Options
Arguments
The following arguments may be passed to the uniqcommand.
input | The name of the file containing the input data. |
output | The name of the file to hold the output data. If no output file is specified, the output is displayed on the standard output. |
The input and output files must not be the same name. If they are, the contents of the file are destroyed. | |
If no input file is specified, uniqreads from the standard input and writes to the standard output. | |
You cannot specify an output file without specifying an input file. |
uniq(1) takes a stream of lines and collapses adjacent duplicate lines into one copy of the lines. So if you had a file called foo that looked like:
davel davel davel jeffy jones jeffy mark mark mark chuck bonnie chuck
You could run uniq on it like this:
% uniq foo davel jeffy jones jeffy mark chuck bonnie chuck
Notice that there are still two jeffy lines and two chuck lines. This is because the duplicates were not adjacent. To get a true unique list you have to make sure the stream is sorted:
% sort foo | uniq jones bonnie davel chuck jeffy mark
That gives you a truly unique list. However, it's also a useless use of uniq since sort(1) has an argument, -u to do this very common operation:
% sort -u foo jones bonnie davel chuck jeffy mark
That does exactly the same thing as "sort | uniq", but only takes one process instead of two.uniq has other arguments that let it do more interesting mutilations on its input:
- -d tells uniq to eliminate all lines with only a single occurrence (delete unique lines), and print just one copy of repeated lines:
% sort foo | uniq -d davel chuck jeffy mark- -u tells uniq to eliminate all duplicated lines and show only those which appear once (only the unique lines):
% sort foo | uniq -u jones bonnie- -c tells uniq to count the occurrences of each line:
% sort foo | uniq -c 1 jones 1 bonnie 3 davel 2 chuck 2 jeffy 3 mark
I often pipe the output of "uniq -c" to "sort -n" (sort in numeric order) to get the list in order of frequency:
% sort foo | uniq -c | sort -n 1 jones 1 bonnie 2 chuck 2 jeffy 3 davel 3 mark- Finally, there are arguments to make uniq ignore leading characters and fields. See the man page for details.
Tuesday Tiny Techie Tip -- 03 December 1996
#!/usr/bin/ksh93 ################################################################ #### Program: uniq_k93 #### #### Description: Emulation of the Unix "uniq" command #### #### Author: Dana French ([email protected]) #### Copyright 2004 #### #### Date: 07/22/2004 #### ################################################################ function uniq_k93 { typeset TRUE="0" typeset FALSE="1" typeset OLDPREV="" typeset FNAME="" typeset STDIN="${1: ${FALSE}}" STDIN="${STDIN:-${TRUE}}" if (( STDIN == TRUE )) then getUniqLine_k93 OLDPREV else for FNAME in "${@}" do if [[ -f "${FNAME}" ]] then exec 0<"${FNAME}" getUniqLine_k93 OLDPREV exec 0<&- else print -u 2 "ERROR: \"${FNAME}\" does not exist or is not a regular file." fi done fi return 0 } ################################################################ function getUniqLine_k93 { nameref OLDPREV="${1}" typeset PREV="${OLDPREV}" typeset CURR IFS="" [[ ${OLDPREV} = "_" ]] && read -r -- PREV && print -r -- "${PREV}" while read -r -- CURR do [[ "_${PREV}" = "_${CURR}" ]] && continue print -r -- "${CURR}" PREV="${CURR}" done OLDPREV="${PREV}" return 0 } ################################################################ uniq_k93 "${@}"
#!/usr/bin/perl -w # uniq - report or filter out repeated lines in a file use strict; my $VERSION = '1.0'; END { close STDOUT || die "$0: can't close stdout: $!\n"; $? = 1 if $? == 255; # from die } sub help { print "$0 [-c | -d | -u] [-f fields] [-s chars] [input files]\n"; exit 0; } sub version { print "$0 (Perl Power Tools) $VERSION\n"; exit 0; } # options my ($optc, $optd, $optf, $opts, $optu); sub get_numeric_arg { # $_ contains current arg my ($argname, $desc, $opt) = @_; if (length) { $opt = $_ } elsif (@ARGV) { $opt = shift @ARGV } else {die "$0: option requires an argument -- $argname\n"} $opt =~ /\D/ && die "$0: invalid number of $desc: `$opt'\n"; $opt; } while (@ARGV && $ARGV[0] =~ /^[-+]/) { local $_ = shift; /^-[h?]$/ && help(); # terminates /^-v$/ && version(); # terminates /^-c$/ && ($optc++, next); /^-d$/ && ($optd++, next); /^-u$/ && ($optu++, next); /^-(\d+)$/ && (($optf = $1), next); /^\+(\d+)$/ && (($opts = $1), next); s/^-f// && (($optf = get_numeric_arg('f', 'fields to skip')), next); s/^-s// && (($opts = get_numeric_arg('s', 'bytes to skip')), next); die "$0: invalid option -- $_\n"; } my ($comp, $save_comp, $line, $save_line, $count, $eof); # prime the pump $comp = $line = <>; exit 0 unless defined $line; if ($optf) {($comp) = (split ' ', $comp, $optf+1)[$optf] } if ($opts) { $comp = substr($comp, $opts) } LINES: while (!$eof) { $save_line = $line; $save_comp = $comp; $count = 1; DUPS: while (!($eof = eof())) { $comp = $line = <>; if ($optf) {($comp) = (split ' ', $comp, $optf+1)[$optf] } if ($opts) { $comp = substr($comp, $opts) } last DUPS if $comp ne $save_comp; ++$count; } # when we get here, $save_line is the first occurrence of a sequence # of duplicate lines, $count is the number of times it appears if ($optc) { printf "%7d $save_line", $count } elsif ($optd) { print $save_line if $count > 1 } elsif ($optu) { print $save_line if $count == 1 } else { print $save_line } } exit 0; __END__ =head1 NAME uniq - report or filter out repeated lines in a file =head1 SYNOPSIS uniq [B<-c> | B<-d> | B<-u>] [B<-f> I] [B<-s> I ] [I] =head1 DESCRIPTION The uniq utility reads the standard input comparing adjacent lines and writes a copy of each unique input line to the standard output. The sec- ond and succeeding copies of identical adjacent input lines are not writ- ten. Repeated lines in the input will not be detected if they are not adjacent, so it may be necessary to sort the files first. The following options are available: =over =item c Precede each output line with the count of the number of times the line occurred in the input, followed by a single space. = item d Don't output lines that are not repeated in the input. =item f I Ignore the first fields in each input line when doing compar- isons. A field is a string of non-blank characters separated from adjacent fields by blanks. Field numbers are one based, i.e. the first field is field one. =item s I Ignore the first chars characters in each input line when doing comparisons. If specified in conjunction with the B<-f> option, the first chars characters after the first fields fields will be ig- nored. Character numbers are one based, i.e. the first character is character one. =item u Don't output lines that are repeated in the input. =back If additional arguments are specified on the command line, they are used as the names of input files. The uniq utility exits 0 on success or >0 if an error occurred. =head1 COMPATIBILITY The historic B<->I and B<+>I options are supported as synonyms for B<-f> I and B<-s> I , respectively. This version accepts 0 as a valid argument for the B<-f> and B<-s> switches; some implementations of uniq do not. =head1 SEE ALSO sort(1) =head1 BUGS I has no known bugs. =head1 AUTHOR The Perl implementation of I was written by Jonathan Feinberg, I . =head1 COPYRIGHT and LICENSE This program is copyright (c) Jonathan Feinberg 1999. This program is free and open software. You may use, modify, distribute, and sell this program (and any modified variants) in any way you wish, provided you do not restrict others from doing the same.
Google matched content |
uniq - Wikipedia, the free encyclopedia
Emulation of uniq Unix command in ksh93 - AIX Expert
Ad Hoc Data Analysis From The Unix Command Line - Wikibooks, collection of open-content textbooks
Society
Groupthink : Two Party System as Polyarchy : Corruption of Regulators : Bureaucracies : Understanding Micromanagers and Control Freaks : Toxic Managers : Harvard Mafia : Diplomatic Communication : Surviving a Bad Performance Review : Insufficient Retirement Funds as Immanent Problem of Neoliberal Regime : PseudoScience : Who Rules America : Neoliberalism : The Iron Law of Oligarchy : Libertarian Philosophy
Quotes
War and Peace : Skeptical Finance : John Kenneth Galbraith :Talleyrand : Oscar Wilde : Otto Von Bismarck : Keynes : George Carlin : Skeptics : Propaganda : SE quotes : Language Design and Programming Quotes : Random IT-related quotes : Somerset Maugham : Marcus Aurelius : Kurt Vonnegut : Eric Hoffer : Winston Churchill : Napoleon Bonaparte : Ambrose Bierce : Bernard Shaw : Mark Twain Quotes
Bulletin:
Vol 25, No.12 (December, 2013) Rational Fools vs. Efficient Crooks The efficient markets hypothesis : Political Skeptic Bulletin, 2013 : Unemployment Bulletin, 2010 : Vol 23, No.10 (October, 2011) An observation about corporate security departments : Slightly Skeptical Euromaydan Chronicles, June 2014 : Greenspan legacy bulletin, 2008 : Vol 25, No.10 (October, 2013) Cryptolocker Trojan (Win32/Crilock.A) : Vol 25, No.08 (August, 2013) Cloud providers as intelligence collection hubs : Financial Humor Bulletin, 2010 : Inequality Bulletin, 2009 : Financial Humor Bulletin, 2008 : Copyleft Problems Bulletin, 2004 : Financial Humor Bulletin, 2011 : Energy Bulletin, 2010 : Malware Protection Bulletin, 2010 : Vol 26, No.1 (January, 2013) Object-Oriented Cult : Political Skeptic Bulletin, 2011 : Vol 23, No.11 (November, 2011) Softpanorama classification of sysadmin horror stories : Vol 25, No.05 (May, 2013) Corporate bullshit as a communication method : Vol 25, No.06 (June, 2013) A Note on the Relationship of Brooks Law and Conway Law
History:
Fifty glorious years (1950-2000): the triumph of the US computer engineering : Donald Knuth : TAoCP and its Influence of Computer Science : Richard Stallman : Linus Torvalds : Larry Wall : John K. Ousterhout : CTSS : Multix OS Unix History : Unix shell history : VI editor : History of pipes concept : Solaris : MS DOS : Programming Languages History : PL/1 : Simula 67 : C : History of GCC development : Scripting Languages : Perl history : OS History : Mail : DNS : SSH : CPU Instruction Sets : SPARC systems 1987-2006 : Norton Commander : Norton Utilities : Norton Ghost : Frontpage history : Malware Defense History : GNU Screen : OSS early history
Classic books:
The Peter Principle : Parkinson Law : 1984 : The Mythical Man-Month : How to Solve It by George Polya : The Art of Computer Programming : The Elements of Programming Style : The Unix Hater’s Handbook : The Jargon file : The True Believer : Programming Pearls : The Good Soldier Svejk : The Power Elite
Most popular humor pages:
Manifest of the Softpanorama IT Slacker Society : Ten Commandments of the IT Slackers Society : Computer Humor Collection : BSD Logo Story : The Cuckoo's Egg : IT Slang : C++ Humor : ARE YOU A BBS ADDICT? : The Perl Purity Test : Object oriented programmers of all nations : Financial Humor : Financial Humor Bulletin, 2008 : Financial Humor Bulletin, 2010 : The Most Comprehensive Collection of Editor-related Humor : Programming Language Humor : Goldman Sachs related humor : Greenspan humor : C Humor : Scripting Humor : Real Programmers Humor : Web Humor : GPL-related Humor : OFM Humor : Politically Incorrect Humor : IDS Humor : "Linux Sucks" Humor : Russian Musical Humor : Best Russian Programmer Humor : Microsoft plans to buy Catholic Church : Richard Stallman Related Humor : Admin Humor : Perl-related Humor : Linus Torvalds Related humor : PseudoScience Related Humor : Networking Humor : Shell Humor : Financial Humor Bulletin, 2011 : Financial Humor Bulletin, 2012 : Financial Humor Bulletin, 2013 : Java Humor : Software Engineering Humor : Sun Solaris Related Humor : Education Humor : IBM Humor : Assembler-related Humor : VIM Humor : Computer Viruses Humor : Bright tomorrow is rescheduled to a day after tomorrow : Classic Computer Humor
The Last but not Least Technology is dominated by two types of people: those who understand what they do not manage and those who manage what they do not understand ~Archibald Putt. Ph.D
Copyright © 1996-2021 by Softpanorama Society. www.softpanorama.org was initially created as a service to the (now defunct) UN Sustainable Development Networking Programme (SDNP) without any remuneration. This document is an industrial compilation designed and created exclusively for educational use and is distributed under the Softpanorama Content License. Original materials copyright belong to respective owners. Quotes are made for educational purposes only in compliance with the fair use doctrine.
FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to advance understanding of computer science, IT technology, economic, scientific, and social issues. We believe this constitutes a 'fair use' of any such copyrighted material as provided by section 107 of the US Copyright Law according to which such material can be distributed without profit exclusively for research and educational purposes.
This is a Spartan WHYFF (We Help You For Free) site written by people for whom English is not a native language. Grammar and spelling errors should be expected. The site contain some broken links as it develops like a living tree...
|
You can use PayPal to to buy a cup of coffee for authors of this site |
Disclaimer:
The statements, views and opinions presented on this web page are those of the author (or referenced source) and are not endorsed by, nor do they necessarily reflect, the opinions of the Softpanorama society. We do not warrant the correctness of the information provided or its fitness for any purpose. The site uses AdSense so you need to be aware of Google privacy policy. You you do not want to be tracked by Google please disable Javascript for this site. This site is perfectly usable without Javascript.
Created: May 16, 1997; Last modified: March 12, 2019