Home Switchboard Unix Administration Red Hat TCP/IP Networks Neoliberalism Toxic Managers
May the source be with you, but remember the KISS principle ;-)
Skepticism and critical thinking is not panacea, but can help to understand the world better

Unix find tutorial

Prev | Contents | Next

Part 8: Finding Files based on size: largest, empty and within certain range


Among categories of files with specific sizes the most popular are probably zero-length file and "huge files". You can also be interesting is sizes about certain for specific categories such as tar file TGZ files, ISO files, etc.

Do not try to delete found files in the same statement using -exec. Write output of find into the file analyze it and only then run xargs against this file. In case you operating with system directories do backup first and run xargs second, You can do a lot of damage using -exec unless you understand what you are doing. And if you use -exec typically you realize that you missed something when it's too late

Predicate size have the following syntax:

-size [+|-]n[bckwMG]

True if the file uses n units of space, rounding up. The units are 512-byte blocks by default, but they can be changed by adding a one-character suffix to n:

+ and - means above given size and below given size, correspondingly. If you use two -size predicates in the same expression you can find files within certain interval of sizes. This is useful for finding "missing files": when you know that file exists, but forgot exact name and path.

Zero length files

Zero length files formally might be considered a subclass of abnormal files, but in reality they are often used in software installations. Another common situation is related to trying to find largest file in the filesystems. In this case you can also sort result in the descending order to make decision about which file to delete and which to move in order to free space easier.

To find all zero length files:

find /home -type f -size 0 -ls 

You can also use a special predicate -empty

$ find /home -empty -exec ls {} \;
After finding empty files, you might choose to delete them by replacing the ls command with the rm command. But it's better to verify the list before jumping the gun... Never do it from root directory, only from the second level directories that you know are save to delete such files. Some zero length files are legitimate entries in /etc and other system directories.

Finding files within certain size range

Another unfortunate, but rather common situation when you destroyed certain files due to a bug in a script and they all have certain size. In this case searching for this size can give you the list that defines the scope of the damage and allow to restore files from the backup.

You can combine several size predicated to get "range", which is useful if you forgot the name of the file, but know its approximate size and some other parameters (for example age of the file):
find /home -type f -size +12K -size -15K -mtime -3 -ls

You can use size in bytes not only in larger units (see above). For example:

find /home -type f -size +1024c -size -2048c

Finding "huge" files

Excluding certain parts of the tree from searching

As with abnormal files discussed in previous section it is important to exclude certain parts of the directory tree from searching. For example, some zero length files are legitimate entries in /etc and other system directories. You can use all approaches discussed in section 6 but often more practical and more simple to use a small Perl script to weed "junk" directories. For example:

open(SYSLOG,"find / -type f -size 0 -ls|");
while() {
   foreach $string (@pattern) {
      if ( index($line,"$string") > -1 ) {
   if ($found == 0 ) {print "$data\n";}

Cleaning the filesystem

The most typical situation for professional system administrator is overflowing of the disk filesystem. Typically users put too many files into one of the filesystems and such a filesystem needs a cleanup, extension of the size or both. This is especially typical for servers that serve research groups.

If you have spare disk capacity, extend the size and think about cleanup later. The problem arise when you no longer have a spare disk capacity. Here you need to resourt to cleanup, which is a dangerous operation that need to be planned and executed in very carefully. You should be extremely careful not to create a "man-made disaster" instead of cleanup ;-). Here are two principles that might help:

First you need to find out top largest file/directories in particular filesystem. For example

du -a /home | sort -n -r | head -n 50 | tee /tmp/largest

To find largest files in a particular directory you can use ls:

ls -lSh . | head -12

Typically cleanup starts with finding tar and TGZ file downloaded by the users. If such file are packages downloadable from the net they can be deleted first as this is cleraly safe think to do:

find . -name "*\,tar" -ls

As users download the same packages again and again you can create for yourself the list that represent well know software packages and automate the process.

find . -name "*\.tgz"   | xargs ls -l | tee /tmp/current_tgz.txt
for f in /root/Config/known_tgz.txt ; do  
   if grep $f  /tmp/current_tgz.txt ; then  
      echo "Known software package $f is present and should be deleted"
      known=`grep $f  /tmp/current_tgz.txt`
      if [ -f $known ] ;  then
         rm known

After that you need to find "oldest" large files on the filesystem. Files that are more then a year old are often good candidates of moving to backup storage.

If "huge" files found are compressible, they can be compressed with gzip or bzip2, but such command should run only in particular directory, not globally.

find . -name "*\.tar" -exec gzip {} \;
You can also use du on some directories
du --max-depth=1 -x -h /

Often some effect can give finding and eliminating duplicate files, especially if such files are huge.

find /home -size +500M -mtime +365 -ls

Cleaning filesystems without backup can lead to SNAFU

WARNING: Usage of find for cleaning filesystems can lead to SNAFU, if the names of the file in different directories are identical. For example, in case those two files are selected by find only the second one will survive the move:


In other words, the command

find /Apps +size 500M -mtime +365 -exec mv {} /Backup

will destroy files in different directories that have identical names, preserving only the one that was selected last. If "flattens that directory structure to a single directory, so files with identical names will be overwritten. You can try to use built-in function dirname to avoid that

find /Apps +size 500M -mtime +365 -exec mv {} /Backup/`dirname {}`

If there are not too many such directories, it is much better and safer to use Midnight Commander to move older directories to another location. it automatically prevents this SNAFU.

In any case if you use find command like above after detection of files you need to move, you first need to ensure that there are no files with identical names in different directories. the simplest way of doing this is to add suffix with the file creation date.

You can also use rsync to copy selected directories to remote location.

Or you can postprocess the list selecting the directories and then use this list via xargs for copy

Top Visited
Past week
Past month


Old News ;-)

Command Line Magic ‏@climagic Feb 23

find music -name '*.mp3' -mtime +365 -a -size +10M -ls # Find and long list mp3 files in Music dir older than a year and larger than 10MB.

find . -empty -type d # List of empty subdirectories of current directory.

Recommended Links

Google matched content

Softpanorama Recommended

Top articles


Top articles


Prev | Contents | Next


The Last but not Least Technology is dominated by two types of people: those who understand what they do not manage and those who manage what they do not understand ~Archibald Putt. Ph.D

Copyright 1996-2018 by Dr. Nikolai Bezroukov. was initially created as a service to the (now defunct) UN Sustainable Development Networking Programme (SDNP) in the author free time and without any remuneration. This document is an industrial compilation designed and created exclusively for educational use and is distributed under the Softpanorama Content License. Original materials copyright belong to respective owners. Quotes are made for educational purposes only in compliance with the fair use doctrine.

FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to advance understanding of computer science, IT technology, economic, scientific, and social issues. We believe this constitutes a 'fair use' of any such copyrighted material as provided by section 107 of the US Copyright Law according to which such material can be distributed without profit exclusively for research and educational purposes.

This is a Spartan WHYFF (We Help You For Free) site written by people for whom English is not a native language. Grammar and spelling errors should be expected. The site contain some broken links as it develops like a living tree...

You can use PayPal to make a contribution, supporting development of this site and speed up access. In case is down you can use the at


The statements, views and opinions presented on this web page are those of the author (or referenced source) and are not endorsed by, nor do they necessarily reflect, the opinions of the author present and former employers, SDNP or any other organization the author may be associated with. We do not warrant the correctness of the information provided or its fitness for any purpose.

The site uses AdSense so you need to be aware of Google privacy policy. You you do not want to be tracked by Google please disable Javascript for this site. This site is perfectly usable without Javascript.

Last modified: October, 03, 2019