[TriLUG] Hunting Down Large Files
Jeff Groves
jgroves at krenim.org
Thu Mar 17 08:37:03 EST 2005
Aaron S. Joyner wrote:
> And the icing on the cake, if you want one command which tells you the
> 20 largest files on your filesystem, you can use something like this:
> find / -type d -exec ls -sS \{\} \; > /tmp/asjout.tmp; cat
> /tmp/asjout.tmp | sort -rn | head -20
>
> It's only sort of one command as I had to cheat and use a temporary
> file, because of the way find generates it's output. :( Also, beware
> that this command can take a while to run. It also doesn't do any
> throwing-away of error output, so you'll want to run it as root mostly
> to get accurate counting and also to minimize trash on the screen.
> Running across my 250G of disk space in one box, which is mostly full,
> it took about 69 seconds. That wasn't my first time running find
> though, so it was probably all cached at that point. Your results may
> vary. :)
>
One more iteration: The below is for the case when the file system that
is full is /. If the file system that is full is some other file
system, you'll want to substitute / with that file system name:
find / -size +300k -type f -xdev -ls 2> /dev/null | sort -k 7n
This finds all files (-type f) on the / file systems _only_ (-xdev) that
are greater than 300 kilobytes (-size -300k) in size. Using "2>
/dev/null" gets any error messages out of your hair and the "sort -k 7n"
sort the listing of files numerically by the seventh column which is the
file size.
Jeff G.
--
Law of Procrastination:
Procrastination avoids boredom; one never has
the feeling that there is nothing important to do.
More information about the TriLUG
mailing list