[TriLUG] Hunting Down Large Files

Chip Turner cturner at pattern.net
Thu Mar 17 02:58:13 EST 2005


Who needs tools when the OS comes with all you need!

# top five largest files/dirs
du -s * | sort -n | tail -5

# largest 5 files
ls -lSr | tail -5

# largest 5 space consumers, either dirs or files, recursively
du | sort -n | tail -5

Add a -x to du if you don't want walk across filesystems boundaries.

HTH,
Chip

Aaron Bockover <abockover.trilug at aaronbock.net> writes:

> Nautilus gave me a shocking message a few moments ago: not enough disk
> space.
>
> A quick df -h showed that my primary volume was indeed 100% used. Sad
> really. I am a data hoarder. It's compulsive.
>
> My question is this: is there a shell tool or script in existence that
> can be run on a given base directory and calculate the total size of the
> *files* in a directory (not recursively), and report those larger than
> some limit? Also something to report single large files larger than some
> limit?
>
> If a tool like this exists, it might help me reduce some of my clutter.
> I know I should probably get rid of those massive ISOs from five years
> ago, but what if I need RH 6 next week?! I'm trying to avoid that route.
>
> If something like this doesn't exist, I think I may have to write one.
> I'd love to hear thoughts on how others manage their heaps of data.
> Fortunately most of mine is somewhat organized, but organization comes
> in phases... dig through data -- organize data -- collect data --
> realize data is unorganized -- repeat.
>
> Regards,
> Aaron Bockover
>
>
> -- 
> TriLUG mailing list        : http://www.trilug.org/mailman/listinfo/trilug
> TriLUG Organizational FAQ  : http://trilug.org/faq/
> TriLUG Member Services FAQ : http://members.trilug.org/services_faq/
> TriLUG PGP Keyring         : http://trilug.org/~chrish/trilug.asc
>

-- 
Chip Turner                   cturner at pattern.net



More information about the TriLUG mailing list