Cron jobs to clean up old logs

Filed under: TechNotes — lars @ 11:55:15 pm

I've often found that I want to be able to periodically delete files older than a certain age from a given directory, so that disks don't slowly fill up with logs or other temporary data.  The approach I've used to do this under Linux is a little shell scripting, something like the following:

find /path/to/logs -type f -mtime +28 | xargs --delimiter=\\n rm 

This will look for files (-type f) in the directory /path/to/logs that are older than 28 days (-mtime +28), and then execute the command 'rm' passing each file as an argument.

You can substitute any command for 'rm', for example another script/command that will gzip your logs and move them to an archive directory:

find /path/to/logs/ -type f -mtime +28 | xargs --delimiter=\\n /home/build/archive_log.sh 

Note that the above will call archive_log.sh once with an argument for each result of find (so archive_log.sh would need to iterate through these arguments).  If instead you want to call archive_log.sh once for each result of find, you can use find's -exec option:

find /path/to/logs/ -type f -mtime +28 --exec /home/build/archive_log.sh {} \;

Another variation is as follows:

find /path/to/logs -maxdepth 1 -type d -mtime +28 | xargs --delimiter=\\n rm

This searches for directories (-type d) 1 level under /path/to/logs instead of files.

Then it's just a matter of scheduling command as a cron job to run every day/week/month.  I'm not going to explain that here, you can read about it on wikipedia if you want.

Comments

No Comments for this post yet...

    Leave a comment

    Allowed XHTML tags: <p, ul, ol, li, dl, dt, dd, address, blockquote, ins, del, span, bdo, br, em, strong, dfn, code, samp, kdb, var, cite, abbr, acronym, q, sub, sup, tt, i, b, big, small>


    Options:
    (Line breaks become <br />)
    (Set cookies for name, email & url)




    powered by  b2evolution