we basically have a nasty issue with various backup files being created of random stuff and left there..
I want to implement a standardized naming convention, and have a cron task go through some specific directories and find junk thats older than so many days that match a specific description.
Now this might be something thats already been done and I'm reinventing the wheel, if so.. please do enlighten me.
Step one is to implement a small bash function in user's profiles that they wll use instead of just cp -pr
..
It would cp the file with a specified retention and user information like so:
filename.backupdate_username_retentioninmonths
it needs to use $LOGNAME
, and check for $SUDO_USER
and use that instead
by default it would use a 6mo
retention, but if the -r mo
flag is passed it would use that (in months).
Here is the start of that bash function that I slammed together (it doesnt have the optional retention stuff in yet).
backup() { if [ -n "$SUDO_USER" ]; then USER=$SUDO_USER else USER=$LOGNAME fi # @TODO: accept -r <mo> argument for dynamic retention retention=6 for file in "$@"; do local new=${file}.$(date '+%Y%m%d')_${USER}_${retention}mo while [[ -f $new ]]; do new+="~"; done; printf "copying '%s' to '%s'\n" "$file" "$new"; \cp -ipr "$file" "$new"; done }
edit: it would also be cool if the function could accept a source AND target dir.. basically mirrior what cp does, just appending the username, backup date, and retention info..
[link][5 comments]