7 day file backup script?


Forum: Apps
Topic: 7 day file backup script?
started by: Big_Pc_Man

Posted by Big_Pc_Man on Jan. 21 2008,18:51
Hello All,
Can anyone provide a 7 day round robin file backup script. I'm trying to backup a file once a day such that I have a weekly archive of the file. It doesn't need to be an incremental backup and the file can be renamed xyz1-10. Brute force will do. I have cron running already so the script can be run from cron once a day.

Posted by lucky13 on Jan. 21 2008,22:46
Not sure I understand your goal. What kind of file? Are you just trying to write like a daily log into a weekly file so you, in effect, end up with 52 weekly logs at the end of the year?

If that's all it is, you can do that with a couple simple lines in cron. The first to run weekly (depending which day you want to start) either by itself or via script to touch a file using date so you have a file named "file-year-weeknumber.log" or whatever. The next would run daily and cat whatever file you want written daily to the file opened in the previous sentence. A third run at whichever day to clear out the previous daily entry.

Is that what you mean?

Posted by lucky13 on Jan. 22 2008,00:25
I saw you were on after I replied but you didn't answer. If what I wrote is what you want to do, you can use a script like this to start the weekly log:
Code Sample
#!/bin/bash
lognumber=$(date +%Y%W)
exec touch filename.$lognumber

Give it a name (logmaker), chmod +x, in $PATH (such as /opt/bin), add a weekly cron job with that. Run this week, it gives you a file named "filename.200803" and the next week would be the same but 04. That will give you sequential weekly logs if that's what you want. It's also an optional step -- the next one should create the file if it doesn't already exist, but my preference is to use touch even if it's an extra step.

EDIT: Cleaned up everything and am leaving just this part to reduce possible confusion. This would be the daily script to set in your crontab...
Code Sample
#!/bin/bash

# set variables
lognumber=$(date +%Y%W)
logday=$(date [whatever you want -- standard date, day of week, etc.])

# if you want a timestamp:
echo $logday >> filename.$lognumber

# this line does what you seem to want:
cat thefileyouwantlogged >> filename.$lognumber

# the next two entries are to clear and restart the daily log file:
rm thefileyouwantlogged
touch thefileyouwantlogged


The last two entries are redundant if the file already resets itself daily anyway. You can also leave out the logday and echo $logday entries if you don't care about timestamping each entry. Set the logday to whatever variables you want (Google: linux man date). Use full paths for "thefileyouwantlogged" and "filename.$lognumber" so you won't have any drama.

I don't know what you meant by brute force. Or even if this is what you wanted. Many ways to skin a cat. Like I said, I would do a weekly job to touch the weekly log file.

Posted by chaostic on Jan. 22 2008,06:18
If you just want to back it up, this is how Apple does it for the OSX daily/weekly/monthly maintenance scripts (Which are based off the Berkley BSD scripts):

Code Sample

#!/bin/sh -
echo ""
printf %s "Rotating log files:"
cd /var/log
for i in system.log; do
   if [ -f "${i}" ]; then
       printf %s " ${i}"
       if [ -x /usr/bin/gzip ]; then gzext=".gz"; else gzext=""; fi
       if [ -f "${i}.5${gzext}" ]; then mv -f "${i}.5${gzext}" "${i}.6${gzext}"; fi
       if [ -f "${i}.4${gzext}" ]; then mv -f "${i}.4${gzext}" "${i}.5${gzext}"; fi
       if [ -f "${i}.3${gzext}" ]; then mv -f "${i}.3${gzext}" "${i}.4${gzext}"; fi
       if [ -f "${i}.2${gzext}" ]; then mv -f "${i}.2${gzext}" "${i}.3${gzext}"; fi
       if [ -f "${i}.1${gzext}" ]; then mv -f "${i}.1${gzext}" "${i}.2${gzext}"; fi
       if [ -f "${i}.0${gzext}" ]; then mv -f "${i}.0${gzext}" "${i}.1${gzext}"; fi
       if [ -f "${i}" ]; then
             touch "${i}.$$" && chmod 640 "${i}.$$" && chown root:admin "${i}.$$"
             mv -f "${i}" "${i}.0" && mv "${i}.$$" "${i}" && if [ -x /usr/bin/gzip ]; then
               gzip -9 "${i}.0"; fi
       fi
   fi
done



Bash or Sh get the number of files with "file-name" as the start, and depending on the number there, move each one up in name and then compresses the last one, creating a new file.

This script keeps a total of 8 files (original, original.0-6.gz). Add/remove "if [ -f "${i}.5${gzext}" ]; then mv -f "${i}.5${gzext}" "${i}.6${gzext}"; fi" statements as needed. Just keep in mind that this script would be ran daily, and would always keep the 9 files around, but will force overwrite the oldest one every day. So original.7.gz would be overwritten every day.

For a weekly script to back up the files (If you actually want a copy of every file for every week, and not just a backup of the last 7 days) would be simply

Code Sample

#!/bin/sh -
mkdir /path/to/weekly/backup/$(date +%W-%y)
cp /path/to/files/filename.[0-6].gz /path/to/weekly/backup/$(date +%W-%y)/

Run that on either day 1 of the week before the daily script or day 7 of the week after the daily script (After the file has been created and is done being used).

Also, if you don't need to gzip the file. just remove "&& if [ -x /usr/bin/gzip ]; then gzip -9 "${i}.0"; fi" part of the daily script.

Posted by mikshaw on Jan. 22 2008,12:37
That OSX script is terrible =o)
Posted by Big_Pc_Man on Jan. 22 2008,19:39
Thanks to all for the ideas.

Specifically what I'm doing is once a day creating a zip file of a critical directory and then copying it over to a dedicated usb stick. I simply want to automate this process and maintain a week long archive.

It's been a long time since I wrote scripts so I'm quite rusty at interpreting what the offered examples do.  I'll start studying up but would appreciate a little more explanation as to how each example works. For instance I have no idea why the OSX script would be bad.

Posted by ^thehatsrule^ on Jan. 22 2008,19:53
If I get what you mean, it can be something like (using easy-to-read/change variables to see what it does):
Code Sample
folder_to_backup="/path/to/myfolder"
backup_name="mybackupname"
backup_location="/path/to/mybackuplocation"

backup_name="${backup_name}-`date +%a`.tar.gz"
cd "$backup_location"
rm -f "$backup_name"
tar zcvf "$backup_name" "$folder_to_backup"
Basically this will use filenames based on the day of the week... and will remove last week's if it exists before making a .tar.gz out of your specified folder.

Quote
For instance I have no idea why the OSX script would be bad.
At first glance, I think it was a reference to the if-statements.

fyi: this seems more of a scripting thread than a mydsl one...?

Posted by lucky13 on Jan. 22 2008,20:43
Quote
I'm trying to backup a file once a day such that I have a weekly archive of the file...

is different from...
Quote
creating a zip file of a critical directory and then copying it over to a dedicated usb stick.


anyway...
Quote
I'm quite rusty at interpreting what the offered examples do.

And I thought I was too pedantic with my explanation of a rather simple script. I was under the impression from the first quote above that you wanted to create something like a weekly log of a particular file. Let me know what's unclear about either script I offered and I'll help knock some of your rust off.

^thehatsrule^ is right about the if statements and that this is in the wrong section and belongs in scripting.

Posted by Big_Pc_Man on Jan. 22 2008,20:47
Thanks for the script ^thehatsrule^.

OK, I deleted my previous response because it finally sank in how the script works. It just makes the name of the backup file whatever it is plus the day of the week and if it already exists deletes it. The backups will roll around the week with each day in the archive until the same day arrives a week later. Nice and simple. I must be a little thick today.

Posted by Big_Pc_Man on Jan. 22 2008,20:54
I do understand lucky13's scripts, I was referring to the OSX one. Thank you lucky13 for the examples.
Posted by ^thehatsrule^ on Jan. 22 2008,21:48
EDIT: heh, I just made a nice post with quotes, and then just saw what happened :P  I'll leave this part in:

Obviously, this method only works if you run this daily (as you indicated earlier).

On "find":
Quote
I found this little piece of code that locates a back in time date. Would this be a better approach to delete the week old file?
Sure, you could adapt that and use it instead... which would remedy the above exception.  Make sure you don't use that directory for anything else though.  If you decide to do this, you may as well use a full date format too.

Posted by Big_Pc_Man on Jan. 22 2008,23:36
Alright, it's looks like a simple and interval flexible approach would be:

Code Sample
tar zcvf /archive_folder/"my_backup_file_name-`date`"  /folder_to_be_backed_up
find /archive_folder/* -mtime +7 -exec rm {} \;


Where 7 could be changed to any number of days and the cron job could be run on any multiple day interval including daily. I'm not sure what the date % options need to be to get an acceptable file name format with a complete date.

Posted by lucky13 on Jan. 22 2008,23:51
Quote
. I'm not sure what the date % options need to be to get an acceptable file name format with a complete date.

Go back and look at my scripts. And what I wrote beneath them.

Posted by Big_Pc_Man on Jan. 23 2008,00:18
Looks like %Y%j will give me 2008022 where the 022 will be the day of the year. That will work fine for my needs. So I end up with:

Code Sample
tar zcf /archive_folder/"my_backup_file_name-`date +%Y%j`".tar.gz  /folder_to_be_backed_up
find /archive_folder/* -mtime +7 -exec rm {} \;

Posted by chaostic on Jan. 25 2008,07:19
Quote (mikshaw @ Jan. 22 2008,07:37)
That OSX script is terrible =o)

Pray tell?
Posted by mikshaw on Jan. 25 2008,15:23
Quote
I'm not sure what the date % options need to be to get an acceptable file name format with a complete date.
< http://www.linuxmanpages.com/man3/strftime.3.php >

Quote
Pray tell?
It's just very sloppy. Not that it doesn't work, but its code is redundant, there are unnecessary commands and unnecessarily repeated commands. I wasn't being completely serious, just thought it was funny for a script included with something as big and shiny as OSX.

Posted by lucky13 on Jan. 25 2008,15:35
Quote
just thought it was funny for a script included with something as big and shiny as OSX.

Apple hires more design people than programmers. not that there's anything wrong with design people, but OSX needs some serious fixes. Badly written scripts are just the tip of their iceberg...
< http://lucky13linux.wordpress.com/2007....n-vista >
< http://lucky13linux.wordpress.com/2007....m-apple >
< http://lucky13linux.wordpress.com/2007....x-users >
Etc...

Posted by chaostic on Jan. 26 2008,13:37
Quote (lucky13 @ Jan. 25 2008,10:35)
Quote
just thought it was funny for a script included with something as big and shiny as OSX.

Apple hires more design people than programmers. not that there's anything wrong with design people, but OSX needs some serious fixes. Badly written scripts are just the tip of their iceberg...
< http://lucky13linux.wordpress.com/2007....n-vista >
< http://lucky13linux.wordpress.com/2007....m-apple >
< http://lucky13linux.wordpress.com/2007....x-users >
Etc...

One, the scripts were written by Berkley, not Apple.

Two, that OSx to Windows bug comparison is bullshit. It includes programs NOT installed by default in OSx OR created, support, or used  by Apple other then Mac being the system it is running on (a php file for Squirrlemail CVE-2007-3944), it includes duplicates (CVE-2007-0229, and CVE-2007-0236 in Jan and March), and it includes none-OS programs (iChat. How many times are msn messenger bugs included in the comparison on the windows side? Not once). It also includes bugs in BETA software that does not come with OSx as default and is not a automatic update (Safari 3 beta, CVE-2007-3944, CVE-2007-3742)

Three, those quicktime bugs you mention, are all on windows. Not in OSx.

So keep spreading biased, baseless fud.

Posted by Big_Pc_Man on Jan. 26 2008,15:15
Hello Again,
I'm using the tar command as stated in previous post to back up and compress a usb stick (200MB) photo folder. I'm now using [nice -n 18 tar..... &] thinking this would reduce the cpu usage and power. I hate hearing the cpu fan running and the computer gets a little flaky running full out for a long time. However this has not accomplished much since the cpu still runs the task full bore since not much else is going on.

Is there a way to force the tar task to only use some percentage of the cpu cycles?

I realize this is getting  a bit far a field from the thread subject.

Posted by lucky13 on Jan. 26 2008,15:18
1. Apple included them. That tells you either (a) they're ignorant or (b) incompetent. Which is it?
2. It's a very fair comparison -- apples to apples, if you like. Users generally install a lot more than what comes on the base system. MS exploits are measured in the same manner, not strictly what comes in a base install of Windows. Apple can't spin its way out of this.
3. Those Quicktime bugs are from Apple's handiwork, not Microsoft's. It's APPLE who's written sloppy code that allows all kinds of malicious content to come in via their product. And their own products are insecure for their own OS. For example:
< http://www.gnucitizen.org/blog/backdooring-mp3-files/ >
< http://lucky13linux.wordpress.com/2007/12/17/mac-idisk-flaw/ >

Want a fair comparison from a security expert who actually uses a Mac? (And pwns them.)
< http://lucky13linux.wordpress.com/2007....-mac-os >

I'm not biased. I'm not the one with money tied up in Apple hardware aside from an old early PowerPC and a few old parts. I was offered an early iMac, but I had no use for something with flowers on its case. I also won't use their software for the aforementioned reasons -- namely it's unsatisfactorily coded and, more often than not, very insecure. It's not FUD to relate the gravity of OS and software insecurity, especially when it relates to a company spending millions of dollars to convince people that their software is inherently safer than something else. When it isn't. Maybe it's another form of insecurity you need to deal with. But please do it on your own time, not mine.

Powered by Ikonboard 3.1.2a
Ikonboard © 2001 Jarvis Entertainment Group, Inc.