stunix*com
Group: Members
Posts: 64
Joined: July 2006 |
|
Posted: Aug. 27 2008,22:42 |
|
taking in to account that if you are using backup.tar.gz rather than, say a home partition, how big is the average file?
i have a number of thinkpad 600e lappys running a few different versions of dsl, and a dsl file server which automatically gets mounted on boot, therefore i habitually keep all docs and stuff there, but when it comes to prefs for mozilla, beaver, axyftp, emelfm etc it all gets squished and resurrected each time i boot.
now, i have found that the larger the file the more time this takes and unfortunately the more risk of error or corruption, so i try to keep it as small as possible, firefox without saving a history but with forcastfox weighs in at 14mb.
the average size of backup.tar.gz for a "well used" machine of mine is around 6 - 10mb and can take a good 30 secs to squish/unsquish, so on my server i have only backed up exactly what i needed to, ie smb.conf, shadow, opt stuff etc adding ~/.mozilla and ~/.thumbnails etc to xfiletool.lst, so a new folder gets written each boot/app start. (infact one of my innstall jobs is to make a link from ~.thumbnails to /temp) im currently i got it at 186kbor 2 seconds, and monitor it by adding the following in .torsmorc
Code Sample | ${execi 5 du -h /mnt/hda2/backup.tar.gz} |
then you can watch a 10Mb file fill up after issuing a reboot.
any thoughts or ideas?
|