how large is backup.tar.gz
Forum: User Feedback
Topic: how large is backup.tar.gz
started by: stunix*com
Posted by stunix*com on Aug. 27 2008,22:42taking in to account that if you are using backup.tar.gz rather than, say a home partition, how big is the average file?
i have a number of thinkpad 600e lappys running a few different versions of dsl, and a dsl file server which automatically gets mounted on boot, therefore i habitually keep all docs and stuff there, but when it comes to prefs for mozilla, beaver, axyftp, emelfm etc it all gets squished and resurrected each time i boot.
now, i have found that the larger the file the more time this takes and unfortunately the more risk of error or corruption, so i try to keep it as small as possible, firefox without saving a history but with forcastfox weighs in at 14mb.
the average size of backup.tar.gz for a "well used" machine of mine is around 6 - 10mb and can take a good 30 secs to squish/unsquish, so on my server i have only backed up exactly what i needed to, ie smb.conf, shadow, opt stuff etc adding ~/.mozilla and ~/.thumbnails etc to xfiletool.lst, so a new folder gets written each boot/app start. (infact one of my innstall jobs is to make a link from ~.thumbnails to /temp) im currently i got it at 186kbor 2 seconds, and monitor it by adding the following in .torsmorc
then you can watch a 10Mb file fill up after issuing a reboot.
any thoughts or ideas?
Posted by roberts on Aug. 28 2008,15:31Keep the size down by factoring out any static, i.e., no changing conntent, into separate or a singled combined tar.gz. It is up to you.
For example I use a single file named myconf.tar.gz then place it with the other DSL extensions and it will autoload along side your existing extensions.
Also be sure not to include cache files in your backup. With each new release of many browsers their cache files often have new or different names. Look at /opt/.xfiletool.lst for examples.
If you don't do some of this "housecleaning" then as your backups grows so will the time that you are waiting.
As DSL was made to be nomadic the backup is paramount and flexible. With a little housecleaing you can keep your DSL backup times down.
DSL also supports both a traditonal hard drive install and a hybrid install often called persistent home and/or persistent opt.
Posted by stunix*com on Sep. 01 2008,23:15
Robert, im sorry to trail this one, but ive been playing with this and getting some odd results so could you give an example to clarify.
first i tried restoring a jpg to the home dir, so i created /home/dsl/home/dsl/image.jpg and from within the primary home dir issued >tar czf ./home . does that make sence? thinking the recreated dir would be created from root. the result was that on reboot i didnt get as far as x before errors won.
i then created just /home/dsl/hello/image.jpg and again from ~/ ">tar czf hello" put it in mydsl and rebooted to find /hello/image.jpg in my root. partial success.
so trying to apply this, and after installing f-prot in to /opt/f-prot
i thought i could save the 30Mb of the definitions file ie /opt/f-prot/antivir.dev so i created this and got some very odd partial reconstruction of what i had in home. x started but bootlocal was empty and other stuff was missing, samba and other stuff in opt.
im guessing at this point i should read up on how to make mydsl apps as i didnt think f-prot should be missing from the repository if samba was there.
Posted by roberts on Sep. 02 2008,03:42It is really more about using the backup/restore capabilites of the tar command. There are many ways to do it, but personally I prefer to use lists. I first prepare the list of files either manually using an editor or via an ls or find command.
Create a list. Using your example of a simple jpg, lets name the list myjpg.lst. The list would contain:
Then create the tar archive using the following:
tar -C / -T myjpg.lst -czvf myjpg.tar.gz
You should see the file listed as the tar archive is created.
Once created, you may then check it by using the following tar command:
tar -ztf myjpg.tar.gz
You should see the file listed on the screen.
Next move your myjpg.tar.gz into the directory containing all your extensions.
Remove the image.jpg and remove myjpg.lst
image.jpg will be loaded like any other extension in /home/dsl
For that large /opt/f-prot/antivir.dev the procedure would be
create a list say f-prot_dev.lst conatining:
Then create the archive with:
tar -C / -T f-prot_dev.lst -czvf f-prot_dev.tar.gz
Double check with the tar -ztf before you delete the original. Move to extension directory and your done.
For a large directory of files, manually entering them into a list is too much work. Use ls or find to create the list. Again once the list is created the tar command is always the same just change the references to the list file and the tar.gz
Example if f-prot directory is all static content then make a list with:
find /opt/f-prot/ > f-prot.lst
You can check the contents of the list file with
Make the archive with
tar -C / -T f-prot.lst -czvf f-prot.tar.gz
Once you get the hang of it, the procedure is always the same wether it is one file or many files.
Now the rule is use .tar.gz for files in opt and home, use .dsl for all others.
Second thing to note is that if you load stuff into home then it will still be in the backup of home. So, if you have factored out some large files into "personal extensions" then you should add those files to .xfiletool.lst to prevent them from being in the next backup.
Posted by stunix*com on Sep. 03 2008,22:21fantastic. from here i have a 280kb f-prot.tar.gz linking to a definitions file in /opt/avdef which is now a UCI (15mb) minimizing the ram footprint. (i read up in the wiki)
i know it'll be a pain to recreate an avdef.uci every week, but its better than not using any av on my file server. Many thanks for the tip.
also, I'm guessing that using a htdocs.tar.gz or uci is more secure for static web stuff.