[Subject Prev][Subject Next][Thread Prev][Thread Next][Subject Index][Thread Index]

Re: new idea ( ??)



Gurus,
please dont address mails to a certain elite class of people leaving out the others... its rude!

I have 2 GB file on unix/linux.
firstly, its bad programming to make a program produce a 2gb file output. so IMHO u shud throw that program out!!! in effect, u reduce performance of the system. this idea is an exhisting programming riddle, but most programmers avoid getting into this situation by making a number of small files. then its not that stupid to backup a file of say 1mb. anyways... lets continue..

I want to take back up (or replicate) of this file as soon as there is any change in this file.
If we use incremental back up, it will take bake up of all 2 GB file size for a single word change.
youre rite there!!!!

Q1 - Is there any solution in your mind, which will collecte only change byte (or blocks of say 1KB) and duplicate on back up server ?
actually... no!
this is a problem with the file system... in the present day filesystems (fat,ext2) a file is taken as the smallest constituent of the fs.... not a byte(bytes are treated like a black box).... so ineffect... if we want to change just a byte in a file... we wud require the sallest constituent to be a byte!!! so the only solution is to make a filesystem of your own just for backing up stuff!!!!
you could call it (b)fs.

Q2 :- What are the products (commercial and non-commercial like unix utilities) support back up of Open files (application) ???
now thats a question for the gurus!!!!!

I hope, i didn't ask a tough question.
not at all!!!!

Neil