faster rsync of huge directories
Tom Rosenfeld
trosen at bezeqint.net
Mon Apr 12 14:24:22 IDT 2010
On Mon, Apr 12, 2010 at 9:41 AM, Tom Rosenfeld <trosen at bezeqint.net> wrote:
> Hi,
>
> I am a great fan of rsync for copying filesystems. However I now have a
> filesystem which is several hundred gigabytes and apparently has a lot of
> small files. I have been running rsync all night and it still did not start
> copying as it is still building the file list.
> Is there any way to get it to start copying as it goes. Or do any of you
> have a better tool?
>
> Thanks,
> -tom
>
>
>
Thanks for all the suggestions!
I realized that in my case I did not really need rsync since it is a local
disk to disk copy. I could have used a tar and pipe, but I like cpio:
find $FROMDIR -depth -print |cpio -pdma $TODIR
By default cpio also will not overwrite files if the source is not newer.
It was also pointed out that ver 3 of rsync now does start to copy before it
indexes all the files. Unfortunately, it is not available on CentOS 5.
-tom
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.cs.huji.ac.il/pipermail/linux-il/attachments/20100412/838aadc0/attachment.html>
More information about the Linux-il
mailing list