Hey everyone, I'm looking to be able to sync a large amount of files (~5M) across the internet for geo-replication purposes. I'd like it to sync daily, but only changes in order to keep the traffic down as much as possible.
I've looked at using lsync, which we've used in the past internally with some success, however the sheer amount of files causes lsyncd to crash. I've tried to break up the lsync so that it was only doing smaller portions as well as using MaxConnections to try to limit it, but it continues to crash. Rsync is able to handle the transfer, but the amount of traffic is kind of prohibitive.
Any ideas would be welcome, thanks!
[link][5 comments]