Folder Synchronization?

What do you guys use for this? I want something I can set up on my home server and synchronize folders to all of my other computers (desktops and laptops). Obviously I’m not looking for a cloud-based solution here, or I’d go with SpiderOak.

Everything I’ve found so far either hasn’t been updated in many years or just simply won’t work for syncing home folders.

run a daemon that looks for changes in selected directories and scp them over to wherever?

dropbox lan sync

Won’t really do the job if you’re using a large amount of data. It still needs to upload them to store a list of the files, and have each computer download that list of files and their changes for LAN sync to work. Which also means it includes the size quota.

That’s fine if I never accidentally change the same file on two different computers, though it would be much easier and better to use rsync, but then it will clobber files with changes on both sides instead of merging them or asking me what I want to do. It wouldn’t make a good system.

I’ve never heard of it, but I looked it up, and it’s like tL said, it uploads everything to dropbox’s servers as well, and since I’m talking about a substantial amount of data (probably around 30gb to start with), I can’t go that route.

iFolder sounds like a great option, but it sounds VERY hard to get to run on Ubuntu or Debian. :frowning:

[quote=“Moparisthebest, post:5, topic:437714”][quote author=t4 link=topic=540415.msg3957888#msg3957888 date=1332954078]
run a daemon that looks for changes in selected directories and scp them over to wherever?
[/quote]

That’s fine if I never accidentally change the same file on two different computers, though it would be much easier and better to use rsync, but then it will clobber files with changes on both sides instead of merging them or asking me what I want to do. It wouldn’t make a good system.

I’ve never heard of it, but I looked it up, and it’s like tL said, it uploads everything to dropbox’s servers as well, and since I’m talking about a substantial amount of data (probably around 30gb to start with), I can’t go that route.

iFolder sounds like a great option, but it sounds VERY hard to get to run on Ubuntu or Debian. :([/quote]
you could set the daemon to cycle every 30s or something - i don’t really see a situation where you’d be merging the same file with in 30s of each other.

another alternative is setting up a shared folder over SAMBA. if you’ve got a gigE network, this should work well.

[quote=“t4, post:6, topic:437714”][quote author=Moparisthebest link=topic=540415.msg3958344#msg3958344 date=1332992912]

That’s fine if I never accidentally change the same file on two different computers, though it would be much easier and better to use rsync, but then it will clobber files with changes on both sides instead of merging them or asking me what I want to do. It wouldn’t make a good system.

I’ve never heard of it, but I looked it up, and it’s like tL said, it uploads everything to dropbox’s servers as well, and since I’m talking about a substantial amount of data (probably around 30gb to start with), I can’t go that route.

iFolder sounds like a great option, but it sounds VERY hard to get to run on Ubuntu or Debian. :frowning:
[/quote]
you could set the daemon to cycle every 30s or something - i don’t really see a situation where you’d be merging the same file with in 30s of each other.

another alternative is setting up a shared folder over SAMBA. if you’ve got a gigE network, this should work well.[/quote]

The situation would arise where I take the laptop away from the house, so it can’t synchronize with the server, then edit some files.

NFS would be better for that, and I do actually have that set up, and if these were both wired desktops, the solution would be easy, a NFS share mounted as /home, unfortunately that won’t work when the laptop is away from the house.

Use your router as some kind of samba share and make it automount on your lunix pcs. Open ports -> works outside of home too.

And how do you think the performance will be with a laptop mounting a 30+gb /home partition over a slow, public wifi connection? It wouldn’t work at all. Also, I may need to use the laptop without an internet connection every now and then.

if you’re out of your own network, i’d suggest going to the cloud…

http://sparkleshare.org/

I don’t need it to synchronize when outside of my own network, just be usable and able to synchronize when I get back.

I installed it and gave it a try before I made this thread. It’s just a fancy front-end to git and will only sync folders in the ‘SparkleShare’ folder, so I can’t use it to sync my home folder for instance.

[quote=“Moparisthebest, post:12, topic:437714”][quote author=t4 link=topic=540415.msg3959308#msg3959308 date=1333123179]
if you’re out of your own network, i’d suggest going to the cloud…
[/quote]

I don’t need it to synchronize when outside of my own network, just be usable and able to synchronize when I get back.[/quote]
so why not make the daemon check if the internal address of the machine matches what it’s supposed to be assigned. let’s say you place your laptop on 192.168.5.200 (which would be a very uncommon address to come across outside your network). so at that point, as soon as your joined your home network and received your unique static IP for that machine and the daemon completed its next cycle, you’d be synced.

I kind of gave up on this, then stumbled upon ownCloud yesterday:

It’s free and open source, and the ‘server’ portion is just some PHP scripts, so it runs about anywhere. It provides plenty of options to sync your files, including WebDAV, and a local sync client which I am currently using. It seems to work pretty well.

The downside is, currently, it doesn’t support client-side encryption, which is fine if you have your own server on-site, but not so much if you want to rent a server someplace and don’t trust them with your files.

http://cubby.com

Does P2P file sync (and you can make a ‘cubby’ out of any folder’) and will soon support custom encryption keys. Though it is still in beta, I have it (no invites left :() and it works rather well, you can choose individually which computers to sync the folders too!

[quote=“tL, post:15, topic:437714”]http://cubby.com

Does P2P file sync (and you can make a ‘cubby’ out of any folder’) and will soon support custom encryption keys. Though it is still in beta, I have it (no invites left :() and it works rather well, you can choose individually which computers to sync the folders too![/quote]

It looks like it keeps all of your files on a central server as well, in which case I’d use something proven like SpiderOak (also, cubby doesn’t run on linux).

When you make a folder a ‘cubby’ you can untick sync to cloud and have it not store your files anywhere your but your own computers and have a pure peer to peer file sync between them.

And Linux. I know there was a reason I didn’t post this up in the first place. Completely forgot about the no Linux support.

Ah, alright, so is it 100% free to use if you don’t choose to sync anything to ‘the cloud’?

Also, I don’t see anything about encryption or anything, that’s important if you don’t control where it is going.

I got all my wife’s files sync’d across two computers using ownCloud, my files are still syncing currently, but I haven’t ran into any problems. It goes much, much faster with few large files than with many, many, small files (like git and svn repos), which is why my sync of all my code is taking awhile.

Yep. And the only time you are forced to upload something to ‘the cloud’ is if you want to make a public link to a file (like dropbox). Otherwise, I have used it to sync folders of above 100GB to multiple computers and it has been great.

More info (including encryption):

http://b.logme.in/2012/04/18/introducing-cubby/

I think it’s something to look out for once it gets out of beta (and if/when they do a Linux client).

Quick update on ownCloud, I wouldn’t use the sync’ing client yet, it started to eat my files (insert random error strings throughout my files). Luckily I had backups, but it’s still in early beta so I have a lot of hope still. :slight_smile:

edit:
I’m now using the csync binary from the project, but over the (as yet) more stable sftp protocol, and now I ensure that only one script is backing up to the repository at a time. I don’t see a really good way to ensure only one copy of a bash script runs on multiple computers, and the people on freenode’s ##bash didn’t have better ideas either. So I use a type of file lock based on creating a directory, and delete it at the end of the script. However, if the script is signal -9’d, the directory will stay and no scripts will run until it is manually deleted. If you have a better options, please let me know. Here is the preliminary version of the script:

#!/bin/sh
args="csync -c"
#args="$args --dry-run"

server="user@host"
folder="/path/to/your/files"

locallockfile="/tmp/mycsync.lock"
remotelockdir="$folder/mycsync.lock"
host="sftp://$server:$folder"

# make sure this script isn't running on this computer
touch $locallockfile
exec 9>$locallockfile
if ! flock -n 9  ; then
	echo "cannot acquire local lock $1, exiting..."
	exit 1
fi

# make sure this script isn't running on any other hosts either
# adapted from http://mywiki.wooledge.org/BashFAQ/045
while true; do 
	if ssh $server mkdir "$remotelockdir" 2> /dev/null
	then
		echo "successfully acquired remote lock $1"
		# Remove lockdir when the script finishes, or when it receives a signal
		trap 'ssh $server rm -rf "$remotelockdir"' 0    # remove directory when script finishes
		break
	else
		echo "cannot acquire remote lock $1, waiting..."
		sleep 5
		#exit 0
	fi
done

#echo "doing sync! $1"
#exit

$args /home/mopar/bin/ $host/bin/
$args /home/mopar/Documents/ $host/Documents/
#exit
$args /home/mopar/apps/ $host/apps/
$args /home/mopar/IdeaProjects/ $host/IdeaProjects/
$args /home/mopar/Music/ $host/Music/
$args /home/mopar/projects/ $host/projects/
$args /home/mopar/workspace/ $host/workspace/

So, there you go. You don’t need or use ownCloud for this, just any place you have sftp access to.