When you maintain a Web site, it is a good practice to hold in your PC a mirror of the Web site’s contents. When you want to modify the contents, you edit the files in your PC and then synchronize the Web site’s contents with the copy in your PC.
To actually synchronize the files, you have several possibilities:
- Individually FTP modified files to the Web site. Manually delete from the Web site files, which you deleted in your PC.
- Create a tar ball of the modified files, FTP it to the Web site, and then untar them remotely. Then manually delete any files which you deleted in your PC. You need also to make sure you missed no modified file in the tarball.
- If the Web hosting services provides a rsync server, you can use rsync to synchronize.
- If you can set up a rsync server on your PC, you can invoke rsync on the remote Web host, if you have shell access to it. In this case, you need also to poke a hole in your PC’s firewall.
- Use the sitecopy command, the topic of this blog post.
- There are also other commands with substantially the same functionality as sitecopy.
Homepage for the sitecopy project: http://www.lyra.org/sitecopy/
Should I use sitecopy? http://www.lyra.org/sitecopy/why.html
Freshmeat project information: http://freshmeat.net/projects/sitecopy/
To use sitecopy under Linux, you need to create the file .sitecopyrc in your home directory. The following works for me (replace words with ‘you’ by names relevant to your situation):
site yoursitename server yourwebdomain.com username yourremoteusername # password ________________ # sitecopy will prompt you for your password local /home/yourlocalusername/websites/yourwebsite # Your Web site's mirror remote / # FTP home directory for your Web site ftp usecwd # FTP will upload files only to the current working directory permissions all # Set permissions of files after uploading permissions dir # Set permissions of directories after uploading safe # Block uploading of files, which were updated on the Web host. exclude *~ # Exclude backup versions of files modified by you exclude /.bash* # Do not delete dot bash files in the Web host. exclude /.svn # or /CVS if you use CVS rather than subversion. exclude /*/.svn exclude /*/*/.svn exclude /*/*/*/.svn exclude /*/*/*/*/.svn exclude /*/*/*/*/*/.svn exclude /*/*/*/*/*/*/.svn exclude /*/*/*/*/*/*/*/.svn exclude /*/*/*/*/*/*/*/*/.svn exclude /*/*/*/*/*/*/*/*/*/.svn exclude /*/*/*/*/*/*/*/*/*/*/.svn exclude /*/*/*/*/*/*/*/*/*/*/*/.svn exclude /*/*/*/*/*/*/*/*/*/*/*/*/.svn exclude /*/*/*/*/*/*/*/*/*/*/*/*/*/.svn exclude /*/*/*/*/*/*/*/*/*/*/*/*/*/*/.svn exclude /*/*/*/*/*/*/*/*/*/*/*/*/*/*/*/.svn
(The above illustrates also a problem which I had in systematically excluding directories by name.)
To actually run sitecopy, create a shell script file with the following command:
sitecopy -u --debug=ftp,files,socket --logfile=/home/yourlocalusername/sitecopy.log yoursitename
The logfile will be few MB long for a Web site with few hundreds of files, but if you encounter any problems, the logfile will help you diagnose the problems.
I found that sometimes I need to run
sitecopy --catchup yoursitename
before uploading some files, because the safe option seems to be overzealous at times.