Get the fabfile.py
I have a small number of machines I administer. Including a couple VPS machines, a small local server, my kids computers, etc. I use this to centralize all the various scripts and configurations so I don't have to worry about making a change on 5 different machines if I change one of my standard scripts and so I can easily have it all in a VCS. Fabric w/ this fabfile fills that need perfectly.
Used with fabric and typically a VCS to maintain and deploy the configuration and control scripts for a small number of systems (possibly including itself). It's fabric commands are modeled after a simple subset of common distributed VCS commands.
It is organized as a set of directories with this fabfile living at the root. Each system gets a subdirectory that contains all the files for it. The subdirectory is named for the host name of the system which is used in all the remote commands. The files within this directory are laid out like a skeleton of that machine's root filesystem.
All commands are recursive and work as locally as possible except status which will either restrict itself to one machine or check all machines. For example if you run 'fab diff' in the directory '[machine]/etc/postfix/' it would diff the files in only that directory and its sub-directories. Note that files on the remote host not present locally are ignored as the local files are meant to be a skeleton version of just the files you want to deal with.
It supports using the sticky bit on files/dirs as a way to indicate you want to skip that file (for temporary or in progress things). It handles sym-links intelligently, preserving links inside the machine filesystem hierarchy while pulling the linked file for links that point outside the tree (rsync --copy-unsafe-links).
Files and directories that are unreadable or unwritable by the user will automatically use sudo to gain access. No attemtps at preserving ownership is attempted, all files pushed using sudo will be owned by root.
I'll use a personal example to help illustrate. My kids both have Ubuntu machines I've cobbled together from spare parts. I have a standard backup script I put on all my machines along with a cron job to run it. I keep the script and the cron job in a separate area and link it into the machine hierarchy.
So I have the root of my VCS in a directory admin/ with the machines in admin/machines/. The backup script and cron are in admin/backup/. So for this area what I have looks like:
admin/ backup/ backup backup-cron machines/ foo/ etc/cron.daily/backup-cron usr/local/sbin/backup bar/ etc/cron.daily/backup-cron usr/local/sbin/backup
With this setup anytime I make a change to admin/backup/backup I can just go into either of the 2 directories and push to update the machines. I can run diff to see the changes and status will show me a summary of the files changed, added or removed.
- Shows list of new or modified files. New files exist only locally while modified files could have been changed locally or remotely.
- Shows diff between local and remote file(s).
- Push local changes to remote file(s).
- Pull remote changes to local file(s).
- fabric 9.2 (porting to 1.0 is planned)
- rsync (3.x recommended)
- ssh access as running user to all machines
- fabric 1.x compatibility evaluation (waiting on fabric 1.x entering debian)
- a way to restart remote processes
- command line wrapper (for better argument handling mostly)
- fix push to work better with mac os x (mac cp supports neither -d nor -u -- maybe use tar, cpio or pax)
(rules for myself when writing new rules)
- context for all commands should be cwd (faking if necessary)