Backing up locally modified files on CVS repo

When working with a CVS repository, you depend completely on having access to the server for almost any command you want to execute. That is, in part, why distributed version control systems (dvcs) like git or mercurial have had such a tremendous success lately. Nevertheless, sometimes you can not choose the vcs you work with and CVS was common-place more than ten years ago, so chances are that you are "lucky" enough to have to work with it ;-)

The following script is just a hack to recurse over all directories and manually check if file dates have changed. It might come handy when you don't have access to the repository:


# Function to get the nth value of a cvs entry (format x/y/z/...)
# Param 1: the string
# Param 2: field number to return
get_entry_value ()
    local subentry=${1%%/*}
    if [ "$2" -eq 1 ];
        echo $subentry
        get_entry_value "${1#$subentry/}" $(($2 - 1))

# For each directory with a CVS subdirectory...
for cvs_dir in $(find . -type d -name 'CVS');
    if [ ! -r $cvs_dir/Entries ]; then continue; fi
    # For each file listed in CVS/Entries...
    grep '^/.*' $cvs_dir/Entries | while read file_entry;
        filename=$(get_entry_value "$file_entry" "2")
        filedate=$(get_entry_value "$file_entry" "4")
        filerealdate=$(date --utc --reference=$dir/$filename 2> /dev/null)
        if [[ -z $filerealdate ]]; then continue; fi
        filerealdate=${filerealdate/ UTC/}
        filerealdate=${filerealdate/  / }
        if [ "$filedate" != "$filerealdate" ];
            echo "$dir/$filename"

Now if you wanted to backup all modified files you would just have to call tar with something like this:

$ tar -cvzf "backup-$(date +%Y%m%d%k%M).tgz" $(