Cool Solutions

[Script] Back up Novell Data Synchronizer



By:

September 8, 2011 12:58 pm

Reads:5,750

Comments:1

Score:Unrated

License:
none

Download backup

Introduction

——————————————————————————————–
There are some recommendations in the Novell documentation on how to back up a Novell Data Synchronizer system. This entry shows an example on how to do a backup by a crontab triggered script. I’m not a programming guy so this script may look terrible for a bash-code-crack. Do not hesitate to send me some feedback; joe@sz.ch

First of all the script shuts down the datasync services and the postgresql database. After this, all the regarding directories are copied away and get compressed. At the end an email will be sent with all the information about the backup.

Script

——————————————————————————————–

#!/bin/sh

#
# VARIABLES
#
# root backup directory
backupdir="/var/dsbackup"

# txt mail content + tar verbose output attachement
mailfile="$backupdir/backup-mail.txt"
tarfile="$backupdir/backup-tar.txt"

# current date and time in sortable format
timestamp="$(date +'%y-%m-%d-%I:%M')"

echo "[DataSync] Generated Timestamp $timestamp." >> $mailfile

#
# PREPARATION
#
# delete all mail text related files - from the run before
echo "[DataSync] Removing mail file $mailfile & rm $tarfile..."
rm $mailfile
rm $tarfile

# touch empty new mail txt files - to be filled
echo "[DataSync] Creating new mail file $mailfile & rm $tarfile..." >> $mailfile
touch $mailfile
touch $tarfile

#
# SHUTDOWN DataSync & PostgreSQL SERVICES
#
# starting announcement of script in mail content
echo "[DataSync] Started backup script..." >> $mailfile

# shutdown datasync
echo "[DataSync] Shutting down Novell DataSynchronizer..." >> $mailfile
/usr/sbin/rcdatasync stop >> $mailfile

# shutdown postgre sql db
echo "[DataSync] Shutting down PostgreSQL Database..." >> $mailfile
/usr/sbin/rcpostgresql stop >> $mailfile

#
# DO BACKUP
#
# starting announcement
echo "[DataSync] Running Backup..." >> $mailfile

# create tar from all the necessary directories
tar -czvpf $backupdir/$timestamp-pgsql.tgz /var/lib/pgsql >> $tarfile
tar -czvpf $backupdir/$timestamp-vardatasync.tgz /var/lib/datasync >> $tarfile
tar -czvpf $backupdir/$timestamp-optdatasync.tgz /opt/novell/datasync >> $tarfile
tar -czvpf $backupdir/$timestamp-etcdatasync.tgz /etc/datasync >> $tarfile
echo "[DataSync] Backup done - everything saved under $backupdir..." >> $mailfile

#
# START DataSync & PostgreSQL SERVICES
#
# start postgre sql db - should be started first
echo "[DataSync] Starting PostgreSQL Database..." >> $mailfile
/usr/sbin/rcpostgresql start >> $mailfile

# start datasync
echo "[DataSync] Starting Novell DataSynchronizer..." >> $mailfile
/usr/sbin/rcdatasync start >> $mailfile

#
# CLEANUP
#
#clean up anything older than 20 days
echo "[DataSync] Purging any backups older than 20 days from the directory $backupdir..." >> $mailfile
find $backupdir -mtime +20 -exec rm -f {} \;
echo "[DataSync] Purging for $backupdir done." >> $mailfile

#
# SEND STATUS MAIL
#
# send mail to a given recipient  - if configured correct mail settings in yast2
echo "[DataSync] Send Mail to it-collector@sz.ch" >> $mailfile
mail -r server@company.ch -s "[SERVER] DataSync Backup done" -a $tarfile xyz@company.ch < $mailfile

- Change the variables according your system configuration
- Change “find $backupdir -mtime +20 -exec rm -f {}” to keep the backups longer or less longer than 20 days

Crontab

As mentioned above the script is triggered by a crontab.

Add the following to the root crontab:

30 23   * * *   root /opt/novell/datasync/backup.sh

——————————————————————————————–
- Change the time
- Change the days

The example line runs the script every day at 11.30 PM

0 votes, average: 0.00 out of 50 votes, average: 0.00 out of 50 votes, average: 0.00 out of 50 votes, average: 0.00 out of 50 votes, average: 0.00 out of 5 (0 votes, average: 0.00 out of 5)
You need to be a registered member to rate this post.
Loading ... Loading ...


Categories: Uncategorized

Disclaimer: This content is not supported by Novell. It was contributed by a community member and is published "as is." It seems to have worked for at least one person, and might work for you. But please be sure to test it thoroughly before using it in a production environment.

1 Comment

  1. By:jmarton

    …that downtime is ok. You’d think at some point during the middle of the night it would be ok, but if it’s not allowed, then you’ll have to do things a bit differently. You can actually use pg_dump to do a database dump while all services are running. The backup may not be completely consistent, but if there’s little activity going on when the backup runs then it may be “good enough.”

    pg_dump -U datasync_user mobility > path/filename

    You can then compress the backup to shrink the size (I recommend bzip2 for pretty tight compression).

    This assumes that a file called .pgpass containing the password for datasync_user is in the home directory of the user running the script. For instance, if you run it as root, then the file should be in root’s home directory. You can secure it a bit my changing the permissions to 600 which prevents anyone other than root from seeing the contents.

    Joe

Comment

RSS