Beefy Boxes and Bandwidth Generously Provided by pair Networks
There's more than one way to do things
 
PerlMonks  

Create Daily Backups of Scripts and Datafiles Inside a Web Site

by jpfarmer (Pilgrim)
on Dec 29, 2002 at 08:14 UTC ( [id://222885]=sourcecode: print w/replies, xml ) Need Help??
Category: Utility Scripts
Author/Contact Info jpfarmer
Description: I work on a website working with several programmers, all vastly different styles of writing/testing code. Every so often, a script or data file will get clobbered and the most recent total-site backup may be several weeks old at best (mainly because the whole site backup is so large).

In response to that problem, I wrote this script to back up the files that would normally get changed during development. I run it nightly via a cron job.

This is my first code submission, so please give me any feedback you may have.
#!/bin/perl

use warnings;
use strict;
use File::Find;
use Archive::Tar;

# Directory to begin search files
our $directoriesToSearch = '/internal/';

# Directory to hold backups
our $backupLocation = '/ipbackups/';

# Calculate current date for use in archive filename
my ($day,$month,$year) = (localtime(time))[3,4,5];
$year += 1900;
$month++;

# Archive filename
our $backupFilename = "backup_$month-$day-$year.tar.gz";

# Array to hold files to be backed up
our @filesToBackup;

# Array holding directories to be skipped
our @directoriesToSkip = qw{
    /internal/scripts/chatlog
    /internal/scripts/schedule
};

# Array holding extensions to backup
our @extensionsToBackup = qw{cgi pl dat txt log};

# Let the user know what we're doing and begin finding files.
print "Find files for backup...";
find(\&actOnFiles,($directoriesToSearch));

# Notify the user that the find process is done and that we're
# creating the archive.  Then, create the archive.
print "Done!\nCreating archive $backupFilename...";
Archive::Tar->create_archive ("$backupLocation$backupFilename", 9, 

@filesToBackup);

# Notify the user and exit.
print "Done!\n";


sub actOnFiles()
{
    # Check if the file is in a skipped directory and, if so,
    # return without action
    foreach my $dir (@directoriesToSkip){
    return if ($File::Find::dir =~ m/$dir/);
    }

    # Check if the file has an appropriate extension.  If not,
    # return without action
    my $test = join('|',@extensionsToBackup);
    return unless ($_ =~ m/.($test)$/i);

    # Fail-safe check to make sure a directory hasn't slipped through
    return if (-d $File::Find::name);

    # Assuming the file has survived this long, push it into the
    # list of files to backup.
    push(@filesToBackup,$File::Find::name);
}
Replies are listed 'Best First'.
Re: Create Daily Backups of Scripts and Datafiles Inside a Web Site
by ehdonhon (Curate) on Dec 29, 2002 at 14:46 UTC
    For a less home-cooked idea, you might want to take a look at CVS.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: sourcecode [id://222885]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others sharing their wisdom with the Monastery: (4)
As of 2024-04-19 14:28 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found