use strict;
use warnings;
use Net::FTP;
use Digest::MD5 qw(md5 md5_hex md5_base64);
my ($ftp, $host, $user, $pass, $handle, $dir, $fname, $hash);
$host = "ftp.mysite.com";
$user = "user";
$pass = "pass";
$dir = "/www/htdocs";
$fname = "robots.txt";
$ftp = Net::FTP->new($host, Debug => 0);
$ftp->login($user, $pass) || die "Bad login";
$ftp->cwd($dir) || die "Unable to change directories";
$ftp->get($fname) || die "Unable to download file";
$ftp->quit;
open($handle, $fname);
$_ = join '', <$handle>;
close($handle);
$hash = md5_hex($_);
print $hash;
You might want something more advanced that gets a directory list, queues all the files, downloads each one and hashes it, and requeues files that fail to download (up to x number of tries per file), but this should get you started. I don't feel like writing a whole application right now :)