http://qs321.pair.com?node_id=1193068


in reply to Re^2: Automating Backup of a Google Map
in thread Automating Backup of a Google Map

There is a problem with storing too many files in a single directory as the directory itself gets fragmented and performance starts to suffer. There are indications that this starts at lower counts but i start to see it when directories have more than about 1000 files. If you were to store one file every hour this would start somewhere around 40 days in.

a second problem with your method is is that "duplicate" files are stored. There realy is no need to save the newest file if it is the same as the last one.

To solve the first problem i create a folder tree to store the files in. such as /users/NAME/documents/FOLDER/y2017/m06 or /users/NAME/documents/FOLDER/y2017/m06/d18 (i have a program that may store a new file every 3 min, this can mean 480 files a day)

To solve the second problem i tend to compare the last and newest files and not put the new file into the history tree if they are the same. In your case the files inside the zip get a new datetime every pull, so you have to extract the relivant 'doc.kml' files and compare those.

The following program should do both of these tasks

use strict; use warnings; use Getopt::Long qw/GetOptions/; use LWP; use File::Basename qw/dirname basename/; use File::Copy qw/copy/; my $url ='http://www.google.com/maps/d/u/1/kml?mid=1Oa_e +gVdStSJBF5C7mpS6MXrkces'; #my $topdir ='/users/NAME/documents/FOLDER'; my $dir ='D:/goodies/pdhuck/down1/perl/monks/kmlbackup'; + my $ftype ='zip'; my $debug =0; my %optdef=("debug=i" => \$debug ,"url=s" => \$url ,"dir=i" => \$dir ); GetOptions ( %optdef ) or die("Error in command line arguments\n"); die $dir.' must exist' unless (-d $dir); my $lastdir=$dir.'/last'; # unless (-d $lastdir && -w $lastdir) unless (-d $lastdir) { mustdir ($lastdir); } my $now=time; my ($sec,$min,$hour,$mday,$mon,$year,$wday,$yday,$isdst) =gmtime($no +w) ; my $nicefmt='/y%04u/m%02u/d%02u-%02u-%02u-%02u-Z'; # last dir is mo +nth # my $nicefmt='/y%04u/m%02u/d%02u/%02u-%02u-%02u-Z'; # last dir is d +ay my $nice=sprintf($nicefmt,$year+1900,$mon+1,$mday,$hour,$min,$sec); my $lastfn=$lastdir.'/lastbackup.'.$ftype; my $nextfn=$lastdir.'/nextbackup.'.$ftype; my $ua=LWP::UserAgent->new(agent =>"libwww-perl-kmzbackup"); my $req = new HTTP::Request (GET => $url); my $request = $ua->request ($req); unless ($request->is_success) { die 'get failed for '.$url.' '.$requ +est->status_line;} open (my $nextout,'>',$nextfn) or die 'cant open '.$nextfn; binmode $nextout; print $nextout $request->decoded_content; close $nextout; my $aresame=1; my $compfile='doc.kml'; if (-f $lastfn) { use IO::Uncompress::Unzip qw(unzip $UnzipError) ; use IO::File; my $nextmember = new IO::Uncompress::Unzip($nextfn, Name => $com +pfile) or die "IO::Uncompress::unzip failed: $UnzipError\n"; my $lastmember = new IO::Uncompress::Unzip($lastfn, Name => $com +pfile) or die "IO::Uncompress::unzip failed: $UnzipError\n"; while ($aresame && ( my $nextline=<$nextmember>) && (my $lastli +ne=<$lastmember>) ){ unless ($nextline eq $lastline ) {$aresame=0} } if ($aresame && (my $nextline=<$nextmember>) ){ $aresame=0} if ($aresame && (my $lastline=<$lastmember>) ){ $aresame=0} close($nextmember); close($lastmember); } else { $aresame=0;} if ($aresame) { print "No change to file $compfile \n"; unlink $nextfn; exit; } my $endfn=$dir.$nice.'.'.$ftype; my $endfn0=basename($endfn); my $enddir=dirname($endfn); unless (-d $enddir) { mustdir($enddir); } copy($nextfn,$endfn) or die "Copy failed: $!"; print 'new backup:'.$endfn."\n"; copy($nextfn,$lastfn) or die "Copy failed: $!"; unlink $nextfn; exit; sub mustdir { my $dir=shift; return if (-d $dir); my $updir=dirname($dir); mustdir ($updir); mkdir $dir; unless (-d $dir) {die 'cant make dir:'.$dir; } } # mustdir