http://qs321.pair.com?node_id=1192892

jmlynesjr has asked for the wisdom of the Perl Monks concerning the following question:

My Experience: I have done zero web programming.

Background: The Bitx40 is a 40 meter Ham transcevier kit. Designed by Ashar Farhan as a very economical entry point for Indian Hams into Single Sideband(SSB) amateur radio, the Bitx40 has now become popular with Hams world wide. This low power(QRP) radio is an excellent platform for experimentation with both hardware and Arduino software(the VFO and display). A very active discussion group is located at https://groups.io/g/BITX20/topics. A group member has created a google map(http://BITXmap.com) that members may add their location to.

The Problem: The map creater would prefer to not be the map Czar and would like to leave the map open for members to maintain their own information. Unfortunately, bungled updates have trashed portions of the map several times. The map can be backed up by manually downloading a .kml file, a royal pain...

The Map: The map has two panes. The left pane is information/configuration oriented. The right pane is the map. One of the left pane pull-down menus leads to a "Download KML File" choice. Clicking through several pop-up boxes will download the .kml file.

Possible Solution: It would seem that a Perl script in a cron job could automate these backups.

The Question: Is this a reasonable approach and where are the snakes?

Hey, it's a start.

#! /usr/bin/perl # Script to experiment with the WWW::Mechanize::Firefox module # Must install WWW::Mechanize::Firefox from CPAN(sudo cpanm WWW::Me +chanize::Firefox) # Must install MozRepl Firefox plug-in Tools->Add-ons # (MozRepl is in the Firefox Web Development Add-ons section) # Must start MozRepl from Firefox Tools->MozRepl->Start use strict; use warnings; use WWW::Mechanize::Firefox; # Bring up the Bitx Map, Firefox needs to be running in this version my $mech = WWW::Mechanize::Firefox->new( activate => 1 ); $mech->get('http://BITXmap.com'); # ???Click through to download .kml file to backup the map(add to cron + job???)

James

There's never enough time to do it right, but always enough time to do it over...

Replies are listed 'Best First'.
Re: Automating Backup of a Google Map
by SuicideJunkie (Vicar) on Jun 15, 2017 at 22:16 UTC

    Perhaps, instead of simply writing code to help clean up the mess, what about preventing the making of messes in the first place? IE; Have a script to validate map update requests and apply them safely.

      I also agree that this would be the best long term approach.

      James

      There's never enough time to do it right, but always enough time to do it over...

Re: Automating Backup of a Google Map
by RonW (Parson) on Jun 15, 2017 at 23:15 UTC

    I agree with SuicideJunkie.

    Looks like it would be easy to create a simple webpage to collect the operator information, validate it, format it into KML, then queue it somewhere (maybe email) for a moderator to approve and append to the master KML file.

    Wikipedia has a sample of a basic KML entry.

Re: Automating Backup of a Google Map
by marto (Cardinal) on Jun 16, 2017 at 08:38 UTC

    As a side note open street map may be worth considering for more control, easier maintenance. With regards automating a backup of the existing Google maps via WWW::Mechanize::Firefox, use the developer tools to inspect and examine the page, note down what you have to click to do it manually, then recreate this programatically.

      marto, this would have been my next path to research. Thanks for the confirmation.

      James

      There's never enough time to do it right, but always enough time to do it over...

Re: Automating Backup of a Google Map
by RonW (Parson) on Jun 15, 2017 at 21:06 UTC

    Perhaps one of the Google related modules on CPAN might be helpful.

      RonW thanks for the clue.

      I found Geo::Google::MyMap::KMLURL. It seems to implement the first part of Sinistral's suggestion below.

      James

      There's never enough time to do it right, but always enough time to do it over...

Re: Automating Backup of a Google Map
by Sinistral (Monsignor) on Jun 16, 2017 at 14:40 UTC

    You can automate the web interface in Google the Google Maps "My Maps", but there is an easier way to get to the raw KML that you might not have known about.

    For the current owner of the map within the Google My Maps, use a web browser to choose "Export to KML" and choose "Entire Map" in the dropdown, and choose the checkbox for "Keep data up to date with network link KML (only usable online)". That export will give you a KMZ file. Use an unzip program (7zip, unzip, Winzip, anything that understands zip format) to see the contents. Within the zip file will be a doc.kml file. Examine the contents of the doc.kml in a text editor. There is a <link> element which has within it a <href> element. That is the URL you can use with wget/curl/LWP to to your backup file retrieval.

      Sinistral, I saw this approach in Geo::Google::MyMap::KMLURL, but I wasn't sure what is was telling me.

      This looks like the best short term approach. Thank you very much!

      James

      There's never enough time to do it right, but always enough time to do it over...

      Update:

      Feedback from the map Czar, Doug.

      "Below is the contents of the batch file with the wget. the italicized all caps words are things I took out to anonymize the information.

      cd /users/NAME/documents/FOLDER /Windows/GnuWin32/bin/wget http://www.google.com/maps/d/u/1/kml?mid=FI +LE ID --no-check-certificate

      This saves it to the folder in the batch file with the name kml@mid=FILE ID.NUMBER Then all you have to do is rename it NAME.kmz and it works perfectly. there might be a better way with the file names but this works fine for our purpose. it only has to be renamed if I need to restore the map. so far it is churning away as planned. I appreciate the help. you probably have more time in finding this solution than I do in setting up the map."

      Doug is running the .bat file hourly using Windows Scheduler. Thanks again to all that contributed.

      Edit: Callsign removed by request.

      James

      There's never enough time to do it right, but always enough time to do it over...

        There is a problem with storing too many files in a single directory as the directory itself gets fragmented and performance starts to suffer. There are indications that this starts at lower counts but i start to see it when directories have more than about 1000 files. If you were to store one file every hour this would start somewhere around 40 days in.

        a second problem with your method is is that "duplicate" files are stored. There realy is no need to save the newest file if it is the same as the last one.

        To solve the first problem i create a folder tree to store the files in. such as /users/NAME/documents/FOLDER/y2017/m06 or /users/NAME/documents/FOLDER/y2017/m06/d18 (i have a program that may store a new file every 3 min, this can mean 480 files a day)

        To solve the second problem i tend to compare the last and newest files and not put the new file into the history tree if they are the same. In your case the files inside the zip get a new datetime every pull, so you have to extract the relivant 'doc.kml' files and compare those.

        The following program should do both of these tasks

        use strict; use warnings; use Getopt::Long qw/GetOptions/; use LWP; use File::Basename qw/dirname basename/; use File::Copy qw/copy/; my $url ='http://www.google.com/maps/d/u/1/kml?mid=1Oa_e +gVdStSJBF5C7mpS6MXrkces'; #my $topdir ='/users/NAME/documents/FOLDER'; my $dir ='D:/goodies/pdhuck/down1/perl/monks/kmlbackup'; + my $ftype ='zip'; my $debug =0; my %optdef=("debug=i" => \$debug ,"url=s" => \$url ,"dir=i" => \$dir ); GetOptions ( %optdef ) or die("Error in command line arguments\n"); die $dir.' must exist' unless (-d $dir); my $lastdir=$dir.'/last'; # unless (-d $lastdir && -w $lastdir) unless (-d $lastdir) { mustdir ($lastdir); } my $now=time; my ($sec,$min,$hour,$mday,$mon,$year,$wday,$yday,$isdst) =gmtime($no +w) ; my $nicefmt='/y%04u/m%02u/d%02u-%02u-%02u-%02u-Z'; # last dir is mo +nth # my $nicefmt='/y%04u/m%02u/d%02u/%02u-%02u-%02u-Z'; # last dir is d +ay my $nice=sprintf($nicefmt,$year+1900,$mon+1,$mday,$hour,$min,$sec); my $lastfn=$lastdir.'/lastbackup.'.$ftype; my $nextfn=$lastdir.'/nextbackup.'.$ftype; my $ua=LWP::UserAgent->new(agent =>"libwww-perl-kmzbackup"); my $req = new HTTP::Request (GET => $url); my $request = $ua->request ($req); unless ($request->is_success) { die 'get failed for '.$url.' '.$requ +est->status_line;} open (my $nextout,'>',$nextfn) or die 'cant open '.$nextfn; binmode $nextout; print $nextout $request->decoded_content; close $nextout; my $aresame=1; my $compfile='doc.kml'; if (-f $lastfn) { use IO::Uncompress::Unzip qw(unzip $UnzipError) ; use IO::File; my $nextmember = new IO::Uncompress::Unzip($nextfn, Name => $com +pfile) or die "IO::Uncompress::unzip failed: $UnzipError\n"; my $lastmember = new IO::Uncompress::Unzip($lastfn, Name => $com +pfile) or die "IO::Uncompress::unzip failed: $UnzipError\n"; while ($aresame && ( my $nextline=<$nextmember>) && (my $lastli +ne=<$lastmember>) ){ unless ($nextline eq $lastline ) {$aresame=0} } if ($aresame && (my $nextline=<$nextmember>) ){ $aresame=0} if ($aresame && (my $lastline=<$lastmember>) ){ $aresame=0} close($nextmember); close($lastmember); } else { $aresame=0;} if ($aresame) { print "No change to file $compfile \n"; unlink $nextfn; exit; } my $endfn=$dir.$nice.'.'.$ftype; my $endfn0=basename($endfn); my $enddir=dirname($endfn); unless (-d $enddir) { mustdir($enddir); } copy($nextfn,$endfn) or die "Copy failed: $!"; print 'new backup:'.$endfn."\n"; copy($nextfn,$lastfn) or die "Copy failed: $!"; unlink $nextfn; exit; sub mustdir { my $dir=shift; return if (-d $dir); my $updir=dirname($dir); mustdir ($updir); mkdir $dir; unless (-d $dir) {die 'cant make dir:'.$dir; } } # mustdir