Beefy Boxes and Bandwidth Generously Provided by pair Networks
laziness, impatience, and hubris
 
PerlMonks  

Generating static HTML pages to view a dynamic website locally

by malaga (Pilgrim)
on Jun 14, 2002 at 07:36 UTC ( [id://174429]=perlquestion: print w/replies, xml ) Need Help??

malaga has asked for the wisdom of the Perl Monks concerning the following question:

I have a client who wants me to build something to automate the showing off of her 1200 (and counting) pictures with text, and that's easy enough, but she wants to be able to download the website and view it locally without connecting the web, the way she can now with her static html site.

The only thing i could think of was to use perl to create the html pages. so the script would go out and find the pictures and text files according to their names (alpha and dates), and spit out and save an html file with the appropriate text and links to pictures. It would also need a process for linking all the pages. Then she would download the resulting html pages and image files and she would have her local copy.

it can't be an original problem. is there a perl module that does anything like this? She might be willing to load apache on her system, but then i'd have to adjust the script to run locally and I don't really think that's the answer either.

Any thoughts are appreciated.

Edited 2002-06-14 by mirod: changed the title (was: Viewing website locally).

  • Comment on Generating static HTML pages to view a dynamic website locally

Replies are listed 'Best First'.
Re: Viewing website locally
by Abigail-II (Bishop) on Jun 14, 2002 at 09:04 UTC
    Some things you can do:
    • Don't have any dynamic pages. Make the entire website static, and just regenerate it on a regular basis, or whenever a new image is uploaded. This has many advantages for the viewers: things can be cached and performance should be better (less strain on the server). It also means you can tar up the site and view locally.
    • Install a webserver locally. I fail to see what you need to do to adjust the script to "run locally". It's still run from the webserver.
    • Make a crawler than extracts all the pages from the server. You'd have to change all the links and turn them to static pages.
    None of this is Perl related. You would face the same problems, with the same solutions, had you written the site using Ada, Haskell or vi-macros.

    Abigail

Re: Viewing website locally
by Aristotle (Chancellor) on Jun 14, 2002 at 08:58 UTC
    Maybe HTML::WebMake fits your bill? There's also Engelschall's all ecompassing WML suite that uses Perl as well as a range of other text processing tools.

    Makeshifts last the longest.

•Re: Viewing website locally
by merlyn (Sage) on Jun 14, 2002 at 12:05 UTC
    In addition to the other solutions, there's nothing stopping you from running a local webserver to handle the dynamic content. I run websites "locally" on my laptop all the time: I have the same configuration on my laptop as I have on http://www.stonehenge.com for editing and testing new ideas at 30,000 feet while I'm travelling.

    Apache would be a good choice for this, or you could construct a mini web-server from HTTP::Daemon that understands the CGI protocol with a few dozen lines of code.

    -- Randal L. Schwartz, Perl hacker

Re: Viewing website locally
by malaga (Pilgrim) on Jun 14, 2002 at 11:22 UTC
    Thank you!

    When you say regenerate the html pages, what do you mean? what would be the differenc between that and a crawler that creates the static pages? what i was thinking of was to organize and name the files well, and have the script go through each directory and create the pages and links accordingly.

    I would just have to change ip number in the script, i guess, to use the server-on-her-pc solution. i don't think that's the best solution, but i can't back up that opinion right now.

      "When you say regenerate the html pages, what do you mean?

      Consider the useful Album script which recursively scans a directory and builds thumbnails and HTML pages. You run it once, move the contents to your web server and when you need to update - repeat. (example)

      Conversely, i use a crawler for the DBIx::XHTML_Table site. I run mod_perl locally on my machine and when i make updates, i run a mirror on unlocalhost.com that contacts my local machine and downloads the HTML files.

      If you install a server on your client's PC, you shouldn't have to worry about changing IP numbers as long as all of the links are relative and not absolute - for example, don't use links in your HTML pages such as "http://somedomain.com/index.html" - use "/index.html" instead.

      jeffa

      L-LL-L--L-LL-L--L-LL-L--
      -R--R-RR-R--R-RR-R--R-RR
      B--B--B--B--B--B--B--B--
      H---H---H---H---H---H---
      (the triplet paradiddle with high-hat)
      
Re: Generating static HTML pages to view a dynamic website locally
by malaga (Pilgrim) on Jun 14, 2002 at 18:40 UTC
    got it. thanks a bunch. i love what the crawler does with hardly any code. i'll give her the choice of that or installing a webserver. or both.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://174429]
Approved by claree0
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others examining the Monastery: (2)
As of 2024-04-26 03:51 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found