Beefy Boxes and Bandwidth Generously Provided by pair Networks
Syntactic Confectionery Delight
 
PerlMonks  

Re: Viewing website locally

by malaga (Pilgrim)
on Jun 14, 2002 at 11:22 UTC ( [id://174455] : note . print w/replies, xml ) Need Help??


in reply to Generating static HTML pages to view a dynamic website locally

Thank you!

When you say regenerate the html pages, what do you mean? what would be the differenc between that and a crawler that creates the static pages? what i was thinking of was to organize and name the files well, and have the script go through each directory and create the pages and links accordingly.

I would just have to change ip number in the script, i guess, to use the server-on-her-pc solution. i don't think that's the best solution, but i can't back up that opinion right now.

Replies are listed 'Best First'.
(jeffa) 2Re: Viewing website locally
by jeffa (Bishop) on Jun 14, 2002 at 15:14 UTC
    "When you say regenerate the html pages, what do you mean?

    Consider the useful Album script which recursively scans a directory and builds thumbnails and HTML pages. You run it once, move the contents to your web server and when you need to update - repeat. (example)

    Conversely, i use a crawler for the DBIx::XHTML_Table site. I run mod_perl locally on my machine and when i make updates, i run a mirror on unlocalhost.com that contacts my local machine and downloads the HTML files.

    If you install a server on your client's PC, you shouldn't have to worry about changing IP numbers as long as all of the links are relative and not absolute - for example, don't use links in your HTML pages such as "http://somedomain.com/index.html" - use "/index.html" instead.

    jeffa

    L-LL-L--L-LL-L--L-LL-L--
    -R--R-RR-R--R-RR-R--R-RR
    B--B--B--B--B--B--B--B--
    H---H---H---H---H---H---
    (the triplet paradiddle with high-hat)