Beefy Boxes and Bandwidth Generously Provided by pair Networks
XP is just a number

Transferring a Catalyst installation

by dwm042 (Priest)
on Nov 02, 2010 at 13:59 UTC ( [id://868993] : perlquestion . print w/replies, xml ) Need Help??

dwm042 has asked for the wisdom of the Perl Monks concerning the following question:

In preparation for a talk I'm to give in December, I've been building Catalyst installations in all the various ways described in Diment and Trout. I have three virtual servers running four different Catalyst configurations (from distro packages, from system Perl using CPAN and local::lib, from custom Perl using CPAN). Trying to move some of my original Catalyst work from one of these installations to another has proved to be challenging.

Fedora is behind Ubuntu and apt-get. Suse is on Perl 5.12, the others on 5.10. CPAN is more up to date than any distro. More so, going from one Catalyst 5.8 subvariation to another by a simple tarball results in Catalyst "guts" that don't sync. I'm not 100% certain, but I think a successful transfer would be a bit like baking a pie. You need to start out with a fresh "crust" (new components created using the script) each time. Just fill in the "crust" with your code (your "filling" using this analogy) and go.

Does anyone else have any experience with this, and what were their results?


Replies are listed 'Best First'.
Re: Transferring a Catalyst installation
by Your Mother (Archbishop) on Nov 02, 2010 at 17:10 UTC

    What CountZero said plus make sure your Moose and Class::MOP stuff jibe (there have also been issues with Mech and LWP regarding non-meaningful testing failures). I love Moose but it has been a sore spot in keeping things stable for quite awhile.

    If all the requirements are in you should definitely be able to use any given Cat app in various spaces. Also make sure your PERL5LIB, etc, isn't causing problems. If it's still not working, maybe come back with some examples of the errors / problems?

Re: Transferring a Catalyst installation
by jethro (Monsignor) on Nov 02, 2010 at 15:03 UTC
    This "crust" method of supplying templates for your own code is one of two reasons why I'm not comfortable with catalyst and I'm really thinking of using CGI::Application in my upcoming web project.

    Do you have any info on how the crust changed over time? Should be easy to find out, just use with an old catalyst version and a new one, then diff. If you see functional differences then catalyst has a design weakness, otherwise it probably is no fault of the "crust". It also depends on what else is doing. If it modifies internal configs or lists, it might be even worse. But that is a question best put to someone working in the catalyst team

      I have to agree with you about the crust. I've had to patch Catalyst::Helper's templates (site specific; don't ask) to get catalyst to generate working scripts. It'd be nice to have a standard way that a site could hook site-specific stuff in here without having to maintain this kind of thing.

      You said two reasons. What's the other one?

        The other reason is that I think the strict MVC model is not consistent with a good object oriented design . Not my own idea, there was a nice article that convinced me (it may have been the paper talked about here UPDATE: No, it was a different one, but making a similar point).

        Now I have not much practice in OO programming and none in web programming, so don't listen to me. But the paper makes some very good points.

        Also the data of my web project is very "departmentalized" (i.e. one web page would usually access two and write to one small part of the whole information), this makes files instead of a database possible and quite frankly I loathe databases when it comes to maintenance and debugging

        Catalyst seems to adhere more strictly to the MVC design, with separation strictness and its reliance on a database, so that CGI::Application may be a better fit for me even though I also spotted some things I don't like there too.

        I really haven't decided yet. For example catalyst seems to depend on database use, only the older of two books about Catalyst mentions how to avoid a database, but it still seems possible to do. Same with object handling. Maybe I have to do a prototype first to help me decide

Re: Transferring a Catalyst installation
by CountZero (Bishop) on Nov 02, 2010 at 15:39 UTC
    The actual Perl version should not matter much, provided you have the same Catalyst version on all your systems.


    A program should be light and agile, its subroutines connected like a string of pearls. The spirit and intent of the program should be retained throughout. There should be neither too little or too much, neither needless loops nor useless variables, neither lack of structure nor overwhelming rigidity." - The Tao of Programming, 4.1 - Geoffrey James

Re: Transferring a Catalyst installation
by juster (Friar) on Nov 03, 2010 at 15:22 UTC

    If you have virtual servers you could always create a separate prefix for the webapp. Then you compile and install perl into the prefix. You can usually get better performance from compiling your own perl (ie without threads.) Another plus is you don't have to worry about upgrading the system perl harming your webapp.

    You could then tarball up the whole thing and not worry about the "crust" and "filling". Or make the whole "crust" from scratch on the next machine and only tarball the filling... your Catalyst app. This takes a little longer to compile everything but is very straightforward and like I said you have less to worry about in the future.

    I think this is analogous to what CountZero and Your Mother have said. I like compiling perl as well because XS modules (ie Class::MOP) won't break on a major system perl upgrade.