Beefy Boxes and Bandwidth Generously Provided by pair Networks
Just another Perl shrine
 
PerlMonks  

Bootstrapping and cleaning after remote execution of method in framework consisting of multiple modules

by mman (Novice)
on Aug 09, 2007 at 12:52 UTC ( [id://631538]=perlmeditation: print w/replies, xml ) Need Help??

We have a framework consisting of 40+ modules. It is placed as tar'ed archive on remote linux host in '~/.folder'. After execution of one-two methods of one-two classes it must be purged out. The problen is: When you're messing up with one file, you can remove it rigth after execution. This perl we have many files, that are dynamically 'use'd. So we can't control successful removing of all files if connection lost. Extracting files to temporary directory in temp is not panacea. Loading all modules for each short-time execution is he right way to kill all perfomance. Is there any way to keep all modules in one file without perfomance penalities? Any ideas?
  • Comment on Bootstrapping and cleaning after remote execution of method in framework consisting of multiple modules

Replies are listed 'Best First'.
Re: Bootstrapping and cleaning after remote execution of method in framework consisting of multiple modules
by moritz (Cardinal) on Aug 09, 2007 at 14:13 UTC
    If these modules are not too big you might consider extracting them to a ramdisk temporarily. Maybe you could even implement some kind of dependency handling to extract only those files you need.

    Somehow it sounds like your setup could be improved. What does your framework do, and why can't it be installed permanently? Do you have root access (if only once) on your maschine?

      Tar with framework at this moment is about 300K in size. It would be too difficult to implement (what modules we need is determined before the connection is made, but on no-perl available side where it's too damn hard to build dependancy tree of files). Framework executing some remote unix binaries, parses output and formats it to XML. It can't be installed pemanently cause each connection (SSH) can use different version of framework. Root access is possible, but not always allowed (User can login with root and generic user credentials). The only two solutions I blindly see are: 1) Compile framework in one huge file and delete it after interpretation immidiately. Possible perfomance degradation (as we interpreting all modules at once, not just what we need). 2) On each connection create 'cron' entry which executing some script monitoring connection and if it's stale or disconnected clean up framework files and itself.
        What about deleting the temporary directory while running the script? On Debian GNU/Linux 4.0 "etch" the following works for me:
        # file test/script.pl use Foo; system('cd ..; rm -rf test/'); print Foo::bar(); # file test/Foo.pm package Foo; sub bar { return "I'm happy\n"; } 1

        If I run script.pl from its directory, it loads package Foo correctly, and deletes itself and the module. Afterwards both the script and the module still work.

        If you implement that (very carefully, make sure you delete the right directory), you can clean up behind you very neatly.

        A big warning: you can cause much damage with such a script...

        "It [the framework] can't be installed pemanently cause each connection (SSH) can use different version of framework."

        How many different versions are there? Can't you just store them permanently in a version-specific directory? (Copy over if needed, otherwise reuse the existing one.) Copying nothing is a lot faster than copying something.

        If the number of different ones is going to grow without bound, then use 'find -ctime +3' or something to prune out the old ones. Then you'll have a time-bounded LRU cache. Whee!

Re: Bootstrapping and cleaning after remote execution of method in framework consisting of multiple modules
by shmem (Chancellor) on Aug 09, 2007 at 17:45 UTC
    It is possible to have many modules in one single file. To shorten startup time and delay compilation of methods until they are actually used, I'd do something like SelfLoader does.

    I'd put all delayable code into the __DATA__ section, and provide each package with its start and end offsets into the __DATA__ section, in a hash table (initially written with enough whitespace for the values as placeholder for yet unknown offsets).

    That way an AUTOLOAD subroutine would seek to that point, read the appropriate amount of bytes, eval them and goto the just compiled sub or method.

    --shmem

    _($_=" "x(1<<5)."?\n".q·/)Oo.  G°\        /
                                  /\_¯/(q    /
    ----------------------------  \__(m.====·.(_("always off the crowd"))."·
    ");sub _{s./.($e="'Itrs `mnsgdq Gdbj O`qkdq")=~y/"-y/#-z/;$e.e && print}
      Good solution for having all methods in one file, but we have tested another, less effective in perfomance, but clean-after-himself: we just 'use' all modules in framework (that is not so slow as we expected, about 0.2 sec). After 'use' each module is deleted, first of all starter.pl is runned by 'perl' and with '-d' option, when it's started it just deletes itself. So we got all necessary modules and no garbage in the directory.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlmeditation [id://631538]
Approved by ww
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others perusing the Monastery: (2)
As of 2024-04-19 20:27 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found