Beefy Boxes and Bandwidth Generously Provided by pair Networks
"be consistent"
 
PerlMonks  

Should we have PerlC and PRE?

by dingus (Friar)
on Dec 07, 2002 at 17:29 UTC ( [id://218268]=perlmeditation: print w/replies, xml ) Need Help??

The Is perl scalable? discussion got me thinking about this. Java has a "compiler" and a run time engine, so does .Net. Perhaps perl should have the same tools as a well publicised option.

Why?

Mainly for marketing reasons. As I don't see a technical benefit - and I can see a number of technical advantages with the current solution. As I see it there are two or three ways that detractors criticise perl. The first is that Perl is an interpreted language and therefore "slow", the second (aluded to in Rot13 Source Jumbling) is that your code can be "stolen" because the source is always visible. Thirdly (and in some ways its a part of #2) there is no easy way to bundle up the modules you need into one lump for easy distribution.

Technically it is reasonable to argue that these are in fact strengths (and I seem to recall some recent posts by merlyn doing just that) and I am not proposing that it be required that perl only have a 2 stage process. However I believe it could be advantageous to have the option of a perl "compiler" and Perl "runtime engine" available as an alternative for those who wish to ship code in a bytecode format.

In some ways this seems like a logical extension of the PAR module and/or perlcc program. What I am proposing is that the intermediate format, something like 'perlcc -B', become supported as a method of code distribution / execution.

Ok so what are the benefits?

Well, the big one is that it allows you to produce perl code that is run on regular user's machines - a program like popfile, which has been discussed on this site in Perl for the Masses, is potentially useful to millions of people but is messy when distributed in source form. For a commercial product, which you do not wish to release under an open source license, the messiness is even worse as the source code of your program including comments and everything is visible to all your customers unless you deliberatley run it through an obfuscator. If you are want to do that why not get some speed advantage also by shipping in bytecode format?

Another advantage is that it eliminates version mismatches. Now I have to admit that perl programs do not seem to suffer too much from version issues, but even the gods are not infallible and it is definitely useful - for mass distribution to non technical users at least - to be able to remove weird version glitches as a possible reason why the program doesn't work on a customer's machine.

Finally there are situations where you want run a perl script many times but need to be sure that the script starts from a consistent spot. The problem with the mod_perl approach is that you can get unexpected results because some data turns out to be persistent. Of course this sort of thing is a bug, but it is easier to make the scalability argument if you say that you can take the source code, precompile it and your service now goes N times faster.

Thoughts?

Dingus


Enter any 47-digit prime number to continue.

Replies are listed 'Best First'.
Re: Should we have PerlC and PRE?
by belg4mit (Prior) on Dec 07, 2002 at 19:20 UTC
    Here are some other resources you missed: App::Packer, Perl2EXE, PerlApp

    I don't believe in hiding in some unnecessary binary format, although I have done something similar to App::Packer in one of my own distributions.

    --
    I'm not belgian but I play one on TV.

Re: Should we have PerlC and PRE?
by gjb (Vicar) on Dec 07, 2002 at 17:44 UTC

    I think this approach definitely has merits. At the company I'm currently working for I've pioneered the use of Perl, so before Perl got "generally accepted" (this took about a year) my colleagues didn't use the software I wrote "because it was too much trouble to install" (not too difficult, mind you, too much trouble: modules and what not).

    Life would have been much easier with a tool to bundle software à la PAR or a PRE as suggested in this meditation. At some point a -- gasp -- Visual Basic program was used rather than my Perl code since the latter was a simple executable and the Perl version required installing a few modules.

    To conclude: I second this suggestion whole heartedly.

    Just my 2 cents, -gjb-

    Update: my thanks to grinder for suggesting the correct HTML entity for à.
      That's the kind of situation where a network installation is probably the best solution. Any way you slice it there's no good reason for 30 people in the same office to have 30 copies of HTML::Parser.

      --
      I'm not belgian but I play one on TV.

Re: Should we have PerlC and PRE?
by grantm (Parson) on Dec 07, 2002 at 20:34 UTC

    Don't forget that this will only work for pure Perl modules. XS modules get compiled to native object code at build/install time and this is stored in a platform specific dynamically loadable object (DLL or .so).

    One of the other posters mentioned HTML::Parser as an example - in fact, HTML::Parser includes an XS component, so only a very small part of it gets compiled to bytecode. Many useful modules are Perl wrappers for C libraries (GUI libraries, DBI/DBD, compression, encryption, etc) and also wouldn't be compatible with your proposal.

    But let's go back to the original problem - Perl doesn't allow you to use components without installing them first. Well hello! Write a VB app and you'll have to install both the VB runtime and any ActiveX controls you need on target machines. Java is the same - sure you can package stuff in a JAR file but any class which wraps a native library will need that library installed too. The typical Java response to this problem is to re-implement a library in Java. Even a C app requires you to have the required libraries installed (unless you squander resources and link everything statically).

      Write a VB app and you'll have to install both the VB runtime and any ActiveX controls you need on target machines.
      Now it is so, sadly. Thanks to OLE and ActiveX. But with VB3, one of the latest version of VB for 16-bits, it didn't need it: just put your executable, the VB runtime engine, and any VBX files you used (plus DLL's as used by the VBX'es behind the scenes) all in one directory on a CD-ROM, and you could run it on any PC which had a CD-ROM drive, without installation. It was a major advantage.

      And with a perl application "compiled" into one executable with perl2exe (or likely PerlApp, though I have not tried that), you can now (still) do the same with a Perl script. Even if it includes XS modules, as compiled DLL files are included as well.

Re: Should we have PerlC and PRE?
by BrowserUk (Patriarch) on Dec 08, 2002 at 03:08 UTC

    This subject was discussed recently at Pre-position musing on "standalone executables".

    As I said there, I too think that the ability to package perl scripts in a bytecode form would be extremley useful for a variety of reasons.

    • Ease of deployment.
    • Reduction in startup time.
    • A certain amount of protection for the author of commercial work.

    I think with Perl 6 and the parrot engine, many of the reasons that make this non-viable for Perl 5 tend to disipate.

    Once you can drop into Parrot to achieve things not possible (or too slow) in native perl, the need to use XS, I think, will likely go away. Of course there will still be modules that consist of interface wrappers around C-libraries and they would still need to be deployed (in binary form) along with bytecode applications and the perl runtime, but that is standard practice for many other languages; on the windows platform at least.

    If the parrot engine turns out to be as efficient as the promise suggests, maybe many of the current modules that use C-libraries could be re-built using a C to parrot translator?

    Whilst notionally, it may seem wasteful to package modules 'statically linked' with each application. In reality, the size of the bytecode will in almost every case be far less than the same modules source code. So concerns regarding squanders resources don't really hold up. This is especially true when you consider that each application would only need to carry with it the bytecode for the modules that it actually uses. For a very high percentage of applications, this means that not having to have all the core modules source code on the target machine (or available via a share), the static linking would result in a reduction of resource usage rather than an increase. Especially as the application could itself then be much more easily deployed across a network using a NFS-type solution than perl itself.

    Whilst it is true that any non-perl/parrot components of modules would still need to be compiled on each target system type, which in the *nix world of multiple variations means that CPAN would still need to serve sources rather than bytecode. In most corporate environments there are a limited number of platforms (usually 1 or 2 at most for sanities sake:), However, the number of each type can be in the thousands. Being able to bytecode compile and package internal applications targeted at those (internally) standard platforms would be a huge benefit.

    Whilst it can be argued that a similar effect can be achieved now by deploying Perl and the source of the application on a centralised server. In many corporate environments this is not the case. Take the example of the large retail chains. They have the need to deploy their internal applications not to several thousand workstation across one huge hetrogenous networked environment, but rather to a few dozen workstations at each of several hundred sites. In this environment, the need to set up a perl installation at each of those hundreds of sites means that perl does not get a look-in.

    The ability to build and deploy stand-alone applications would go a long way to easing perls entry into this kind of application. I can say from experience that in one very large project, perl was muted as an option (which I personal voted against at the time for entirely different reasons*), but was eventually squashed as an option purely on the basis of cost of installing and maintaining perl installations at 2000 sites!

    * I should add that were I now to be making the same decision, it would be different. Having seen the problems that arose from the alternative that was chosen, and with the benefit of my new found knowledge of perl, I truely wish that perl had been chosen for that project. I firmly believe that it would have saved an enormous amount of time, cost, trouble and resource.


    Okay you lot, get your wings on the left, halos on the right. It's one size fits all, and "No!", you can't have a different color.
    Pick up your cloud down the end and "Yes" if you get allocated a grey one they are a bit damp under foot, but someone has to get them.
    Get used to the wings fast cos its an 8 hour day...unless the Govenor calls for a cyclone or hurricane, in which case 16 hour shifts are mandatory.
    Just be grateful that you arrived just as the tornado season finished. Them buggers are real work.

Re: Should we have PerlC and PRE?
by Anonymous Monk on Dec 07, 2002 at 21:00 UTC
    A big part of the problem is: Who make these decisions? The community? Larry Wall? ORA???

    It would be nice if there was an organization or company who was not only involved with such decisions, but also had some financial muscle to put behind it.

    Personally, I look at the PAR project as a very exciting thing, but even if it is the right solution to any current problems, it will take a longer time to develop because Perl doesn't have any thing to put behind it.

    Some of you may view this as a good thing. I think you are wrong. Having such a company or organization would help Perl tremendously. Just look what IBM's and even Sun's adoption of Linux has done for Linux. Look what Zend has done for PHP. Shall I go on?

    It is going to take a lot more than Perl 6 to change the prevailing opinion of Perl as a good tool, but yesterday's solution.

      I find your comments intriguing. I think you are correct on one level, but i wonder on another. For instance, Perl is now bundled with just about every major *nix OS. MS put a bunch of money into Perl (see perlfork's credits) a lot of major companies donate to YAS so that various people can have their perl development financially supported. So the picture isnt so straightforward. Although I think its hard not to come to the conclusion that YAS doesnt do a very good job of publicizing Perl. Though this could be becuase there are enough hype engines out there and perl just wants to get the job done.

      --- demerphq
      my friends call me, usually because I'm late....

      A big part of the problem is: Who make these decisions?

      Whoever steps up to the plate. I don't see any reason why it would require Larry's or anyone else's official support. This is how open source works.

      Although, I'm still not sure how this differs from what Parrot promises to deliver. Unless I'm missing something It would seem better to work with them rather then starting your own project.

      Update: Changed link from parrotcode.com (which is unrelated) to parrotcode.org (the correct site). Thanks to grantm for pointing this out.

Re: Should we have PerlC and PRE?
by cjf-II (Monk) on Dec 08, 2002 at 00:52 UTC
    What I am proposing is that the intermediate format, something like 'perlcc -B' ....

    I've probably overlooked something here, but how will this 'intermediate format' differ from Parrot bytecode?

Re: Should we have PerlC and PRE?
by Abigail-II (Bishop) on Dec 09, 2002 at 18:06 UTC
    Perl isn't slow (compared to say, C) because it's interpreted. It isn't interpreted in the way we normally think about interpreted. Perl code gets compiled before it can be run - Perl just doesn't have a separate compiling phase. But the speed gain from caching the compilation is small. Perl is slow because it's so flexible. It allows for things like:
    push @array => splice @list, $foo, $bar, @new; $str .= "whatever";

    And throw in memory management, dynamic scoping, AUTOLOADING, evalling and lots of other things that make it fun to program in Perl, and you will have a significant speed penalty to pay. Compiling will not solve that - you still will need to do the same steps Perl is already doing (and it's doing them in C code).

    If for whatever reason you rather ship a binary instead of Perl code, you could always write a small C program, embed perl in the program, and pass your Perl program as a C string to the embedded perl interpreter. You might want to do some mungling of the code to defend against a strings attack.

    Of course, the real defence against "source code" stealing is by using proper contracts. I used the work for a company that sold software, often for prices over $1,000,000. We threw in the source code (C) for free (many dozens of Mbs) for any customer requesting it. For some customers, being able to get the source code was a must - it needed to be auditted. We never run into any problems of code being "stolen". But we did get bugfixes this way.

    Abigail

Re: Should we have PerlC and PRE?
by Aristotle (Chancellor) on Dec 10, 2002 at 15:20 UTC
    One problem you overlook is licensing. If you compile all those CPAN modules you're using into a package of your own, then you will have to conform to their licenses. I'm not sure that's what you wanted to end up with.

    Makeshifts last the longest.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlmeditation [id://218268]
Front-paged by gjb
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others having an uproarious good time at the Monastery: (6)
As of 2024-04-18 13:11 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found