http://qs321.pair.com?node_id=1106252


in reply to Re: The future of Perl?
in thread The future of Perl?

Did you answer "Yes" to my first question?

Yes. With caveats. I would be more than willing to expend some, or even a lot, of my time contributing to giving Perl a future.

OK. Release to CPAN. Report bugs. Fork projects. Spread knowledge.

All completely pointless. Just business as usual. More of the same.

The caveats:

  1. I have to believe that what I am expending my time on, will achieve that goal.

    Pretty much anything less than a full fork of the code base, that ditched the existing revision history and all the out-of-date OS support and huge swaths of other historical gunk wouldn't interest me.

  2. The future being aimed for has sufficient support from enough others, and significant others, in the community to allow it to be seen and announced as a community goal.

    The bottom line here is that unless it garnered the support and active involvement of at least some of the more active and less entrenched guys from p5p; there is no point in starting it.

  3. The time frame for the goal has to be such that it can be reasonably predicted to be achievable before it is too late.

    I know many would say that it is already too late; but I think that if the right goal was chosen, and it could be achieved with 2 to 3 years, I would be prepared to try and help.

What should that goal be? I have my ideas; but it would be pointless to lay them out; my ideas would be a magnet for wide-spread, cursory dismissal.

It would require widespread and public consultation -- no hidden enclaves behind closed doors by small groups of yesterdays in-crowd -- and wide(ish) agreement by a sufficiently capable and influential group of proven contributors and if not totally new blood; at least enough occasional contributors and (perhaps) returning disillusioned, to give a core of willing people to make it happen.

And *ALL* of them would have to have an equal voice in the discussion of what gets done and why; even if not in the final decisions.

And it would have to happen fast. And that means no high horses, entrenched positions, or appeals to higher, prior (historical) authority.

It's not going to happen; but it could with sufficient good will.


With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
"Science is about questioning the status quo. Questioning authority".
In the absence of evidence, opinion is indistinguishable from prejudice.

Replies are listed 'Best First'.
Re^3: The future of Perl?
by choroba (Cardinal) on Nov 05, 2014 at 21:45 UTC
    How do other languages get by with so few modules available?
    Which ones do you mean?
    لսႽ† ᥲᥒ⚪⟊Ⴙᘓᖇ Ꮅᘓᖇ⎱ Ⴙᥲ𝇋ƙᘓᖇ

      So, things moved on from when I last looked at Python & Ruby -- it seems they've also embraced the 'never mind the quality, feel the width' attitude -- but they are the wrong targets.

      See Re^4: Would you stay with Perl if there were no CPAN?. In particular note:

      "only 2% of them seem to be downloaded/released with any regularity, and indeed about the same 2% look to be the only ones I could imagine more than a handful of people ever finding useful, ever, just based on their problem space."

      and

      "Reading between the lines, they're [Haskell developers] trying to optimize for minimalism, efficiency, and elegance long-term, even in the published libraries, in exchange for some of the "benefits" of more "flood algorithm"-y approaches... As a result, the vast majority of Hackage packages implement thousands of "known" algorithms and standardized protocols/interfaces, making them very useful to scientists and other users of "hard" comp-sci. While not preaching "one way to do it", in most cases there is only one choice because it is so definitively/obviously optimum, there's no reason to ask the question if you really understand the problem space."

      and

      "don't compare Perl to Python or Ruby (same case with PHP) anymore, the three of those are so far behind the .NET ecosystem, the Java monster/monstrosity, and the less visible but ubiquitous JavaScript juggernaut, that if you want to talk about growing the Perl userbase by embracing and extending the other language communities, you should try to target the 90% of the the "trained" professional programmers who use the plurality languages/systems, not the other 10%."

      I can't say it better. More packages won't help, unless those packages are authoritatively written and used by experts in vertical markets that are in current demand and growth. If nobody is using Perl; there is nobody to write those packages.


      With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
      Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
      "Science is about questioning the status quo. Questioning authority".
      In the absence of evidence, opinion is indistinguishable from prejudice.
Re^3: The future of Perl?
by salva (Canon) on Nov 10, 2014 at 22:50 UTC
    all the out-of-date OS support and huge swaths of other historical gunk

    Unfortunately in relative terms, that accounts for mostly nothing.

    99% of the complexity of the perl interpreter/compiler/runtime is due to its initial design. A clear example of premature and abusive optimization... maybe it made sense twenty years ago but nowadays it is just a heavy burden stopping perl 5 development in anything but trivial matters, and specially in getting new blood into it.

    Anybody wanting to advance perl 5 seriously, should consider starting from scratch!

      Anybody wanting to advance perl 5 seriously, should consider starting from scratch!

      Its called Perl 6.

      Unfortunately in relative terms, that accounts for mostly nothing. 99% of the complexity of the perl interpreter/compiler/runtime is due to its initial design.

      That's why you reduce & refactor.

      30 something years ago I was contracted by IBM to work a maintenance gig on DB2. It was at that point written in COBOL of which I had minimal experience. I did a 4 week course on it at college.

      I (typically) went into the APAR database and picked the longest outstanding bugs to tackle. One example was a sev.4 that had been outstanding for 4 years. I spent 3 days trying to understand the bug report in the context of the code; and two more badgering a user to reproduce the problem.

      Once I understood the problem I tracked the bug to a 2500 line procedure that, upon inspection, although I understood what it was meant to do, I couldn't, from reading the code, work out how it did it!

      So, I threw away the entire body of the procedure, retaining only the inputs and outputs and set about rewriting it to perform the task it was documented as performing. The result was the reduction of a 783 (those who know binary will understand why the figure sticks in my brain), chunk of code to around 20 lines.

      1. It fixed the problem.
      2. It functioned correctly. Ie. It passed all existing tests plus a bunch I added.
      3. It was rejected.

        I could not identify the specific line or lines of code that cured the APAR.

      It was the only contract I arranged to leave early -- I bought my way out of it.

      Two conclusions:

      1. Set-in-stone rules and procedures are the single biggest barrier to progress.

        p5p has lots of them.

      2. Any piece of code greater than 20 lines can be improved; especially if freed of the constraints of history.

        Some of Perl's functions are 1000s of lines long.

      You may be right -- you usually are -- but, given sufficient will, I'd be prepared to expend some time to trying to prove you wrong.

      C'est la vie.


      With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
      Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
      "Science is about questioning the status quo. Questioning authority".
      In the absence of evidence, opinion is indistinguishable from prejudice.

        It was called Perl 6 and then it went insane.

        A complete rewrite of the guts with the cleanup and change of a few warts and with a few additions would make sense and would rightfully be called Perl 6. A bit akin to what Topaz planed to be, but without the decision to use C++. It might even make sense, if the project started a little later, to target the java or .net runtime (think Mono).

        It might even have been released about ten years ago while "the inner circle" would be busy writing the synopses of a new megalanguage with a cool codename intended to become the Perl successor sometime in the future or a little later.

        Jenda
        Enoch was right!
        Enjoy the last years of Rome.

      Anybody wanting to advance perl 5 seriously, should consider starting from scratch!

      Though I too would love to see that, I doubt it is realistic. Remember, both Topaz and Ponie were attempted and abandoned. The best we can hope for is probably a slow, relentless refactoring.

      Of course, the "old, huge, tangled code base problem" is not specific to the Perl 5 internals. It is a chronic problem in the software industry, even afflicting the blessed Perl Monks code base. :)

      The problem isn't an infrastructure issue, however -- and speaking as one of the handful of people who've had a hand in developing the site's software: It's our own gosh-darn fault. Perlmonks is WAY more complex than when it originally launched. It does a crapload of perl evals and sql queries per page. It's vulnerable to resource hogs. Searching can cripple the database. And right now, I don't think we're gonna fix these problems any time soon. ... It's not a matter of computer resources, as much as human engineering resources.

      -- Re: perlmonks too slow by nate (original co-author of the Everything Engine)

      Some relevant quotes from Nobody Expects the Agile Imposition (Part VI): Architecture follow:

      Have you ever played a game called Jenga? The idea behind Jenga is that you start by making a tower of blocks. Each player removes a block from somewhere in the tower, and moves it to the top of the tower. The top of the tower looks tidy, but it's very heavy and the bottom of the tower is growing more and more unstable. Eventually, someone's going to take away a block from the bottom and it'll all fall down.

      I came into Perl development quite late, and I saw a very intricate, delicate interplay of ideas inside the Perl sources. It amazed me how people could create a structure so complex and so clever, but which worked so well. It was only much later that I realised that what I was seeing was not a delicate and intricate structure but the bottom end of a tower of Jenga. For example, fields in structures that ostensibly meant one thing were reused for completely unrelated purposes, the equivalent of taking blocks from the bottom and putting them on the top.

      -- The Tower of Perl by Simon Cozens

      The perl5 internals are a complete mess. It's like Jenga - to get the perl5 tower taller and do something new you select a block somewhere in the middle, with trepidation pull it out slowly, and then carefully balance it somewhere new, hoping the whole edifice won't collapse as a result.

      -- Nicholas Clark

      I remember one company that has about 120 engineers, developers of all kinds of whom 10 are still able to work on the core functionality. The other 110 are working on new stuff. We brought all the engineers into the room. We said, okay, the product manager for the first area and the lead engineer for the first area come on up here. Now select the people you need to do this work over the next month, including, of course, the core engineers. And they did and we said, okay, now leave, get out of here and start working. ... when we got to the fifth product manager and the lead engineer and they said we can't do anything. There's no core engineers left. We looked around the room and there were 60 engineers left. They were thoroughly constrained by the core piece of functionality.

      If you have enough money, you rebuild your core. If you don't have enough money and the competition is breathing down your neck you shift into another market or you sell your company. Venture capitalists are into this now, buying dead companies. Design-dead software.

      -- Ken Schwaber, Google tech talk on Scrum, Sep 5, 2006 (38:40)

      Netscape 6.0 is finally going into its first public beta. There never was a version 5.0. The last major release, version 4.0, was released almost three years ago. Three years is an awfully long time in the Internet world. During this time, Netscape sat by, helplessly, as their market share plummeted. It's a bit smarmy of me to criticize them for waiting so long between releases. They didn't do it on purpose, now, did they? Well, yes. They did. They did it by making the single worst strategic mistake that any software company can make: They decided to rewrite the code from scratch.

      It's important to remember that when you start from scratch there is absolutely no reason to believe that you are going to do a better job than you did the first time. First of all, you probably don't even have the same programming team that worked on version one, so you don't actually have "more experience". You're just going to make most of the old mistakes again, and introduce some new problems that weren't in the original version.

      -- Joel Spolsky on not Rewriting

      Now the two teams are in a race. The tiger team must build a new system that does everything that the old system does. Not only that, they have to keep up with the changes that are continuously being made to the old system. Management will not replace the old system until the new system can do everything that the old system does. This race can go on for a very long time. I've seen it take 10 years. And by the time it's done, the original members of the tiger team are long gone, and the current members are demanding that the new system be redesigned because it's such a mess.

      -- Robert C Martin in Clean Code (p.5)

      That's not to say it can't be done though. The great Netscape rewrite (ridiculed by Spolsky above) -- though a commercial disaster -- metamorphosed into an open source success story. Another example of a successful rewrite is the Perl 5 rewrite of Perl 4.

        The best we can hope for is probably a slow, relentless refactoring.

        Relentless I agree with; but I question "slow".

        It is my (long) considered opinion that if you reduce the targets to Windows and *nix (I'd say POSIX, but that lives about a decade behind (at least) linux); then with just 10 people (They'd have to be the right people), you could refactor perl5 in one year, to be:

        • Faster.
        • Simpler.
        • More maintainable.

        The secret: Fcuk the rest; get it working on these two first.

        Rational: If it can be made to work -- ie. passes the entire perl build test suite and perlbench on those two platforms, whilst having reduce the kloc by 50% and the average function size by 50%-- on those two wildly disparate platforms, then it can be made to work anywhere where there is sufficient will and bodies to tackle the task. (If there ain't; c'est la vie!)

        Detail: Whilst neither kloc nor function size is directly proportional to understanding, correctness and maintainability ; the correlation is so strong, over many studies over many decades, that it would be obtuse not to recognise that simplification is inversely proportional to understanding; and understanding is directly proportional to both correctness and maintainability.

        Target: Start small in terms of platforms (just two); start small in terms of functionality (just does what p5 does now); start small in terms reduction in size. I've suggested 50% but I believe 70% is (quite easily) achievable.

        If you reduce the current code by 50%; twice as many people stand a chance of understanding it.

        Then you emulate the pugs model: everyone gets a commit bit; (a majority of) 3 people have to agree, to rescind a commit.

        The guiding logic

        Nothing substantial -- syntax or semantic -- changes until a 50% reduction (compiled kloc) occurs. Then you invite both requests-for-change, and patches.


        With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
        Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
        "Science is about questioning the status quo. Questioning authority".
        In the absence of evidence, opinion is indistinguishable from prejudice.
        Of course, the "old, huge, tangled code base problem" is not specific to the Perl 5 internals. It is a chronic problem in the software industry

        Yes, but perl 5 is an extreme case.

        In any case, IIRC topaz was not a fiasco but a prove of concept that derived into the Perl 6/Parrot project. Ponie intention was to bridge XS and parrot, something dammed from the start because XS is just a way to access the perl 5 internals directly. A high level API hidden the implementation details is completely missing.

        Corollary: anybody wanting to advance perl 5 seriously should forget about XS compatibility. It is a too heavy to carry stone.

Re^3: The future of Perl?
by Anonymous Monk on Nov 07, 2014 at 19:27 UTC
    What OS support would you get rid of?

      netware, OS/2, plan9, qnx?, symbian, vms, vos?


      With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
      Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
      "Science is about questioning the status quo. Questioning authority".
      In the absence of evidence, opinion is indistinguishable from prejudice.
        Please don't remove VMS, I am so happy that I can use Perl on VMS platforms. Last week, following a change in architecture, I rewrote a 1400+ DCL (equivalent of shell scripting under VMS) program into a 25 lines Perl program. It took me a few hours (I know, a few hours for a 25-line program might seem to be inefficient, but I had to face some complicated system problems having nothing to do with Perl, and I don't think anyone could have rewritten the DCL in less than several days, perhaps more).

        Although I am working mostly under Unix, I have to spend about 25 to 30% of my time on VMS platforms. And, for me, Perl is even more important with VMS than it is under Unix, because VMS, while still being pretty efficient in some respects, does not have sed, awk, cut, find, grep, redirections, pipes (well VMS does have a form of pipe, but much less practical), and so on. And Perl basically gives me a substitute for all or most of these Unix utilities.

        I think QNX would be a poor target for removal up front. It's a mostly POSIX-compatible system that runs on modern embedded hardware very well. ( http://www.qnx.com/products/neutrino-rtos/neutrino-rtos.html#POSIX ) If your desire is to make Perl more relevant, abandoning a system with decent although small market share on ARM Core and Intel Atom systems is probably not a good way to do that. Should the GNU-userland Linux systems, the BSD systems, OSX, Windows, and maybe Android be higher priority? Sure. But to get to all of those portably gets you a pretty long way toward QNX. If support for it fell off it may not be a catastrophe. Writing it off before any attempt to gauge the work necessary to keep it seems premature.

        VMS may not be cutting edge, but it's still used in some important places.

        I don't imagine it would be hard with a Windows version and a GNU userland installed on top of their OS to keep the eComStation die-hards interested in maintaining the OS/2 and eComStation portability. It's not a very big community, but it's still fairly active.

        I think if you wanted a better list of systems to abandon for being irrelevant you could start with MS-DOS/PC-DOS, AmigaOS, Haiku, Mac OS Classic, and BS2000. I almost included RISC OS. Their community seems pretty headstrong and there's still work going into the OS.