Beefy Boxes and Bandwidth Generously Provided by pair Networks
No such thing as a small change
 
PerlMonks  

Re: (OT) Programming as a craft

by BrowserUk (Patriarch)
on Dec 16, 2003 at 04:18 UTC ( [id://314964]=note: print w/replies, xml ) Need Help??


in reply to (OT) Programming as a craft

I drew my conclusions on this subject when it came up once before, and nothing has changed in the meantime except that I ended that piece with

Hopefully, that will remain the case at least until after I retire.

A change of mind, or maybe heart

On this I have changed my mind. Over the intervening year, I have had the time and motivation to look at where I think we are on the evolution of both the business and the profession of software development, and I have reached the conclusion that not only will it's current 'craft' status change -- it must change.

Despite my natural reluctance to see the end of what one might emotionally call 'a way of life', the one I have pursued with interest and fun for 20+ years. I now see it's demise as not just inevitable, but desirable. Whilst I previously thought that I might get away with being a Luddite and basking in the glory of the 'lone craftsman' persona until I retire. Thanks to having had the time and motivation to take a look at specific aspects of what we do every day as programmers and look at them in detail -- something that as a working programmer I had never had the time to do before. Not just 'performing the functions of a programmer', but looking closely at how the methods we currently use, evolved, and where they might lead to if pursued (with some hopefully intelligent guestimation and projection) into the future a ways.

A tentative conclusion

I've reached (a tentative) conclusion that we, both collectively and individually, are on the cusp of a major leap in the way software development is both performed and it's status as a result. The change I foresee, might be called, for want of a better term, automation.

Doom?

Just as with so many other mass-market products that start out being manufactured by small groups, usually driven by highly motivated, and not necessarily "professional" individuals, the transitioned to mass manufacture by a small number of very large corporations in order to achieve the benefits of scale required to fund the huge R&D costs for 'new models' is inevitable. That will probably not be a popular conclusion. The chances are that those with a little knowledge of history will envisage software moving into the 'production line' phase, with the inherent horrors of individuals performing endlessly repetitive, menial tasks in lock step. With all the skill, flair and therefore interest diluted or completely removed. Like an auto-manufactures assembly line. I too used to think that this was the next step on the evolution.

History

After all, this was the case in so many other industries. And not just the obvious examples either -- the auto industry, motorcycles, TVs and other electronic consumables in general, even the PC. With a little sideways glancing at many other industries, you can also follow this progression. Tourism & air travel, with package holidays, mass-market cruise ships following each other on the same schedules just far enough apart that the wake from the preceding ship doesn't 'shake the contents'. A little harder to see are things like music, film, fashion and even art. However, I think that there is a major difference that means that all is not lost. The older of the above industries have moved beyond the assembly line stage, with thousands of people arranged either side of a moving conveyor performing repetitive tasks like automatons. They have been replaced, by automatons -- robots. The driving force behind that replacement was costs, mostly labour costs. The enabling technology that allowed the replacement was computers. These gave the machines that actually performed the task of assembly, the intelligence and dexterity to perform those tasks without the guidance and supervision of the human brain. This is the same technology that will (IMO) allow the software development industry to bypass the assembly-line-powered-by-humans stage and move directly to the automated assembly line stage.

Essentially, the craft-to-assembly-line transition involves breaking the task of overall assembly down into infinitesimally small, discrete steps and then assign each step to an individual (human), to master and repeat endlessly. The enabling technology for this in the mechanical engineering field was the use of the basic had tools of that craft -- scribes, rules, micrometers, hammers, saws and files -- to be used to create better machines -- lathes, routers, presses etc. Once this was mastered, then it became possible to create specialist versions of these machines that would produce sub-assemblies that the individual workers then assemble together to form the final product. This is roughly where we are today in the software industry. We have the second level of tools at our disposal -- editors, compiler, interpreters, version control, CD burners etc. With these, we can produce the software sub-assemblies, classes, libraries, objects etc. The industry is now trying to get to grips with the mechanisms of utilising these sub-assemblies in production line environments -- the moves towards offshore software development are the current signs of this although there were plenty of corporate MIS depts. that attempted to do this kind of development throughout the 80's and 90's. Assuming that throwing large numbers of programmers at tasks, under strictly controlled development team regimes would result in quicker development times and reduced costs and reliable time scales. Personally, I think that both of these strategies will eventually be discredited and discarded as failures. And not before time. The next step is robots.

A ray of hope

I can sense (or pre-sense) the hostility that many who read that statement with feel. They, like me, do not want to give up their job, their skills, their art, their passion to a robot. Fear not, you won't have to. The reason is that, as yet, we have not for the most part reached the point where our jobs can be automated, not even by computers. More importantly, we have not yet reached the point where the numbers of people employed in the industry and their total salaries have become such a high proportion of the total available income from the potential market for our products, that they govern the profitability of the industry as a whole. With cars, the markets where pretty much saturated in terms of numbers somewhere in the late 80's, early 90's. As such, the only ways to get higher profits was to

  1. differentiate the product to induce brand switching.
  2. reduce unit costs -- ie. reduce staffing levels.

The software industry is far from reaching this point. In fact, the limiting factor in the applications of our products is our ability to produce them. There are a million new uses that software could be put to, and the new markets and uses grow every day, as the hardware gets cheaper and more powerful.

The limitation that stops the field growing is the productivity of the industry. It simply takes too long, using current techniques, to bring new products, or even better versions of existing ones to fruition. During the 80s and again in the dot Com bomb of the late nineties, as the market place for our endeavours grew, the attempt was made to ramp up production by increasing productivity through assembly line programing and by recruiting large numbers of low-skilled personnel into the industry. Both failed.

History's lesson

The lesson that can be learned from history of other industries is that the only, long term, effective way of increasing the productivity of an industry, is not to dumb down the job and throw bodies at the problem, but to increase the skills of the existing levels and have them develop tools that perform the mundane and repetitive parts of the process. A small number of highly skilled personnel use their skills and innovation to automate those parts of their jobs that they hate. Universally, the boring and repetitive parts. The number of people employed remains static, but the education (and salary:) levels increases as they use automation to multiply their productivity. I recently typed the following in another post here

A secret

The secret of productivity is to have tools that do more of the work, more quickly, more accurately and with less supervision. Computers, as tools, are being applied in every area of human endeavour to this end with amazing results. With one, notable and lamentable exception -- software development.

I repeat it, because I don't think I can say it any better. I think that this is not only the indicator of what we in software need to do, I also think that the time is nigh when the technology, the motivation and even the financial climate are such that we will be able to do what it necessary. And that is to produce better tools. That doesn't mean better editors, or faster sort algorithms, or easier to use versioning systems, or more testing, or bigger, more comprehensive code libraries. It involves the integration of all of these and more. We need tools that automate steps, and facilitate that automation. We don't just need a bigger parts catalog. We need tools that know about the parts catalog and help use to select the right parts from it. We need to be able to select not individual parts and glue them together, but to be able to specify the purpose of our applications at a high level and have our tools select the appropriate parts for use and integrate them together for us. We need to be able to interchange parts on the basis of performance as measured by any of several criteria and pick the one that supports the particular performance characteristics required by our application.

We specify a sort. If the application requires a faster one, we swap in a faster one, or a stable one, or one capable of handling large volumes of data, or one that is specialised in dealing with the particular attributes of our data -- transparently to the rest of the application. Perhaps the application can select the sort it uses on the basis of real time intelligence of the data it has to sort. The tools, and even the applications themselves have to become more intelligent in selecting the building blocks they use to perform the tasks at hand, given real-time intel on the nature of the data upon which they must perform it.

Ultimately, this probably requires more intelligence in the software that runs the tools and the applications. Ie. The OSs themselves. We need to move beyond the current crop of models of the world and the data in it, which evolved in the 1970s and 80s to model the world in terms of the architectures and hardware, and their limitations, available at that point in time. Higher level languages is one part of this. The integration of those languages with data stores that don't reduce the world to a series of bytes streams is another.

Exciting times (I hope)

In many ways, I see this as the most exciting time in software development. Watching the tools and systems that will integrate the innovations of the human mind with the power of the hardware now available to us, will be a privilege to witness. If I can find a way to take part in the process, so much the better.


Examine what is said, not who speaks.
"Efficiency is intelligent laziness." -David Dunham
"Think for yourself!" - Abigail
Hooray!

Replies are listed 'Best First'.
Re: Re: (OT) Programming as a craft
by chromatic (Archbishop) on Dec 16, 2003 at 05:12 UTC

    That argument ignores a fundamental difference between software and manufactured goods:

    Duplicating software is easy.

    A compiler and a copy command are your assembly line; it's already fantastically cheap.

      It's a fair point, but I wouldn't say that I ignored it, just that the terms of the analogy require reinterpretation to 'fit' them to the reality.

      A compiler and the copy command are very cheap, provided everyone who wishes to use your product has exactly the same compiler, and cp er.. copy command, but they don't, nor should they. Once you move out of your 'production plant', the effort required to copy and compile the product start to rise. Perl itself runs more places than almost anything else you care to name, but at what cost?

      A Config that contains a little under 1000 variables.

      A build process that, to be frank, makes the assembly instructions for your average automatic gear box, self assembly PC, even a full blown kit car I once assembled, look relatively simple by comparison. So complicated in fact, it is necessary to distribute and build two copies of perl in each dstribution. The first is a simplified version with just enough functionality to ease the problems of configurability, so that it can be used to glue the many other tools, configurations and utilities that are required, be used to build the full product.

      The alternative approach is the packaged software route as exemplified by MS. With this, each application has to be pre-built to cater for all possible eventualities, and will only run on a very limited subset of target environments. Each application becomes enormous with the weight of its runtime configurability, despite the fact that late binding is available and that 'they' control the content and functionality of the environments that they target

      If 'production', in software terms, meant the copying of the code (compiled or source) onto a CD (or server) and distributing it, then that indeed is cheap. However, it doesn't. Most manufactured goods leave the production facilities as finished products ready for immediate use. Software, even the best packaged consumer software -- which currently probably mean games -- is rarely "ready for use" as it leaves the production facilities. Even most games require a certain amount of expertise and knowledge on the behalf of the purchaser, in order to install them and set them up for use. Except for those that run on proprietory, single function hardware where the variables can be much more tightly controlled than with general purpose hardware. Eg. PCs.

      Currently, software manufacturers leave the final assembly, tuning and shakedown of their products to the purchaser, and the costs of the training, expertise, man hours spent performing that final assembly -- and correcting it when it goes wrong -- are rarely considered along side the purchase price, except in highly dubious 'true cost of ownership' surveys.

      So,whilst the anaogy leaves much to be desired, if you extend your thinking to encompass the complete process from initial concept to ready to use, the differences becomes less clear cut.


      Examine what is said, not who speaks.
      "Efficiency is intelligent laziness." -David Dunham
      "Think for yourself!" - Abigail
      Hooray!

        We're agreed that some software needs customization before it's usable by customers. (I've never noticed the apparent similarity in root words before. I'll have to look into that.)

        My difficulty with the assembly line image is that, with a physical product, the assembly line uses interchangeable workers and strict processes to make identical copies of a physical product cheaply and efficiently.

        Aside from an emotional reaction against the idea of treating programmers as interchangeable pieces, I cannot not see the "make identical copies" part of the process. That, to me, is the reason an assembly line is possible! It exists precisely for mass duplication!

        Granted, there exist assembly lines dedicated to customization -- the worker who puts extra memory in laptops, for example -- but even then, the scope and breadth of the customizations are much, much smaller than in a software project.

        They also can't be divorced from the physical aspect. Certainly Apple could ship all PowerBooks with a gigabyte of memory, but they can't change one master PowerBook and duplicate it for every customer without touching every machine.

Re: Re: (OT) Programming as a craft
by woolfy (Chaplain) on Dec 16, 2003 at 13:06 UTC
    Indeed, the automation of programming is still quite far away in time, so it seems. I have witnessed two such attempts first-hand.

    I once worked for a company that had two departments (this company ceased to exist over 10 years ago). The department I worked in, produced CBT (Computer-Based Training, or courseware, or Computer-Aided Instruction, etc.). The other department worked on designing AI (Artificial Intelligence): they made systems designed to create AI-applications, and they tried to produce AI-end-user-apps.

    The European Community subsidised this AI-department with enormous amounts of money. Some spin-offs of their activities became mildly succesfull, like an application to support decision making. They used LISP to build their AI-system, and argumentably they did that in such a way that the applications could never be fast nor maintainable; that decision-making-support-application was horrendously slow. So they turned to the CBT-department, working with TenCORE (nowadays a dying language). The application was completely rewritten in TenCORE (liz did a lot of that work), after which the speed was acceptable. The app could be used to decide on e.g. the holiday destination for a family, using as many factors as that family wanted. As said, the app was mildly succesful, but it never was succesful enough to make up for the many EU-investments.

    This AI-department tried to automate programming. And in the end, they failed, and very soon after the EU stopped investing, the department vanished. Their tools were too simple for so complex a task, their goals too high, and they produced hardly any end-user apps.

    Back to TenCORE. That was the second attempt on itself. The creators of TenCORE made a system built on TenCORE itself, TenCORE Producer. Producer was just a higher programming language, simplified, made accessible in a GUI. Hardly any AI in it. At the time, there was also Authorware, built in C. Just like developers with Producer could extend their end-user applications with modules built in TenCORE, CBT-developers using Authorware could use C to extend their apps. By the way, developers used TenCORE and Authorware to create websites, multimedia, games (my company used TenCORE to create and maintain hundreds of websites).

    CBT can be considered as automation of education. Producer and Authorware can be considered as automation of programming this CBT.

    And why didn't this type of programming conquer the world? Because they are just tools. And there are a lot of other factors that influence success or failure, factors like marketing, economical growth, hype-ability.

    Creating good software is a lot like writing good books. Or like good project management. Or like good management of a company. You can have as many good tools as you can find, afford and use, but still the tools make the job easier, they don't do the job, that is done by the people that use the tools.

    Because of their complexity (both from a conceptual point of view and the implementation aspect), I think CBT and AI are not merged yet to be used succesfully world-wide. This complexity is not just a problem for the programmers, but certainly as well for managers, marketing people and end-users.

    Attention was drawn from the further development of AI in software development because the arrival of new hypes in the nineties, first multimedia, later internet.

    Now, there is a lot of simple (!) multimedia on the internet. And a lot of simple CBT on the internet. Even a lot of CBT combined with multimedia. But they all lack AI. I even see that most of this "modern" multimedia and CBT is simpler than what was produced 10 years ago (well, except maybe the games).

    I am convinced that Perl-specific graphical editors (like Producer and Authorware) would help Perl (just as it would be the case for other languages, like PHP, Java, Javascript, Python) to become more widely accepted as tools for development for software (not just internet scripts). But I don't know about such graphical editors, and I certainly haven't seen any such editors with AI-qualities.

    I think it might be a good step in the good direction when someone builds such a graphical editor in Perl for Perl. It should be easy to use, and it should enable a developer to build applications (CBT, multimedia, database management, computer system management, content management, etc.) without having to know much of the underlying language, Perl.

    Next step would be to add AI-techniques to it, so a developer would just have to give the program a description of the app to create, a lot of paramaters and a lot of contents.

    The first task of that program would of course be to build courseware to explain the use of the program and the underlying concepts. Because the developers need to understand the thing they are working on/with.

      think it might be a good step in the good direction when someone builds such a graphical editor in Perl for Perl . . .

      Perl is highly non-trivial to parse, which is why a any Perl IDE will have a lot of problems, even for jobs as seemingly simple as syntax highlighting. So I think this is another thing will have to wait for Perl6, where parsing should be easier.

      What I don't understand is why a Perl editor should be written in Perl itself. You're ultimatly dealing with a stream of bytes, which is as a completly language-agnostic concept as you can find.

      ----
      I wanted to explore how Perl's closures can be manipulated, and ended up creating an object system by accident.
      -- Schemer

      : () { :|:& };:

      Note: All code is untested, unless otherwise stated

        What I don't understand is why a Perl editor should be written in Perl itself. You're ultimatly dealing with a stream of bytes, which is as a completly language-agnostic concept as you can find.

        But woolfy was specifically describing a Perl-specific code editor, not an arbitrary-stream-of-bytes editor.

        Beyond just syntax highlighting, such a tool might be expected to support the features of some advanced Java and Smalltalk code editors, like:

        • symbol name lookup (tab-completion for variable names, when typing a function name show an example of the arguments it takes, etc);
        • refactoring editor (for example, select a few lines of code and invoke the "move to new subroutine" refactoring, and automatically have those lines replaced by a call to a new subroutine with proper argument passing and return value assignment);
        • run-time/debugger integration (interpreting code as you type, setting break points, editing code in a running program and then continuing);
        • documentation support (WHYSIWYG POD editor, fill in some kinds of POD automatically);
        • testing support (automatic running of tests in the background, integrated display of profiling results and code or pod coverage analysis);
        • plugin support (to let others add support for working with Class::MethodMaker, Class::Contract, Aspect, or other specialized approaches to Perl coding).

        Even if it only implemented some subset of the above, I think such an editor would be providing real value, beyond the traditional solution of a good text editor with syntax coloring and automatic indenting.

        And I don't see how you could really do justice to such a tool without implementing it in Perl, or at least delegating key portions to an embedded Perl instance.

        Perl is highly non-trivial to parse, which is why a any Perl IDE will have a lot of problems, even for jobs as seemingly simple as syntax highlighting.

        Nah, I am using Crimson Editor (a Win app), in which scripts can be executed with a shortcut (create your own macros and make one macro to call perl) and that includes syntax libraries for Perl and other languages. It works nicely. It does not parse Perl, it does not need to, but it works nicely together with perl.exe.

        What I don't understand is why a Perl editor should be written in Perl itself. You're ultimatly dealing with a stream of bytes, which is as a completly language-agnostic concept as you can find.

        Well, in my opinion it should be possible with any good language to create an editor. So why not an editor, written in Perl. And why not a graphical editor. It is a proof of strength, of versatility. It is a goal in itself. Write code with something that is written in Perl code.

        Furthermore, such an editor would be (in addition to the test suites) one of the first things to check when a new release of Perl would be published. If it stopped working, it might be a good sign something is wrong with the new release (yes yes, I know, of course, any editor contains bugs).

        Lastly, a good editor included in the Perl distributions is a nice completion of the package. Install Perl and start working in Perl and with Perl. No extra's needed. And when we're at it, a nice module searching/finding/installation/updating feature in the editor would be much welcomed by at least me.

Re: Re: (OT) Programming as a craft
by BUU (Prior) on Dec 16, 2003 at 05:51 UTC
    The secret of productivity is to have tools that do more of the work, more quickly, more accurately and with less supervision. Computers, as tools, are being applied in every area of human endeavour to this end with amazing results. With one, notable and lamentable exception -- software development.


    Interesting quote, but I disagree completely. The proof of my disagreement? Perl it self (and dozens of other 'higher level' languages of course). What else is Perl but an attempt to produce a tool that does more of the work, more quickly and more accurately with less supervision? It has high level features built in, such as scalars and hashes to do more of the work which directly translates to faster, and it has a built in garbage collector to do stuff we as programmers don't need to supervise.

      I agree entirely that Perl is a step in the right direction. However, the language is only a part of the problem,and only a part of the solution.

      If you could write all your applications without recourse to an editor, version control software, a browser or other software to access CPAN, a datbase, test tools and all the other tools and utilities that the average perl programmer uses to write his applications, then I would agree with you, but you can't.

      Or at least, you can't do it that way and expect to produce a high quality product, that meets or exceeds the requirements, on time, within budget, at the target cost and that will contniue to perform it's function with reasonable maintanence costs for a sufficient period of time that it will produce a positive ROI.


      Examine what is said, not who speaks.
      "Efficiency is intelligent laziness." -David Dunham
      "Think for yourself!" - Abigail
      Hooray!

        If you could write all your applications without recourse to an editor, version control software, a browser or other software to access CPAN, a datbase, test tools and all the other tools and utilities that the average perl programmer uses to write his applications, then I would agree with you, but you can't.
        Oh but I could. It would take longer probably and be harder to implement, but thats why they invented all these nifty automated utilities to speed up production of the product, isn't it?

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://314964]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others sharing their wisdom with the Monastery: (5)
As of 2024-04-24 08:42 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found