Beefy Boxes and Bandwidth Generously Provided by pair Networks
Your skill will accomplish
what the force of many cannot
 
PerlMonks  

Re: Make it good

by BrowserUk (Patriarch)
on Oct 18, 2004 at 16:42 UTC ( [id://400215]=note: print w/replies, xml ) Need Help??


in reply to Make it good

I think there is a counterpoint to this argument. If you look at the history of software projects that went bad, the most consistent reason for failure is that they tried too hard to make things perfect. To cover all the bases. To be all things to all people. To encapsulate everything. To control everything. To handle every contingency, edge case and possibility.

As a freind of mine once put regarding a project that cost close to a billion dollars and ultimately went nowhere... "Designed to death!".

The art of software (design/code/maintanence) is as much about what to leave out, what not to handle, what not to encapsulate, and what not to fix, refine or refactor, as it is about doing those things.

I agree with your first point. Make it work. But I do not think that you gave it enough emphasis (despite putting it first) nor enough importance. It is better to produce something that does something--even if it does it wrong--than to produce nothing at all.

More is learnt from making a mistake and correcting it than from trying to avoid the mistake in the first place. Intellectual 'what-ifs' and 'maybes' are interesting games to play, but ultimately less productive than "It does"' or "It doesn't."s.

Like the proverbial picture, one line of code is worth 1000 words. Even if the result is that the line of code is thrown away, the the learning (experience) from having written and tried it is invaluable.

The earlier mistakes, mis-assumptions and bad design are found, the earlier they can be rectified and the less their effect upon the project enddate.

And the quickest and most relaible way to find mistakes is to make them.

Unnecessary security is more than just unecessary--it is a drain on resources, both when coding the program, and when using it.

Designing in reusability, before there is an application for reuse, besides being a waste of effort if the code is never reused, frequently leads to design and coding descisions that only serve the purpose of the assumed reusability.

These decisions are often non-optimal for both the original application, and when a real application for reuse comes along, non-optimal for that too.

Better to code the solution to the problem at hand, and redesign/refactor for reuse, only once the nature of the reuse is known and can be factored into the design.

To do otherwise places constraints and costs that rarely (IME) ever produce payback.


Examine what is said, not who speaks.
"Efficiency is intelligent laziness." -David Dunham
"Think for yourself!" - Abigail
"Memory, processor, disk in that order on the hardware side. Algorithm, algorithm, algorithm on the code side." - tachyon

Replies are listed 'Best First'.
Re^2: Make it good
by radiantmatrix (Parson) on Oct 18, 2004 at 17:02 UTC

    First, let me say that I don't necessarily disagree with you. But, I'm not talking about "design first, code later" mentality. Yes, things can be designed to death, but with the above goals in mind both while you design and while you code (which are really the same step, but that's another Meditation), you can find the balance.

    The art of software (design/code/maintanence) is as much about what to leave out, what not to handle, what not to encapsulate, and what not to fix, refine or refactor, as it is about doing those things.

    This, to me, falls in the category of Make it clean. Part of being clean means not doing the unneccessary.

    Unnecessary security is more than just unecessary--it is a drain on resources, both when coding the program, and when using it.

    There is really no such thing as "unnecessary security". It's possible that someone might do something unnecessary in the name of security. But, such things are actually not good security -- the illusion of security reduces security, as does needless complication from unnecessary measures. It's important to remember that the principles above work together -- you can't put all your eggs in making it secure if doing so would cause it not to be clean (or worse, not to work).

    Designing in reusability, before there is an application for reuse, besides being a waste of effort if the code is never reused, frequently leads to design and coding descisions that only serve the purpose of the assumed reusability.

    There is a big difference between designing-in reusability and coding with reusability in mind. The former, as you've said, is a bad idea unless reusability is one of your goals. However, it is a good idea to avoid practices that needlessly break reusability. Again, by using the principles of "make it work" and "make it clean", it's apparent that there are cases where reusability is less of a concern. That's why I phrased that section the way I did.

    Better to code the solution to the problem at hand, and redesign/refactor for reuse

    And this is a task made easier by keeping in mind potential reusability during coding. "Oh, I shouldn't make that a constant, what if I need to reuse this?" Of course, sometimes making it work precludes this -- like when performance trumps most other concerns.

    radiantmatrix
    require General::Disclaimer;
    "Users are evil. All users are evil. Do not trust them. Perl specifically offers the -T switch because it knows users are evil." - japhy

      I followed dragonchild's earlier link to Extreme Perl--Preface and came across the quoted text below. It (with the addition of a little context) does so much a better job of saying what I was trying to say above, I just wanted to point it out.

      Start out by

      ...do[ing] the simplest thing that could possibly work...
      and then--if required--make it (slightly) more complex in order to make it work. Iterate.

      I sort of arrived at this basic philosophy for coding (which I attempted to describe in My number 1 tip for developers.), over many years almost by accident. It is basically the result of an adverse reaction to having been forced to attempt to follow one new "programming paradigm", "structured programming methodology" after another as each new fad came (and for the most part went).

      From the classical Waterfall Model, through SSADM/LBMS, OOA/OOD, DataFlow Modelling and Use Cases to the current trends for Data Driven and Test Driven development, AOP and XP. Each of these brought something slightly new and valuable to the subject.

      The problem is that all too often, that one new thing is seen (by some, temporarially) as the be-all and end-all of chique and they try to design a whole "methodology" around it. The problem for the programmer is pursuading management that whilst that one new pin is a useful technique, that it can be incorporated into the existing working practices without throwing everything else away.


      Examine what is said, not who speaks.
      "Efficiency is intelligent laziness." -David Dunham
      "Think for yourself!" - Abigail
      "Memory, processor, disk in that order on the hardware side. Algorithm, algorithm, algorithm on the code side." - tachyon

        Very well phrased, if you don't mind me saying so. I think we're in danger of violently agreeing. :)

        Yes, new "methodologies" and "shifts in paradigm" arise, shine brightly, and get relegated to the growing pile of Things That Work Just Like Everything Else. What I'm hoping to point out through the parent node is that all of these methods are just approaches to making good software, but the all focus on how rather than what.

        I do believe strongly in the iteration style of thought. Programming, like writing, is an iterative process. You brain-dump, then you refine, which might lead to refactors in some cases, which require futher refinement, and so on. Perl has taught me that it isn't how you do something that matters as much as what you accomplish. If one has two approaches to solving a problem and one is faster, cleaner, more secure, etc. -- and it still works -- then that is the better solution. TMTOWTDI, but not all are always the best way.

        The whole point of TMTOWTDI, to me, is that each approach has its strengths and weaknesses, and which is "best" to use largely depends on what you're using it for. The "Make it Good" concept is that it doesn't matter what method you choose, so long as you are reaching toward the best balance of Function, Elegance, Security, Ease-of-use (for the target audience), Reusability, and Readability.

        Now, I realize that good coding practice can't be relegated to those few short bullet points, but I strongly believe that to use these -- or something like them -- as a framework make a lot more sense than standardizing on the "chic of the week" methodology.

        radiantmatrix
        require General::Disclaimer;
        "Users are evil. All users are evil. Do not trust them. Perl specifically offers the -T switch because it knows users are evil." - japhy
Re^2: Make it good
by ysth (Canon) on Oct 18, 2004 at 18:22 UTC
    I agree with your first point. Make it work. But I do not think that you gave it enough emphasis (despite putting it first) nor enough importance. It is better to produce something that does something--even if it does it wrong--than to produce nothing at all.
    So was Matt's Script Archive a good thing in the long run?

      I said produce something. Not produce it and then promote it to the general public.

      I did start to try and objectively try and consider whether anything had come out of Matt's script's that might be comsidered good. But then I realised that it would be pointless. Your question has nothing to do with what I said in the post to which you replied. Neither in isolation, nor in the context of the thread of which it is a part.

      At this point, I am going to invoke Godwin's Law, as in my opinion, any reference to Matt's script's in the context of Perl programming is exactly equivalent.


      Examine what is said, not who speaks.
      "Efficiency is intelligent laziness." -David Dunham
      "Think for yourself!" - Abigail
      "Memory, processor, disk in that order on the hardware side. Algorithm, algorithm, algorithm on the code side." - tachyon
        Your question has nothing to do with what I said in the post to which you replied.
        Your post caused me to ask myself the question :)

        I wasn't trying to subtly argue against you; I think you could make a case either for or against. And the answer to my question applies to your argument taken to the exteme. So if you can decide that the script archive did overall lead to more good than bad, you have a very strong argument in favor of "make it work" over all else.

      Yes, I think it was. I learnt perl by fixing bugs in his code. I expect many others did too.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://400215]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others exploiting the Monastery: (4)
As of 2024-04-24 21:56 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found