Beefy Boxes and Bandwidth Generously Provided by pair Networks
Do you know where your variables are?
 
PerlMonks  

"Get the Job Done" (GeJDo)

by dragonchild (Archbishop)
on Sep 19, 2001 at 16:38 UTC ( [id://113303]=note: print w/replies, xml ) Need Help??


in reply to Too Many Ways To Do It

The issue of "best" is very deceptive. Best how? I've said this in a number of nodes, but I'm going to say it here all laid out.

The best language when it comes to CPU and RAM usage is machine language. You cannot do better than machine language. At all. No way. The best compiler will only match the machine language implementation, and usually not even come close.

However, development time in machine language is extremely slow. Like, slower than that. And, maintenance is even harder. This is why we use higher-generation languages.

Now, every 3+G language has tradeoffs. Each language's syntax lends itself to certain applications. FORTRAN is excellent for engineering/math. COBOL is excellent for business apps. Haskell for functional, etc. In addition, different compilers do different things well. You can compile the same C program with two compilers and one will have better RAM, but worse CPU. The other will be reversed.

Perl, that 4G language we all know and love, has been optimized for development time. Now, what does that mean?

It means that no matter how I think about it, I can get a functional program that does what I want it to do in a reasonable amount of time.

  • Does it mean that I use RAM most efficiently? Nope.
  • Does it mean that I use CPU most efficiently? Nope.
  • Does it mean that I built it so that I could maintain it easily? Nope.
It means that I can write one-offs quickly and efficiently and "Get The Job Done". (GeJDo? Maybe GAJD for Get A Job Done? I dunno.) Remember that human time is the most expensive part of a development effort.

Does this mean that Perl cannot do all the things other languages do? Yes, it can ... and "Well Enough". Perl is, as we all know, a RAM hog. It can also be a CPU hog. This is part of the tradeoff for having lightning-fast development time.

Now, what does this all mean?

It means that "Good Enough" should be your mantra. You can always optimize. You can always rewrite your thing in machine code, if you really wanted to. However, you have to determine when the gains made in runtime performance are not worth the value of the (re)development time needed to achieve those gains. At that point, you have hit diminishing returns, and should probably stop.

If your program works, it doesn't matter if the template is perfect or if your for-loops are optimized or whatever. Are you going to reuse the template? If not, then leave it alone. If you are and the rewrite would help you when you use it again, then go ahead and rewrite it. Otherwise, leave it alone.

------
We are the carpenters and bricklayers of the Information Age.

Don't go borrowing trouble. For programmers, this means Worry only about what you need to implement.

Replies are listed 'Best First'.
Re: "Get the Job Done" (GeJDo)
by thraxil (Prior) on Sep 19, 2001 at 20:19 UTC

    The best language when it comes to CPU and RAM usage is machine language. You cannot do better than machine language. At all. No way. The best compiler will only match the machine language implementation, and usually not even come close.

    there's actually a lot of debate about this point. unless you're a really, really good assembly programmer, a mature compiler working with well written high-level code will most likely produce faster, more efficient assembly than you can. a good compiler benefits from the years of knowledge and experience of its authors. this means that it knows every trick that they know. it knows the chip inside and out and can make much more sophisticated counter-intuitive optimizations around obscure side-effects of infrequently used instructions. compilers have less of an advantage on RISC architectures than they do on more advanced chips with hundreds of instructions; many more than most humans can hope to fully understand.

    of course, if you write bad high-level code, the compiler will happily produce bad assembly code.

    and assembly still can't be beat for code size. that and its direct access to the hardware are the reasons people still do use it.

    its fun though and will teach you a great deal about computers that you just won't learn from coding high-level languages. you haven't lived till you've spent over 24 hours straight debugging Z-80 assembly.

    anyway, i agree with your message though. TIMTOWTDI applies outside perl too. sometimes one of the ways to do it is a different language.

    anders pearson

      The last time I used assembly language was a one-line inline asm opcode to call the RDTSC instruction on X86.

      The last significant amount of asm I wrote was a function to convert UTF-16 to UTF-8. I was inspired by a comment in the reference implementation saying "I wish there was a better way to do this" where there is a bunch of if/else chains to test how many leading 1 bits are in a byte. Well, the CPU has an instruction for that. Then, it proved easier just to write it all in asm than to interface a helper function. After all, its work is to push bits around, something asm is good at.

      So, instructions that are not modeled in your high-level language is a good reason to dig to asm.

      —John

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://113303]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others romping around the Monastery: (5)
As of 2024-04-23 22:11 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found