your 2nd and 3rd points are very questionable.
Very often Perl is more real-time than C.
Agree with me or not, but I saw several times a discussion when it appears that GCC is incapable to compile some large autogenerated programs, because it uses O(N*N) in many places. Therefore people bother splitting such sources to chunks.
Ideal programmer will create faster programs in C, but real-world programmers often write
for (i=0; i<strlen(s); i++) {
// and never notice that counting lentgth is on every iteration
}
It is unrealistic to use advanced techniques on C everywhere, in perl it is often hard to write bad: you often save precalculated values in hash, and so on.
As for 3rd item - embedding perl into different application is very common practice (well, cases when bytes count are exclusion, but those are rare, IMHO). | [reply] [Watch: Dir/Any] [d/l] |
These aren't faults of the language, these are faults of the implementation. There is nothing impossible about writing a compiler for Perl to machine code, or constructing a chip that natively executes (!!) Perl.
Assembly (of all kinds) and Perl are both Turing complete, so anything written in one can be written in the other, theoretically.
| [reply] [Watch: Dir/Any] |