Beefy Boxes and Bandwidth Generously Provided by pair Networks
Think about Loose Coupling
 
PerlMonks  

Re^4: Random quotes in the top left corner

by apotheon (Deacon)
on May 01, 2005 at 03:50 UTC ( [id://452970]=note: print w/replies, xml ) Need Help??


in reply to Re^3: Random quotes in the top left corner
in thread Random quotes in the top left corner

I don't think I'd say that my "conclusion" was "wrong" so much as that it was incompletely stated. To round it out nicely, I should have made reference to the idea that first, one should create a throw-away.

I know that the idea of creating a throw-away version first is an oft-cited principle of software design in general, but it applies even moreso to innovations: first, you innovate, then you create something useful. Sure, tightly couple in the throes of creative frenzy if you must, but then recreate your innovation with more modularity. There's a big difference between screwing around with new ideas and writing good code.

So, yeah, I think that tight coupling has its place in experimenting with new ideas, but that doesn't mean you should be calling something a release version before you've refactored and restructured so that it's more modular.

But, again, I could well be speaking too vehemently of things I don't understand well enough. Considering our relative levels of experience with Perl code, I'd be inclined to say that, all else being equal, yours is more likely the "right answer" than mine.

print substr("Just another Perl hacker", 0, -2);
- apotheon
CopyWrite Chad Perrin

Replies are listed 'Best First'.
Re^5: Random quotes in the top left corner
by tilly (Archbishop) on May 01, 2005 at 06:56 UTC
    I still disagree. In real world cases tightly integrated designs have often been viable a decade or more before modularized designs came to market. This is not a question of building a throw-away (though your solution will be superceded once raw performance isn't as important as, say, customizability), it is a question of being able to create the right product at the right time.

    Let's look at an analogous situation comparing languages rather than development techniques. Suppose that Perl is 10x slower than C. But it is clearly preferable to accomplish a speific task in Perl. Then from Moore's law we can project that it will be feasible to do that task in C about 5 years before it is feasible to do it in Perl. The C solution will be doomed, eventually someone will do it in Perl and the Perl solution will be more flexible and customizable and will win. However the C solution will not be a throw-away, it will be able to meet a need before the Perl one could. Furthermore with the 5 year headstart on development, the C project could easily remain better than Perl for several years after CPUs become good enough for the Perl solution. So for perhaps a decade, despite the obvious reasons to prefer Perl, everyone will use a solution built in C.

    The same thing happens with modularization. For instance in the mini-computer market, the first successful operating systems (Apollo, VMS, etc) were tightly integrated with the hardware. Eventually they all lost to Unix, which was significantly more modular in design. But they were not throw-away solutions, and for many years was what everyone used.

      I still disagree. In real world cases tightly integrated designs have often been viable a decade or more before modularized designs came to market.

      You seem to be implying (and please correct me if I'm wrong ;-) that loosely coupled systems are necessarily slower to market and/or perform significantly worse than tightly coupled solutions?

      If so, I'm not entirely convinced. When I see tightly coupled software being produced it's normally a combination of one or more of:

      • Developers not having the necessary knowledge of ways to create software in a loosely coupled manner (e.g. not knowing about techniques like dependency injection.)
      • Not having appropriate tools to make the development of loosely coupled software simple (e.g. a language like Perl or Java offers more features that help with loose coupling than a language like C or COBOL.)
      • Not having experience of software development practices that encourage loosely coupled software (e.g. using TDD.)

      Sure - there are some instances where a tightly coupled system has been deliberately chosen due to some constraint - but they seem rare in my experience. More often they're done first because that's the only way the developers know how to create software.

        I am not implying that.

        I am implying that maintaining loose coupling necessarily keeps you from being able to find certain kinds of optimizations, and that means that the top possible performance (tight memory, etc) can be hit by a tightly integrated application. If performance is a significant constraint (mini computers in the 70's, GUI interfaces on PC hardware in the 80's, PDAs in the 90s) then tightly coupled systems may be viable several years before loosely coupled ones are.

        (As a practical matter, most attempts to write a tightly coupled system will result in something slower than a loosely coupled system could have been. This is certainly a strategy that one would only try with very good reason.)

        This applies to a small minority of software, and applies to virtually none in the Perl world. Places where I might expect to see this happen today include extremely high-volume servers (eg Google), very constrained systems (eg many embedded projects), and computationally intensive products (some games). None of those are good candidates for Perl because Perl is relatively big and slow.

      I have a couple of problems with your explanation.

      First, the performance and rapid development gains in tight coupling typically aren't as significant (and, in the case of rapid development, perhaps aren't even typically extant) as they are for simply choosing the right tool (language) for the job. The analogy you provide doesn't strike me as being particularly viable.

      Second, the importance of incremental performance gains from tight coupling decreases over time as system performance capability increases. The reason that UNIX was able to overcome less modular designs didn't have so much to do with the fact that less modular designs were superseded as the fact that less modular designs were no longer a simple necessity of hardware restrictions. I tend to think that as system performance capability increases, we'll see further evolution toward modularization, and what we think of as loose coupling today may be classed as tight coupling in a few years.

      Older, less modular OS designs didn't develop so much because they provided a simpler path toward development as because A) greater modularity in design hadn't really been experimented with very much yet and B) more modular design simply wouldn't run on the comparatively limited hardware available at the time.

      print substr("Just another Perl hacker", 0, -2);
      - apotheon
      CopyWrite Chad Perrin

        I find it ironic that you have problems with my explanation, and then proceed to indicate that you got the main point.

        Your reason B for less modular OS designs was exactly my point. Less modular designs were feasible on the hardware before a more modular design would be. Once more modular designs became feasible, it was only a question of time until one emerged as a standard and then won due to advantages in cost, configurability, and maintainability. But there was a window where non-modular solutions made sense.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://452970]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others goofing around in the Monastery: (5)
As of 2024-03-28 13:10 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found