Beefy Boxes and Bandwidth Generously Provided by pair Networks
good chemistry is complicated,
and a little bit messy -LW
 
PerlMonks  

Re^13: Beyond Agile: Subsidiarity as a Team and Software Design Principle

by einhverfr (Friar)
on Jul 23, 2015 at 07:02 UTC ( [id://1135957]=note: print w/replies, xml ) Need Help??


in reply to Re^12: Beyond Agile: Subsidiarity as a Team and Software Design Principle
in thread Beyond Agile: Subsidiarity as a Team and Software Design Principle

I am still not convinced this is a difference of kind. It still looks like a difference in degree. I.e. more complexity rather than a different kind of complexity. Not all torpedos were purely gyroscopic. Depth sensing was accomplished, for example, with pneumatic bladders.

  • Comment on Re^13: Beyond Agile: Subsidiarity as a Team and Software Design Principle

Replies are listed 'Best First'.
Re^14: Beyond Agile: Subsidiarity as a Team and Software Design Principle
by Anonymous Monk on Jul 23, 2015 at 09:24 UTC

    I am still not convinced this is a difference of kind. It still looks like a difference in degree. I.e. more complexity rather than a different kind of complexity. Not all torpedos were purely gyroscopic. Depth sensing was accomplished, for example, with pneumatic bladders.

    Ah, the complexity of pneumatic bladders, now there is a machine that will take over suffocate the world :P

      If I understand you correctly you are making a "More is Different" kind of argument. I.e. software cannot be reduced to applied hardware. And at least in some areas I agree. Hardware and software are different abstractions for different domains. Software is not merely applied hardware just as chemistry is not merely applied atomic physics and biology is not merely applied chemistry.

      My doubt though is with the specifics. Chemistry derives some important properties from atomic physics, and software derives important properties, including limits, from hardware. In other words software will always be physically limited by hardware and by important other physical limits (for example the CAP Theorem becomes a physical limit when you are talking about physically separate systems). We think about logic problems and decision trees in different ways.

      So what I am getting at is that there is a lot to be gained at looking at software from a software-as-machine perspective. There is a lot of insight that can be gained there. I think you and the other poster (the "Managing the Mechanism" guy) are arguing about how it is different. I am not convinced that either of those sets of differences hold up. I think no software plays the game itself (that's a fantasy), but also software is not unlimited either by logic or by the physical world. Rather the differences are differences in abstractions we come up to manage the differences in complexity.

      An example might be quantum physics -> solid state electronics -> integrated circuits -> chip architecture. Each of these disciplines invents additional abstractions to deal with the complexity and changes in behavior found. But if you want to really understand one level you would do well to become reasonably competent at the level immediately underlying it.

        software is not unlimited either by logic or by the physical world. Rather the differences are differences in abstractions we come up to manage the differences in complexity.

        Software is limited by the hardware it runs on; but those limitations are the hardware's limitations, not the software's.

        I'll take this slowly, and that is not an implied insult to you, just a necessity, because either I'm really bad at conveying my message -- though historically I've been told I'm reasonably good at that -- or this is one of those concepts -- like quantum mechanics -- that people have a real hard time grasping.

        Eg. A 64-bit process can theoretically address 16 EiB (18446744073709551616 bytes); but the first x64 processors had a physical bus limit of 44-bits. That physical reality constrained the performance and capacity of software written for that processor. But with the next generation of hardware x64 chips had a 48-bit bus, so that same software -- if written correctly -- would, without even being recompiled -- would instantly have 16 times its old capacity.

        But it doesn't stop there. The same processor that has 16 times the capacity, also halved the latency between chip elements, and effectively doubled the number of those elements available to the chip designer. The first means that the (unchanged) software runs (almost) twice as fast at the same clock speed; the second means that the chip designer can expand the size of the caches; and increase the depth of the instruction pipelines; and increase the number of parallel execution units; with the effect that that at the same clock speed, the cpu can enact the same instructions that on the earliest version took 3, 5, 8, or 21 clock cycles to complete, so that now, almost every instruction completes within a single clock cycle.

        So, the same, unchanged, even un-recompiled, software has 16 times its old capacity and executes 2, 4 or even 8 times faster.

        Software has only logical limitations.

        Contrast that with the fact that building a bridge across the Atlantic Ocean will forever be physically impossible. Traveling to other galaxies will remain physically impossible.

        Step 2 follows.


        With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
        Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
        "Science is about questioning the status quo. Questioning authority". I knew I was on the right track :)
        In the absence of evidence, opinion is indistinguishable from prejudice.
        I'm with torvalds on this Agile (and TDD) debunked I told'em LLVM was the way to go. But did they listen!

        Part 2

        In this complex machine, each gear wheel represents one decimal digit.

        Each time the bottom gear completes a full rotation (counts to 10), it causes the gearwheel above to step forward by 1/10 of a rotation. And when the bottom gear completes 10 rotations it has caused the second wheel to rotate completely once; which in turn causes the third gear wheel to step forward 1/10 of a rotation; thus 100 is recorded.

        So, to add digits of precision; you simply add more gears wheels; but there is a problem with that. As you add more gear wheels, the physical force required to click over the highest digit gets harder and harder. At first you can make the bearing more accurate and better lubricated; then you can add a reduction mechanism to the (human powered) input shaft so that it requires more turns of the handle to drive a single rotation of the bottom gears. It means it takes longer to run through a given calculation but the force multiplier of the reduction gear overcomes the input force requirement limitation.

        That difference engine uses 31 digit numbers. With modern manufacturing techniques & materials, I could see that being extend to 100 or even 200 digits. The cost would be huge; and the reduction ratio of the input drive would be very high, meaning that the calculations would be very slow; so today, we'd just add an electric motor to do the donkey work.

        So then you add more digits; and increase the reduction gear ratio further to compensate; but eventually physics wins out and the force required to turn all the gears in concert is so high that the metal of the gear teeth -- whatever metal you use -- simply cannot transmit the forces required. And you've hit the fundamental physical limitations of the system.

        In software; that physical limitation doesn't exist.

        On my commodity hardware machine, using arbitrary precision software, I can multiply 10,000 digits numbers together with ease. Slowly, but the greatest effort is inputting the numbers. And distributed computing projects like Great Internet Mersenne Prime Search are routinely working with numbers with millions of digits. Whatever physical limits the hardware of the day has; they can be overcome by "simply" using more hardware. Well design & properly written for the purpose, the software doesn't need to change at all.

        Suggesting that the difference is simply a matter of scale is like saying the distance from here to Alpha Centauri is just a matter of scale. The truism that every journey starts with the first step; doesn't help when there are 41314127522800000 steps to take.

        The difference between software and hardware are not orders of scale or magnitude; but the fundamental physical laws versus the human brains ability to conceive of and coordinate the logical complexity of the problem.

        As of yet, software is still in its infancy. The software (DNA) that controls the hardware (wetware) of the human brain is millions of times more complex than the software we currently write. Our ingenuity has allowed us to construct hardware that can run our, relatively speaking, currently simply software, very fast. Much, much faster than the human brain.

        However, we don't yet have good algorithms for solving problems that even extremely small, and primitive biological computers (eg. ant's, bumblebee's; octopus's and jellyfishes brains) take in their stride. We currently compensate for the crudity of our algorithms by using brute force; using speed and the historic growth of that speed.

        That growth in speed is rapidly running out; so we are now moving to using concurrency and massively distributed concurrency to compensate. In doing so we hit another fundamental physical limit; energy demands and cost. The largest HPC systems are using 10s and 100s of MegaWatts of power; but mostly what they do are quite simple algorithms; they just do them billions and billions of times to produce the results we are after.

        But it still requires the human brain to intuit the next steps in the evolution of our knowledge.


        With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
        Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
        "Science is about questioning the status quo. Questioning authority". I knew I was on the right track :)
        In the absence of evidence, opinion is indistinguishable from prejudice.
        I'm with torvalds on this Agile (and TDD) debunked I told'em LLVM was the way to go. But did they listen!

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://1135957]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others making s'mores by the fire in the courtyard of the Monastery: (2)
As of 2024-04-19 18:38 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found