Beefy Boxes and Bandwidth Generously Provided by pair Networks
Perl: the Markov chain saw

Re^3: The future of Perl?

by wjw (Priest)
on Nov 09, 2014 at 17:05 UTC ( #1106623=note: print w/replies, xml ) Need Help??

in reply to Re^2: The future of Perl?
in thread The future of Perl?

Thanks for that BrowserUK. Those points are informative to me, and seem to be on the money.

Clearly the example I used is a poor one for having tried to make the point I was attempting to make. (My lack of depth of knowledge in the arena). Perhaps our understanding of language is driving our innovation in hardware instead of the reverse. I wonder what might happen if the reverse was approached more aggressively. Maybe that is what is happening at a certain level in the fields of bio-mimicry and maybe even in quantum computing.

At any rate, I think you are right; hardware does not affect languages at this point in time, but I don't doubt that it will in the future. I find it hard to believe that all machine languages by nature have to operate at the basic level in one of two states. I think the success thus far has limited the innovation.

I have gone way off track relative to the original post and suspect I better stop. Thanks for the interesting meditation.

...the majority is always wrong, and always the last to know about it...

Insanity: Doing the same thing over and over again and expecting different results...

A solution is nothing more than a clearly stated problem...otherwise, the problem is not a problem, it is simply an inconvenient fact

Replies are listed 'Best First'.
Re^4: The future of Perl?
by BrowserUk (Pope) on Nov 09, 2014 at 18:48 UTC
    I think you are right; hardware does not affect languages at this point in time, but I don't doubt that it will in the future.

    Hm. All the signs are that more and more, languages will present higher and higher level abstractions of the programming model and the hardware will adapt to the software view.

    For example, storage system manufacturers are beginning to move away from the bytes, blocks and files view of the disks and SSDs they manage, and moving to an Object-oriented view, which does away with the need to convert binary memory objects to some intermediate format (eg.JSON) so they can be written out as files via the file system.

    CPUs currently have 3 or 4 levels of caching; much of which is split into separate instruction and data caches. I wouldn't mind betting that it won't be long before we have another type of on-chip caching; namely, a dedicated stack cache. Due to their nature and use, stacks have really good locality of reference, which makes breaking them up into typically short (16byte) cache lines and mixing them up with the random access of heap storage, is sub optimal. Better to have stacks cached in 2k or 4k chunks and keep them in memory between task switches where possible. Segregating stack caches would allow relatively small amount of dedicated on-cpu real estate to keep the stacks of many/most threads alive between context switches, whilst preventing the stack memory from 'polluting' the general data caches.

    Further, hypervisers are well on their way to virtualising everything: disks, memory, networks, ports, cpus, gpus Essentially, the 'computer' where your programs run is nothing more than a figment of the hypervisors imagination. The programmers view may be of a single 32-bit cpu with 2GB of ram and a local harddrive; with the reality being that it is a VM that may be running on 1 of 32 cores on a 64-bit 4-cpu blade in a farm of machines with 128GB memory and the drive provisioned from a multi-tiered remote drive array.

    And the actual hardware might be x64, or ARM or Solaris or Power 8; and neither the program nor the programmer need be aware which.

    With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
    Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
    "Science is about questioning the status quo. Questioning authority".
    In the absence of evidence, opinion is indistinguishable from prejudice.

      IBM’s System/38 minicomputer took just such an “object-oriented” view.   Since the computer was designed only to run their operating software, they pushed many low-level operations directly into the microcode.   I don’t believe that they continued that practice in the machine’s functional successors . . .

Log In?

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://1106623]
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others romping around the Monastery: (4)
As of 2021-02-28 22:40 GMT
Find Nodes?
    Voting Booth?

    No recent polls found