Archive for December 26th, 2016

The implications of an unparsable machine language aren’t just philosophical.  For the past two decades, learning to code has been one of the surest routes to reliable employment — a fact not lost on all those parents enrolling their kids in after-school code academies.  But a world run by neurally networked deep-learning machines requires a different workforce.  Analysts have already started worrying about the impact of AI on the job market, as machines render old skills irrelevant.  Programmers might soon get a taste of what that feels like themselves.
This explosion of indeterminacy has been a long time coming.  It’s not news that even simple algorithms can create unpredictable emergent behavior — an insight that goes back to chaos theory and random number generators.  Over the past few years, as networks have grown more intertwined and their functions more complex, code has come to seem more like an alien force, the ghosts in the machine ever more elusive and ungovernable.  Planes grounded for no reason.  Seemingly unpreventable flash crashes in the stock market.  Rolling blackouts.
These forces have led technologist Danny Hillis to declare the end of the age of Enlightenment, our centuries-long faith in logic, determinism, and control over nature.  Hillis says we’re shifting to what he calls the age of Entanglement.  “As our technological and institutional creations have become more complex, our relationship to them has changed,” he wrote in the Journal of Design and Science.  “Instead of being masters of our creations, we have learned to bargain with them, cajoling and guiding them in the general direction of our goals.  We have built our own jungle, and it has a life of its own.”  The rise of machine learning is the latest — and perhaps the last — step in this journey.
To nerds of a certain bent, this all suggests a coming era in which we forfeit authority over our machines.  “One can imagine such technology outsmarting financial markets, out-inventing human researchers, out-manipulating human leaders, and developing weapons we cannot even understand,” wrote Stephen Hawking — sentiments echoed by Elon Musk and Bill Gates, among others.  “Whereas the short-term impact of AI depends on who controls it, the long-term impact depends on whether it can be controlled at all.”
   —    Jason Tanz
From his article:  “The End Of Code
Appearing in the June 2016 issue of Wired Magazine
[Every 10 years or so we are cautioned about computers, the end of programming, Artificial Intelligence and “the end of code”.  And, as always, I am reminded of the quote:  “The survival value of human intelligence has never been satisfactorily demonstrated.”   —   Michael Crichton from his book:  “The Andromeda Strain“.   I guess we may see, sooner rather than later.   —   kmab]
On This Day In:
2021 Live Well
2020 Every Touch Vibrates
When She Smiles
2019 Six Reasons Why #DumbDonald Will Always Be A Failure
2018 The Trouble With Bookstores
Another No Chew Diet
2017 Biased World View
2016 Control In The Age Of Entanglement
2015 Okay, Maybe Not Ceaseless
2014 Can Do
2013 Are You Helping?
2012 Inside All Truth Is A Vacuum
2011 So, Whom Are We Trying To Fool Then?

Read Full Post »

%d bloggers like this: