Artificial Intelligence – Time to Take a Deep Human Breath

By Peter Bernstein
28 Jan 2019

I can’t speak for the rest of my BCStrategies colleagues, but I am incredibly tired of the hyperbole surrounding artificial intelligence (AI), machine learning, automation and “digital transformation.” The truths are:

  • There is nothing artificial about the intelligence of today’s and future computing capabilities.
  • We are not on the verge of digital Armageddon and total dystopia.
  • Conversely, we are not on the cusp of a digital utopia for businesses that trickles down to the rest of us in the next few years.
  • Digital transformation as industry elixir for enterprise communications is nothing new.

Anecdotally, on the last point, my colleagues will remember the early 1980s sign over the cafeteria of a major communications equipment vendor. It read something like: “We will be the leaders before, during and after the digital transformation.” At lunch with Fortune 50 CIOs and CTOs, I was asked in my capacity as a consultant to the vendor by one of the visitors if the hosts were kidding. “There is no before, during and after. There is only now and crisis management. We will likely be at our current jobs less than three years.” The career situation has only gotten worse. Pardon the digression.

It is time to take a deep breath and put a little context around all of the hyper enthusiasm. We are clearly getting “over our skis.” This is problematic at best.

The Past is Prolog?

What got me thinking about this was the recent New York Times article, “The Hidden Automation Agenda of the Davos Elite,” by Kevin Roose. It amplified the prediction made the same week on CBS’ widely viewed 60 Minutes, by widely respected technologist Kai-Fu Lee, that AI will displace 40 percent of the world’s workers within 15 years. He added, “I believe [AI] is going to change the world more than anything in the history of mankind…More than electricity.” Throw on top of this a dash of the rampant coverage of CES which was all about AI and the disruption to be caused by 5G’s arrival. YIKES!

There is limited space here to recite the long history of AI in particular. For that I recommend everyone read the accurate description on Wikipedia, “History of artificial intelligence.” It starts with:

The history of Artificial Intelligence (AI) began in antiquity, with myths, stories and rumors of artificial beings endowed with intelligence or consciousness by master craftsmen…

The seeds of modern AI were planted by classical philosophers who attempted to describe the process of human thinking as the mechanical manipulation of symbols. This work culminated in the invention of the programmable digital computer in the 1940s, a machine based on the abstract essence of mathematical reasoning. This device and the ideas behind it inspired a handful of scientists to begin seriously discussing the possibility of building an electronic brain.

Wikipedia appropriately also has an entry, “Timeline of machine learning.” It is also worth a read and bookmark.  

In addition, put on your must read list, “A Mind at Play: How Claude Shannon Invented the Information Age,” by Jimmy Soni and Rob Goodman. It documents why Shannon is respectfully credited as “The Father of the Information Age.” It notes that in 1950 he demonstrated a mechanical mouse that was capable of “learning” how to navigate a maze based on trial and error, i.e., six years before recognition of AI as a field of study.  

In short, we appear to be in another period of jargon juggling, a.k.a. “let’s give them something to talk about.” This not much different, for example, from the fascination with the Cloud. To my mind Cloud is an updating and rebranding of what in the 1970s IBM called “shared space computing.” In telecom we had Centrex, a dedicated computer that “hosted” several hundred features/services enterprises consumed on a subscription basis instead of via their on-premises PBXs. It too was what we’d now call Cloud. Think of all the rebranding of hosted communications since the advent of digital Centrex in the late 1970s and early 1980s. You have to love the evolution of marketing spin!

I have written and spoken many times in the past decade that we are in, “The Age of Acceleration.” It is an age where the only constants in computing and communications are change and the velocity at which it is coming at us. Those who do not adapt to change and its speed will suffer the dire consequences.  

Will AI and automation rapidly change the way in which work is done and by what? Of course. This is digital manifest destiny. It is useful to think about how many of the mundane things human do can be done faster, more efficiently and effectively by powerful processing-enabled non-human capabilities. History informs us that it is not, however, a reason to hyperventilate.

Will it displace human activity? Hardly. What the Information Age has taught us, if nothing else, is that while computational and communications innovation-driven change creates what many deem premature obsolescence, it also creates incredible opportunities and wealth. The real problem is not power of new tools, but rather that the gap between our educational systems and public policy apparatus adapting to our changing world is growing rather than shrinking.

The Davos Elite’s focus on short-term profits and shareholder value creation (i.e., their job security) imperils the societal value that properly focused, applied and supervised AI, et al, can deliver. It is noteworthy that the supposed revolution we are engrossed in will like almost all revolutions before it probably take place in evolutionary fashion, albeit somewhat quicker. 

Machines may be able to learn faster, provide “insights” and execute decisions in more timely ways, but it is trust that makes the world go around. Trust, along with the exercise of emotion-imbued volition, are essential ingredients of what separates man from machine.  

A reasoned embrace of AI and its cohorts, with a long-term understanding of costs as well as benefits, is the path to a future where the benefits are optimized, and risks minimized. We need to get off of the top of the hype curve and use some of the exponentially increasing computing power to get a handle on use cases, metrics, best practices, etc. 

Some have speculated that soon the need for industry analysts and consultants will disappear. As tantalizing as this sounds to those of us looking for more time to indulge our other interests, I hope not. I like writing. What we all could use is for AI-driven capabilities that with five nines accuracy can delete according to our instructions all of the communications in all of our virtual receptors. The amount of time we get back from not manually dealing with emptying spam filters, reviewing emails, cleaning up our social media accounts, deleting voicemails and other digital detritus, would enhance personal and professional productivity, i.e., be invaluable. This would be AI to the max. Worth a very deep breath.      

Comments

There are currently no comments on this article.

You must be a registered user to make comments