Vanishing Point

Welcome to the Fourth Wave? Cognitive Tech and AI are Reshaping Humanity

Posted by Eric Chase June 28, 2017 at 10:30 AM

iStock-687023482_web.jpgIn 1950, Alan Turing published an article in MIND entitled Computing Machinery and Intelligence.[1] It posits, confidently, that computers would advance to a point at which they could be programmed to learn and perform with skills rivaling human intelligence. At the time, the proposition was revolutionary, though it actually reflected ‘computability and the universal machine’ work he had been pursuing since 1936. The paper laid out the concept of the discrete state machine model, clarified the prospect of what he termed 'intelligent machinery' and provided real suggestions about how Artificial Intelligence could happen.

 

And it introduced the famous ‘Turing Test’ (aka ‘Imitation Game’) in which a human and computer would be “interrogated under conditions where the interrogator would not know which was which, the communication being entirely by textual messages.” The argument of the game is that if the interrogator could not distinguish the two, it would offer proof that technology could be intelligent, cognitive or – at its farthest reaches – conscious, in a manner of speaking.

 

Looking at the emergence of Artificial Intelligence and machine learning technologies over the past decade confirms that cognitive ability is absolutely a capability and that many of these technologies even seem to evoke consciousness. We know that technology, even augmented reality, is not actually biological – and consciousness fundamentally is. A recent article in the Harvard Business Review entitled In the AI Age, “Being Smart” Will Mean Something Completely Different poses the rhetorical question, “So, what can we do to prepare for the new world of work? Because AI will be a far more formidable competitor than any human, we will be in a frantic race to stay relevant. That will require us to take our cognitive and emotional skills to a much higher level.”[2]

 

The article follows what’s now a relatively familiar argument – that technologies like wearables and AI are integrating with humans, making us all part of a biodigital convergence. We know that ‘smart’ technologies as simple as health trackers or as complex as Brain Computer Interfaces (BCI) have merged people and products to generate new universes of data and potential for working, thinking, performing (and risk).[3]

 

More interesting than the article itself is a comment included in the discussion area of the piece:

The fundamental distinction between Artificial and Living intelligence is the way it is powered. Like bacteria, Artificial Intelligence can get its electrons a number of ways, whereas most animal life can only get its electrons through complexly processed means. This situation has been the architect of human predictive power, and has used bioenergetic hacks and biases to economize prediction. Empathy, imagination, and emotion are all predictive hacks, and what chiefly distinguish us from Artificial predictive processes. 

 

The point is that wearables, bionics, AI, virtual/mixed/augmented reality – all bleeding edge technologies that (literally) are reshaping how we work, interact, learn, and live – are doing so as ‘other’ to our individual, biological selves. Ironically, what we dubbed as ‘smart’ technology has begun to seem ill-equipped for where we are going. We’ve reached a new juncture where tracking and data analytics have given way to capabilities that some say may accelerate human evolution.

 

These are more than smart technologies. They are computer programs able to learn a task exponentially faster than any human. Meta apps developed to recreate themselves like some futuristic ouroboros. AR devices that enhance what you see, and take commands based on what you hear, say and gesture.[4] Together these cognitive technologies are increasing our human capacity to sense, reason, and control.

 

This shift has massive implications for consumers, companies, cultures, and countries. It poses questions about human adaptability. How much can we enhance our cognitive abilities before we lose fundamental human qualities? Will rejecting augmentation be a new way that people are classified, hired, employed? While many of these questions seem like part of a plot twist in an episode of Black Mirror, the looming opportunities and risks demand a future-focused approach.

 

With the broad consumer adoption of smart, IOT technologies, biodigital conversion became an everyday reality. With IOE, people and machines became even more tightly integrated into the critical infrastructure. That poses enormous yet unforeseen risks. It also represents significant possible opportunities. Resilience will be dependent on how we understand, plan for, and adapt to both.

 

It’s time to consider how cognitive technology can be an asset for the future of work, society, and humanity.

 

 Future of Security

 

[1] http://loebner.net/Prizef/TuringArticle.html

[2] https://hbr.org/2017/06/in-the-ai-age-being-smart-will-mean-something-completely-different

[3] https://futurism.com/we-may-be-on-the-brink-of-a-new-age-in-human-evolution/

[4] http://magicleap.com

Eric Chase

Eric Chase

Eric Chase is Senior Associate at Toffler with a proven track record developing trusted customer relationships with industry and government leaders in the defense, intelligence, security, and law enforcement communities. He is recognized thought leader with over a decade of experience helping leaders create and implement strategies to enhance organizational innovation and agility in both the public and private sectors.

POST A COMMENT

Subscribe to our blog