Artificial intelligence: will engagement remain uniquely human?
Strong artificial intelligence will eventually be capable of integrating perceptions, feelings and even emotions. Will robots become engaged humanoids? Can engagement be understood outside the human sphere?
According to the American cognitive scientist Marvin Lee Minsky, the aim of artificial intelligence is to get machines to carry out actions that humans accomplish with their intelligence. (1)
A few years ago, people only talked about weak artificial intelligence, robots programmed through tailored algorithms and simulated classic intelligence. It was easy to imagine that, given its intimate and emotional nature, engagement would remain a human prerogative and an unparalleled advantage for collective performance.
Yet we now know that strong artificial intelligence will eventually be capable of integrating perceptions, feelings and even emotions. Will robots become engaged humanoids? Can engagement be understood outside the human sphere?
While the topic of engagement is complex to understand, define and in particular control, we do know some of its main principles, including the concept of a moral contract, in which employees do their best. This is a tacit agreement based on bilateral trust between the employee and the employer, the individual and the Group.
Reciprocity is at work on several levels, making it more complex. The first is relational, involving the link between the employee and the line manager: what English speakers call “the locus of engagement.” The second involves the organizational trust between the employee and
the Group, since engagement implies loyalty. Trust therefore combines reason and emotion.
Engagement draws on our instincts, actively mobilizing what Frédéric Laloux calls “our three brains”(2): our guts, our heart and, as most commonly acknowledged, our brain. Rarely has a moral concept drawn on so many nerve cells, which drive our behavior. This careful blend boosts performance.
Paul Zak, a neuroscientist focusing on organizational performance, has developed a methodology for identifying the cultural drivers likely to build a culture of trust at work, a psychological state he reveals as essential to collective performance.
In particular, he demonstrates the fundamental role of the hormone oxytocin in the creation of trust between individuals. Combined with a dose of strategic vision communicated by top management, trust (which Zak calls the “moral molecule”) enables engagement and therefore performance. (3) Although artificial intelligence is already able to distinguish true from false and fair from unfair, the moral link appears to remain exclusively human.
It is highly likely that in the short term, the trust/performance mix will remain one of the keys to a reinvented form of management, a future leadership whose top priorities will be mastering our instincts, neuronal plasticity and malleability in relationships. The ability to trust and be trusted, to find our way in an uncertain world and to trust our gut instinct will be a valuable skill in an increasingly fast-changing ecosystem.
Robots could take longer to develop this skill than humans.
First published in Les Echos
(1) Marvin Minsky, The Emotion Machine: Commonsense Thinking, Artificial Intelligence, and the Future of the Human Mind, Simon & Schuster, 2007
In his numerous publications, Minsky attempts to map human thought. He brings philosophical insights into artificial intelligence, which he approaches from a moral and computing standpoint.
(2) Frederic Laloux, Reinventing Organizations ; Vers des communautés de travail inspirées. Diateino, 2015
Laloux takes the example of our three brains to support his theory on the role of emotions in management practices, and in particular the role of instinct.
(3) Paul J. Zak, Trust Factor: The Science of Creating High-Performance Companies. Amacom, 2017
Paul J. Zak, The Moral Molecule: The Source of Love and Prosperity. Dutton, 2012