Our CEO Bill McDermott says, “Everything has to start with empathy for the end user.” So, more precisely, what is empathy? It is “The quality of feeling and understanding another persons’ situation in the present movement and communicating this to the person.” In addition, empathy fosters social emotional competence like self-awareness, responsible decision making, relationship skills, and self-management. A machine’s ability to read the “emotion states” of a mind (body language, voice, and facial expression) is critical in the new paradigm of an autonomous system. Empathy changes the way we feel and more importantly, machines can be trained to recognize, express, and “have” emotions by reading those three emotion states.
When building such a cognitive system, facial recognition and detection plays a huge role. Face recognition and detection occurs routinely and effortlessly by humans. The first automatic face recognition system was developed by Kanade in 1973, and it’s since evolved to the highest level of performance with machine learning and system processing power. Essentially, face recognition is a classification problem consisting of four critical modules:
- Feature extraction
Detecting Expressions and Machine Learning
Paul Ekman has spent 40 years watching thousands of people and is the world authority on facial expressions. As per Ekman and other predominant scientists, facial expressions across the globe fall roughly into seven categories:
Each one of these categories has well-defined markers. For example, with Surprise, the upper eyelids and brows rise, and the jaw drops open. So an empathic system leverages these markers with machine learning algorithms such as principal component analysis (PCA), independent component analysis (ICA), linear discriminant analysis (LDA), neural networks methods, AdaBoost-based methods, and other learning algorithms. For the FERET database, an image corpus (consisting of 14,051 eight-bit grayscale images of human heads) was assembled to support government monitored testing and evaluation of face recognition algorithms using standardized tests and procedures. This database is available and can be leveraged as well.
Empathy in Machine Learning Provides the Human Touch
Empathic application has profound use cases across many domains, like customer service, companion robotic systems, defense and security. And more importantly, these systems also can provide superior care giving to people suffering from autism, cataracts, Alzheimer’s, and osteoporosis.
Machine learning enables machines to know who, what, where, when, and why, so that the machines can anticipate and respond to our needs gracefully.
In the new era of machine learning and artificial intelligence, empathy matters. As empathic system and applications continue evolve, they’re certainly not replacements for humans, but both empathic machines and humans will coexist.
What Do You Think?
I’d like to hear your thoughts on the role empathy plays in machine learning. Share them here or on Twitter at @ChandranSar