Skip to Content
Business Trends
Author's profile photo Saravana Chandran

Predictive Thursdays: Empathy, Facial Recognition, and Machine Learning

Our CEO Bill McDermott says, “Everything has to start with empathy for the end user.” So, more precisely, what is empathy? It is “The quality of feeling and understanding another persons’ situation in the present movement and communicating this to the person.” In addition, empathy fosters social emotional competence like self-awareness, responsible decision making, relationship skills, and self-management. A machine’s ability to read the “emotion states” of a mind (body language, voice, and facial expression) is critical in the new paradigm of an autonomous system. Empathy changes the way we feel and more importantly, machines can be trained to recognize, express, and “have” emotions by reading those three emotion states.

When building such a cognitive system, facial recognition and detection plays a huge role. Face recognition and detection occurs routinely and effortlessly by humans. The first automatic face recognition system was developed by Kanade in 1973, and it’s since evolved to the highest level of performance with machine learning and system processing power. Essentially,  face recognition is a classification problem consisting of four critical modules:

  • Detection
  • Alignment
  • Feature extraction
  • Matching

Detecting Expressions and Machine Learning

Paul Ekman has spent 40 years watching thousands of people and is the world authority on facial expressions. As per Ekman and other predominant scientists, facial expressions across the globe fall roughly into seven categories:

  • Sadness
  • Surprise
  • Anger
  • Contempt
  • Disgust
  • Fear
  • Happiness

Each one of these categories has well-defined markers. For example, with Surprise, the upper eyelids and brows rise, and the jaw drops open. So an empathic system leverages these markers with machine learning algorithms such as principal component analysis (PCA), independent component analysis (ICA), linear discriminant analysis (LDA), neural networks methods, AdaBoost-based methods, and other learning algorithms. For the FERET database, an image corpus (consisting of 14,051 eight-bit grayscale images of human heads) was assembled to support government monitored testing and evaluation of face recognition algorithms using standardized tests and procedures. This database is available and can be leveraged as well.

Empathy in Machine Learning Provides the Human Touch

Empathic application has profound use cases across many domains, like  customer service, companion robotic systems, defense and security. And more importantly, these systems also can provide superior care giving to people suffering from autism, cataracts, Alzheimer’s, and osteoporosis.

Machine learning enables machines to know who, what, where, when, and why, so that the machines can anticipate and respond to our needs gracefully.

In the new era of machine learning and artificial intelligence, empathy matters.  As empathic system and applications continue evolve, they’re certainly not replacements for humans, but both empathic machines and humans will coexist.

What Do You Think?

I’d like to hear your thoughts on the role empathy plays in machine learning. Share them here or on Twitter at @ChandranSar

Assigned Tags

      2 Comments
      You must be Logged on to comment or reply to a post.
      Author's profile photo Former Member
      Former Member

      Machine learning will have to go much further than merely recognising facial expressions (e.g. Paul Ekman's microgestures) to fully read and interpret body language. Whereas Ekman has identified some 3000 microgestures, body language comprises around 700,000 gestures. Recognition and interpretation alone has limited value. True value is subsequently gained from managing the observed body language.

      Author's profile photo Former Member
      Former Member

      Hi Tony,
      Completely agree, three elements are critical in the context of empathic machine learning system and they are body language, facial expression and voice. We are at the very early stage of building such machine learning system, but there is huge hope for higher acceleration of innovation due the faster processing speed and highly available systems exist today.

      Thanks
      Chandran