Emotional Intelligence — The Next Online Frontier: Insights from Facebook’s Compassion Research Day
Forget artificial intelligence. Forget even business intelligence. The next intelligence frontier for enterprising online platforms is emotional intelligence.
Have you noticed those new bright yellow emoticons on Facebook? Facebook calls them “stickers,” which it turns out are based on an evolutionary deep-dive into emotions as researched and discussed by none other than Charles Darwin himself in 1872 book The Expression of the Emotions in Man and Animals. This is more than just child’s play for Facebook, signaling a heavy investment into a curriculum, of sorts, for “emotional intelligence” — not only for the platform itself, but for us humans who use it.
I spent last Thursday at Facebook in Menlo Park at their 4th annual Compassion Research Day, during which they invite anyone who is interested to their campus to hear “everything we’ve learned in the last year about what happens when you apply the science of how people relate to each other to social technology.” Team Facebook doesn’t do this in a vacuum: Researchers, engineers, and user-experience folks both from within Facebook and from heavily credentialed
research institutions such as Yale, Berkeley, Claremont McKenna, and Stanford were all on-hand to talk about their work and some recent Facebook inventions.
I came expecting a serious talk about handling of online bullying behaviors, but what I heard was potentially a lot more expansive. They did drill a great deal into some recent related inventions including the new anti-bullying hub at http://www.facebook.com/safety/bullying and a new reporting system that every teen in the US aged 13-16 can click to when encountered with offensive posts, all backed up by this same heavy-hitting research and real-time “rapid feedback” collection.
All of this is underpinned by a more expansive mission beyond helping you report more abuse to Facebook. On the contrary — Facebook wants to help people relate to each other. In this Facebook, you, your kids, and your grand-kids would be able to learn to become emotionally intelligent.
At the outset, kids need help identifying their emotions when they’re being bullied, the researchers discovered, but at the same time they want to be careful to not make a big deal out of something might not actually be such a big deal to the targeted individual. This is where their rapid feedback system has come in handy in helping determine why people report things on Facebook in the first place and what really is going on for the individual reporter and reportee, and helping kids untangle it all.
And be assured Facebook is watching and collecting everything.
Said Facebook UI researcher Pete Fleming, “We do EVERYTHING to try to understand how you use Facebook, and we also work hard to develop new methods and tools to try to understand you better.” To that end, the new reporting tools are referred to as “conflict resolution” rather than just “reporting.” Take a look at the new flows here:
As you can see from these flows, their aim is to help kids first identify and understand their own feelings, and then to try to work it out with their counterpart if possible.
To be clear, sometimes you can’t work it out directly and sometimes there are situations where Facebook needs to pull posts down. In addition, Nikki Staubli, member of Facebook safety team behind the new hub at http://www.facebook.com/safety/bullying, told me that if if you click that “I feel like I might harm myself” link, her team gets these reports, they respond immediately, and they give them local resources and help lines. “We stop their Facebook experience and put them in a checkpoint.”
But the majority of the time, says Emiliana Simon-Thomas from Berkeley’s Greater Good Science Center, “the very best thing for a person to do is send a message. It’s not going to help to send it to Facebook. Facebook is not going to be able to do anything, but you can resolve it with your friend.”
Coupled with their new rapid-feedback data collection, Facebook can tell that this approach seems to be working. Paul Piff, also at Berkeley’s The Greater Good Science Center, says of the new real-time tools, “We have the ability to capture what people are feeling online as they are feeling it. As a function of using these tools that make them more empathetic, people are feeling better towards the other person and other people.” Put another way by Simon-Thomas, the new frameworks serve to “contribute not just to individual wellness but greater community wellness.”
Dacher Keltner, also from The Greater Good Science Center, added that the point of the new built-in stickers is “To equip people with more ability to communicate authentically in a place where facial expressions aren’t available.” Keltner, who helped design and bring the stickers to Facebook, added that “Facebook is a place where we’re teaching each other to become emotionally intelligent.”
But he takes it a step further, hinting at what might be in the future, when referring to the amazing granularity of vocal expression:
“If we really want to make Facebook emotionally, rich we need to concentrate on the voice.”
Vocal Bursts: When a smile isn’t enough
The ultimate promise according to Keltner is no less than “building into the Facebook experience the languages evolution has crafted for millions of years,” an “unprecedented science.”
And you thought it was just about cats.
It’s becoming more and more common not just on social networks but also in business settings to actually get down to emotions. Here at SAP we’ve pioneered design thinking around empathy under the shepherding of community advocate Marilyn Pratt and others. And though we think we’re more likely to spend our cycles immersed in business intelligence at SAP, it turns out there’s a whole other intelligence underpinning a lot of our user experience wherever we turn.
There actually is a wall at Facebook
As we rely more and more on online worlds, remember that technologists of all types are thinking hard about what you write on that wall and working on how to engineer more emotional intelligence into all of our interfaces. Is it really any more of a stretch to foresee a future in which your business tools offered flows that prompt “Are those salaries balanced properly for your workers’ quality of lives?” “Could you trim carbon emissions in this area?” “Did you really want to think for the short term, or do you want to extrapolate for your children or grandchildren’s futures?” What questions would you ask to help create the future you want to see?
Watch the recordings from Facebook’s 4th Compassion Research Day:
Very nice - I still don't have the hang of Facebook yet (I still feel very newbie) but am grateful for the interaction with family members who live far away.
Thank you for sharing this; something to think about
This fits in with what I have been thinking off for some time already. We've gone past the bits and bytes of the cold, logical machine, and we're moving more and more into the direction of "situational aware IT" (or emotional IT).
I've seen some examples already of how to use sentiment analysis. Social Media are a prime playing ground for experimenting with that. Sooner or later, it will also find it's way into mainstream businesses.
It's not a coïncidence that Psychology has made its way into IT projects (Gamification, Design Thinking, Human Machine Interaction,...)
But what if your IT itself could detect that, you are in a bad mood today, so you probably should not send this foul-mouthed mail to your colleague for not responding in time...
"Emotional IT" -- thanks for that reference Tom! brilliant. I think that's it. I think we can't ignore that we're getting our hands on technology at younger and younger ages, which means we may be saying a lot of things that remain permanently online before we've learned a whole lot about relationships (not to mention, working and getting a job in the first place, and then keeping one in the face of being able to send that foul-mouthed mail).
What do you think about Facebook self-assigning this role? On the one hand, I found it a bit daunting to think we'd relegate teaching emotional intelligence to a social enterprise like Facebook, but I warmed to the idea during the day, because if they're there already, at least Facebook is working on the task.
I have mixed feelings about what Facebook might be planning.
- Letting people decide what is "bad" and then making them go through a process to identify and classify why it's bad, is cumbersome. It's the easy way out for Facebook.
- But imagine that Facebook would auto-detect what is bad. That would be creepy. You would get the feeling that someone is watching every move you make.
- no doubt that this early stage of user triggered filtering is intended for a self-learning mechanism which will eventually lead to Auto-filtering. So it goes from cumbersome to creepy.
So the big question is: how will they package this, to make it acceptable to the users?
also: why limit to age-groups 13-16?
I seem to remember Google using a bit of situational aware logic in Android. In the early days, there was a feature in Android which would detect the number of spelling and grammar mistakes in your text messages. If the Algorithm decided that you were having trouble typing your message, and it was past 11 O'Clock in the evening, you'd get a popup saying:
"You may be drunk, are you sure you want to send this message right now?"
I'm not sure that Facebook thinks it's the easy way out to try to bring emotional intelligence curriculum to their end-users -- but I agree on the mixed feelings. If this is the goal, I'd also like to see some curriculum in schools too. As for auto-filtering, I didn't detect a hint of that in the future. However, they did suggest they're investigating platforms for people younger than 13 -- and many of the platforms my kid plays on filter text, which seems to be a good thing for young ages.
This is why the age limits start at 13 -- but many questioned why stop at 16.
And that "you may be drunk..." sounds like an April 1 algorithm 🙂
I have been wanting to get back to this post for a few day as I responded on Facebook (of all places) thanks for writing the post.
As to is this the right thing for facebook to be doing? I think they are best placed to be doing it and they can't then I don't think anyone will.
I do think it is slightly ironic though that a site that has been originally devised and deployed by the more emotionally illiterate among us is now helping people become more emotionally literate.
I hope their motivations are pure and not just another way to mine more information about us to sell us yet more ads.
Thanks again for the write-up and I hope this is the start of good things to come.
ps it is all about cats ... this emotional intelligence stuff is just a red herring to feed the cats.
Thanks for that thoughtful contribution Nigel. It's true that Facebook has a somewhat rare position at the moment - still a relatively 'walled garden' where kids can explore being online in groups. No doubt that will be disrupted and potentially unseated, but it's a concern if properties like ask.fm or snapchat do the unseating, because those seem to be totally unmoderated at present.
Not sure about the emotional illiteracy part -- but I didn't see the Zuckerberg movie so I probably missed that part that despite being schooled in a respected college institution you can still be void of emotional good intentions. What I can say from what I observed that day is that there is a legion of truly dedicated researchers and user experience folk paying great attention to this. I started the day skeptical and wound up feeling pretty good about the whole endeavor.
Looking forward to staying tuned indeed.