Diversity in Tech for Technologists: You’re part of the problem. You can be part of the solution.
Adapted from SAP Mentors Webinar April 9 – (Disclaimer: Views my own)
If you’re a technologist and you’re sitting it out because you think the buzz-phrase “diversity in tech” doesn’t apply to you, I challenge to you to recognize that you’re part of the problem. Diversity isn’t just an HR issue, a talent pipeline issue, a question of recruiting, and an embarrassing statistic for major tech corporations. It’s a technology issue – and as a technologist, you can do something about it.
It’s been a month since the Lesbians Who Tech Summit 2018 and this may be my favorite, lingering insight.
Key themes from Lesbians Who Tech Summit 2018 – images via Camille Eddy
Since the organizers at Lesbians Who Tech worked hard to intentionally include a variety of marginalized technologists – featuring 100% female speakers, 50% speakers of color, 10% transgender and non-binary speakers – it’s no coincidence that key themes about bias in the machine were repeated at various eye-opening sessions. As technologists, it’s especially our responsibility to recognize when the status quo in critical emerging technologies such as machine learning, artificial intelligence, and the algorithm economy is inherently biased, or otherwise face unintended consequences.
Camille Eddy: The Cultural Bias in AI – images via Camille Eddy
The Cultural Bias In AI
Camille Eddy’s brilliant talk at the Summit on the cultural bias that sneaks into AI (find her similar talk online) invites us to do just this. Too often, she said, the data sets we use to train the models that inform our mathematical algorithms are not actually diverse, and if we’ve failed to source diverse data sets, the foundation of our technology is biased and skewed to not be able to even recognize the diverse people who use it. “What happens,” she asks, “when any segment of the population is invisible to the technology we use?”
Segregated technology happens. In her work with robotics, Camille relates how people of color are often invisible or misidentified in facial recognition. You may have heard of the soap dispenser that used a sensor that couldn’t recognize black skin. You may also know that some social media photo filters have trouble optimizing black skin the same way they optimize white skin. Did you know there are some optical sensors in fitness trackers that struggle to properly measure pulse through black skin? When wearables fail to source diversity, this can lead to life-threatening consequences in our tech-reliant world.
How can we prevent biased technology from being released into the market? And when we fail to prevent it, Camille asks, “How do we actually interrupt the bias as it happens?”
Filipe Roloff, in support of a colleague recovering from being attacked
In case you are tempted to think of this as “just” a black thing or a gay thing or a female thing, consider that there are 7.6 billion of us on planet earth – each of us different. Contributing their talents at SAP alone, we have people on the spectrum bringing special talents, people in wheelchairs helping us problem solve, blind people who help make our user interface better, veterans and people who’ve immigrated and adapted to wildly different situations, and yes even straight white cis males – all of us together helping make better technology just by being who we are — yet each of us being reduced to a limited set of descriptors in algorithms that are used to profile us, serve us more technology, and influence us.
Sandberg and Swisher at Lesbians Who Tech Summit 2018 – images via Twitter and Lesbians Who Tech
When Bias Leads to Weaponization
Probably no company is having a more public challenge about data profiling right now than Facebook. Beyond the current sensationalism in data profiling, protection, and privacy, another important question I fear we risk missing is about the potential for weaponization of bias.
What are tech providers’ roles now and going forward in tempering technology that serves and exploits bias – in essence manipulating our own confirmation bias in turn with biased algorithms – to such a massive degree that it can influence elections — the very fabric of society?
Should tech providers also have been responsible for predicting this would happen? As Kara Swisher asked Sheryl Sandberg in their talk at the Summit, “How did you not predict your platform would be weaponized?
Sandberg admitted: “We’re playing catch-up”— yet this is far from just Facebook’s issue or a matter of recruiting diversity. It’s not just somebody else’s problem, for example, that there are fewer than 2% black technologists at our elite Silicon Valley companies. If we’re not all not asking questions and testing for bias – or the intent to cause harm (to ourselves or others) – we can consider ourselves all part of the problem.
I, too, a gay white female in tech thinking and writing often about diversity issues, am part of the problem with as much of an inherently biased network as any other one single person in tech.
Even as magical as the Lesbians Who Tech Summit 2018 was, its wild conference diversity did not happen automatically. Lesbians Who Tech had to fight (others and themselves) to be intentional, inclusive, intersectional. (Trigger alert: the (other) “q” word may appear).
Which is the good news: that means we all can do it. We can question our bias. And we can test for it.
Disarming The Bias
There are three phases I think technologists need to go through to address bias in technology.
Get over it.
Pardon my directness, but first, we can check our privilege and get over (or promise to just try!) our issues with words like “quota” that might actually serve a productive purpose towards the diversity goals we say we have.
Then, we can take a look at who surrounds us and challenge it: Is there a diverse tribe in this room? Does this meeting room represent the communities in which we actually live? The world at large? The people who are going to use this tech or rely on it for their lives?
Be the technology.
Finally, if any of that makes us uncomfortable: We’re technologists, and as technologists, we can test our code for the same bias. Look at the diversity or lack of it in your data sample, and by how it acts when it is used by a variety of people. There are also emerging services (built by, yes, technologists) out there that specifically to help interrupt bias in tech.
Three inspirations of this from Lesbians Who Tech include:
Ana Arriola – images via Twitter
- Ana Arriola from Facebook talked at the Summit about how they are exploring “benevolent uses of AI” such as being able to determine a suicidal intent in a post, or ensure we never mis-gender or be harmful to anyone online.
- Aubrey Blanche from Atlassian talked about technology to review for and interrupt biases and gendered words in job descriptions (https://textio.com/), which echoes SAP’s own Business Beyond Bias.
- To help you get beyond your own biased hiring networks (yes, all of our networks are actually inherently biased), Lesbians Who Tech is architecting a tool to help validate nontraditional and underrepresented talent called include.io https://include.io/.
And there are many other tools coming out or not even invented yet in this essential space. Perhaps you will create one.
You can be part of the solution.
Thank you Moya Watson for this great recap. Thank you for your great work in the community too!
Back atcha on both counts --
In the community it's the same thing, isn't it? We have to operate intentionally instead of thinking community just happens by itself. I'm super proud that I get to work in this area, encouraging people to share their voices, and I feel like it makes a difference. SAP Cloud Platform is currently the most used tag in blogging, and if me and you and others in the community getting online saying "we hear you -- your voice belongs here" is what it takes, I'll keep doing that.
Very interesting thoughts. I agree with most of them. (Although in my house I am a technologist and my husband the homemaker)
Did you talk about less people graduating with a STEM degree. Females and Males alike are avoiding these degrees. It makes it hard to find a qualified person. Add to that, finding diversity is a challenge. As you pointed out, the people who live in the area might be predominately Caucasian. So the location does become a factor. I think that is slowly going away. I work at a home office. I know others that do as well.
Just plain finding people willing to speak is a hard job as well. Vegas TechEd is a great example. Balancing out that diversity with willing and able speakers. (You and Tammy did a great job) But the SAP sessions - I found very few women. (any?) Not to say SAP is not diverse - I can read and see that it is.
It is hard to press on and share knowledge. Men or Women. Because we might simple say something wrong. Plus there is the whole audience thing.
But I digress a bit. Totally agree with your blog. Not sure if I can follow those steps.
I am passionate about diversity and always have been. I still think it might start in high school or grade school even. Children form there outlooks by teachers, friends and parents.
I'm laughing a little about this - my niece and a good friend of ours are both going for a teaching degree. Maybe once they get into the classroom for a couple years they could think about diversity.
I really loved this blog. I missed the call - work got in the way again. So this is a nice, new recap for me. As usual probably too long of a response. I am wordy.
Thank you Michelle for reading and for these thoughtful comments!
>Females and Males alike are avoiding these degrees. It makes it hard to find a qualified person. Add to that, finding diversity is a challenge.
While this isn’t the only game in town, through our sponsorship and efforts of the awesome recruiting team at the Lesbians Who Tech Summit, we have a resume pipeline of 4000+ tech candidates from around the world, largely women and a good percentage people of color. Send whomever is having a struggle finding diversity to me – or to a conference like this 🙂
>Not to say SAP is not diverse – I can read and see that it is.
Oh no, I completely agree with you. Frankly <views my own but it’s obvious to see> the technologists we, SAP, put on stage (internally and externally) are mostly homogeneous. When we see women we usually see them in supporting, communications roles moderating the discussions but not usually dispensing technical strategy.
Bjoern Goerke did a fantastic job on the Bridge at Star Trek TechEd last year -- I believe he did it with intention (see 'quota'/etcetera). But this is far from what we usually see (which is part of why it was so electrifying, I believe totally activating dormant parts of our organization). This is why we try to raise this issue internally and externally. Just as other tech companies, we have relatively poor ratios especially in the tech positions.
This is where we really have to challenge our attitudes towards “q” word – which is what I meant by ‘get over it’ – I absolutely do not mean to let go of anything when you see it’s wrong (perhaps I should edit?). Call it what you will and maybe "intentionality" will be a less charged word -- but it doesn't happen automatically.
It’s like this word “quota” has become such a dirty word we can’t even explore about why it might make a difference, as Leanne Pittsford writes in the linked article (far better than I can articulate). The “two people equally qualified” – if you really do find yourself in that situation, we can talk then, but we’re not even finding these folks because we’re looking within our own biased networks to start… Setting goals, aspirations, yes quotas, cause a fundamental shift. <disclaimer: not saying anything on behalf of SAP since I know how sensitive folks are about this word>
>. I still think it might start in high school or grade school even.
Absolutely right – it starts in grade school. That’s also a great point.
and no you’re never ‘too wordy’ -- it’s a total honor to know you’ve read this and you’ve shared your thoughts on this. Maybe let’s do blog ping-pong?:)
Agreed. There are so many different ways we are diverse. Including sexual orientation. (I hope I wrote that right), US vs a different country. Aging technologist - oh no, that's me. Heavy people vs. very fit people. Perhaps even medical issues. Yes, I had a company I was working at ask about mine. My Doctor wasn't very happy to say the least.
In Michelle's perfect world, the q word would be gone. Everyone would be hired based on their experience. OK, so my perfect world would have animals in the workplace, peace on Earth, and nothing that would cause Global warming. And of course all disease would be gone. None of my four legged friends would die....
So do you get the picture? I think full equality is going to be VERY hard to obtain. Until that happens, maybe the "Q" word is not so bad. Although honestly, who wants to work for a company that hires them just because XYZ. I guess we just work harder to prove ourselves.
I can't believe that we won't get there. It's been really recently that anyone could "come out of the closet". Now I still think it takes a lot of courage to do so, BUT things are better.
Now my brain is thinking about a blog - that echos this one without repeating it....
The Challenge: #BlogoPolo! You're it! 🙂
You're absolutely right: there are 7.6 billion ways to be different (actually more -- all who have come before, and all who will ever be). I think our start now is to accept we are all part of the bias issue, and to do what humans do: try, fail, try again, and be different.
That's horrible about the medical thing: I sincerely hope we are not heading towards a world of lack of privacy on medical issues because of needing employment... about the age thing? I bet I'm older than you -- shhhhh: don't tell anyone 🙂
One of the things that always interests me about conversations around bias and diversity is the fact that most people only think of measurable traits, like: skin color, gender, age, cultural background. What we fail sometimes to consider is that each of those things contributes to diversity of experience and perspective, which is the true value in a diverse group of minds collaborating. And when I say "most people," I mean it's typically the policymakers within organizations who take a misguided approach to increasing diversity that is built around -- you said it, "quotas." The problem with this is the assumption that, for example, all females have the same experience and perspective. Therefore, if we hire X number of females, we will have a representative sample. I think the issue really needs to be addressed at a "how we hire people" level. Diversity shouldn't be a check box. Instead, it should be a fundamental part of the recruiting process. "Tell me how you are different. Tell me about some unique experiences you've had that have shaped how you approach problems. Tell me about things you've struggled with and overcome." I think that this gives a much better insight into the diversity of a group than whether you have enough government-defined, minority groups represented. I want to go about diversity in a smarter way than we have in the past (we, being the tech industry). I don't think quotas work and they create resentment (even from those hired based on the quota) and questioning whether people belong in their jobs, even if they're truly the most qualified. There's always that question. We have to get away from that.
BTW completely agree with your point Moya Watson on biased networks. I wrote about similar around National Women's Day and how this whole "thank you to all the women who helped me get where I am today!" nonsense doesn't actually serve women. Instead, we need to be reaching out and down to help raise up others. Cheers 🙂
tell me the link to that? i really resisted the idea that our networks are biased -- how could MINE be biased when I'm so "diverse" whatever the heck that is -- yet, indeed, i am part of the problem.
so hard and important to step out of comfort zones...
Here's the post (published on LinkedIn because it was a personal, non-SAP thing)...
"Diversity shouldn't be a check box" <-- boom, exactly. Speaking as someone who often feels like a 'check box' when I'd rather be a whole person, though, measuring things (and sadly reducing people to descriptors) is usually all we have, and if we say we want it to be better, and it may be a tool to help, then we tackle it - one descriptor at a time.
Michelle Crapo great points. Just as an FYI, my daughter's school starts the kids on programming already in 1st grade!
It's just so amazing how we start equal... and, perhaps, telling that it's around puberty when the gap widens.
This makes me want to embed Dar Williams When I Was A Boy. perhaps best saved for another round of bloggo-polo...