What would you do if you got your hands on some big data?
Maybe it’s an entire data center’s worth… Data of purchase histories, demographics, social connections, medical records – whatever you’d like.
And let’s say you also have a platform to reach a large audience… Millions of buyers, customers, prospects, or any market segmentation of your choice.
What would you do… sell the information? Maybe use analytics for better list selection or product recommendations? Forecast future complaints, revenue, or illnesses?
Now more importantly, what would guide those decisions — how would you know what is a right or wrong action?
Hence the topic of big data morality and pursuing the value that big data can bring in light of the values for which a company stands. We’ve heard how companies must plan for the technical challenges of big data in volume, velocity, and variety – but we must also consider (before the fact!) the morality of the analyses and potential insights discovered.
Not so sure this matters? From an external perspective, imagine the potential pushback when performing an analysis that users don’t agree with. As seen in the case of Facebook’s emotional experiment, Kashmir Hill sums it up nicely that while the findings of a test may be neat “…manipulating unknowing users’ emotional states to get there puts Facebook’s big toe on that creepy line.” And we know how a company’s key metrics can quickly be impacted by users responding to undesirable news like in the case of Netflix’s annoucement of Qwikster…
Granted the guidance to be responsible and honest isn’t a new phenomenon; there are similar rallies behind corporate social responsibility and corporate sustainability. But the increased and potentially more personal impact – for better or worse – when working in big data makes it unique.
Take for example the data captured by the Narrative Clip – a photo for every 30 seconds of our lives. While this could be quite invasive to your and others privacy, I hope you are also glad to hear that the CEO of Narrative, Martin Kallstrom, says one of the two most important things in designing wearables is honesty. Hence, there is no off button. If the device is out, it’s taking pictures.
So knowing that we should strive for big data morality, what are some ways in which we can realize it? Well, one way is to compare your actions with your company’s values. Does anything stand out or seem questionable – how can you make adjustments? We can also learn from the great work others have started, such as the practices seen in Responsible Innovation or the Data Science Association’s Data Science Code of Professional Conduct.
Other ways to promote big data morality may simply start with open communication in your company: standing in your customer’s shoes and thinking how findings will be perceived, getting external and varied internal opinions, moderating internal debate, or playing devil’s advocate.
Just as we’re seeing data privacy or security being used more as a competitive advantage or point of differentiation with brands, so will be the use of data and analytics in a way that is moral. Don’t just be good in and at your big data analytics practices, but do good!