Skip to Content
Author's profile photo Werner Baumbach

Trolls, Bots and the Bowling Green Massacre

At university in the mid nineties one of my Brazilian class-mates was thrilled that he could check news from home via gopher (if you remember this protocol please press like, if you don’t just click on it softly). Gopher was powering an early information service. Back then, when we came home from a vacation we would ask friends and family what had happened in the last two weeks around the world – switching off was much easier. Gopher and other similar early services of the early internet age have evolved into today’s media landscape. As our smart phones have more power than the PCs in the day of the internet pioneers (but also cost more) today we feel comprehensively and well informed, around the clock, around the globe and with countless sources. In theory knowing “what is going on” and understanding the world is much easier.

My feeling is though, that we may have passed a turning point in this information age and maybe only vaguely noticed it. And I am not primarily referring to blunt misinformation or faulty news articles, sometimes described as fake news – even though these are scary enough. Distorting the facts, massaging the numbers and lip sticking pigs is not new. When Erik the Red landed on an icy desert in the middle of the Northern Atlantic he called it Greenland to make it sound more attractive for fellow Vikings that were supposed to follow (instead if Iceland, which is actually much greener). But we now seem to have entered the post-factual age, instead of lies and fiction we now call it alternate facts. Some of these alternate facts should be plain obvious or can be proven incorrect with very little effort. But in our world of hyper connectivity, mass media and social networks the bigger threats are the subtler, more complex efforts to shape public opinion, fuel sentiments and manipulate behavior. One such elaborate method is called astroturfing – where several seemingly unconnected sources and contributors are used to create an overall impression while masking the real sponsor or initiator of the activity. We as empowered social media natives may feel confident, that we have done our research and cross checked our facts and do not realize that in fact, that all pieces originate from the same source. With this technology, products are promoted, public opinion is shaped and maybe elections manipulated or at least heavily influenced. (if you are interested to hear more about astroturfing check out this Ted Talk).

As in our private lives and across social aspects, this clearly also has an impact on business operations. We all know, that informed decisions are vital for our success. Triggering action based on insights rather than gut feeling can make the difference. With data being so vital, ensuring that it is accurate remains key and will become increasingly more challenging. Social media can be used to influence businesses just as well as the “end-user”. Think of sentiment analysis solutions that are widely used to track e.g. the performance of marketing campaigns or products. Or consider your activities during business planning or supplier selection. What sources do you use? Which of them can you trust? How can you double check information? Partly I believe that we can use own data and new sources (e.g. with sensors) to validate hypothesis. We can match these insights against other sources, validating their quality and trust worthiness. And I think we need to give back some space to our instincts instead of blindly trusting results that we do not necessarily know how they were created. If something feels off, in many cases it is worth rechecking findings before acting – no matter whether it is in our personal lives, financial decisions, business operations or for example politics.

Assigned Tags

      1 Comment
      You must be Logged on to comment or reply to a post.
      Author's profile photo Jelena Perfiljeva
      Jelena Perfiljeva

      Correction: it should be "alternative facts" ("alternate" has a different meaning).

      Good blog and good subject. If a company relies on the social media analysis for their business decisions, would it be easy to distort that data by using troll bots, for example?

      On a bright side, spam was a major problem in the early days but recently technology to filter it out has improved greatly. There are already sites that can spot fake reviews, for example. I believe eventually the technology-based solutions will arise to deal with "alternative facts" too. It does seem like a technology merry-go-round though. One technology creates a problem that we need another technology to solve. I guess some might call it "job security". 🙂

      Thank you for sharing!