Ethical decisions are rarely easy. Nowadays, even less so. Smart machines, cheap computation and vast amounts of consumer data not only offer incredible opportunities for modern organizations, they also present a moral quandary: Is it OK, as long as it’s legal?
Algorithmic bias can take many forms — it is not always as clear-cut as racism in criminal sentencing or gender discrimination in hiring. Sometimes too much truth is just as dangerous. In 2013, an academic paper demonstrated that Facebook “likes” (which were publicly open by default at that time) could be used to predict a range of highly sensitive personal attributes, including sexual orientation and gender, ethnicity, religious and political views, personality traits, use of addictive substances, parental separation status and age.
When they published their study, the researchers acknowledged that their findings risked being misused by third parties to incite discrimination, for example. However, where others saw danger and risk, one of the authors’ colleagues at Cambridge University saw opportunity. In early 2014, Cambridge Analytica, a British political consulting firm, signed a deal with that colleague for a private venture that would capitalize on the work of the trio of researchers.
A quiz was created, thanks to an initiative at Facebook that allowed third parties to access user data. Almost 300,000 users were estimated to have taken that quiz. It later emerged that Cambridge Analytica then exploited the data to access and build profiles on 87 million Facebook users. Arguably, neither Facebook nor Cambridge Analytica’s decisions were strictly illegal, but in hindsight — and in context of the scandal the program soon unleashed — they could hardly be called good judgment calls.
Over the past decade, Apple has been criticized for taking the opposing stance on many issues relative to its peers like Facebook and Google. Unlike them, Apple runs a closed ecosystem with tight controls: You can’t load software on an iPhone unless it has been authorized by Apple.
While Facebook’s actions may have been within the letter of the law, and within the bounds of industry practice, at the time, they did not have the users’ best interests at heart. There may be a simple reason for this. Apple sells products to consumers. At Facebook, the product is the consumer. Facebook sells consumers to advertisers.
Your customers will expect you to use their data to create personalized and anticipatory services for them while demanding that you prevent the inappropriate use and manipulation of their information. As you look for your own moral compass, one principle is apparent: You can’t serve two masters.