Facebook currently has a Cambridge Analytica problem. It is under severe pressure to explain how 87 million users had their personal data leaked and offer assurances of how it will not happen again. Beyond the US, Cambridge Analytica has been a player in multiple elections in Kenya and Nigeria.
This month Mark Zuckerberg testified before the US Congress and the biggest revelation of that episode was that America’s lawmakers have very little understanding of how Facebook works, and missed a key opportunity to engage deeply with the problems at the heart of Facebook’s business model and practices.
Thanks to the overall weak line of questioning, Zuckerberg’s net worth rose $3 billion during the testimony.
Deleting Isn’t An Option
Users are outraged, some deleting their accounts in the #DeleteFacebook movement. It seems, though, that in general even while many people get angry, they don’t do much more than utter a tut tut.
It’s worth remembering that to actually delete your Facebook account is a privilege, as New York Times reporter Sheera Frenkel tweeted. “For much of the world, Facebook is the internet and only way to connect to family/friend/business.”
From an ICT4D perspective the people we serve, who count on us for knowing how the tech and the data works, need Facebook. And indeed, so do we in our ICT4D offerings through WhatsApp, Messenger and Groups.
Many ICT4D orgs continue to ride the wave of the stellar uptake of Facebook and its owned services, utilising the reach, communication and engagement opportunities these offer, for example, through Facebook Basics.
We Do No Harm, Right?
Can the ICT4D movement have its own Facebook-Cambridge Analytica moment? The answer is yes, of course, and to prevent, or at least delay it from happening we need to vigilantly focus on data privacy and interrogate the choices we make in the offering of our services.
Knowing that using external platforms that vacuum up data can be potentially hazardous, the ICT4D community needs to reaffirm its commitment to do no harm, to ensure data privacy and security.
We’re the good guys: we are transparent with individuals whose data are collected by explaining how our initiatives will use and protect their data; we protect their data; our consent forms are written in the local language and are easily understood by the individuals whose data are being collected.
Nice words, but do we really implement them?
How Careful Are We?
Below are a few questions to ponder in the context of Cambridge Analytica-Facebook.
- Access: WIRED magazine shows you how to download and read your Facebook data. Does your app or service allows users to do the same?
- Recourse: Again, drawing on the GDPR (you can tell I’m a big fan), how easy is it for our users to contact us, request their data to be removed, ask for the algorithm that profiles them to be explained? Do we have the capacity to meet these demands?
- Protection: Where is the data that you collect about users? What measures have you put in place to safeguard it?
We really need to be very creative in solving these challenges.
How Are You Transparent and Safe?
So, how is your project practicing radical transparency? Have you had to explain your actions to your users, have you been requested to delete data? Pre-emptively, in what ways have you engaged the community to explain exactly what you are doing?
Please do share your experiences.
There is value in creating templates for radically understandable ethics forms, processes for data download and explanations.
While the scale of risk is lower for us than for Facebook, based on sheer number of affected users, the issues are no less grave. Perhaps in ICT4D, by often coming as non-profits and development agents and not as commercial entities, the issues of data protection are even more important than with Facebook. We come as people who are there to help. If we fail in doing no harm, how terrible is that!?
We need to make sure our house is in order before it’s too late.