By Peter Adeshina
Whatsapp, the messaging app acquired by Facebook in 2014, is used by millions across the globe to keep up with loved ones and exchange information that range from the serious to the mundane.
For more than four years now, Facebook has been at the center of embarrassing investigations into the clandestine harvest of users personal data supplied during sign-up process and in-app usage for unethical political campaign purposes. Its CEO, Mark Zuckerberg, has made appearances before the United States Congress over the issue and tendered an apology.
His apology and defense nonetheless, the scandal battered Facebook’s reputation and led to global calls for stricter scrutiny of the operations of the tech giant and others, especially where it concerns data privacy. It also provoked circumspection on the part of end-users on how to safely use the Facebook service – or whether to use it at all. A campaign with the hashtag ‘DeleteFacebook’ trended for days in the wake of the revelations.
The backlash was a reiteration of an old concern: users want social network services to provide and manage connectivity, but not pry into personal affairs carried out using the service – especially those considered to be done in private. When, for instance, people use Instagram’s private message feature to regale their friends of their escapades on a Friday night, they don’t want someone on the other end listening.
To explain simply, an end-to-end encryption means that information shared through the messaging services will be fully protected by secret codes accessible only to the sender and the recipient. Despite facilitating the connection, Facebook and its other messaging services will themselves not be able to read or ‘decrypt’ messages exchanged on the platform. It is a feature already active on WhatsApp and rival services such as Telegram, hence their popular choice as preferred mediums for voice calls and messaging in less-democratic countries where illegal taps and phone hacks are rampant.
While Facebook’s planned broad enforcement of end-to-end encryption may have thrilled users and proponents of digital data privacy, it angered child rights activists and at least a top government official in the United Kingdom.
Priti Patel, UK’s Home Secretary, condemned Facebook’s encryption plans as “morally wrong and dangerous” in a recent statement. Echoing concerns of other prominent child rights activists, she says the move will make it harder for the service to detect and report the criminal usage of its messaging platforms to “share images of child sexual exploitation”.
Her concern isn’t baseless. According to the US National Center for Missing and Exploited Children (NCMEC), Facebook filed over 16 million reports of child sexual exploitation and abuse content on its platform. At least 2,500 arrests are made annually from the reports, according to the National Crime Agency (NCA).
By stripping itself of the power to access messages exchanged on the platform, Facebook is inadvertently ceding space to these criminals. WhatsApp, where end-to-end encryption is already active, has also faced repeated accusations of aiding terrorist plans and attacks given its inscrutable nature.
In other words, Facebook, against its wishes, is asked to retain its ability to pry and access information exchanged on its platform. It could cede this power to a designated government agency but its global usage immediately presents a raft of problems: no single government can be realistically handed the key to sensitive data of citizens of other countries, and in authoritarian states, government access to such information could enhance its ability to censor, repress, and abuse.
If not anything else, these complications show that, contrary to popular claims and rhetoric, there are no easy answers or silver bullet to the problems associated with data privacy. Big tech companies can either pry or not. But we have seen that both choices present serious consequences.