On 17 March 2018, the New York Times and the Observer of London broke the news that the SCL Group and Cambridge Analytica used the data of 50 million Facebook users – without their knowledge or permission – to help the Trump campaign to influence the US elections. (The original New York Times article can be found here).
In Britain, Cambridge Analytica is facing investigations by Parliament and Government Regulators into allegations that it performed illegal work on the “Brexit” campaign. Closer to home, reports have surfaced that the companies played a role in President Uhuru Kenyatta’s 2013 and 2017 campaigns for the Kenyan Presidency. The Managing Director of the company has claimed that not only did they conduct a survey, but “rebranded the entire party twice, written their manifesto” and “then we’d write all the speeches and we’d stage the whole thing – so just about every element of the campaign”.
There has been a huge uproar in the US and UK with Mark Zuckerberg being called before the US Congress and UK parliamentary panel to answer questions on the debacle. Zuckerberg is set to appear before Congress today and tomorrow, but has declined the invitation to appear before the UK parliament.
The data of 50 million users which is at the heart of the congressional inquiry was collected over a number of years by Aleksandr Kogan, an academic based at the University of Cambridge, who developed an app which not only gathered data from the people paid to download it (people were paid to download the app which was advertised on a website for doing odd jobs online), but from all of those people’s friends as well. Reportedly, of the 50 million Facebook users whose data was collected, only 270 000 of those users had consented to having their data harvested. All that the researcher divulged to Facebook and the users was that he was collecting information for academic purposes.
It is now reported that approximately 60 000 South Africans’ data may have been breached after as few as 330 people downloaded the app designed by Aleksandr Kogan.
Facebook’s lax privacy policies have been called into question before. The American Civil Liberties Union (ACLU) has for years been calling on Facebook to clean up their act and implement more stringent data protection. (See the full ACLU post here)
In 2009, the ACLU warned against the lack of privacy when you took online quizzes:
‘Even if your Facebook profile is “private,” when you take a quiz, an unknown quiz developer could be accessing almost everything in your profile: your religion, sexual orientation, political affiliation, pictures, and groups. Facebook quizzes also have access to most of the info on your friends’ profiles. This means that if your friend takes a quiz, they could be giving away your personal information too.’
In 2016, the ACLU in California also discovered, through a public records investigation, that social media surveillance companies like Geofeedia were improperly exploiting Facebook developer data access to monitor Black Lives Matter and other activists. They again sounded the alarm to Facebook, publicly calling on the company to strengthen its data privacy policies and “institute human and technical auditing mechanisms” to both prevent violations and take swift action against developers for misuse.
The ACLU reports that Facebook has modified its policies and practices over the years to address some of these issues. Its current app platform prevents apps from accessing formerly-available data about a user’s friends. And, after months of advocacy by the ACLU along with the Center for Media Justice and Color of Change, Facebook prohibited use of its data for surveillance tools.
Facebook’s response to the Cambridge Analytica debacle demonstrates that the company still has significant issues to resolve. The ACLU points out that Facebook knew about the Cambridge Analytica data misuse back in December 2015 but did not block the company’s access to Facebook until hours before the current story broke. And its initial public response was to hide behind the assertion that “everyone involved gave their consent,” with executives conspicuously silent about the issue. It wasn’t until Wednesday, 21 March 2018, that Mark Zuckerberg surfaced and acknowledged that this was a, “breach of trust between Facebook and the people who share their data with us and expect us to protect it,” and promised to take steps to repair that trust and prevent incidents like this from occurring again.
The question remains: how will Facebook improve its privacy and data retention practices? With the EU General Data Regulation coming into force in May 2018, Facebook will be forced to comply with privacy principles which run contrary to its established business model. These include: having to request Facebook users’ consent in clear and unambiguous language to process their private data, mandatory notification of users when a data breach occurs, and providing users the ‘right to be forgotten’ which would empower users to demand that Facebook delete their data, stop any further dissemination and require third parties associated with Facebook stop any further processing of the data.
In South Africa, the Protection of Personal Information Act (POPI), upon coming fully into operation, will apply to the processing of data of the type used by Cambridge Analytica. Facebook and Cambridge Analytica would constitute the ‘responsible party’ and ‘operator’ respectively, placing certain duties on Facebook and Cambridge Analytica. South African Facebook users would have recourse with the Information Regulator or courts were a similar breach of data to occur after the commencement of the Act. Their claim would lie in the fact that Facebook would have breached the conditions for lawful processing of data laid out in Chapter 3 of POPI. These conditions include requirements similar to those in the EUGDR, such as: further processing limitation, which requires Facebook to only allow further processing of personal information which is reasonably related to the initial reason the data was collected for; security safeguards, meaning that Facebook would have to take reasonable and appropriate measures to ensure that the integrity and confidentiality of the data is ensured; and data subject participation, which gives the user the right to request confirmation that Facebook has their personal information, and request that this information be corrected or deleted.
View the full ACLU post by Nicole Ozer, Technology & Civil Liberties Director, ACLU of California and Chris Conley, Policy Attorney, ACLU of Northern California Technology and Civil Liberties Project and their suggestions here.
By: Alexandra Ashton and Tsanga Mukumba
Disclaimer: The opinions expressed by the Realising Rights bloggers and those providing comments are theirs alone, and do not reflect the opinions of the Legal Resources Centre. The Legal Resources Centre is not responsible for the accuracy of any of the information supplied by the bloggers.