The Right to Privacy in the age of Big Data and Open Data
By Amanda Lo
You might have seen the advertising campaign that Facebook has been running across Australia to regain user trust. One of their tag lines is “data misuse is not our friend”.
The campaign comes after The Guardian and The New York Times exposed a wide-scale data breach by Facebook in March 2018. Data analytics firm Cambridge Analytica reportedly collected and processed the personal information of over 50 million Facebook users to help devise targeted political advertisements at US voters.
A few months later, and Facebook is trying advertising to mend its relationship with users. The campaign not only addresses the demands for greater protection of user privacy from data misuse, but also the need for greater transparency to combat fake news and fake accounts.
The right to privacy as a human right
The myriad of ways of collecting, analysing and sharing data creates challenges about our understanding of what privacy is and how to best protect our right to privacy. This is evident from recent cases of high-profile mismanagement of personal data by private and public entities.
The right to privacy, according to which no one shall be subjected to arbitrary or unlawful interference with his or her privacy, family, home or correspondence, and the right to the protection of the law against such interference, is enshrined in Article 12 of the Universal Declaration of Human Rights and Article 17 of the International Covenant on Civil and Political Rights.
Enter: big data and open data
Most of us have heard of the terms big data or open data, but what do they mean? While there is no agreed definition, big data generally refers to the large volume of data available and the process of applying analytic techniques to identify patterns and draw various inferences.
Open data usually refers to the public sector releasing data to the public to encourage greater transparency and accountability in government. Proponents of open data also argue it improves scientific research, fosters innovation and stimulates economic growth.
In 2016, as part of its policy to facilitate greater use of health data to support medical research, the Australian Department of Health published de-identified medical billing records of around 2.9 million Australians from the Medicare and Pharmaceutical Benefits Schemes. A report by researchers from the University of Melbourne (Vanessa Teague, Chris Culnane, Benjamin Rubinstein) showed how simple the process of re-identifying individuals is by linking their health records with other known information about them such as medical procedures like childbirth.
Updating our privacy settings: the UN Big Data-Open Data Consultation
In light of the increased volume of data collected, processed and analysed and advanced analytic technologies available, there is a need to review and consider whether international legal instruments established in the post-WWII era are sufficient to protect the individual’s right to privacy.
Acknowledging the importance of the right to privacy in the digital era, the United Nations Human Rights Council established the first Special Rapporteur on the Right to Privacy in 2015. An interim report on Big Data – Open Data was presented to the United Nations General Assembly in October 2017. To gather views and feedback from individuals, civil society, private and public sectors, the UN Special Rapporteur on the Right to Privacy led an international consultation in Sydney on 26 and 27 July 2018.
The consultation covered a range of issues from technical aspects of data management, consumer rights, to the role of human rights. De-identification techniques were examined. There was also discussion on how to ensure consumers retain power and choice over how their data is used, especially when most privacy policies are complex and difficult to understand. The importance of human rights and ethics in the application of automated decision-making technologies and its effect on different communities was also considered.
Along with the right to privacy, the protection of other human rights is also affected by technological changes. On 24 July 2018, the Australian Human Rights Commission held an international conference and announced its public consultation to better understand the impact of technology on human rights. A key part of the Commission’s initiative is to encourage responsible innovation.
This means technology applications, when designed from a human rights-based approach, should be developed and applied in accordance to human values that protect important rights and freedoms. For example, by recognising and preventing algorithms from perpetuating certain social biases and discrimination.
Amanda Lo is studying law at the Chinese University of Hong Kong and visited UNSW Law as an intern in The Allens Hub for Technology, Law and Innovation.