Back to blog

LCN Blogs

No such thing as privacy

No such thing as privacy

Christianah B

29/01/2020

Four years ago, the government’s CCTV watchdog, Tony Porter warned the public about the risks of surveillance technology changing the “psyche of the community” by reducing individuals to numbers in a database. Fast forward to 2017, we have the ECtHR case Antovic and Mirkovic v Montenegro where the use of CCTV in a lecture theatre was held to breach Article 8, then in 2018, the discovery of Facebook/Cambridge Analytical scandal; it becomes apparent Porter had good reason to be concerned about the UK’s data privacy laws.

Porter had robust, security arguments with universities about the use of such cameras and questioned whether they were “conducive to creating a learning environment”. A similar issue was raised in a recent case, Antovic and Mirkovic v Montenegro, where the European Court of Human Rights (ECtHR) was asked to determine whether video surveillance of lecture hall could be viewed as a violation of a professor’s right to privacy in accordance with Article 8 of the ECtHR (right to respect for private and family life).

The case was challenged by the government who claimed Article 8 wasn’t applicable in the public area of a lecture theatre. However, the court had previously found that “private life” might include professional activities, therefore Article 8 was applicable and camera surveillance wasn't a necessary state interference. 

Reasoning

The ECtHR reached this decision based on several reasons; the first being pre-existing case law; because they’d previously found covert video surveillance at work to be an intrusion into an employee’s private life, they saw no reason to depart from this finding. Secondly, domestic courts had failed to take necessary measures to consider any legal justification for the surveillance or check they’d followed Data Protection Agency laws. Domestic law has stringent rules on the use of surveillance; for example, surveillance could be carried out if the aim of the measure (eg, preventing danger to property and people) could not be achieved in another way, however, the Data Protection Agency found no such danger existed while the university's justification, the surveillance of teaching, was not recognised by law.

In obiter judgement, Judges Vucinic and Lemmens expressed a joint concurring opinion, however, Judges Spano, Bianku and Kjølbro expressed a joint dissenting opinion and voted against finding a violation of Article 8 of the Convention for three main reasons.

Firstly, the majority view expansion of Article 8 may have significant implications. The expansion of the Article 8 notion was done without a basis in ECtHR pre-existing law, which “is not sufficiently supported by cogent legal arguments.” This view is supported by the UK’s Supreme Court which has criticised the Strasbourg court’s tendency to expand the scope of Article 8 continuously on controversial, social and ethical issues because although British courts are not bound to follow ECtHR, in practice, they will treat ECtHR decisions as a binding authority; this is evident in Kennedy v Charity Commissioners (2014) UKSC.

Secondly, the majority view raises the issue of inconsistency with a previous court judgement. The ECtHR has already decided on several cases (concerning video surveillance and monitoring at workplaces), that an employer’s surveillance of employees at the workplace does not automatically raise an issue under Article 8. For example, in existing case laws; Peck v. the United Kingdom and Perry v. the United Kingdom, ECtHR found, “The normal use of security cameras, whether in public streets or on-premises, such as shopping centres or police stations, do not as such raise issues under Article 8 of the Convention.” This judgement contradicts previous judicial reasoning by suggesting the normal use of security cameras (eg, recording or permanent storage) in shopping centres or police stations may interfere with an individual’s private life.

Thirdly, the majority view only relied on the video monitoring and not on the recording, processing or possible use of the data gathered. The dissenting judges were satisfied with the specific circumstances of the case; that the claimants had been notified of the video surveillance (notice was given so the claimants had a reasonable expectation of privacy), which only monitored their professional activity of teaching and due to blurry video quality, the persons could not easily be recognised and footage was only accessible to the dean and was automatically deleted after 30 days (therefore the information wasn’t stored, processed or able to be made use of).

Arguably Porter would agree with the dissenting judges because he believes surveillance is only permitted if consent or notice has been given. His freedom argument focuses on universities being transparent with students, lecturers and explaining their obligations. Thus, in this case, the university did what was expected of them by a government’s surveillance commissioner, by informing the claimants about the footage and being open and transparent, but failed to do what was expected of them by the law. 

Shortly after the Montenegro case was the Facebook/Cambridge Analytical scandal. Facebook failed to protect 30 million users from having their data harvested by Trump Campaign Affiliate. Facebook has more than a decade-long track record of incidents highlighting inadequate and insufficient measures to protect data privacy. While the severity of these individual cases varies, the sequence of repeated failures paints a larger picture of systemic problems of the UK’s privacy laws. Mass surveillance - including body-worn video, drones and number plate recognition systems - in public places, has caused several privacy and civil liberties concerns. Director of Technology Law, Judith Rauhofer argues the use of information, communications technology and the ‘digitalisation’ of everyday tasks has resulted in a paradigm shift where vast amounts of personal information about individuals, their opinions and habits are generated and stored in the databases of those providing online services. 

Civil rights campaigners like JUSTICE and LIBERTY, who have drawn attention to the UK’s privacy issues in the media and the High court, claim the key issues relating to the right to privacy in the UK is police surveillance technology (the police have been accused of introducing new technologies, without proper trials, consultation or even legislation), the Snooper’s Charter and state surveillance (the government have exercised highly intrusive state-sanctioned surveillance powers via Investigatory Powers 2016 which gives too much power to security and intelligence services who could violate individual privacy), CCTV (the use of smart CCTV has made the UK one of the most-watched nations in the world), police databases, ID cards and criminal record checks (vast number of employers have direct access to individual criminal records, including unproven allegations). Some argue to knock back the use of technology would be Luddite but surveillance commissioner Porter fears it poses the risk of creating a datafied society, where “everybody is a number and everybody can be linked via ANPR to facial recognition, to another thing.”

The proposals for human rights reform in the 2019 general election

Digital government is increasingly replacing analogue administration in the UK. This shift has rule of law implications in terms of the transparency and accountability of the government’s exercise of public power. So, given this context, it’s important to consider what the main UK political parties say about data and digital rights in their manifestos this year. The Labour manifesto reaffirms the party's commitment to the Human Rights Act (HRA) and ECtHR, and promises a new Charter of Digital Rights, which will be the “strongest protection of data.” Labour notes that the Tory government is “dependant on digital technology” more than ever and proposes cybersecurity that will allow individuals to challenge algorithmic injustice, to prevent the use of digital infrastructure for surveillance over access and ownership of their data.

In comparison, the Conservative manifesto proposes to “empower the police to safely use new technologies like biometrics and artificial intelligence, along with the use of DNA, within a strict legal framework”. They hope to tackle privacy issues through a balancing exercise; they plan to update the HRA to ensure a proper balance between the rights of individuals, national security and government, is struck. The Conservative manifesto has been criticised for falling short in protecting an individual’s technology and data rights. Given that the Tories have a clear majority, there will be extensive changes to the HRA, that being said, a Labour leader with a pro-human rights perspective takes over in March so there may be positive changes made. 

Snooper’s charter

The controversial Investigatory Powers Act (2016), nicknamed the Snooper’s charter, has been criticised for giving police and public bodies too much scope but doesn’t go far enough in protecting innocent citizens. Civil rights campaigners argue the controversial law brings unprecedented levels of online surveillance to the UK by extending police powers over internet data, allowing authorities (like the security services) to hack into devices and obtain information to aid cases. In the wake of 2017’s terrorist attacks in London and Manchester, the government ensured social media apps like WhatsApp and Snapchat didn't provide a hiding place for terrorists to communicate with each other, but is this at the expense of people’s privacy? Civil rights campaigners believe the government is fighting a losing battle because they’ll never be able to coerce the global open-source community to comply, whilst others think oversight of current surveillance powers needs to be reformed before any new powers are introduced.

Impact of Brexit

Data protection is unlikely to be foremost in people’s minds when considering the impact of Brexit, whether it be soft or hard, deal or no deal. However, in 2018, the UK government issued a document called: Data protection if there’s no Brexit deal. After Brexit, the UK’s data protection legislation will remain unchanged, since the EU Withdrawal Bill will incorporate the GDPR into UK law. Transfers of personal data from the UK to the EU won’t be affected but transfers of personal data from the EU to the UK will, due to an adequacy decision. The European Commission has stated that if it deems the UK’s level of personal data protection essentially equivalent to that of the EU, it would make an adequacy decision allowing the transfer of personal data to the UK without restrictions. However, the reality is the UK won’t automatically be awarded adequacy status because the ECtHR has criticised the UK’s privacy legislations in the past. Therefore, nothing is certain about the outcome of any Brexit deal, only ambiguity. But one thing remains clear; businesses and universities who act in good faith and justify any changes to business processes and decisions will be less vulnerable than those who do not.

Conclusion 

The Montenegro case highlights the importance of employers and universities ensuring that their current surveillance practices are legitimate and whether these could be viewed as supporting a legitimate aim on grounds of safety, theft, damage or protection of property or others, as a failure to do so could leave them susceptible to challenge or in breach of a civil right. Therefore, a business or university must be able to justify any surveillance taken and ensure this only goes as far is as necessary to protect the legitimate business interests. Dr Maria Kendrick, Human Rights Lecturer at City, University of London, says “the more technology is used the more human rights issues we will have, but I think the big question is whether the law can keep up with technological developments or whether people will have their rights infringed first and then the law react afterwards.”