Many of us see our privacy as a basic right. But in the digital world of app-addiction, geolocation tracking and social oversharing, some may have cause to wonder if that right is steadily and sometimes willingly being eroded away.

You can see that we need to get some grip on how the right to privacy can be enforced as technologies continue to develop that can pose serious threats to individuals’ sense of dignity, reputation, privacy and safety

David Erdos

The freedom of expression and the need for privacy may be strange bedfellows today – but could full-blown estrangement beckon in a digital future that makes the leap from user-controlled content to unfiltered, online sharing of, well, everything?

A future where streaming your life online becomes the norm is not unthinkable, according to Dr David Erdos, whose research in the Faculty of Law explores the nature of data protection. “Take something like Snapchat Spectacles or Google Glass,” he says. “Such technology could very quickly take off, and all of a sudden it becomes ‘normal’ that everyone is recording everything, both audibly and visually, and the data is going everywhere and being used for all sorts of purposes – some individual, some organisational.”

This makes questions about what control we have over our digital footprint rather urgent.

“You can see that we need to get some grip on how the right to privacy can be enforced as technologies continue to develop that can pose serious threats to individuals’ sense of dignity, reputation, privacy and safety,” he adds.

One enforcement Erdos refers to is Google Spain, a ruling made in 2014 by the Court of Justice of the European Union (CJEU) that examined search engines’ responsibilities when sharing content about us on the world wide web.

The CJEU ruled that people across all of the 28 EU Member States have a ‘right to be forgotten’ online, giving them an ability to prohibit search engines indexing inadequate, irrelevant or other illegal information about them against their name. This right to be forgotten is based on Europe’s data protection laws and applies to all online information about a living person.

Google responded by publishing a form you can submit to have such links to content (not the actual content) removed. I put it to the test – Google refuses on the basis that web links to my long-closed business are ‟justified” as they ‟may be of interest to potential or current consumers”.

Erdos explains that data protection doesn’t always work as it was originally intended to. “On paper, the law is in favour of privacy and the protection of individuals – there are stringent rules around data export, data transparency and sensitive data, for example.

“But that law was in essence developed in the 1970s, when there were few computers. Now we have billions of computers, and the ease of connectivity of smartphones and the internet.  Also, sharing online is not practically constrained by EU boundaries.

“That means the framework is profoundly challenged. There needs to be a more contextual legal approach, where the duties and possibly also the scope take into account risk as well as the other rights and interests that are engaged.  That law must then be effectively enforced.”

In fact, the EU data protection law currently extends surprisingly far. “By default, the law regulates anyone who alone, or jointly with others, does anything with computerised information that mentions a living person,” Erdos explains. “That could include many individuals on social networking sites. If you’re disseminating information about a third party to an indeterminate number of people, you’re (in theory at least) responsible for adherence to this law.”

Tweeters, for instance, may have to respond to requests for data (Tweets) to be rectified for inaccuracy or even removed entirely, and field ‘subject access requests’ for full lists of everything they’ve Tweeted about someone. And under the new General Data Protection Regulation that comes into effect in 2018, the maximum penalty for an infringement is €20 million (or, in the case of companies, up to 4% of annual global turnover).

When it comes to search engines or social media, Erdos admits that a strict application of the law is “not very realistic”. He adds: “There’s a systemic problem in the gap between the law on the books and the law in reality, and the restrictions are not desperately enforced.”

Erdos believes inconsistencies in the law could be exploited online by the ruthless. “The very danger of all-encompassing, stringent laws is that it seems as if responsible organisations and individuals who take them seriously are hamstrung while the irresponsible do whatever they want.”

This also applies to ‘derogations’ – areas where the law instructs a balance must be struck between data protection and the rights to freedom of journalistic, literary and artistic expression.

“Member states have done radically different things in their formal law here – from nothing at all through to providing a blanket exception – neither of which was the intention of the EU scheme.”

As the new law in 2018 will empower regulators to hand out fines of up to hundreds of millions of euros to large multinational companies, Erdos is passionate about the urgency of Europe getting a coordinated and clear approach on how its citizens can exercise their data protection rights.

“We are giving these regulators quite enormous powers to enforce these rules and yet do we have a good understanding of what we want the outcome to be and what we’re expecting individuals and organisations to do?” Erdos ponders.

“To me, this means that the enforcement will become more and more important. Data protection is not just a technical phrase – people really do need protection. The substance of the law needs to be hauled into something that’s more reasonable. That protection needs to be made real.”

Erdos’ research also explores the nature of data protection and academic freedom, and he successfully argued for academic expression to be added to the list of free speech derogations in the 2018 legislation. “I have come across the most egregious examples of research guidance stipulating alleged data protection requirements, including claims that published research can’t include any identifiable personal data at all,” says Erdos.

“In a survey of EU data protection authorities, I asked whether a journalist’s undercover investigation into extremist political beliefs and tactics and an academic’s undercover research into widespread claims of police racism could be legal under data protection. Not one regulator said the activity of the journalist would in principle be illegal, but almost half said the academic’s activity would be unlawful.

 “Academics aim to write something of public importance, and make it rigorous. The old law was seen to prioritise even tittle-tattle in a newspaper over academic research; one hopes this will largely be removed by the new law.”

For many, the greatest concern remains the potential threats to their privacy. In order for consumers to feel safe with emerging technology, law makers may have to legislate for potential breaches now, rather than react after the damage is done.

“We don’t want to respond in a panic of registering or documenting everything, but the alternative of collapse into an ‘anything goes’ situation is equally dangerous.

“Apps like Snapchat show many people value being able to upload certain pictures and information that soon disappear. We don’t want people forgetting what they’re sharing today, and then worrying far too late how third parties are using that information.”

Would Erdos himself ever use Snapchat Spectacles or Google Glass (he does own a smartphone)? He laughs. “Let’s face it, email, the internet, Google search… people ended up having to use them. So, never say never!”


Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.