Monday, October 30, 2017

GDPR step-by-step - Part 2 - Personal, Pseudonymised and Anonymous Data

This is the second post in my new series about the GDPR (General Data Protection Regulation), in which in I will highlight relevant aspects of this new regulation, specially for businesses.

The GDPR shall apply from May 2018, so it is very important that businesses are fully prepared to the new rules. This series is an attempt to help business owners to be aware of the new rules and the specific challenges that they might present in different information systems. These posts have educational purposes, they do not substitute a consultation with a lawyer. I hope that the content can be useful to you. All highlights and comments in yellow are mine. To read the GDPR, click here.

In this Part 2, you will find:

1- What is personal data?
2- What is pseudonymized data?
3What is anonymous data?

[Part 1]



1- What is personal data?


As we saw last week, in Part 1 of this series, article 2 of the GDPR, which deals with the material scope of the new regulation, states that "This Regulation applies to the processing of personal data wholly or partly by automated means and to the processing other than by automated means of personal data (...)".


The question that we should as then is
: what is personal data?

Article 4(1) of the GDPR brings this definition (which might surprise some, as also information relating to an identifiable person is considered personal data):


"‘personal data’ means any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person;"


It is a broad definition and also contextual, as if there are technologies that in some context might identify a person only by knowing a single cultural element, for example, this cultural element, in that case, will be considered personal data and its processing will be subject to the GDPR.


Recital 26 of the GDPR helps us understand this concept:


"The principles of data protection should apply to any information concerning an identified or identifiable natural person. Personal data which have undergone pseudonymisation, which could be attributed to a natural person by the use of additional information should be considered to be information on an identifiable natural person. To determine whether a natural person is identifiable, account should be taken of all the means reasonably likely to be used, such as singling out, either by the controller or by another person to identify the natural person directly or indirectly. To ascertain whether means are reasonably likely to be used to identify the natural person, account should be taken of all objective factors, such as the costs of and the amount of time required for identification, taking into consideration the available technology at the time of the processing and technological developments. The principles of data protection should therefore not apply to anonymous information, namely information which does not relate to an identified or identifiable natural person or to personal data rendered anonymous in such a manner that the data subject is not or no longer identifiable. This Regulation does not therefore concern the processing of such anonymous information, including for statistical or research purposes.


Again, it seems that the legal definition is contextual here, as in order to understand if the data can be identified, account should be taken on "objective factors, such as the costs of and the amount of time required for identification, taking into consideration the available technology at the time of the processing and technological developments." 
For now, it seems that, in practice, all data can be considered personal data. So what can a business owner do? Let's then go to item 2, which deals with pseudonymisation.


2- What is pseudonymised data?

The GDPR also defines what is pseudonymised data. According to article 5:


"‘pseudonymisation’ means the processing of personal data in such a manner that the personal data can no longer be attributed to a specific data subject without the use of additional information, provided that such additional information is kept separately and is subject to technical and organisational measures to ensure that the personal data are not attributed to an identified or identifiable natural person"

In recital 28, the GDPR expresses the advantages of pseudonymisation:

"The application of pseudonymisation to personal data can reduce the risks to the data subjects concerned and help controllers and processors to meet their data-protection obligations. The explicit introduction of ‘pseudonymisation’ in this Regulation is not intended to preclude any other measures of data protection."

When dealing with data protection by default and by design (Article 25), the GDPR states that:

"1. Taking into account the state of the art, the cost of implementation and the nature, scope, context and purposes of processing as well as the risks of varying likelihood and severity for rights and freedoms of natural persons posed by the processing, the controller shall, both at the time of the determination of the means for processing and at the time of the processing itself, implement appropriate technical and organisational measures, such as pseudonymisation, which are designed to implement data-protection principles, such as data minimisation, in an effective manner and to integrate the necessary safeguards into the processing in order to meet the requirements of this Regulation and protect the rights of data subjects."

It is also associated with data security, as article 32, dealing with security of processing, establishes that:

"1. Taking into account the state of the art, the costs of implementation and the nature, scope, context and purposes of processing as well as the risk of varying likelihood and severity for the rights and freedoms of natural persons, the controller and the processor shall implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk, including inter alia as appropriate: (a) the pseudonymisation and encryption of personal data;"

In any case, we cannot forget what we read in recital 26, which stated that "personal data which have undergone pseudonymisation, which could be attributed to a natural person by the use of additional information should be considered to be information on an identifiable natural person" - meaning that the GDPR is applicable to pseudonymised data.

Now we go to the last type of data for our purpose - anonymous data.

3What is anonymous data?

According to the end of recital 26, "(...) the principles of data protection should therefore not apply to anonymous information, namely information which does not relate to an identified or identifiable natural person or to personal data rendered anonymous in such a manner that the data subject is not or no longer identifiable. This Regulation does not therefore concern the processing of such anonymous information, including for statistical or research purposes."

Therefore only in cases in which information is considered anonymous - according to the GDPR definition - the processing of this data will not be subject to the GDPR.

As we saw today, it is very important to understand the concepts of personal data, pseudonymised data and anonymous data, as it might influence the rules applicable to the processing of such data.


*
That's all for today. Do you have comments about this post? Feel free to post them below. 



Best,

Luiza

GDPR, data protection
privacy


Tuesday, October 24, 2017

GDPR step-by-step - Part 1 - Material Scope, Geographical Scope and Lawfulness of Processing

This is the first post in my new series about the GDPR (General Data Protection Regulation), in which in I will highlight relevant aspects of this new regulation, specially for businesses.

The GDPR shall apply from May 2018, so it is very important that businesses are fully prepared to the new rules. This series is an attempt to help business owners to be aware of the new rules and the specific challenges that they might present in different information systems. These posts have educational purposes, they do not substitute a consultation with a lawyer. 
I hope that the content can be useful to you. All highlights and comments in yellow are mine. To read the GDPR, click here.

In this Part 1, you will find:


1- The GDPR's material scope;
2- The GDPR's geographical scope;
3- Lawfulness of processing.

[Part 2]

*

1- To what type of data processing is it applicable? - material scope - article 2:

The GDPR is applicable to "the processing of personal data wholly or partly by automated means and to the processing other than by automated means of personal data which form part of a filing system or are intended to form part of a filing system." (article 2)

both automated means and not automated means

It is NOT applicable to "the processing of personal data:

(a) in the course of an activity which falls outside the scope of Union law;
(b) by the Member States when carrying out activities which fall within the scope of Chapter 2 of Title V of the TEU;
(c) by a natural person in the course of a purely personal or household activity;
(d) by competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, including the safeguarding against and the prevention of threats to public security."


2- To which territories it applies? - geographical scope - article 3:

1. This Regulation applies to the processing of personal data in the context of the activities of an establishment of a controller or a processor in the Union, regardless of whether the processing takes place in the Union or not.

if the controller is in the European Union - GDPR applies


2. This Regulation applies to the processing of personal data of data subjects who are in the Union by a controller or processor not established in the Union, where the processing activities are related to: 


(a) the offering of goods or services, irrespective of whether a payment of the data subject is required, to such data subjects in the Union; or


(b) the monitoring of their behaviour as far as their behaviour takes place within the Union.


both situations in which the controller is not in the European Union, nevertheless the GDPR is applied


3. This Regulation applies to the processing of personal data by a controller not established in the Union, but in a place where Member State law applies by virtue of public international law.



3- Lawfulness of processing - article 6:

all data processing, in order to be legal, has to correspond to one of the items below

1. Processing shall be lawful only if and to the extent that at least one of the following applies:

(a) the data subject has given consent to the processing of his or her personal data for one or more specific purposes;

(b) processing is necessary for the performance of a contract to which the data subject is party or in order to take steps at the request of the data subject prior to entering into a contract;

(c) processing is necessary for compliance with a legal obligation to which the controller is subject; 

(d) processing is necessary in order to protect the vital interests of the data subject or of another natural person;

(e) processing is necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller;

(f) processing is necessary for the purposes of the legitimate interests pursued by the controller or by a third party, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require protection of personal data, in particular where the data subject is a child.

Point (f) of the first subparagraph shall not apply to processing carried out by public authorities in the performance of their tasks.

item "a" will be a frequent justification for business, therefore we need to know what is consent according to the GDPR. Article 4(11) explains:

"‘consent’ of the data subject means any freely given, specific, informed and unambiguous indication of the data subject's wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her"

indication of the data subject has to be:
- freely given
- specific
- informed
- unambiguous

it has to be delivered by
- a statement or
- a clear affirmative action

*
That's all for today. Do you have comments about this post? Feel free to post them below.

Best,
Luiza
about.me/luizasrezende
GDPR, data protection
privacy

Monday, September 5, 2016

Browser Fingerprinting Study: Sign Up Today

Hi, all,

Please find below an invitation by Dr. Zinaida Benenson, from the University of Erlangen-Nuremberg, for you to participate in her browser fingerprinting study. Participation takes less than 1 minute per week and no account is needed to sign up.

If you would like to receive the next posts by email, don't forget to subscribe.

Luiza Rezende
about.me/luizasrezende

***

"My research group seeks support for an innovative browser fingerprinting study. Participation takes less than 1 minute per week, no account is needed to sign up: https://browser-fingerprint.cs.fau.de

The study is running for 6 months, here are the first statistics: https://browser-fingerprint.cs.fau.de/statistics

Your support would help all research groups over the world that do research on browser fingerprinting, as we are going to release an open data set of fingerprints at the end of 2016. Till now, everybody has to compile their own data set, and this is extremely time-consuming.

Our data set will be unique, because through our novel study design we have an unprecedented level of ground truth: We can assign each fingerprint to a particular (of course, anonymized) participant. In all other projects, recurring participants are recognized through cookies, which is very error-prone, as people delete their cookies"


Dr. Zinaida Benenson
Human Factors in Security and Privacy Group
Chair for IT Security Infrastructures
University of Erlangen-Nuremberg


The Unintended Consequences of “People You May Know”

Post Written By Mark Warner - usable privacy and security researcher. Twitter: @privacurity

Going to see a psychiatrist can be a daunting prospect for many due to the often-intimate information being disclosed. The doctor-patient confidentiality regulations are designed to provide an environment in which the patient feels comfortable to disclose and discuss very sensitive information without fear of negative consequences. While the intimate information disclosed during a session must remain confidential, so too should the attendance itself.

Last week, an article written by Kashmir Hill at Fusion.net, reported on a psychiatrist who was made aware that her patients were being recommended as potential friends to one another over Facebook. While the psychiatrist herself reported only occasional use of the social messaging platform and never shared her e-mail or phonebook contacts, the recommendation engine was able to find common factors between her patients, recommending them to one another as “people you may know”. 

Facebook states that its suggestion engine works by analysing “mutual friends, work and education information, networks you’re part of, contacts you’ve imported and many other factors”. The vagueness of this statement leads to the question, what are these other factors?

Could it be that her patients have “checked-in” to similar places in and around the treatment location? Could these common locations be factors that Facebook analyse to generate friend suggestions? If the patients are sharing their email and phonebook contacts, could Facebook be linking them through their common contact with the psychiatrist? If so, could this be actively exploited to identify patient details?

This example illustrates the way technology is bridging the gap between the professional space and the personal. It also acts as a warning sign for the growing use of technologies that were never designed, or intended for medical use, which are now fast becoming everyday tools within the industry. WhatsApp is a great example of this. It’s inexpensive, simple to implement, has almost no integration with hospital or clinical systems, but enables real time, media rich communication between medical staff, and even patients.

The rapid adoption of these technologies into and on the boundaries of the medical industry could have huge benefits, but unintended consequences may result in significant personal and societal costs. How these technological changes are managed to allow society to benefit while maintaining fundamental values that protect the individuals right to privacy is at the forefront of the Privacy & Us project. These types of questions will be the focus of our multidisciplinary research over the next three years, so watch this space.

Post Written By Mark Warner - usable privacy and security researcher. Twitter: @privacurity

Wednesday, August 31, 2016

IFIP Summer School 2016 and 1st Privacy&Us Training Event

Last week happened the IFIP Summer School, in Karlstad, Sweden, in the Computer Science building at Karlstad University (KAU), which had as the main subject "Privacy and Identity Management".

"Egg room", one of the classes where the lectures happened, at Karlstad Univesity, Sweden. Picture taken by Michael Bechinie
It was a very diversified and interdisciplinary program: from Monday to Friday (August 21-26) the participants had the opportunity to join multiple sections varying from law to computer science, from ethics to HCI (human-computer Interaction), all of them investigating this very interesting and rich field of Privacy and Identity Management. You can check the entire program here.

Among the speakers were Amelia Andersdotter (dataskydd.net), Jan Camenisch (IBM Research – Zürich, Switzerland), Roger Clarke (Xamax Consultancy Pty Ltd., Australia), Jolanda Girzl (Konsument Europa, Director, ECC Sweden Swedish Consumer Agency), Marit Hansen (Privacy Commissioner of Schleswig-Holstein, ULD, Germany), Rainer Knyrim (Preslmayr Rechtsanwälte AG, Austria), Steven Murdoch (University College London, UK – TBC), Charles Raab (University of Edinburgh, UK), Angela Sasse (University College London, UK), Bernd Carsten Stahl (De Montfort University, Leicester, UK) and Vicenc Torra (University of Skövde).


As a continuation of the summer school on August 25th started the 1st Privacy&Us (Privacy & Usability) training event, having the presence of the PhD students, their supervisors and the business partners of the project. This was the first training event of the program, which had various interesting lectures, always dealing with privacy, usability, or the intersection of both fields. For example, the participants had a lecture on "Privacy of Personal Health Data", with Angela Sasse; another lecture about the General Data Protection Regulation with Rainer Knyrim and a workshop on "Introduction to Usability", with Angela Sasse and Michael Bechinie, all in the first day.

In the end, after such a productive week, we could all relax and enjoy a delicious barbecue - Swedish style! - in Karlstad, in a sunset atmosphere. You can check the beautiful pictures (all taken by the participants of the program) below.

Stay tuned to receive more news about Privacy&Us (and more news about privacy as well!). Don't forget to subscribe to the blog (on the right side of the page).

All the best,

Luiza Rezende

karlstad
One of the lectures. Picture taken by Emiliano de Cristofaro

Karlstad
In the city of Karlstad. Picture by Michael Bechinie

sunset
The beautiful sunset. Picture taken by Alex Railean