Monday, September 5, 2016

Browser Fingerprinting Study: Sign Up Today

Hi, all,

Please find below an invitation by Dr. Zinaida Benenson, from the University of Erlangen-Nuremberg, for you to participate in her browser fingerprinting study. Participation takes less than 1 minute per week and no account is needed to sign up.

If you would like to receive the next posts by email, don't forget to subscribe.

Luiza Jarovsky
Lawyer and PhD Fellow Researching Data Privacy


"My research group seeks support for an innovative browser fingerprinting study. Participation takes less than 1 minute per week, no account is needed to sign up:

The study is running for 6 months, here are the first statistics:

Your support would help all research groups over the world that do research on browser fingerprinting, as we are going to release an open data set of fingerprints at the end of 2016. Till now, everybody has to compile their own data set, and this is extremely time-consuming.

Our data set will be unique, because through our novel study design we have an unprecedented level of ground truth: We can assign each fingerprint to a particular (of course, anonymized) participant. In all other projects, recurring participants are recognized through cookies, which is very error-prone, as people delete their cookies"

Dr. Zinaida Benenson
Human Factors in Security and Privacy Group
Chair for IT Security Infrastructures
University of Erlangen-Nuremberg

The Unintended Consequences of “People You May Know”

Post Written By Mark Warner - usable privacy and security researcher. Twitter: @privacurity

Going to see a psychiatrist can be a daunting prospect for many due to the often-intimate information being disclosed. The doctor-patient confidentiality regulations are designed to provide an environment in which the patient feels comfortable to disclose and discuss very sensitive information without fear of negative consequences. While the intimate information disclosed during a session must remain confidential, so too should the attendance itself.

Last week, an article written by Kashmir Hill at, reported on a psychiatrist who was made aware that her patients were being recommended as potential friends to one another over Facebook. While the psychiatrist herself reported only occasional use of the social messaging platform and never shared her e-mail or phonebook contacts, the recommendation engine was able to find common factors between her patients, recommending them to one another as “people you may know”. 

Facebook states that its suggestion engine works by analysing “mutual friends, work and education information, networks you’re part of, contacts you’ve imported and many other factors”. The vagueness of this statement leads to the question, what are these other factors?

Could it be that her patients have “checked-in” to similar places in and around the treatment location? Could these common locations be factors that Facebook analyse to generate friend suggestions? If the patients are sharing their email and phonebook contacts, could Facebook be linking them through their common contact with the psychiatrist? If so, could this be actively exploited to identify patient details?

This example illustrates the way technology is bridging the gap between the professional space and the personal. It also acts as a warning sign for the growing use of technologies that were never designed, or intended for medical use, which are now fast becoming everyday tools within the industry. WhatsApp is a great example of this. It’s inexpensive, simple to implement, has almost no integration with hospital or clinical systems, but enables real time, media rich communication between medical staff, and even patients.

The rapid adoption of these technologies into and on the boundaries of the medical industry could have huge benefits, but unintended consequences may result in significant personal and societal costs. How these technological changes are managed to allow society to benefit while maintaining fundamental values that protect the individuals right to privacy is at the forefront of the Privacy & Us project. These types of questions will be the focus of our multidisciplinary research over the next three years, so watch this space.

Post Written By Mark Warner - usable privacy and security researcher. Twitter: @privacurity