[Switzerland] CH: Revison of Federal Data Protection Act, incl. draft)
Report on ePrivacy Regulation draft (Oct 2017)
REPORT
23 October 2017 |
|
|||
on the proposal for a regulation of the European Parliament and of the Council concerning the respect for private life and the protection of personal data in electronic communications and repealing Directive 2002/58/EC (Regulation on Privacy and Electronic Communications)
(COM(2017)0010 – C8-0009/2017 – 2017/0003(COD)) |
||||
|
http://www.europarl.europa.eu/sides/getDoc.do?type=REPORT&reference=A8-2017-0324&language=EN
[Hogan Lovell] Google’s DeepMind and GDPR
[Germany] Datenschutz-Anpassungs- und -Umsetzungsgesetz EU– DSAnpUG-EU
CJEU: Breyer vs. Germany
CJEU landmark decision in the case Breyer v. Federal Republic of Germany (decision dated 19 October 2016, case number C-582/14).
DLA Piper analysis
David Vasella
http://datenrecht.ch/bgh-i-s-breyer-vi-zr-13513-16-5-17-personenbezug-dynamischer-ip-adressen/
IAPP article
(from 2015) Rethinking Personal Data Breaches (EU)
So as the world stands still – and waits for GDPR to pass the European Parliament vote in a few days, and just before we are all hit by a wave of audit/certification/consulting firms selling their services – here’s a quick look at Personal Data Breaches.
According to Opinion 03/2014 of the Article 29 Working Party – which back in the days was just an opinion, but now gets quite a bit more muscle: http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp213_en.pdf
Most people think of a data breach as an event in which data is accessed by an authorized person, resold on the darknet, made public by some creant, etc..
The Article 29 Working Party took a much more holistic view – and includes loss of integrity and timely accessibility along with the loss of confidentiality.
Opinion 03/2014 gives examples of data breaches, and walks the reader through accessing the impact. While the GDPR will provide us with more details and requirements (e.g. to notify within 72 hours), the Opinion does a good job illustrating the underlying thinking.
So quoting from the Opinion:
“Case 1: Four laptop computers were stolen from a “Children’s Healthcare Institute”; they stored sensitive health and social welfare data as well as other personal data concerning 2050 children.
- Potential consequences and adverse effects of the confidentiality breach:
The first impact is a breach of medical secrecy: the database contains intimate medical information on the children which are available to unauthorized people. [..]
- Potential consequences and adverse effects of the availability breach:
It may disturb the continuity of children’s treatment leading to aggravation of the disease or a relapse. [..] - Potential consequences and adverse effects of the integrity breach:
The lost data may affect the integrity of the medical records and disrupt the treatments of the children. For example, if only an old back-up of the medical records exists, all changes to the data that were made on the stolen computers will be lost, leading to corruption of the integrity of the data. The use of medical records that are not up-to-date may disrupt the continuity of children’s treatments leading to aggravation of the disease or a relapse. [..]
“
So the overall paradigm is a bit different than elsewhere. – It will be interesting to see how many changes were made last minute to the GDPR, but assessments like the one above should be common place in 2018 and beyond.
(from 2016) – Lessons from living with high privacy fines (Spain)
The GDPR introduces some very high fines for violations, and for many countries in Europe this will be a major change. – In this context, it’s interesting to have a look at Spain, where the Data Protection Authority can already enforce fines of up to 600,000 EUR since several years.
Ricard Martinez of the Spanish Data Protection Association APEP wrote a very interesting article on the challenges that come with high privacy fines.
My key take-aways from his post are:
- The total annual amount of fines in Spain is between 15 to 20 mio EUR in the last decade.
- The majority of the sanctioned companies are in the telecommunications, video surveillance, and financial industries. Their relative share stays about the same year by year. – So the high fines do not appear to be a crucial deterrent.
- The legislator had to modulate the sanctions to balance the impact on small and medium enterprises. – It’s important that the DPAs harmonize around this before the GDPR becomes effective, as the overall effect might be unfair.
- The volume of complaints is steadily increasing from year to year. This has an impact on the ability of the DPA to take actions: The number of actual infringement statements is staying constant. – Any news on DPA actions seem to increase the volume of complaints further.
There’s much more information in Ricard Martinez’ post, and I encourage you to read more at http://www.phaedra-project.eu/the-challenge-of-the-enforcement-in-the-proposal-for-a-general-data-protection-regulation-2/
(from 2016) UK court decision on whether clinical trial data can be adequately anonymised
The below is from 2016.
Very interesting article from Freshfields, that shows the UK Information Commissioner (supported by the First Tier Tribunal) taking a practical approach to the anonymisation of personal data. Also, a reminder that clinical trial data might be subject to freeddom-of-informations requests in UK under some conditions.
http://knowledge.freshfields.com/en/Global/r/1640/can_clinical_trial_data_be_adequately_anonymised__
Key points of interest incl.
“There was no evidence that a third party, alone, could identify participants. The evidence showed that identification would be possible by combining the patient data with NHS data, but this would have involved an NHS employee breaching professional, legal and ethical obligations, and having the skill and motivation to do so. This level of conjecture was considered remote. It is not ‘any conceivable means of identification’ that must be considered, but only ‘those reasonably likely to be used’. We ‘must consider whether any individual is reasonably likely to have the means and the skill to identify any participant and also whether they are reasonably likely to use those skills for that purpose’. ”
High-level summary
“The Information Commissioner had ordered Queen Mary University London to disclose patient data from a trial on chronic fatigue syndrome under the Freedom of Information Act. The Tribunal reviewed this decision.
QMUL ran several arguments but the one the Tribunal most struggled with was whether the data had been anonymised enough that it should no longer be considered personal data. If so, it would likely be disclosable under FOIA. If the data was not sufficiently anonymised, it would still be ‘personal data’ and would therefore have to be withheld from disclosure.
Although the Tribunal was split in its decision, the majority was in favour of upholding the Information Commissioner’s decision that the data had been adequately anonymised. QMUL was therefore ordered to disclose it.”
Privacy as a Service in Digital Health
.. paper by Xiang Su, Jarkko Hyysalo, Mika Rautiainen, Jukka Riekki, Jaakko Sauvola, Altti Ilari Maarala, and Harri Honko
at https://arxiv.org/ftp/arxiv/papers/1605/1605.00833.pdf
I still need to let it truely sink in before I’m ready to comment on it – but I am glad that this kind of privacy design thinking is now happening. GDPR offers some challenges and many opportunities. Having a technical layer to complement the privacy processes, we’ll all have to put in place can be very helpful. Let’s hope for some reasonable open data scheme to make the legal aspects more digestable to tools and algorithms.
Let’s just hope, it won’t go the way of the P3P protocol.