When Data Protection by Design and Data Subject Rights Clash
Cite as: Michael Veale, Reuben Binns and Jef Ausloos (2018) When Data Protection by Design and Data Subject Rights Clash. International Data Privacy Law (2018) doi:10.1093/idpl/ipy002. [Note: An earlier draft was entitled "We Can't Find Your Data, But A Hacker Could: How 'Privacy by Design' Trades-Off Data Protection Rights"]Abstract➔Data Protection by Design (DPbD), a holistic approach to embedding principles in technical and organisational measures undertaken by data controllers, building on the notion of Privacy by Design, is now a qualified duty in the GDPR.➔Practitioners have seen DPbD less holistically, instead framing it through the confidentiality-focussed lens of Privacy Enhancing Technologies (PETs).➔While focussing primarily on confidentiality risk, we show that some DPbD strategies deployed by large data controllers result in personal data which, despite remaining clearly reidentifiable by a capable adversary, make it difficult for the controller to grant data subjects rights (eg access, erasure, objection) over for the purposes of managing this risk.➔Informed by case studies of Apple's Siri voice assistant and Transport for London's Wi-Fi analytics, we suggest three main ways to make deployed DPbD more accountable and data subject-centric: building parallel systems to fulfil rights, including dealing with volunteered data; making inevitable trade-offs more explicit and transparent through Data Protection Impact Assessments; and through ex ante and ex post information rights (arts 13-15), which we argue may require the provision of information concerning DPbD trade-offs.➔Despite steep technical hurdles, we call both for researchers in PETs to develop rigorous techniques to balance privacy-as-control with privacy-as-confidentiality, and for DPAs to consider tailoring guidance and future frameworks to better oversee the trade-offs being made by primarily well-intentioned data controllers employing DPbD.