The Trouble with Article 25 (and How to Fix It): The Future of Data Protection by Design and Default

International Data Privacy Law (2020) doi: 10.1093/idpl/ipz019

29 Pages Posted: 12 Mar 2021

See all articles by Ira Rubinstein

Ira Rubinstein

New York University (NYU) - Information Law Institute

Nathaniel Good

University of California, Berkeley - School of Information

Date Written: September 30, 2019

Abstract

In its simplest formulation, data protection by design and default uses technical and organizational measures to achieve data protection goals. Although privacy regulators have endorsed privacy-enhancing technologies or PETs for well over thirty years, Article 25 of the General Data Protection Regulation (GDPR) breaks new ground by transforming this idea into a binding legal obligation. But Article 25 as presently conceived is poorly aligned with privacy engineering methods and related privacy-enhancing technologies (PETs). This is especially true of “hard” PETs that place limited trust in third parties (including data controllers) and instead rely on cryptographic techniques to achieve data minimisation. In order to advance data protection in its own right rather than merely reinforce the general principles of the GDPR, Article 25 must be interpreted as requiring the implementation of privacy engineering and hard PETs. A bold way to achieve this is by mandating that data controllers use available hard PETs for data minimisation. More gradual steps include data protection regulators insisting on a central role for privacy engineering and PETs in public sector projects; issuing guidance on Article 25 in very forceful terms that clearly require the implementation of “state of the art” privacy technology; and using their enforcement powers to reward good examples of privacy engineering rather than to penalize failures.

NB: This is a pre-copyedited, preprint version of an article accepted for publication in International Data Privacy Law following peer review. The final and updated version was published in International Data Privacy Law, Volume 10, Issue 1, February 2020, Pages 37–56, https://doi.org/10.1093/idpl/ipz019.

Keywords: privacy by design, data protection by design, privacy-enhancing technologies, privacy engineering, trust, data minimisation,privacy, data protection, GDPR

Suggested Citation

Rubinstein, Ira and Good, Nathaniel, The Trouble with Article 25 (and How to Fix It): The Future of Data Protection by Design and Default (September 30, 2019). International Data Privacy Law (2020) doi: 10.1093/idpl/ipz019, Available at SSRN: https://ssrn.com/abstract=3773333

Ira Rubinstein (Contact Author)

New York University (NYU) - Information Law Institute ( email )

40 Washington Square South
New York, NY 10012-1301
United States

Nathaniel Good

University of California, Berkeley - School of Information ( email )

102 South Hall
Berkeley, CA 94720-4600
United States

Do you have negative results from your research you’d like to share?

Paper statistics

Downloads
313
Abstract Views
1,057
Rank
176,825
PlumX Metrics