Mobile Devices as Stigmatizing Security Sensors: The GDPR and a Future of Crowdsourced 'Broken Windows'
International Data Privacy Law, Volume 8, Issue 1, Pages 69–85, doi: 10.1093/idpl/ipx024
University of Groningen Faculty of Law Research Paper No. 13/2018
26 Pages Posted: 28 Jan 2018 Last revised: 8 Jun 2019
Date Written: 2018
Abstract
Various smartphone apps and services are available which encourage users to report where and when they feel they are in an unsafe or threatening environment. This user generated content may be used to build datasets, which can show areas that are considered ‘bad,’ and to map out ‘safe’ routes through such neighbourhoods. Despite certain advantages, this data inherently carries the danger that streets or neighbourhoods become stigmatized and already existing prejudices might be reinforced. Such stigmas might also result in negative consequences for property values and businesses, causing irreversible damage to certain parts of a municipality. Overcoming such an “evidence-based stigma” — even if based on biased, unreviewed, outdated, or inaccurate data — becomes nearly impossible and raises the question how such data should be managed.
Keywords: Apps, Crowdsourcing, GDPR, Privacy, Public Space, Security
Suggested Citation: Suggested Citation