SSRN Id3769159
SSRN Id3769159
SSRN Id3769159
MPI-CSL
W o r k i n g Pa p e r s
202 1 | 03
Artificial Intelligence
and the Right to
Data Protection
Ralf Poscher
Abstract
In respect to technological advancement, lawyers often come into play merely
as external restrictions. That is, they are asked whether a given technology is
consistent with legal regulations or to evaluate its foreseeable liability risks.
As a legal researcher, my interest is the exact opposite: How do new technolo-
gies influence our legal framework, concepts, and doctrinal constructions?
This contribution shows how Artificial Intelligence (AI) challenges the tradi-
tional understanding of the right to data protection and presents an outline of
an alternative conception, one that better deals with emerging AI technologies.
* Director of the Department of Public Law at the Max Planck Institute for the Study of Crime, Security
and Law. Honorary Professor at the University of Freiburg.
1 M Albers, ´Realizing the Complexity of Data Protection´ in S Gutwirth/R Leenes/P De Hert (eds), Re-
loading Data Protection (2014) 217 (hereafter Albers, ´Complexity´); K Vogelsang, Grundrecht auf in-
formationelle Selbstbestimmung? (1987) 39–88.
data. 2 The Court explained that the individual has a “right to determine himself, when
and in which boundaries personal data is disseminated” 3 – just as an owner has the
right to determine herself when she allows someone to use her property. 4 This idea,
which is already illusory in the analog world, has often been ridiculed as naive in our
contemporary, technologically interconnected and socially networked reality, in which
a vast spectrum of personal data is disseminated and exchanged at all levels almost
all of the time. 5 Data simply does not possess the kind of exclusivity to justify paral-
lels with property ownership. 6 Our Constitutional Court seems to have recognized
this. It has not explicitly revoked the property-like formula, but it has made decreas-
ing use of it and makes no reference to it at all in more recent decisions on the sub-
ject. 7
Even if everyone can agree that the right to data protection is, in substance, not akin
to a property interest in one’s personal data, the right to data protection is formally
handled as if it were a property right. In the same way that any non-consensual use
of one’s property by someone else is regarded a property rights infringement, any
non-consensual use – gathering, storage, processing, and transmission – of per-
sonal data is viewed as an infringement of the right to data protection. This formal
conception of data protection is not only still prevalent in the German context, but the
European Court of Justice (ECJ) perceives the right to data protection under Art. 8 of
the Charter of Fundamental Rights of the European Union (CFR) in much the same
way. In one of its latest decisions, the ECJ confirmed that data retention as such
constitutes an infringement irrespective of substantive inconveniences for the per-
sons concerned:
It should be made clear, in that regard, that the retention of traffic and location
data constitutes, in itself, … an interference with the fundamental rights to respect
for private life and the protection of personal data, enshrined in Articles 7 and 8 of
the Charter, irrespective of whether the information in question relating to private
2There is a certain parallel between this conceptualization of the right to privacy and its scope under
the U.S. Supreme Court’s early Fourth Amendment jurisprudence: The Supreme Court, until Katz v.
U.S., 389 U.S. 347 [1967], applied the Fourth Amendment only to the search and seizure of a citizen’s
personal property and effects (see, e.g., Olmstead v. United States, 277 U.S. 438 [1928]) and was
thus tied in substance to a property right.
3
Bundesverfassungsgericht (1 BvR 209/83) (Judgement) [1983] BVerfG 65, 1 (42): „... Befugnis des
Einzelnen, grundsätzlich selbst zu entscheiden, wann und innerhalb welcher Grenzen persönliche Le-
benssachverhalte offenbart werden.“
4 Albers, ´Complexity´ (n 1) 219.
5M Albers, ´Information als neue Dimension im Recht´ (2002) 33 Rechtstheorie 61 (81) (hereafter Al-
bers, ´Information´); K Ladeur, ´Das Recht auf informationelle Selbstbestimmung: Eine Juristische
Fehlkonstruktion?´(2009) 62 DÖV 45 (46–47).
6 Cf. J Fairfield/C Engel, ´Privacy as a Public Good´ in R A Miller (ed), Privacy and Power (2017).
7E.g., Bundesverfassungsgericht (1 BvR 2388/03) (Decision) [2008] BVerfGE 120, 351 (360); Bun-
desverfassungsgericht (1 BvR 2074/05) (Judgement) [2008] BVerfGE 120, 378 (397–398).
8Court of Justice of the European Union (La Quadrature du Net and others v. Premier ministre and
others) (Judgement) [2020] CJEU Joined Cases C-511/18, C-512/18 and C-520/18, para. 115.
9 Albers, ´Complexity´ (n 1) 219.
10Bundesverfassungsgericht (1 BvR 2226/94) (Judgement) [1999] BVerfGE 100, 313 (366); Bun-
desverfassungsgericht (1 BvR 518/02) (Decision) [2006] BVerfGE 115, 320 (343-344); Bundesverfas-
sungsgericht (1 BvR 256, 263, 586/08) (Judgement) [2009] BVerfGE 125, 260 (310); Bundesverfas-
sungsgericht (1 BvR 1299/05) (Decision) [2012] BVerfGE 130, 151 (184); Bundesverfassungsgericht
(1 BvR 142/15) (Decision) [2018] BVerfGE 150, 244 (265–266).
11Bundesverfassungsgericht (1 BvR 1254/05) (Judgement) [2008] BverfGE 120, 378 (400-401); Bun-
desverfassungsgericht (1 BvR 142/15) (Decision) [2018] BVerfGE 150, 244 (266).
12H Surden, ´Machine Learning and Law´ (2014) 89 Washington L Rev 87 (88–90) (hereafter Surden,
´Machine Learning´); W Hoffmann-Riem, ´Verhaltenssteuerung durch Algorithmen – Eine Herausfor-
derung für das Recht´ (2017) 142 AöR 3 (hereafter Hoffmann-Riem, ´Verhaltenssteuerung´); W Hoff-
mann-Riem, ´Artificial Intelligence as a Challenge for Law and Regulation´ in T Wischmeyer/T Rade-
macher (eds), Regulating Artificial Intelligence (2020) 3 (hereafter Hoffmann-Riem, ´Artificial Intelli-
gence´).
13 Surden, ´Machine Learning´ (n 11) 93.
Importantly, while the algorithms may be transparent to their designers, 14 after the
system has cycled through hundreds, thousands, or even millions of recursive, self-
programming patterns, even the system programmers will no longer know which type
of data was processed in which way, which inferences were drawn from which data
correlations, and how certain data have been weighted. 15
The self-adaptive “behavior” of at least certain types of AI technologies leads to a
lack of transparency. This phenomenon is often referred to as the black box issue of
AI technologies. 16 Why is this a problem for the traditional approach to evaluating
data protection?
If we wish to follow the analytical approach in order to make our judgement on the
basis of which individual personal data have been used and how many times they
have been processed and cross-analyzed with what types of other data, 17 AI’s lack
of transparency seems to rule this out. Thus, AI creates problems for the traditional
understanding and treatment of the right to data protection due to its lack of transpar-
ency. 18 These issues are mirrored in the transparency requirements of the General
Data Protection Regulation, which rests very much on the traditional conception of
the fundamental right to data protection. 19
destructive; the second part resolves the tension created by a proposal that is doctri-
nally mundane but shifts the perspective on data protection rights substantially.
Among other advantages, the proposed shift in perspective could render the right to
data protection more suitable for handling issues arising from AI.
The first part is a no-right-thesis. It contends that there is no fundamental right to
data protection. That is, the right to data protection is not a right of its own standing.
This explains why the ongoing quest for a viable candidate as the proper object of
the right to data protection has been futile. 21 Art. 8 CFR, which seems to guarantee
the right to data protection as an independent fundamental right, rests on the misun-
derstanding that the fundamental rights developments in various jurisdictions,
namely also in the jurisdiction of the German Federal Constitutional Court, have cre-
ated a new, substantive fundamental right with personal data as its object. There is
no such new, substantive fundamental right. This, however, does not mean that
there is no fundamental rights protection against the collection, storage, processing,
and dissemination of personal data. Yet data protection does not take the form of a
new fundamental right – property-like or otherwise.
The second part of the thesis reconstructs the “right” by shifting the focus to already
existing fundamental rights. Data protection is provided by all of the existing funda-
mental rights, which can all be affected by the collection, storage, processing, and
dissemination of personal data. 22 In his instructive article “Nothing to Hide,” David
Solove developed a whole taxonomy of possible harms that can be caused by data
collection. 23 They include the loss of life and liberty, infringements on property inter-
ests and the freedom of expression, violations of privacy, and denials of due process
guarantees. It is easy to see how the dissemination of personal finance information
can lead to the loss of property. He cites tragic cases, which have even led to a loss
of life, such as when a stalker was handed the address of his victim by public author-
ities ‒ data he used to locate and kill her. 24 Solove’s list suggests that the essence of
data protection cannot be pinned down to merely a single liberty or equality interest
but instead potentially involves every fundamental right. Understood correctly, the
right to data protection consists in the protection that all fundamental rights afford to
all the liberty and equality interests that might be affected by the collection, storage,
processing, and dissemination of personal data.
The way in which fundamental rights protect against the misuse of personal data re-
lies on doctrinally expanding the concept of rights infringement. Fundamental rights
usually protect against actual infringements. For example, the state encroaches
upon your right of personal freedom if you are incarcerated, your right to freedom of
assembly is infringed when your meeting is prohibited or dispersed by the police,
and your freedom of expression is violated when you are prevented from expressing
your political views. Usually, however, fundamental rights do not protect against the
purely abstract danger that the police might incarcerate you, might disperse your as-
sembly, or might censor your views. You cannot go to the courts claiming that certain
police behavioral patterns increase the danger that they might violate your right to
assembly. The courts would generally say that you have to wait until they either al-
ready do so or are in the concrete process of doing so. In some cases, your funda-
mental rights might already protect you if there is a concrete danger that such in-
fringements are about to take place, so that you do not have to suffer the infringe-
ment in the first place if it were to violate your rights. 25 These cases, however, are
exceptions.
The right to data protection works differently. What is unique about data protection is
its generally preemptive character. It already protects against the abstract dangers
involved in the collection, storage, and processing of personal data. 26 Data protec-
tion preemptively protects against violations of liberty or equality interests that are
potentially connected to using personal data. 27 The collection, aggregation, and pro-
cessing of data as such does no harm. 28 This has often been expressed in
25
See Bundesverfassungsgericht (2 BvR 1060/78) (Decision) [1979] BVerfGE 51, 324, in which the
Court saw it as an infringement of the right to physical integrity to proceed with a criminal trial if the
defendant runs the risk of suffering a heart attack during the trial; cf. also Bundesverfassungsgericht
(1 BvR 542/62) (Decision) [1963] BVerfGE 17, 108 (high-risk medical procedure – lumbar puncture –
with the aim of determining criminal accountability for a misdemeanor); Bundesverfassungsgericht (1
BvR 614/79) (Decision) [1979] BVerfGE 52, 214 (220) (eviction of a suicidal tenant) and R Poscher,
Grundrechte als Abwehrrechte (2003) 388–390 (hereafter Poscher, ‘Abwehrrechte’).
26Cf. Marsch, Datenschutzgrundrecht (n 20) 109, with a focus on the internal peace of mind of decid-
ing on one’s exercise of fundamental rights.
27
E.g., the collection of comprehensive data in the course of a nationwide census is not in itself an im-
minent threat, but it is dangerous because of the potential (mis-)use of the masses of the gathered
mass data, cf. Bundesverfassungsgericht (1 BvR 209/83) (Judgement) [1983] BVerfG 65, 1; the col-
lection of data for an anti-terrorism or anti-Nazi database is problematic because of potential negative
impacts for those mentioned in it, cf. Bundesverfassungsgericht (1 BvR 1215/07) (Judgement) [2013]
BVerfGE 133, 277 (331-332).
28 Albers, ´Complexity´ (n 1) 225.
conjunction with the idea that data needs to become information in certain contexts
before it gains relevance. 29 It is only the use of data in certain contexts that might in-
volve a violation of liberty or equality interests. The collection of personal data on po-
litical or religious convictions of citizens by the state is generally prohibited, for exam-
ple, because of the potential that it could be misused to discriminate against political
or religious groups. Data protection demands a justification for the collection of per-
sonal data, even if such misuse is only an abstract danger. 30 It does not require con-
crete evidence that such misuse took place, or even that such misuse is about to
take place. The right to data protection systematically enhances every other funda-
mental right already in place to protect against the abstract dangers that accompany
collecting and processing personal data. 31
A closer look at the court practice regarding the right to data protection reveals that,
despite appearances, courts neither treat it as a right the right to data protection as a
right on its own but instead associate it with different fundamental rights, depending
on the context and the interest affected. 32 Even at the birth hour of the right to data
protection in Germany, in the famous “Volkszählungs-Urteil” (census decision), the
examples the court gave to underline the necessity for a new fundamental right to
“informational self-determination” included a panoply of fundamental rights, such as
the right to assembly. 33 In an unusual process of constitutional migration, the court
pointed to the “chilling effects” the collection of data on assembly participation could
have for bearers of that right, 34 as they were first discussed by the U.S. Supreme
29
M Albers, ´Zur Neukonzeption des grundrechtlichen „Daten“schutzes´ in A Haratsch et al (eds), Her-
ausforderungen an das Recht der Informationsgesellschaft (1996) 121–23, 131–33; Albers, ´Informa-
tion´ (n 5) 75; M Albers, Informationelle Selbstbestimmung (2005) 87–148; M Albers, ´Umgang mit
Personenbezogenen Informationen und Daten´ in W Hoffmann-Riem, E Schmidt-Aßmann, A Voß-
kuhle (eds) Grundlagen des Verwaltungsrechts (2nd edn 2012) 7–28; G Britz, ´Informationelle Selbst-
bestimmung zwischen rechtswissenschaftlicher Grundsatzkritik und Beharren des Bundesverfas-
sungsgerichts´ in W Hoffmann-Riem (ed), Offene Rechtswissenschaft (2010) 566–568 (hereafter Britz
´Informationelle Selbstbestimmung´); Albers, ´Complexity´ (n 1) 222–224.
30
Cf. the examples mentioned in note 27. This preemptive protection against state action is not to be
confused with the duties to protect against unlawful infringements of liberty interests by third parties,
cf. Poscher, ´Abwehrrechte´ (n 23) 380–387 on the duty to protect under the German Basic Law. As
far as such duties to protect are accepted, data protection would also address preemptive dimensions
of these duties.
31Cf. J Masing, ´Datenschutz – ein unterentwickeltes oder überzogenes Grundrecht?´ (2014) RDV 3
(4); Marsch, Datenschutzgrundrecht (n 20) 109–110; T Rademacher, ´Predictive Policing im Deut-
schen Polizeirecht´ (2017) 142 AöR 366 (402); Marsch ´Artificial Intelligence´ (n 14) 40.
32 Cf. Britz, ´Informationelle Selbstbestimmung´ (n 27) 571, 573, who first characterized the German
right to informational self-determination as an “accessory“ right.
33 Bundesverfassungsgericht (1 BvR 209/83) (Judgement) [1983] BVerfG 65, 1 (43).
34 Bundesverfassungsgericht (1 BvR 209/83) (Judgement) [1983] BVerfG 65, 1 (43).
Court. 35 The German Federal Court drew on an idea developed by the U.S. Su-
preme Court to create a data protection right that was never accepted by the latter.
Be that as it may, even in its constitutional birth certificate, data protection is not only
put forth as a right on its own but associated with various substantive fundamental
rights, such as the right to assembly.
Further evidence of the idea that personal data is not the object of a substantive
stand-alone right is provided by the fact data protection does not seem to stand by
itself, even in a jurisdiction in which it is explicitly guaranteed. Art. 8 CFR explicitly
guarantees a right to data protection. In the jurisprudence of the Court of Justice of
the European Union, however, it is always cited together with another right. 36 The
right to data protection needs another right in order to provide for a substantive inter-
est – usually the right to privacy, 37 but sometimes also other rights, such as free
speech. 38 Thus, even when data protection is codified as an explicit, independent
fundamental right, as it is in the Charter, it is nevertheless regarded as an accessory
to other more substantive fundamental rights. 39 This is odd if the right to data protec-
tion is taken at face value as a substantive right on its own but only natural if taken
as a general enhancement of other fundamental rights.
35 U.S. Supreme Court (Wieman v. Updegraff) [1952] 344 U.S. 183, 195.
36 Court of Justice of the European Union (Schecke and Eifert v. Hesse) (Judgement) [2010] CJEU
Joined Cases C-92/09 and C-93/09, para. 47; Court of Justice of the European Union (Digital Rights
Ireland v. Minister for Communications, Marine and Natural Resources and others and Kärntner
Landesregierung and others v. Seitlinger and others) (Judgement) [2014] CJEU Joined Cases C-
293/12 and C-594/12, para. 53; Court of Justice of the European Union (Maximilian Schrems v. Data
Protection Commissioner) (Judgement) [2015] CJEU Case C-362/14, para. 78; Court of Justice of the
European Union (Data Protection Commissioner v. Facebook Ireland and Maximilian Schrems)
(Judgement) [2020] CJEU Case C-311/18, para 168.
37Cf. Court of Justice of the European Union (Digital Rights Ireland v. Minister for Communications,
Marine and Natural Resources and others and Kärntner Landesregierung and others v. Seitlinger and
others) (Judgement) [2014] CJEU Joined Cases C-293/12 and C-594/12, para 37; Court of Justice of
the European Union (La Quadrature du Net and others v. Premier ministre and others) (Judgement)
[2020] CJEU Joined Cases C-511/18, C-512/18 and C-520/18, para. 115.
38
Court of Justice of the European Union (Digital Rights Ireland v. Minister for Communications, Ma-
rine and Natural Resources and others and Kärntner Landesregierung and others v. Seitlinger and
others) (Judgement) [2014] CJEU Joined Cases C-293/12 and C-594/12, para. 28; Court of Justice of
the European Union (La Quadrature du Net and others v. Premier ministre and others) (Judgement)
[2020] CJEU Joined Cases C-511/18, C-512/18 and C-520/18, para. 118; Court of Justice of the Eu-
ropean Union (Privacy International v. Secretary of State for Foreign and Commonwealth Affairs and
others) (Judgement) [2020] CJEU Case C-623/17, para. 72.
39 Marsch, Datenschutzgrundrecht (n 20) 132–133.
equality interests they protect, it becomes clear how personal data processing sys-
tems must be evaluated. They have to be evaluated against the background of the
question: to what extent does a certain form of data collection and processing sys-
tem pose an abstract danger for the exercise of what type of fundamental right?
Looking at data collection issues in this way has important implications – also for the
legal evaluation of AI technologies.
every abstract risk exceeds the threshold of a fundamental rights infringement. There
are general life risks that are legally moot. In extreme weather, even healthy trees in
the city park carry the abstract risk that they might topple, fall, and cause considera-
ble damage to property or even to life and limb. Courts, however, have consistently
held that this abstract danger does not allow for public security measures or civil
claims to chop down healthy trees. 40 They consider it part of everyday life risks that
we all have to live with if we stroll in public parks or use public paths.
The threshold for everyday life risks holds in the analog world and should hold in the
digital world, too. In our digital society, we have to come to grips with a – probably
dynamic – threshold of everyday digital life risks that do not constitute a fundamental
rights infringement, even though personal data have been stored or processed. On
one of my last visits to my physician, I was asked to sign a form that would allow his
assistants to use my name, which is stored in their digital patient records, in order to
call me from the waiting room when the doctor is ready to see me. The form cited the
proper articles of the, at the time, newly released General Data Protection Regula-
tion of the European Union (Art. 6 par. 1 lit. a, Art. 9 par. 2 lit. a). It is not the case
that there is no risk involved in letting other patients know my name. If the physician
happens to be an oncologist, it might lead to people spreading the rumor that I have
a terminal illness. This might find its way to my employer at a time when my contract
is up for an extension. So, there can indeed be some risk involved. We have, how-
ever, always accepted this risk – also in a purely analog world – as one that comes
with the visit of physicians, just as we have accepted the risk of healthy trees being
uprooted by a storm and damaging our houses, cars, or even ourselves. As we have
accepted everyday life risks in the analog world, we have to accept everyday digital
life risks in the digital world.
For AI technologies, this could mean that they can be designed and implemented
such that they remain below the everyday digital life risk threshold. When an AI sys-
tem uses anonymized personal data, there is always a risk that the data will be
deanonymized. If sufficient safeguards against deanonymization are installed in the
system, however, they may lower the risk to such a degree that it does not surpass
the level of our everyday digital life risk. This may be the case if the AI system uses
data aggregation for planning purposes or resource management, which do not
threaten substantive individual rights positions. An example of a non-AI application is
the German Corona-Warn-App, which is designed in such a way as to avoid central-
ized storage of personal data and thus poses almost no risk of abuse.
c) A Systemic Perspective
Third, the alternative approach implies a more systemic perspective on data collec-
tion and data processing measures. It allows us to step back from the idea that each
and every instance of personal data processing constitutes an infringement of a fun-
damental right. If data protection is understood as protection against abstract dan-
gers, then we do not have to look at the individual instances of data processing. In-
stead, we can concentrate on the data processing system and its context in order to
evaluate the abstract danger it poses.
Unlike the traditional approach, focusing on abstract dangers for substantive funda-
mental rights that are connected with AI technologies does not require the full trans-
parency of the AI system. The alternative approach does not require knowledge of
what kind of data is processed exactly how and when. What it needs, however, is a
risk analysis and an evaluation of the risk reduction, management, correction, and
compensation measures attuned to the specific context of use. 41 It requires regula-
tion on how false positives and negatives are managed in the interaction between AI
and human decision makers. At the time of our conference, the New York Times re-
ported on the first AI-based arrest generated by a false positive of facial recognition
software. 42 As discussed in the report, to rely solely on AI-based facial recognition
software for arrests seems unacceptable given the failure rate of such systems. A le-
gal regulation has to counterbalance the risks stemming from AI by forcing the police
to corroborate AI results with additional evidence. A fundamental rights analysis of
the facial recognition software should include an evaluation not only of the technol-
ogy alone but also of the entire socio-technological arrangement in the light of ha-
beas corpus rights and the abstract dangers for the right to personal liberty that
come with it. The actual cases, however, are not about some formal right to data pro-
tection but about substantive rights, such as the right to liberty or, the right against
racial discrimination and the dangers AI technologies pose for these rights.
For AI-technologies the differences between the traditional approach and the sug-
gested approach regarding the right to data protection are similar to differences in
the scientific approach to and the description of the systems as such. Whereas tradi-
tionally the approach to and the description of computational systems has been very
much dominated by computer sciences, there is a developing trend to approach AI-
systems – especially because of their lack of informational transparency – with a
more holistic intradisciplinary methodology. AI-systems are studied in their deploy-
ment context with behavioral methodologies which are not so much focused on the
41 Cf. Albers, ´Complexity´ (n 1) 232, who draws a parallel to risk management in environmental law.
42 Kashmir Hill, Wrongfully Accused by an Algorithm, New York Times, June 24, 2020 (ny-
times.com/2020/06/24/technology/facial-recognition-arrest.html).
inner informational workings of the systems but on their output and their effects in a
concrete environment. 43 The traditional approach tends toward a more technical, in-
formational analysis of AI systems, which is significantly hampered by the black box
phenomenon. The shift to the substantive rights perspective would lean toward a
more behavioral approach to AI. The law would not have to delve into the computa-
tional intricacies of when and how what type of personal data is processed. It could
take a step back and access how an AI system “behaves” in the concrete socio-tech-
nological setting it is employed in and what type of risks it generates for which sub-
stantive fundamental rights.
5. Resumé
From a doctrinal, fundamental rights perspective, AI could have a negative and a
positive implication. The negative implication pertains to the traditional conceptual-
ization of data protection as an independent fundamental right on its own. The tradi-
tional formal model, which focuses on each and every processing of personal data
as a fundamental rights infringement could be on a collision course with AI’s techno-
logical development. AI systems do not provide the kind of transparency that would
be necessary to stay true to the traditional approach. The positive implication per-
tains to the alternative model I have been suggesting for some time. The difficulties
AI may pose for the traditional conceptualization of the right to data protection could
generate some wind beneath the wings of the alternative conception, which seems
better equipped to handle AI’s black box challenge with its more systemic and be-
havioral approach. The alternative model might seem quite revisionary, but it holds
the promise of redirecting data protection towards the substantive fundamental rights
issues at stake – also, but not only, with respect to AI technologies.
43An overview on this emerging field in I Rahwan et al, ´Machine behaviour´ (2019) 568 Nature 477
(481–482).