report-consumer-friendly-scoring
Dieses Dokument ist Teil der Anfrage „Gutachten des Sachverständigenrats für Verbraucherfragen“
The legal framework for scoring 131
Act which prohibits the processing of inaccurate data that the data used to calculate the score are of direct rel-
(and hence also non-essential data within the meaning evance”). It remains unclear how the catalogue of these
of section 31(1)(2) of the Federal Data Protection Act – data types could ever be reliably defined.
see Overbeck, 2016). In the longer term there will be a
need for a body of law designed to ensure the quality The ‘correlation’ requirement laid down in section 31(1)
of stored data. A normative mooring for such a legal (2) of the Federal Data Protection Act implies that those
regime already exists today in the principle of data ac- who undertake predictive scoring processes must never
curacy enunciated in Article 5(1)(d) GDPR (see Pötters, rely, when designing the process, on statistically unver-
2018, on Article 5 GDPR, point 24). The contours of this ified routine plausibility checks. In this respect, scoring
area of the law and of the obligations that controllers needs “empirical reinforcement”. This requirement is far
have to fulfil with regard to the accuracy of the data they from self-evident, for there is no general obligation on
process, however, have scarcely been developed at all those who enjoy the fundamental right of freedom of ex-
to date (Hoeren, 2016). One legally simple way of over- pression enshrined in the first sentence of Article 5(1) of
coming this problem certainly lies in the rights of data the Basic Law to confine themselves to rationally justi-
subjects to information and rectification (first sentence fied utterances, not even when they are communicating
of Article 16 GDPR; for more details see Domurath and alleged facts. Seen in that light, the rationality require-
Neubeck, 2018). In this respect, however, data privacy ment in section 31(1)(2) of the Federal Data Protection
law suffers from a considerable mobilisation deficit Act already looks like a thoroughly significant legislative
(Härting, 2015; Spindler, Thorun and Wittmann, 2017). intervention, although, given the social significance of
communicated probability scores, a plausible justifica-
tion can be found for it.
3.2 Scientific basis of scoring processes
Section 31(1)(2) of the Federal Data Protection Act pre- That a process is scientific within the meaning of sec-
scribes that scoring processes must meet certain scien- tion 31(1)(2) of the Federal Data Protection Act is not
tific standards (see section B.IV.1 above). With this provi- guaranteed solely by the fact that its predictive perfor-
sion, the legislature excludes at least the use for scoring mance is delivered with a level of reliability appropriate
purposes of data that cannot contribute anything to the to the relevant area of people’s lives. The fact is that any
predictive performance of a scoring process (Domurath process which delivers a better predictive performance
and Neubeck, 2018). Where it is not even possible to than the toss of a coin can be the result of proficient ap-
demonstrate a correlation between a particular type of plication of statistical methods and, as such, constitute a
data and the event whose probability is to be predicted, significant and praiseworthy scientific achievement. But
the use of that type of data would be contrary to sec- it does not answer the question whether the procedure
tion 31(1)(2) of the Federal Data Protection Act. should or should not be applied in a particular area of
people’s lives. Specific quality criteria are not associated
Attempts are sometimes made to frame requirements with the obligation to follow a scientific procedure. In
for the instrumentality of the data that are used which this respect the legal regime covering predictive scoring
go beyond proof of correlation. Formulating these re- has a regulatory void, which becomes particularly strik-
quirements in such a way that they are usable in the ing when contrasted with something like the law gov-
practical application of the law has proved to be a diffi- erning the capital adequacy of credit institutions, which
cult undertaking. This applies, for example, to the case was outlined above (see subsection E.I.3.4. This does not
that is sometimes made for the restriction of usable data mean that section 31(1)(2) of the Federal Data Protection
to those that are “contractually relevant” (Domurath Act is a toothless tiger, but it does have biting inhibitions.
and Neubeck, 2018, cite examples). The types of data in
question are those that influence the probability of the
target behaviour in a particularly direct way (see also
Buchner, 2018, on section 31 of the Federal Data Protec-
tion Act, point 8, who states that such a provision would
require users “to demonstrate plausibly and verifiably
132 The legal framework for scoring
4. G
uaranteeing transparency Article 12 GDPR
and comprehensibility Transparent information,
communication and modalities
The General Data Protection Regulation explicitly
anointed transparency as a principle to which all pro-
for the exercise of the rights of
cessing of personal data must adhere. The third princi- the data subject
ple set out in Article 5(1)(a) GDPR is that personal data
must be “processed in a transparent manner in relation The controller shall take appropriate measures
to the data subject”. This principle of transparency is to provide any information referred to in Articles
developed programmatically in recitals 39, 58 and 60 of 13 and 14 and any communication under Articles
the GDPR. The circuitous wording of the cited sources 15 to 22 and 34 relating to processing to the data
must not obscure the fact that the level of abstraction subject in a concise, transparent, intelligible
of the transparency principle is still considerable. Which and easily accessible form, using clear and plain
precise duties are actually incumbent on the controller language, in particular for any information
in respect of each specific data processing operation re- addressed specifically to a child. (…)
mains uncertain (see above before section E.I.1 and, for
example, Roßnagel, 2018, Wachter, Mittelstadt and Flori-
di, 2017, and Selbst and Powles, 2017). The catalogue
of obligations is fleshed out somewhat in Articles 12 to Article 13 GDPR
15 GDPR.
Information to be provided
where personal data are col-
lected from the data subject
(Article 14 is similar: Informa-
tion to be provided where per-
sonal data have not been ob-
tained from the data subject)
(…) In addition to the information referred to
in Paragraph 1, the controller shall, at the time
when personal data are obtained, provide the
data subject with the following further informa-
tion necessary to ensure fair and transparent
processing:
(…)
the existence of automated decision-making,
including profiling, referred to in Article 22(1)
and (4) and, at least in those cases, meaningful
information about the logic involved, as well as
the significance and the envisaged consequences
of such processing for the data subject.
The legal framework for scoring 133
interest of safeguarding trade secrecy and that of access
Article 15 GDPR to information are balanced in current data privacy law.
Secondly, the General Data Protection Regulation, in
Right of access by the data what are key provisions in terms of scoring transparen-
subject cy, defines the catalogue of obligations
The data subject shall have the right to obtain
incumbent on the controller in a conspicuously unin-
from the controller confirmation as to whether or
formative manner. Article 13(2)(f), Article 14(2)(g) and
not personal data concerning him or her are being
Article 15(1)(h) GDPR each define information about
processed, and, where that is the case, access to
“the logic involved” (la logique sous-jacente; die invol-
the personal data and the following information:
vierte Logik) in automated decision-making within the
meaning of Article 22 GDPR. It might be supposed that,
(…)
in the disciplines in which algorithms feature, the term
‘logic’ related to an algorithm as described from a par-
the existence of automated decision-making,
ticular perspective and that the legislature had made
including profiling, referred to in Article 22(1)
reference to this non-legal term with a view to prepar-
and (4) and, at least in those cases, meaningful
ing it for reception by the legal community (examples of
information about the logic involved, as well as
such processes are described in Klement, 2006, and Ma-
the significance and the envisaged consequences
this, 2017). This supposition is wide of the mark. Math-
of such processing for the data subject.
ematicians, computer scientists and software engineers
have a no less vague notion than legal scholars as to
The provisions prescribe the fulfilment of extensive what the “logic involved” in automated decision-mak-
information obligations to the data subject (WP 29, ing might be.
2018), and give the latter far reaching rights of access
to information, which are also rooted in fundamental The lively debate (see section B.I.3 above) on the disclo-
rights (second sentence of Article 8(2) of the Charter of sure of the attributes used as input variables in Schufa
Fundamental Rights of the European Union). But these credit scores and their weighting is indicative of the lack
provisions likewise leave considerable latitude for the of normative guidance provided by the transparency
application of the law. This has two reasons. regime of the General Data Protection Regulation. If we
assume that the calculation of a Schufa score amounts
First of all, interests that conflict with the principle to decision-making within the meaning of Article 22
of transparency have also been recognised and must GDPR, it is still a moot point which items of information
therefore be taken into account in the interpretation of on the genesis of a score are covered by the description
the neutrally framed terminology of the transparency “the logic involved” (evidence of views on the scope of
regime. Recital 63 makes this clear, stating that “A data the provisions can be found in Wischmeyer, 2018; for a
subject should have the right of access to personal data more restrictive interpretation, see, for example, Paul
which have been collected concerning him or her, and and Hennemann, 2018, on Article 13 GDPR, point 31; for
to exercise that right easily and at reasonable intervals, a broader interpretation, see, for instance, Bäcker, 2018,
in order to be aware of, and verify, the lawfulness of the on Article 13 GDPR, point 54). It is sometimes assumed,
processing. (…) That right should not adversely affect by explicit reference to the Schufa judgment of the Fed-
the rights or freedoms of others, including trade secrets eral Court of Justice, that the obligation to give access to
or intellectual property and in particular the copyright information goes further than the boundaries set by the
protecting the software. However, the result of those current legal position. As Florian Schmidt-Wudy writes,
considerations should not be a refusal to provide all “With regard to the scope of the information on the “log-
information to the data subject.” It is recognisable that ic involved”, it remains to be seen whether the non-dis-
a regulation problem has been identified here but not closure, approved by the Federal Court of Justice, of the
resolved. The General Data Protection Regulation is scoring formula will remain tenable, for without knowl-
unable to establish consensus on the issue of how the edge of the scoring formula, it is scarcely possible for the
134 The legal framework for scoring
data subject to discover and correct errors in the score It is certainly unmistakable that, in its transparency
(…). On the other hand, unrestricted disclosure of the requirements, the General Data Protection Regulation
score may jeopardise the business model of credit ref- follows on from its forerunner in EU law, the Data Pro-
erence agencies (…). tection Directive. This suggests a very cautious interpre-
tation of the transparency requirements set out in Arti-
Because of the analogous application of Article 15(4) cles 13 to 15 GDPR (Wischmeyer, 2018). The information
GDPR, however, and the balance it prescribes with fun- to be disclosed under these provisions would then be
damental rights and freedoms, strict secrecy of scoring kept very general and would be confined to a merely su-
formulae as approved by the Federal Court of Justice perficial presentation of the program functions. On the
will not be maintainable if knowledge of them is es- other hand, this cautious circumscription of the trans-
sential for a data subject to be able to identify flawed parency requirements in data privacy law may reflect
calculations and have them corrected. On the contra- the fact that the question how it is possible in practice
ry, it will depend on the individual case, which means to establish transparency (see section B.I.4 above) is
that in certain cases both the scoring formula and its still under discussion. At the heart of the transparency
underlying parameters may certainly be the subject of debate at the present time is not legal permissibility
a disclosure.” (Schmidt-Wudy, 2018, on Article 15 GDPR, but technical feasibility. (see Selbst and Barocas, 2018,
point 78.3). The cautious way in which the commenta- Burrell, 2016, and Lipton, 2016). The technical-sounding
tor expresses his interpretation of the law, is illustrative but substantively vague description of the transparency
of the strikingly weak normative guidance provided by entitlement, with terms like “the logic involved”, “signif-
Articles 13 to 15 GDPR (but see Heuzeroth and Seibel, icance” and “envisaged consequences”, may therefore
2018). The present legal position is still lagging behind prove to be especially receptive to future developments
the normative guidance provided by section 34 of the in legal scholarship.
Federal Data Protection Act (old version), on the basis
of which the Federal Court of Justice outlined the infor-
mation access claim against Schufa – and that provision
itself is far from unequivocal.
In the light of the above, it is no surprise that the scope
of transparency requirements arising from the General
Data Protection Regulation is a subject of controversy.
The crystallisation point in the debate is the question
whether the GDPR grants the data subject a ‘right of
explanation’ of an automated individual decision. The
object of this discussion, conducted on an internation-
al scale, is to build a bridge between, on the one hand,
the transparency requirements of the General Data Pro-
tection Regulation and, on the other hand, the lively
discussion on ways of making complex algorithmic de-
cision-making systems comprehensible to people (see
section B.I.4 above as well as Gesellschaft für Informa-
tik, 2018, Selbst and Powles, 2017, Selbst and Barocas,
2018, and Wachter, Mittel- stadt and Floridi, 2017).
The legal framework for scoring 135
In the first step, the question to be asked is whether the
5. G
uaranteeing non- motive for the behaviour being tested for conformity
with the law requires attention in the light of anti-dis-
discrimination crimination law. This may be so because one of the
grounds listed in section 1 of the General Equal Treat-
ment Act was a determinant factor for the behaviour
Section 1 of the General Equal in question. Current anti-discrimination law. To take
an example, someone refuses to conclude a contract
Treatment Act Purpose on grounds of the other party’s ethnic origin (see sec-
The purpose of this Act is to prevent or to stop tion 3(1) of the General Equal Treatment Act). Closer
discrimination on the grounds of race or ethnic scrutiny is also called for, however, in the case of modes
origin, gender, religion or belief, disability, age or of behaviour with seemingly innocuous motives if those
sexual orientation. motives are particularly detrimental to any persons on
account of one of the grounds listed in section 1 of the
General Equal Treatment Act. For example, someone re-
fuses to conclude a contract because of the other party’s
Section 3 of the General Equal insufficient knowledge of the German language (see sec-
tion 3(2) of the General Equal Treatment Act). The sec-
Treatment Act Definitions ond step involves an examination of whether reliance
(1) Direct discrimination shall be taken to occur on the suspect ground is justified in the given situation.
where one person is treated less favourably At the end of this examination, it will have been estab-
than another is, has been or would be treated lished whether or not prohibited discrimination has
in a comparable situation on any of the taken place. To discriminate unlawfully, then, means to
grounds referred to under Section 1. (…) act on prohibited grounds (for a detailed treatment, see
Schramm, 2013). Anti-discrimination law is ‘input-fo-
(2) Indirect discrimination shall be taken to occur cused’. Its attention is fixed on the interaction of certain
where an apparently neutral provision, criteri- decision-making criteria and their admissibility. In the
on or practice would put persons at a particu- realm of scoring, this method of applying the law may
lar disadvantage compared with other persons have unwanted results. For instance, a seller declines to
on any of the grounds referred to under do a deal with a prospective buyer because of the lat-
Section 1, unless that provision, criterion or ter’s low score. In so doing, the seller is not acting on
practice is objectively justified by a legitimate the basis of a protected characteristic but simply of a
aim and the means of achieving that aim are score. This ground for refusal does not alter the fact that
appropriate and necessary. the sex of the prospective buyer, for instance, played a
significant role in the calculation of the score. It could
be argued, on the basis of that fact, that this is a case
5.1 Discriminatory acts and discriminatory of unequal treatment requiring attention in the light
effect of anti-discrimination law (Moos and Rothkegel, 2016,
It is difficult for current anti-discrimination law to ac- advance this argument; see also section C.III.5 above).
commodate the problem of discriminatory scoring in its The seller, of course, does not refuse to enter into a con-
conceptual framework (see chapter B.II above), because tract because of the other party’s sex but because of
it typically checks whether the reasons that people or the inadequate score. Although attempts can be made
institutions give for their actions are legitimate from an to bring such cases into the ambit of anti-discrimination
anti-discrimination perspective. Whether a reason for law by means of rules on indirect discrimination, that
an action is objectionable on grounds of incompatibility will not resolve the difficulties.
with anti-discrimination law may be ascertained in the
following two steps:
136 The legal framework for scoring
Many individual variables, possibly even inestimably criminatory on grounds of gender inequality even with-
many, go into the calculation of a score. An audit that out the need to isolate individual discriminatory factors
examined every one of the input variables and assessed (ECJ judgment of 27 October 1993 in case No C-127/92 –
its admissibility under anti-discrimination law would Enderby [EU:C:1993:859].
be practically unmanageable. Moreover, such an audit
would run the risk of discovering mere spurious corre- An ‘impact-focused’ consideration of scoring practices
lations and highlighting them as requiring justification, relates not only to the individual variables that go into
even though their occurrence is virtually inevitable in the calculation of the score but also to the effects of the
sufficiently large volumes of data. scoring proves. The scoring process itself is the “prac-
tice” (Verfahren) within the meaning of section 3(2) of the
For complex scoring processes, the ‘input-focused’ General Equal Treatment Act that must be guaranteed
analysis of compatibility with anti-discrimination law compatible with anti-discrimination law (Hacker, 2018).
must be supplemented by an ‘impact-focused’ analysis.
In other words, attention should not be fixed solely on
the decision-making criteria but also on the effects that 5.2 Challenges posed by impact-focused
decisions have. The European legislature displayed a protection against discrimination
delicate linguistic touch when providing for the possi- The difficulties involved in trying to remedy the discrim-
bility that motive-related anti-discrimination law would inatory effects of scoring on particular groups of people
reach its limits when confronted with complex data-pro- can be considerable. Still comparatively easy to address
cessing operations, and hence with scoring. In recital 71 are those discriminatory effects that are attributable to
of the General Data Protection Regulation, it does not flaws in the technical design of the scoring process. If the
speak of “discrimination” arising from data-processing process produces quality disparities for various groups
operations but of “discriminatory effects”. of people (see section B.II.3 above) and those disparities
could have been avoided at no extra cost, the scoring
An impact-focused analysis can establish its legal bear- practice is incompatible with anti-discrimination law
ings by reference to the fact that section 3(2) of the (Hacker, 2018). Additional costs may also be imposed on
General Equal Treatment Act refers not only to “an ap- the scorer if a greater degree of freedom from discrimi-
parently neutral provision” or “criterion” but also to a nation is thereby achievable (Hacker, 2018). However, in
“practice” (Verfahren). By means of the term “practice”, cases where the discriminatory effects of a scoring prac-
anti-discrimination law releases itself from confinement tice also increase its predictive power, an opportunity
to the scrutiny of individual motives. It opens the door to is provided for the scorer to justify these discriminatory
analyses which can identify even complex and scarcely effects (for more details, see Hacker, 2018).
penetrable “practices” (Block, 2018, on section 3 of the
General Equal Treatment Act, point 69) as problematic Then there are the difficulties that arise when it comes
in terms of anti-discrimination law. As far as the ques- to proving the discriminatory effects of a scoring prac-
tion of the reference point is concerned, neither nation- tice (Hacker, 2018). A plaintiff who suspects discrimi-
al anti-discrimination law nor the underlying European nation will not have the comparative data to underpin
directives stipulate that individual criteria must give rise his assertion (Hacker, op. cit., Hildebrandt, 2015) and
to disadvantages but even permit a general overview of demonstrate the discriminatory effects of a scoring
several provisions or entire processes (Schiek, 2007, on practice. And even from a bird’s eye view from which
section 3 of the General Equal Treatment Act, point 33). a wide panorama of data sets could be seen, it would
Starting points for an anti-discrimination regime that still be hard to identify the discriminatory effects of a
does not focus primarily on motives for decisions but on scoring practice, for apart from age and gender, data
the results of systems that operate in incomprehensible on the attributes that are customarily at the root of dis-
ways are also to be found in the case law of the Court crimination traditional are generally unavailable. “even
of Justice of the European Union. German anti-discrim- to collect them would be problematic, because no one
ination law is shaped to a great extent by this case law, may be required to disclose his or her sexual orientation
which has found that pay structures may be judged dis- or religion. The establishment of ethnic origin raises a
The legal framework for scoring 137 fundamental problem, namely whether there are ‘objec- tive’ factors at all for determining a person’s ethnicity other than his or her nationality.” (Grünberger, 2013, p. 664; cf. also Article 9 GDPR). A simple enlargement of the material scope of the Gen- eral Equal Treatment Act to include “automated deci- sion-making practices” would not suffice to deal with the problem of discriminatory scoring. Data privacy law offers potential for a solution. The principle that person- al data should be processed “fairly” (de manière loyale; nach Treu und Glauben) enshrined in Article 5(1)(a) GDPR is a normative anchorage in this respect (Hacker, 2018). In the case of remediable discriminatory quality dif- ferences in scoring practices, the principle of accuracy (Article 5(1)(d) GDPR) is also affected (Hacker, op. cit.). If these legal principles open data privacy law to the normative objective of protection against discrimina- tion, rights of access to information under Article 15(1) (h) GDPR and data protection impact assessments un- der Article 35 GDPR will offer ways of addressing the problem of discriminatory scoring (for more details, see Hacker, 2018).
138 The legal framework for scoring
IV. Supervision
The legal requirements for scoring relate to various as- On these grounds alone, there appears to be ample
pects of scoring, among the most prominent of which potential for sovereign supervision of scoring, because
are transparency, quality and non-discrimination. issues of confidentiality – in the sense of trade secrecy,
The requirements differ, depending on who is scoring for example – and of adverse social consequences of
whom, for what purpose and in what way. This wide di- transparency (see section B.I.1 above) do not arise in
versity of material requirements is matched by a wide this context, for disclosure to the state supervisory au-
array of legal implementation mechanisms. Some im- thority does not extend ad infinitum the circle of those
pression of the diversity of conceivable institutional who know how the relevant scoring algorithm works.
law-enforcement arrangements can be obtained from The overseeing state functionaries, for their part, can
data privacy law alone, which plays a key role in the for- be sworn to secrecy, and indeed they already are as a
mulation of legal requirements for scoring (see Schantz rule (see, for example, section 30 and section 29(2) of
and Wolff, 2017, pp. 295ff.). Among these are legal the Administrative Procedure Act (Verwaltungsverfah-
remedies for adversely affected individual consumers, rensgesetz).
scope for class actions on the part of plaintiffs such as
consumer advice centres and even rules for business Seen in this light, the problem of official supervision is
organisations (Spindler, 2011, provides a comprehen- one of adequate staffing and equipping of the relevant
sive review, dealing with data protection on pp. 270ff.) supervisory authorities. These must be enabled to con-
as well as state supervisory mechanisms ensuring that duct even complex audits of compliance with substan-
scoring is conducted in compliance with the law. tive law (see SVRV, 2016; also Gesellschaft für Informatik,
2018).
State supervision is the key instrument for enforcement
of the aforementioned quality requirements for pre-
dictive scoring prescribed by section 31 of the Federal
Data Protection Act. BaFin, the Federal Financial Super-
visory Authority, oversees compliance with the quality
requirements governing models for the assessment of
credit default risks (see section E.III.3 above). For the
bonus programmes of statutory health insurance funds
(see section E.II.3 above) there are legal bases that al-
low particularly close supervision (Ullrich, 2018, on sec-
tion 65a of Book V of the German Social Code, point 7).
It is possible to add to the substantive law the obser-
vance of which these bodies oversee and so to extend
their respective supervisory missions.
Advisory Council for Consumer Affairs The Advisory Council for Consumer Affairs is an advisory body of the Federal Ministry of Justice and Consumer Protection (BMJV). It was set up in November 2014 by the Federal Minister of Justice and Consumer Protection. The Advisory Council for Consumer Affairs is tasked with using research findings and drawing on the Federal Ministry of Justice and Consumer Protection‘s practical experience to help shape consumer policy. The Advisory Council is independent and is based in Berlin.