report-consumer-rights-svrv

Dieses Dokument ist Teil der Anfrage „Gutachten des Sachverständigenrats für Verbraucherfragen

/ 78
PDF herunterladen
people and discovers patterns – people in age X on average sleep in hours Y-Z;
      stress causes insomnia; events x, y, z cause stress -> based on this, if someone
      is in age X and is not asleep in hours Y-Z and undergoes an event x or y or z, the
      program decides to display the commercial to such a person. Later, it evaluates
      its own choice by seeing how many of these people actually clicked the link. It
      might see that most of the people who clicked where living in the cities, or in
      northern countries, etc. It would add this to its database for the future etc. It
      generally learns a lot about people from observing what they do, how they
      respond to incentives etc. In this sense, when the algorithm gets the task ‛display
      the commercial to 1000 people having the highest chance of buying it’ a few
      months later, it can ‛decide’ to show it to a completely different set of people,
      possibly in a more effective way. Even though no human re-programmed it. It did
      re-program itself in a manner foreseen by the programmer, but not in the
      direction foreseen (direction was ‘chosen’ by the software itself). That is why
      ‘autonomous’ is not autonomous in a human sense. It is limited. But it is not
      merely automatic. Whether this is the change of the source code, or just
      modification of a database, depends on particular programming technique and
      our terminology – but at least for legal purposes, this does not matter that much.
      What matters is that the process was not merely ‘automatic’.”
The difference is relevant because a software agent can make a choice which is legally not
correct. In the above example the scenario could be regarded as a prohibited type of
aggressive advertising. In a slightly modified form the choice could be regarded as
discrimination based on gender or ethnicity (where higher prices are offered to those who are
prepared to pay more; the potential addressees of advertising are uncoupled on the basis of
their political opinion or sexual orientation). Both anti-discrimination law181 and fair trading
law182 establish boundaries for such practices. The fact that systematic breaches of the law
are possible is one key reason why rules on algorithms should be enshrined in law. Such
illegal practices are generally hidden from view, although they are certainly no mirage, as
first empirical studies have already shown.183 If a self-learning algorithm “decides” not to pass
information on to a particular country, the potential users are uncoupled from access. The
software agent operates on the basis of a predefined goal, but the software agent itself
chooses which concrete actions to take to achieve that goal. The way in which the software
agent decides to take which action is pre-programmed, but no-one can predict precisely what
the software agent will do.


181
    For instance, section 19 (1) of the General Act on Equal Treatment (Allgemeines
Gleichbehandlungsgesetz, AGG): “Any discrimination on the grounds of race or ethnic origin, sex,
religion, disability, age or sexual orientation shall be illegal when establishing, executing or terminating
civil-law obligations”. As far as EU directives are concerned, see: Directive 2000/43/EC implementing
the principle of equal treatment between persons irrespective of racial or ethnic origin and Directive
2004/113/EC implementing the principle of equal treatment between men and women in the access to
and supply of goods and services, prohibiting discrimination in access to goods and services, based
on race and gender respectively; see also Brownsword, The E-Commerce Directive, Consumer
Transactions, and the Digital Single Market: Questions of Regulatory Fitness, Regulatory
Disconnection and Rule Redirection, lecture given on 18 June 2016 at the SECOLA conference in
Tartu, Estonia.
182
    See Directive 2005/29/EC on Unfair Commercial Practices, in which “to materially distort the
economic behaviour of consumers” is defined as “using a commercial practice to appreciably impair
the consumer’s ability to make an informed decision, thereby causing the consumer to take a
transactional decision that he would not have taken otherwise”, which would include the late-night
advertising example; see Brownsword, The E-Commerce Directive, Consumer Transactions, and the
Digital Single Market: Questions of Regulatory Fitness, Regulatory Disconnection and Rule
Redirection, lecture given on 18 June 2016 at the SECOLA conference in Tartu, Estonia.
183
    Sweeney, “Discrimination in Online Ad Delivery” (2013), Communications of the ACM, Vol. 56 No.
5, p. 44–54; Chander, “The Racist Algorithm?”, (2017), 115 Michigan Law Review, Forthcoming UC
Davis Legal Studies Research Paper No. 498.
                                                                                                         55
57

As well as this internal perspective, that is the relationship between the principal and the
agent, account also needs to be taken of the network effect. The complexity of self-learning
algorithms is based on the interaction between systems, between the algorithms themselves
in an interconnected environment. This represents an emergence in which responsibility is
distributed across many actors. In an interconnected city, for example, numerous sensors
and systems interact with driverless cars. It is conceivable that in future systems will be able
to continue developing their own algorithms, in which case algorithms will write algorithms.
Now it is not the case that these algorithms are operating in a legal vacuum so to speak, only
that software agents are not explicitly regulated by law. Under the law as it currently stands,
software agents are regarded as tools applied by humans, by an enterprise or a body
responsible for the algorithm. Whenever a software agent produces discriminatory, unfair or
misleading market practices, the person or body responsible for producing the algorithm will
be held responsible. As things currently stand, the law of general terms and conditions, anti-
discrimination law and, above all, fair trading law play a key role. The only sanction provided
for under German law is a stop-order mechanism, that is the incriminating practice is
prohibited ex nunc.
At least that is what it looks like on paper. However, since the actions which software agents
take are largely hidden from view, there is little chance of legal breaches being found out. As
a result, enterprises’ willingness to abide by the law necessarily drops too. Applying the
terminology of economic legal analysis, this means that where the costs of incorporating
legal requirements into the algorithm are higher than the potential loss after illegal actions
are found out, enterprises will see little need to comply with legal rules ex ante. This leads to
the call for algorithms, even self-learning ones, to have to comply with applicable law. The
principle of transparency should apply to algorithms and it ought to be possible to check
whether the law is being complied with.

II.    Big data, information asymmetry and profiling
Commonly applied definitions of neither “big data” nor “profiling” are yet available, and so
various definitions are in use. A simple and pragmatic approach will suffice for our purposes.
“Big data” is here defined as any technology which permits unlimited quantities of data to be
gathered and processed, whereby the data are accessible because users have put them into
the Internet and sufficient technical capacities are available for evaluating them.184 “Profiling”
is here defined as any technology which permits conclusions to be drawn from existing data
and profiles regarding individual behaviour.185 Article 4(4) of the General Data Protection
Regulation contains a legal definition:
      “‘Profiling’ means any form of automated processing of personal data consisting
      of the use of personal data to evaluate certain personal aspects relating to a
      natural person, in particular to analyse or predict aspects concerning that natural
      person’s performance at work, economic situation, health, personal preferences,
      interests, reliability, behaviour, location or movements.”

1.     The problem
Information asymmetry is not a new problem. The whole of existing consumer law is based
on the idea that asymmetries can be eliminated using government-prescribed information
rules. Despite all the criticism, above all from the behavioural sciences, consumer policy is
sticking to this paradigm, not least because society rightly assumes that legal subjects act
autonomously and under their own responsibility. The need to differentiate between different
consumer models does not change that premise. No-one is calling the normative model of
democratic societies into question.

184
    Hildebrandt/Gaakeer, (eds), Human Law and Computer Law: Comparative Perspectives, (Springer
Verlag, 2013).
185
    Hildebrandt/Gutwirth, (eds), Profiling the European Citizen. Cross-Disciplinary Perspectives,
(Springer Verlag, 2008).
                                                                                               56
58

The linking of insights gained in the behavioural sciences with big data and profiling
technologies has given rise to a new kind of information asymmetry. In the analogue world
the business sector knows more about products and markets. Consumer behaviour research
used to be time-consuming and expensive, and its findings were only of limited use. Big data
and profiling give the business world the chance to analyse consumer behaviour in a
targeted manner, to better understand why a particular decision was taken and thus to get to
know consumers better than they know themselves (or even want to know themselves).186
This advantage means targeted advertising campaigns can be used in entirely new ways to
drastically reduce costs and increase the efficiency of the advertising means employed.
      Example
      Imagine a person with the following Google profile: male/female, aged 26–30,
      works from 7 a.m. to 6 p.m., buys medicines online. If that person permits
      localisation on their smartphone, then Google knows when they are at home and
      how long they sleep at night and are motionless. Google can also see when that
      person gets up at 3 a.m. and picks up their mobile phone. A supplier of sleeping
      pills could charge Google with offering its products around that time of night to
      those consumers who have difficulty sleeping and who browse the Internet to
      distract themselves. Such a transaction would not be the result of information
      asymmetry about the product or market, but about the manner in which, the
      conditions under which and the time of day when consumers take a decision. The
      entrepreneur knows what consumers do not know, namely that they behave as
      predicted by Google under the given conditions.

2.     Legal classification
Existing protective mechanisms available in applicable consumer contract law cover these
situations as little as the relatively recent Unfair Commercial Practices Directive does, which
the German legislature has incorporated into the Act against Unfair Competition. As far as
the situation upstream of the conclusion of a contract is concerned, under applicable law it is
already doubtful whether consumers actually have a right to information. The prevailing
opinion is that the prohibition of misleading advertising or of a misleading omission is not
equal to the consumer’s right to objective information about the qualities of a product or
service. In our example the problem does not revolve around information about the product
or service. In any case, stricter requirements are applied in the law of medicines to limit the
scope of advertising much more than in other fields. The suppliers of sleeping pills are only
interested in the consumer’s behaviour. It is specifically that knowledge about the
consumer’s behaviour which opens up new sales methods to them.
The extremely wide definition of advertising applied in the Directive and thus also in German
law permits marketing oriented to consumer behaviour to be subsumed under the
requirements of fair trading law. Only, under applicable law consumers ought to have no right
to the offer to buy sleeping pills at 3 a.m. in the morning being based on a sophisticated
analysis of their behaviour profile.187 The question also remains of whether, even if they were
aware of that when making their purchase, consumers would behave differently or whether
they would be happy to take up the offer. Consumers may take a different decision if they are
able to find out beforehand what kind of profile Google has of them, knowing that they can
influence their data by deleting or changing data, for instance.
Articles 13 and 15 of the General Data Protection Regulation give consumers rights to
information and rights of access which specifically also cover profiling. In the light of the

186
    Lunn, Regulatory Policy and Behavioural Economics, (Organisation for Economic Co-operation and
Development, 2014), <http://www.oecd-ilibrary.org/content/book/9789264207851-en> (last retrieved
28 Nov. 2016).
187
    Brownsword, The E-Commerce Directive, Consumer Transactions, and the Digital Single Market:
Questions of Regulatory Fitness, Regulatory Disconnection and Rule Redirection, lecture given on 18
June 2016 at the SECOLA conference in Tartu, Estonia.
                                                                                                57
59

Federal Data Protection Act, it seems reasonable to interpret the General Data Protection
Regulation such that consumers are at least to be informed about the basic assumptions
applied in the algorithm logic on which the profiling is based.188 EU law trusts in the power
and assertiveness of each individual. Individuals first need to be aware of the problem, they
need to assert their claim and possibly apply to a court. What we need is not just one
Schrems, but many Schrems. It is more than doubtful whether it would be possible to get a
grip on information asymmetry even if bundling were possible. In addition, the General Data
Protection Regulation does not provide for data protection authorities to intervene instead of
the consumer. Since data processing is not linked to binding general requirements, the data
protection authorities entrusted with their implementation also appear to lack the competence
to measure the algorithms applied against a standard benchmark and, where applicable, to
demand that corrections be made.

III.   Potential solutions as regards regulating algorithms and big data
The use of algorithms and the prospects opened up by self-learning algorithms which update
the source code raises questions of an altogether different dimension than when one takes
the micro perspective of digital services. The issue here is not only one of maintaining the
autonomy of consumers, which was to be the driver behind potential solutions as regards the
law of digital services, but of human dignity in the age of artificial intelligence (AI). The
political challenge is to answer the question of how to ensure that self-learning algorithms
“act” in an ethically responsible manner. Can politics trust in business, in competition, in
independent ethical behaviour on the part of those who are responsible for driving forward
developments when it comes to AI? And, even more difficult, what will happen when AI takes
on a life of its own? How can a self-controlling process be politically, ethically and legally
mainstreamed?
The Advisory Council believes that it is the political realm which is called to act. The question
is no longer whether political action is necessary, but what type of action that could be. A
normative component needs to be incorporated into the algorithms. Under the lofty rubric of
“human dignity” and the autonomy of human beings, the issue when it comes to consumer
law would be compliance with the prohibition of discrimination, fair advertising, consumer
data protection law and fair terms and conditions. Once this basic issue has been solved –
and the Advisory Council is convinced that political action is what is needed – we will find a
series of obstacles strewn across the path towards implementation of that goal which have
their origin in the different rationality behind law and technology.189

1.     Requirements under the Federal Data Protection Act
The 20th century legislative model requires that the government create a legal framework for
technology within the context of which business itself develops its own technical standards.
The manufacturers of technical products are obliged by the legislature to comply with the
state of technology and the state of scientific knowledge. Standardisation bodies have a key
role to play in this, since it is they which flesh out the framework provided by the legislature.
In Germany, consumers are involved in the process of standardisation through the DIN
Consumer Advisory Board. Once adopted, standards enjoy privileged status. Once the
manufacturer has certified that its products meet these standards, either itself or via
independent third-party institutions (e.g. TÜV), products can be put on the market without
further governmental control. In the event of a claim, it is assumed until the opposite is
proven that the manufacturer has met any legal obligations. The EU took over this model
cum grano salis in the mid-1980s and applied it to technical regulation in Europe.



188
    See Schmechel, (op. cit., fn. 16), who cites Paal, Beck’sche Kompakt Kommentare Datenschutz-
Grundverordnung, Paal/Pauly (eds), C.H. Beck Verlag 2017, margin no. 31 re Article 13 of the General
Data Protection Regulation.
189
    Boer, Legal Theory, Sources of Law, and the Semantic Web, (IOS Press, 2009).
                                                                                                 58
60

The German legislature took a different path in section 28b of the Federal Data Protection
Act. In that provision it obliged loan agencies in particular to comply with scientifically-
mathematically recognised standards and did not allow them to process especially sensitive
data within the meaning of section 3 no. 9 of the Federal Data Protection Act. The provisions
read as follows.
                                   Federal Data Protection Act
                                          Section 28b
                                            Scoring
      For the purpose of deciding on the creation, execution or termination of a
      contractual relationship with the data subject, a probability value for certain future
      action by the data subject may be calculated or used if
      1. the data used to calculate the probability value are demonstrably essential for
      calculating the probability of the action on the basis of a scientifically
      recognised mathematical-statistical procedure [emphasis added],
      2. in case the probability value is calculated by a credit inquiry agency, the
      conditions for transferring the data used under section 29 and in all other cases
      the conditions of admissible use of data under section 28 are met,
      3. (...)
      4. (...)
                                            Section 3
                                        Further definitions
      (...)
      (9) “Special categories of personal data” means information on a person’s racial
      or ethnic origin, political opinions, religious or philosophical convictions, union
      membership, health or sex life.
      (...)
A case is pending before the Federal Constitutional Court against the Schufa credit agency
concerning the matter of whether Schufa should have to disclose its scoring algorithms. The
Federal Court of Justice negated just that.190 The Federal Constitutional Court has not yet
declared whether it will accept the constitutional complaint for decision. Germany’s highest
court has therefore not yet clarified which requirements are to be made of a scientifically
recognised mathematical-statistical procedure. As regards sensitive data, section 28 (8) read
in conjunction with subsection (6) of the Federal Data Protection Act at any rate sets limits
when it comes to those criteria which may be applied when determining the score value.
What has not yet been clarified is the extent to which the boundaries set in section 19 and
section 20 of the General Equal Treatment Act (Allgemeines Gleichbehandlungsgesetz,
AGG) have an impact on data capture. The US Equal Access Opportunity Act is clearer in
that respect.191 Monitoring compliance with statutory requirements is the responsibility of the
data protection authorities. In view of the relatively low mathematical/technical complexity of
scoring and the possibility of assigning responsibilities, competent monitoring ought to be
safeguarded.192
The Advisory Council notes that the existing rule in section 28b of the Federal Data
Protection Act represents a useful starting point when it comes to regulating self-
learning algorithms.

190
    See Federal Court of Justice, judgment of 28 Jan. 2014, file no. VI ZR 156/13 (Gießen Regional
Court, Gießen Local Court). A constitutional complaint has been filed against the Federal Court of
Justice’s ruling (file no. 1 BvR 756/14).
191
    See Metz, “Scoring: New Legislation in Germany”, (2012), 35 Journal of Consumer Policy, p. 297–
305.
192
    See Part IV, III. 1. above.
                                                                                                 59
61

2.     Requirements under the General Data Protection Regulation
Nevertheless, the provision in section 28b of the Federal Data Protection Act cannot be
transferred to self-learning algorithms which autonomously update and change programs and
network among themselves. The US Federal Trade Commission has taken up this problem
and is investigating the need to increase and options for increasing transparency.193
Concrete results are not yet available.
The General Data Protection Regulation only addresses algorithms in the form of an
individual entitlement to information and access. This regulatory technique is well-known, as
it was used in Directive 2008/48/EC, where the obligation to issue credit responsibly is
conceived merely as information.194
                             General Data Protection Regulation
                                            Article 13
                      Information to be provided where personal data are
                                collected from the data subject
      (...)
      (2) In addition to the information referred to in paragraph 1, the controller shall, at
      the time when personal data are obtained, provide the data subject [emphasis
      added] with the following further information necessary to ensure fair and
      transparent processing:
      (...)
      (f) the existence of automated decision-making [emphasis added], including
      profiling, referred to in Article 22(1) and (4) and, at least in those cases,
      meaningful information about the logic involved, as well as the significance and
      the envisaged consequences of such processing for the data subject.
      (...)
                                           Article 15
                              Right of access by the data subject
      (1) The data subject [emphasis added] shall have the right to obtain from the
      controller confirmation as to whether or not personal data concerning him or her
      are being processed, and, where that is the case, access to the personal data
      and the following information:
      (...)
      (h) the existence of automated decision-making [emphasis added], including
      profiling, referred to in Article 22(1) and (4) and, at least in those cases,
      meaningful information about the logic involved, as well as the significance and
      the envisaged consequences of such processing for the data subject. (…)
                                           Article 9
                       Processing of special categories of personal data
      (1) Processing of personal data revealing racial or ethnic origin, political opinions,
      religious or philosophical beliefs, or trade union membership, and the processing
      of genetic data, biometric data for the purpose of uniquely identifying a natural
      person, data concerning health or data concerning a natural person’s sex life or
      sexual orientation shall be prohibited.
      (…)



193
  See Part V, II. below, Foreign models.
194
  See also Directive 2014/17/EU on credit agreements for consumers relating to residential
immovable property and amending Directives 2008/48/EC and 2013/36/EU and Regulation (EU) No
1093/2010 (OJ L 60, 28.2.2014, p. 34).
                                                                                                60
62

However, unlike section 28b of the German Federal Data Protection Act, the General Data
Protection Regulation does not make any legally binding requirements in respect of scoring,
apart from in Recital 71, according to which “the controller should use appropriate
mathematical or statistical procedures for the profiling”. Unlike section 28b of the Federal
Data Protection Act, requirements made of business under the Regulation are subject to a
threefold restriction:
      • the requirements made under Recital 71 should be complied with, not “are to be” or
        “must be” complied with,
    • the procedures must be appropriate and not necessarily “scientific”,
    • the procedure should be mathematical or statistical and not mathematical-
        statistical.
Normally, matters on which no political agreement can be reached are moved into the
recitals. Ultimately, it is then up to the European Court of Justice to decide to what extent
enterprises must use mathematical-statistical procedures, what that means, what standards
are to be applied to the mathematical or statistical procedures or what happens if enterprises
do not comply with the requirements set in Recital 71. What concrete impact this lowering of
standards will have on the distribution of competencies and what scope the German
legislature actually retains in view of full harmonisation will need to be discussed
elsewhere.195 At least Article 9(1) of the General Data Protection Regulation, like the Federal
Data Protection Act, prohibits the processing of sensitive data. The restrictions imposed on
this prohibition will not be addressed here.
Opening up Recital 71 of the General Data Protection Regulation by, in a way, generally
binding business in the same way as in section 28b of the Federal Data Protection Act
cannot hide the fact that the primary addressees of the EU requirements are citizens who
want to assert their right to information and access. However, under the provision of Recital
63, that right “should not adversely affect the rights and freedoms of others, including trade
secrets or intellectual property and in particular the copyright protecting the software”. In the
light of the Federal Data Protection Act, it is obvious that the General Data Protection
Regulation should be interpreted to mean that consumers should at least be informed about
the basic assumptions made in the algorithm logic.196 Depending on the outcome of the
proceedings pending before the Federal Constitutional Court, the question of the relationship
between EU law and Germany’s Basic Law could also be raised. Even if it were possible to
push through the German legal position across Europe – perhaps after it is underpinned by
constitutional law – we are still left, in regard to this complex issue, with requirements under
EU law which do not go very far because they are entirely guided by the power of individuals
and their ability to assert their rights.
There are considerable consequences as regards public legal redress. Profiling also has to
be measured against the provisions of the General Data Protection Regulation on the
admissibility of personal data processing. However, under Article 58(1a) of the Regulation,
the data protection authorities are also tasked with monitoring and implementing application
of the Regulation. This concerns the principles applied to data processing as set out in
Chapter II (Articles 5 to 11) of the Regulation. It is not entirely clear whether the monitoring
obligation also applies to the algorithms used, which are only referred to in regard to the
rights of the data subject in Chapter III, and then only in the recitals, which have no legal
force. Even if there were such an obligation, there are no uniform standards to which the
authorities could gear their activities.




195
  See Part V, III. 4.
196
  Schmechel (op. cit., fn. 16), who cites Paal, Beck’sche Kompakt Kommentare Datenschutz-
Grundverordnung, Paal/Pauly (eds), C.H. Beck Verlag 2017, margin no. 31 re Article 13 of the General
Data Protection Regulation.
                                                                                                 61
63

The Advisory Council notes that the rudimentary approaches to regulating algorithms
set out in the General Data Protection Regulation are insufficient and fall below even
the standard applied in section 28b of the Federal Data Protection Act.

3.        Re the three possible options for a regulatory approach
The Advisory Council notes that there are theoretically three possible options for
regulating this matter:
      •   proactive (legality by design): the legislature could oblige enterprises to
          incorporate binding legal requirements into algorithm development;

      •   reactive: the legislature could restrict itself to obliging enterprises to comply
          with the law when developing algorithms (which actually goes without saying)
          and then focus on ex-post monitoring;

      •   the happy medium: the legislature could set a regulatory framework which
          combines binding governmental requirements with self-regulation.
These options will be outlined and analysed in the following.

4.        Re lack of transferability of technical regulation
If the legislature decides to take the proactive approach, in the light of a century’s worth of
experience, it would make sense to oblige industry to comply with the rules of technology.
The following triad has become established both legislatively and constitutionally197 when it
comes to regulating product safety:198 the generally recognised rules of technology; the state
of the art; and the current state of science and technology. It is obvious even at first glance
that the German legislature has set the bar high in section 28b of the Federal Data Protection
Act. Credit institutions must apply scientifically validated methods, that is not only those
which are generally recognised and generally applied but those which stand up to being
measured against scientifically validated standards. One of these three standards has taken
root, namely the generally recognised rules of technology in the field of consumer goods and
the current state of science and technology for medicinal products. Where products are
subject to pre-market control exercised by government authorities, these are obliged to
examine compatibility with binding government requirements when licensing products.
Where no such pre-market controls are conducted, which – for good reason – is the case for
all technical consumer goods, either the manufacturers themselves or authorised certification
agencies establish whether the product meets the generally recognised rules of technology.
The point of reference when conducting this assessment is generally the technical standards
drawn up by German standardisation bodies or by EU standardisation institutions. Within the
EU, self-certification or third-party certification guarantees manufacturers (or importers)
access to the Single Market. However, manufacturers are not obliged to abide by technical
standards. They can also apply other methods to ensure they are complying with the
statutory safety requirements. Corrective measures are taken under labiality law. Where
products give rise to damage despite standards being complied with, the courts can hold
manufacturers liable in so far as this proves justified.
Transferring the above approach to digitalisation, the legislature could set binding standards
as regards developing algorithms. One conceivable option would be, for example, to
reformulate Article 9 of the General Data Protection Regulation (the prohibition of processing
sensitive data and its exceptions) in this way. As simple and convincing as such a rule may

197
   Federal Constitutional Court, order of 8 August 1978, file no. 2 BvL 8/77.
198
   Marburger, Die Regeln der Technik im Recht, (Heymanns Verlag, 1982);
Joerges/Falke/Micklitz/Brüggemeier, “Die Sicherheit von Konsumgütern und die Entwicklung der
Europäischen Gemeinschaft”, (1988), Schriftenreihe des Zentrums für Europäische Rechtspolitik,
Vol. 2, p. 523.
                                                                                                 62
64

appear, it would at best solve questions concerning automated programming by software
agents, but not programming by autonomous software agents. Compliance with legal
requirements can, therefore, only be guaranteed if they are not only incorporated into the
source code but if they are also automatically taken into account whenever an autonomous
change is made. To be able to do that, legal rules would have to be made compatible with
the logic of the “code”, which only understands “yes” and “no” and cannot cope with vaguely
formulated general legal clauses (e.g. “good faith” or “good morals”).
Across the world research teams are working on the options which legality by design opens
up. Opinions differ as to their feasibility. Thinking this through to the end, full compatibility
would mean reducing the law down to a “yes” or a “no” and incorporating legal reality into this
“yes/no” logic. Legality by design would have to be shaped in such a way that all possible
cases could be broken down into “yes/no”. It would also be worth thinking about
incorporating an option into an algorithm in which a competent human being would have to
be called in where uncertainty arises as to how to handle reality. It is clear that a great deal
more research needs to be done here. It is currently still unclear whether such compatibility
can actually be created by technical means.
In fact, the trend when it comes to standard-setting in consumer law is towards general
clauses. It is not least the adoption of the idea of social protection (the protection of the
weakest under law) which has meant that the number of legal rules which bind the
contracting parties to the principles of good faith, good morals and, less spectacularly,
compliance with sensible and adequate rules has increased exponentially. The politically
desired greater level of protection in private-law relationships contrasts with a loss of legal
certainty. At any rate, the functional logic of algorithms could have positive consequences if
the legislature were forced to differentiate more strictly than before between prohibitions
which are absolute and those which are linked to sensible benchmarks. The development of
black lists in fair trading law and the law of general terms and conditions, as well as the
prohibitions of discrimination, which are absolute, bear witness to the possible developments
which modern consumer legislation might undergo.199 Even if it were possible to shift the
focus of consumer law, we would still be left with many rules where the standards
themselves leave considerable scope for interpretation on account of being formulated in the
style of general clauses. As well as considerable doubts as to how complex legal realities
can be processed, the criticism raised against the feasibility of implementing the law in
algorithms is above all directed against the fact that it is hardly conceivable how general
clauses are to be translated into a mathematical programming language.
The Advisory Council notes that it will not be possible to regulate algorithms using the
means and technologies available for regulating industrial products.

5.     Re the deficits and consequences of a reactive approach
In reality, control is currently being exercised purely reactively. Enterprises in the digital
economy use the freedom afforded by liberal market economies to define algorithms
independently. To what extent existing algorithms comply with the requirements of applicable
consumer law and of anti-discrimination law, to name just two legal fields, is currently largely
not subject to any ex post factum control of whatever shape or form. The reason is simple:
Potential illegal results can only be identified by the respective addressee, and that only
theoretically.
If one nevertheless wanted to advocate purely ex-post controls, then there would be two
prerequisites: (1) a digital agency which has the requisite technical and legal resources to be
able to check whether the technology is compatible with the law and (2) an obligation to
disclose the algorithm with all its autonomous modifications to a closed circle of government
controllers.

199
   The report commissioned by the Advisory Council and submitted by Rott (op. cit., fn. 157) adopts
the same approach.
                                                                                                  63
65

The need for a digital agency entirely independently of the existence of a law of algorithms is
addressed elsewhere.200 Letting things go on as before and trusting in the self-responsibility
of business and the self-regulatory power of competition without an obligation to register and
without the obligation to disclose algorithms is at any rate not a serious option. In view of the
current pace of social change, not only in the world of business, and its potential impact on
human beings, a purely reactive political approach is not an option.
The Advisory Council is convinced that sticking to “business as usual” is, politically
speaking, not a serious option. The political realm is called to drop the option of ex-
post controls, the de facto approach, and to look for a regulation which does justice to
the specific features of algorithms.

6.     Re the limited possibilities of co-regulation
Attempts to link governmental and private regulation can be found along the spectrum
between the two extremes of pre-market and post-market controls. All these considerations
are, tacitly, based on the idea that it will be possible to get a handle on algorithms in the
same was as it was possible to get a grip on the health and safety risks posed by consumer
goods on the one hand and the machines and technology used in the production of goods on
the other.
Gerald Spindler and Christian Thorun put forward a carefully elaborated proposal for co-
regulation in a report they submitted to the registered society Selbstregulierung
Informationswirtschaft.201 The basic idea is that the (German) legislature should adopt
framework legislation which sets out the minimum requirements as regards standard-setting
(clear targets, participatory approach, decision-making, transparency, financing,
standardisation organisation gets no copyright) as well as regarding enforcing those
standards (binding commitment, monitoring, complaints mechanism, sanctions).202
Spindler/Thorun do not themselves address co-regulation so as to pick up on the risks of
automated and self-learning algorithms by software agents. They test their proposal in four
areas: data protection; unfair competition; IT security; liability law and telemedia law with civil
law and ancillary areas (in particular consumer protection law). Without calling the potential
of co-regulation in regard to the four areas into question from the outset, scepticism as to
how the model proposed by Spindler/Thorun could be transferred to the regulation of
algorithms nevertheless predominates.
Even the EU’s attempts to take advantage of the tried and tested system of governmental
framework-setting and private standard-setting for services by and large miss the mark. One
could raise the objection that there is as yet no European legislation available for
standardising services;203 in addition, when it comes to the digital world, it is hard to see why
the digital economy should agree to set voluntary standards which could go beyond general
guidelines or even codes of practice. The digital economy is dynamic; new business models
are constantly evolving which generally involve algorithms. However, standard-setting is a
rather more static process. Private standard-setting tends to codify the past, at any rate in so
far as standards describe products. If one takes the example of health apps,204 the question
arises of why companies providing these services should cooperate with each other, given

200
    See Part V.
201
    See https://sriw.de/images/pdf/Spindler_Thorun-Eckpunkte_digitale_Ordnungspolitik_final.pdf (last
retrieved 28 Nov. 2016), since published as Spindler/Thorun, “Die Rolle der Ko-Regulierung in der
Informationsgesellschaft: Handlungsempfehlung für eine digitale Ordnungspolitik”, (2016), MultiMedia
und Recht Beilage, Vol. 6, p.1–28.
202
    Busch’s editorial in “Towards a ‘New Approach’ in European Consumer Law: Standardisation and
Co-Regulation in the Digital Single Market”, (2016), Journal of European Consumer and Market Law,
Vol. 5, p. 197–232, p. 197 takes the same approach.
203
    Van Leeuwen, European Standardisation of Services and its Impact on Private Law Paradoxes of
Convergence, (Bloomsbury 2017).
204
    Adam/Micklitz (op. cit., fn. 78).
                                                                                                   64
66

Zur nächsten Seite