As the automation of the legal system progresses, more and more data protection issues are coming up. In particular legal tech applications are hereby concerned. As soon as automated systems and artificial intelligence (AI) are used, Article 22 of the GDPR comes into play. So the question is to what extent automated decisions have legal effects in individual cases. In my article, I explore the implications of Article 22 for legal tech, which have been little explored so far.
Objective of Article 22 GDPR
The aim of Article 22 of the GDPR is to give individuals a right of defense against fully automated decision-making processes and to prevent them from being degenerated into an object of a purely automated decision. It generally prohibits decision making based exclusively on automated processing.
In this context, the question arises whether the automated data processing should merely be an aid to decision-making and the decision is ultimately made by a human being, or whether the automated processing operation is the sole basis for the specific individual decision.
The automated decision must also involve a certain materiality for the data subject. Whereas a mere offer to enter into a contract or direct advertising does not usually entail any legally relevant disadvantageous consequences, refusal to enter into a contract - in the absence of a contractual obligation - does not usually have a legally disadvantageous effect as well. The situation is different if, for example, approval is refused by means of an automated administrative act, as this may already have a detrimental legal effect due to the reference to fundamental rights.
In all cases of permissible automated decision-making, therefore has to be ensured that the data subject's point of view is not ignored in fact or in law as a result of the automation. The entire decision-making process should be fair and transparent by linking the exemption of the controller from the prohibition requirement to specific minimum guarantees.
Broad Spectrum of LegalTech applications
In some cases, legal tech applications are merely intermediary platforms between people seeking legal advice and lawyers. Their technical implementation does not represent a disruptive innovation. However, very advanced forms of artificial intelligence (AI) that work based on neural networks are included as well and can therefore indeed be regarded as a disruptive innovation.
My analysis is based on a broad definition of legal tech. It comprises every form of software-supported handling of legally relevant tasks. The use of automated software solutions is possible in almost all areas of the legal system.
The spectrum of automated computer programs ranges from so-called expert systems, which apply knowledge about contexts and rules to evaluate a specific issue and draw conclusions, to artificial neural networks, which are concerned with recognizing relationships of meaning in data sets.
Automated systems can be used, for example, by the police to prevent threats. The administration also has a wide range of options for the automated issuing of administrative acts, for which the legal basis has now been created in various procedural regulations.
Significant Restrictions in Public Law
Especially for the use of legal tech in the field of justice, Article 22 GDPR represents a significant restriction. This is already becoming clear in the case of predictive policing. Only the automated use of anonymized data appears to be unproblematic. In the specific case, the permissibility will depend on whether the automated data processing also leads to a decision. This cannot already be seen in the context of the creation of a pure danger prognosis due to the lack of legal effect or comparable impairment.
When creating an offender profile for the purpose of criminal prosecution, anonymized data processing is out of the question. The direct initiation of automated investigations based on such a profile is therefore inadmissible.
The automated drafting of judgments or decisions is theoretically conceivable. Since state decisions in civil proceedings usually have adverse legal consequences for at least one party same as criminal judgments, with the exception of cases of acquittal, there are only a few conceivable cases in which an automated judgment would not violate Article 22 (1).
In administration, too, the distinction between automated administrative acts that are permissible and impermissible under data protection law can be made according to whether they are of favourable or onerous nature.
More Freedom in Private Law
Especially outside the sovereign sector, the most diverse possibilities for the use of automated legal tech applications by law firms, companies and private individuals are being discussed and created. In the field of private law, automated legal tech applications are per se more permissible than in the sovereign sector. For example, in the case of private law predictive analytics, a legal effect would be questionable. Even if no anonymized evaluation of decisions were to take place, a comparable impairment would generally not be significant as long as the decisions cannot be viewed directly.
Language analysis tools do not necessarily violate Article 22 either. These are applications that enable automatic identification of content representations as well as legal arguments in continuous texts and can present their results at high speed. This makes it possible to create pleadings or contracts using legal apps. Thus, especially in areas of law that make do with largely standardized contracts (e.g., corporate or employment law), various contract texts can be created automatically. However, if the automatic analysis leads to automatic contract adjustments that are disadvantageous for the contractual partner, it is inadmissible.
The mere creation of a draft contract with the help of AI, on the other hand, should not be seen as problematic, because legal effects or comparable impairments are generally not associated with a mere draft. Other letters such as automated demand letters from a collection agency, on the other hand, are likely to be inadmissible if no further examination of the content takes place and any immediate consequences such as the filing of a lawsuit or the initiation of enforcement measures are announced.
In the private sector, scoring and profiling are also used. Scoring uses existing data to determine the likelihood of certain behavior on the part of people with identical characteristics. Profiling can be used to develop automated personality profiles. More frequently the subject of discussion are legal tech providers dedicated to automated law enforcement. These include, for example, the assertion of air passenger rights or the review of issues relating to rental contracts. Even the use of chatbots to answer legal questions from private individuals is increasingly playing a role.
In the case of scoring and profiling by private companies, automated data processing may be inadmissible if it leads directly in automated form to a decision that is disadvantageous for the data subject with legal effect or comparable impairment and, for example, the score value alone is decisive for this.
With smart contracts, automated processing always leads to an automated decision. This is accompanied by the loss of a legally protected position for one party (e.g., transfer of a right), which may result in an adverse legal effect for that party and thus constitute a violation of Article 22 (1) GDPR. However, smart contracts are designed precisely for this type of automation of legal changes, so that the exceptions set out in letters a) and c) will regularly be met.
Chatbots do not pose any problems if they are used purely to provide legal advice. There is no adverse effect here. The same applies if they only create sample contracts or prepare letters.
The provisions of Article 22 GDPR cover all types of automated systems and thus also artificial neural networks. These will eventually be able to make not only automated but also autonomous decisions based on automated data processing operations. With increasing use of such advanced forms of AI, the relevance of Article 22 GDPR will increase and may ultimately become a yardstick for innovation. Since legal tech is used in the administration of justice, which is sensitive to fundamental rights, Article 22 GDPR has an important protective function with regard to the EU Charter of Fundamental Rights.
We support you in keeping an eye on relevant data protection developments in IT law and advise you on all aspects of the use of legal tech. In doing so, we help you identify relevant topics and regulate them contractually.