Author

Dr. Christian Frank, Licencié en droit (Paris II / Panthéon-Assas)

Partner

Read More
Author

Dr. Christian Frank, Licencié en droit (Paris II / Panthéon-Assas)

Partner

Read More

12 October 2022

Fair trial to determine AI liability?

  • In-depth analysis

On Sept. 28, 2022, the EU Commission published its draft Directive to create a framework for non-contractual civil liability for artificial intelligence systems, COM(2022) 496 final. The draft is an important cornerstone of the agenda to create a balanced legal ecosystem for the development and use of artificial intelligence systems in the follow-up to the draft AI Regulation presented in April 2021 (COM(2021) 206 final, the draft itself is available here). The regulation itself contains only nine articles. It is preceded by 33 recitals and 17 pages of explanatory notes. The Commission has decided to pursue a procedure-based approach. It is intended to make it easier for plaintiffs to meet their burden of proof if they want to claim for compensation of damage caused by an AI system. The alternative option of creating a harmonized system of strict liability as in "classic" product liability, possibly with a limit on the amount to be claimed and supplemented by mandatory insurance, has been rejected for the time being. It is not out of the question, however, but is to be considered again five years after the expiry of the transposition period for the directive when the measures now proposed are reviewed.

The core provisions are contained in Art. 3 and 4 (Articles identified below without further specification refer to the draft AI Liability Directive, COM (2022) 496 final.): Art. 3 determines, on the one hand, under which conditions who can demand from whom the disclosure or preservation of evidence. It also regulates the consequences if a defendant does not comply with the obligation to disclose or preserve evidence. Art. 4 provides the conditions for assuming the rebuttable presumption of a causal link, rules for rebuttable presumptions of a causal link in case of fault of the defendant.

In the explanatory memorandum to the draft, the Commission explicitly states that the proposal does not shift the burden of proof so that providers, operators and users of AI systems are not exposed to higher liability risks that could hinder innovation and reduce the uptake of AI-enabled products and services.

From a litigator's perspective, the draft presents itself as a suitable concept, which, however, would benefit from revision and concretization in some areas. This article will take a closer look at the concept of disclosure:

Liability conditions in the area of traffic and provider responsibility not covered

Before we get into this, however, a few comments on the scope of application should be emphasized: It only deals with non-contractual civil liability - contract law and criminal law are thus left out. According to Art. 1(3)(a), the Directive shall not affect Union law with regard to liability conditions in the field of transport. The exact scope of this exclusion is drafted in a cloudy manner and is also not sufficiently clear from the explanations in the recitals: The wording itself appears comprehensive, recital 11 mentions the liability of transport companies, which should remain unaffected. According to the impact assessment, claims by a person injured in a traffic accident against the manufacturer of the autonomously controlled vehicle involved in the traffic accident should be covered (read more). Here, a more precise wording in the directive itself and an addition to the recitals would be necessary. The draft is also intended to leave unaffected the liability exclusions and duties of care under the Digital Services Act - i.e., primarily hosting - and platform provider responsibilities.

The possible actors

According to Art. 2, claims can be filed by the injured party himself, the plaintiff according to Art. 2 para. 6 lit. (a); his legal successors, including assignees, and whoever is acting on behalf of one or more injured persons such as a plaintiffs’ association. In addition, a potential plaintiff has the right to sue. This is a natural or legal person who is considering asserting a claim for damages but has not yet done so. According to the definition, this can also be a potential assignee or a potential plaintiffs’ association.

The potentially affected parties include at first the defendant as the person against whom the claim for damages is asserted. However, the disclosure provisions also include third parties who are not or are not to be sued themselves, more on this in a moment.

The procedure

The subject of disclosure requests are high-risk AI systems as defined in the draft of the AI Regulation that are suspected of having caused damage, Art. 3(1). Currently, such systems are regulated in Art. 6 of the draft AI Regulation (COM(2021) 206 final). These are, for example, AI systems that are intended to be used as a safety component of a product as defined in Annex II, or that themselves constitute such a product and are subject to a conformity assessment - such as medical devices. In addition, AI systems are considered high-risk in the areas listed in Annex III of the draft AI Regulation. For example, systems for biometric identification and categorization of natural persons or for the management and operation of critical infrastructure. Thus, disclosure requests cannot be related to "certain" AI systems that are only subject to the transparency obligations under Art. 52, such as emotion recognition systems or chatbots, or the vast majority of Other AI systems with minimal or no risk, such as AI-enabled video games or spam filters.

 

The Harry Lime Dilemma

A plaintiff may request the court to disclose relevant evidence to him. The disclosure request may, but need not, be directed against the defendant. The plaintiff may also request disclosure from a third party who is a provider, a person under a corresponding obligation or a user of such a system and who has this evidence. These terms are taken from the draft AI Regulation: the provider is the person who develops the system or has it developed in order to place it on the market or put it into operation in his own name or under his own brand; the user uses such a system outside the private sphere, Art. 3 No. 2 and 4 Draft AI Regulation. Individual provider obligations are imposed in the draft AI Regulation, for example, on product manufacturers in Art. 24, importers in Art. 26 or distributors in Art. 27 Draft AI Regulation.

If the plaintiff's request for disclosure is directed at such a non-defendant, the plaintiff must show that it has first made all "proportionate attempts" to gather the relevant evidence from the defendant itself, Art. 3(2). This requires at least a request for disclosure. A similar obligation applies to the potential plaintiff, Art. 3(1) p. 1 (more on this below), who is, however, only required to make a request which “was refused”. The different wordings are certainly not a drafting error. However, the draft does not specify what further efforts the plaintiff must make in order to safely pass the hurdle of " proportionate attempts ". Theoretically, this can include the enforcement of substantive claims - for example, for the provision of information -, the assertion of procedural rights, in the ZPO, for example, under Section 142, or merely requiring the plaintiff's own research and attempts at analysis. The difference is serious; in this respect, it would make sense for the Directive to specify the sailing instructions to the member states accordingly.

The plaintiff will also have to show why disclosure is necessary and, in its view, also proportionate. The former will involve more than a blanket assertion of a need for evidence or a lack of transparency. Recital 20 at least suggests this - because if the court is to limit disclosure only to evidence that is necessary for a decision on the respective claim for damages, the applicant must have explained to it beforehand, for example, which data it specifically needs for what. The example given there, however, reveals a dilemma: it will not always be easy for the claimant to determine in advance those parts of the relevant, but unknown to him, records or data sets that are necessary to prove non-compliance with a requirement of the AI Regulation. However, the wording is certainly deliberate, as a reference to "verifying compliance with the requirements of the AI Regulation" would allow for broad fishing expeditions.

At this stage, the court will not be able to avoid formally involving the data holder in the proceedings: Otherwise, the latter will neither be able to assert its legitimate interests in the protection of trade secrets addressed therein nor to file an appeal against corresponding orders.

In this context, it is also interesting to ask whether other third parties can, or even must be included in the proceedings: If the application is directed against a user of a system, it is quite conceivable that this user merely received the "relevant evidence", i.e. the relevant data, from its actual owner on a contractual basis. However, at least a non-exclusive licensee will have a different interest in the secrecy of data merely licensed to him than the actual owner. The latter also has quite different possibilities of setting out the requirements for a trade secret under the national transposition of the Trade Secrets Directive and thus either preventing or restricting its disclosure or making it the subject of a secrecy order as e.g. provided for in Section 16 of the German Trade Secrets Act. This is where the further dilemma will become apparent in practice. There are too few court decisions on essential aspects of the Trade Secrets Directive. Rulings by the ECJ in this regard are hardly to be expected in the near future either: relevant disputes are essentially decided in injunction proceedings; the possible loss of trade secrets requires rapid clarification for purely factual reasons. Five to ten years of main proceedings make little sense for most companies in terms of time. The number of proceedings, which are primarily conducted to further develop the law and the meaning of important legal concepts, is rather limited in practice.

Art. 3(4), last sentence, obliges the Member States to ensure that the persons to whom the disclosure order is issued have appropriate procedural remedies against it. Here, too, implementation is anything but simple: if a plaintiff has lost first instance proceedings of a damages case because he mistakenly believed that he already had sufficient evidence, can he still file a disclosure request against a third party in appeal proceedings? Or should this be treated as a delayed submission and request? If the motion is admissible, does it then become the subject of a separate proceeding on disclosure? If not, will the third party (only then) be included in the appeal proceedings: can it then file an independent appeal to the final appellate court against a disclosure order of the (ordinary) appellate court? If the Directive wants to prescribe such decisions, it must intervene deeply in procedural laws of the Member States in order to achieve harmonization. If it leaves this to the transposition in the Member States, there is a risk of inhomogeneous implementation and corresponding dissonance.

Only plaintiffs may also request preservation of relevant evidence

Under Article 3(3), only plaintiffs may also apply to the courts for measures to preserve the relevant evidence. Potential plaintiffs, on the other hand, have no corresponding standing to sue. There is no justification given for this differentiation. Presumably, it is the proven seriousness of pursuing one's own claims by having already filed a lawsuit that justifies the possibility of a motion for preservation.

 

No compulsory enforcement, consequences in case of involvement of a third party unclear

If a defendant fails to comply with a disclosure order, the court presumes that he has breached his relevant duty of care, Art. 3(5). The defendant may rebut the presumption. It also follows indirectly from this that, under the concept of the draft, disclosure to a defendant cannot be compelled. Recital 21 refers to enforcement possibilities under national law, which could, however, delay claims for damages and thus possibly cause additional costs for the parties to the proceedings. The legal consequence of the rebuttable presumption is therefore considered sufficient.

The draft is silent as to what are the consequences if the party affected by the order was a third party but not a defendant in the proceedings. It seems obvious that this would have no consequences at all. The presumption of breach of the duty of care is tied to the defendant's own conduct. It is at least not obvious on what basis one would want to presume a breach of the duty of care on the part of the defendant because a third party did not comply with a disclosure request that was aiming at him. Without attribution, it cannot be presumed that the defendant did not comply with his duty of care, at least if this is to relate to facts in connection with the design of the high-risk AI system that were the subject of the disclosure order.

If the order cannot be enforced against the defendant, it will - base to the provisions in the draft - be even more impossible to enforce it against a third party.

Potential plaintiffs

The term potential plaintiff includes natural and legal persons who are considering asserting a claim for damages. Following the logic of the plaintiff concept, potential assignees and potential plaintiffs’ association are thus also entitled to sue. The sphere of potential plaintiffs is thus very broad. A certain concretization is sought through the specifications described below. However, whether this is actually sufficient to screen out purely speculative disclosure requests by potential plaintiffs is at least not obvious. One starting point could be the wording of Art. 3(1), last sentence of the German language version, according to which the potential plaintiff should plausibly present "its" claim for damages - from which one could conclude that only the circumstance of bringing the action itself may be unclear. The English and the French versions, however, use an indefinite pronoun here („a claim for damages“ and/or „d’une action en réparation“). It would be helpful here if the EU legislator unifies the wording and improves the clarify in the recitals, if necessary.

Before filing a disclosure request with the court, potential plaintiffs must first have unsuccessfully attempted to ask the provider, a corresponding obligated party or a user to do so. In practice, this will amount to a corresponding demand letter with a deadline. However, in contrast to the plaintiff's arguably more extensive obligations to apply appropriate attempts at gathering the relevant evidence from the defendant, potential plaintiffs do not have to fulfill any more extensive obligations here.

They must then sufficiently prove the plausibility of their claim for damages by presenting facts and evidence, Art. 3(1) last sentence. There is no corresponding explicit requirement for the actual plaintiff, but this makes no difference in the end, since he must include respective submission in the statement of claims anyway.

Conclusion

As a conclusion to the disclosure and preservation according to Art. 3, it can thus be stated. The inclusion of third parties in disclosure orders is an ambitious approach to enable and, if necessary, accelerate proceedings. However, the issues that arise in this regard are complicated. For some aspects, there is a hole in the draft - for example, what is to follow from the fact that a provider or user who is affected by a disclosure order but has not been sued simply ignores it. The concept of rebuttable presumption in case of disregard of the disclosure order appears balanced in terms of content and suitable to accelerate proceedings.

A Directive will (have to) leave room for the Member States to transpose it: In order to reduce the dissonance here as far as possible, it is desirable that at least in some areas, the provisions themselves, or at least the explanatory notes, be made more specific.

However, practitioners should already begin to prepare their agreements for such scenarios. The agreements to be concluded now for the development of AI systems will presumably still be relevant once the Directive has entered into force and been implemented. Clauses on information and cooperation obligations in the event of a threatened or actual claim for the purpose of such disclosure should therefore now be drafted or revised.

Artificial Intelligence Act

Read frequently asked questions, commentary and the latest updates concerning the EU's AI Act. 

Learn more
Learn more
Call To Action Arrow Image

Latest insights in your inbox

Subscribe to newsletters on topics relevant to you.

Subscribe
Subscribe

Related Insights

Artificial Intelligence Act

Draft of the AI Act gets on the home stretch

12 May 2023
Briefing

by Dr. Christian Frank, Licencié en droit (Paris II / Panthéon-Assas)

Click here to find out more