How may the AI Liability Directive impact startups.

September 22, 2022
AI Liability Directive

For many AI startups, the global challenges we are currently facing (COVID-19, climate change, and the energy crisis) are opportunities to use technology to improve people’s lives. This is why AFS  has actively participated in the policy debate on AI since the first White Paper on the topic was published in February 2020. Advocating for a policy and legal framework that allows AI to thrive and be developed and implemented by startups is a key priority for our organisation.

We’ve had a sneak peek at the new AI Liability Directive developed by the European Commission, which intends to address compensation of those affected by AI-powered decisions, something which is not regulated by the AI Act. Its objective is to lay out common rules for Member States in this regard. In particular, it would set common rules on the disclosure of information on high-risk AI systems and rules on the burden of proof. 

The good: We were glad to see that no-fault liability did not make it into the draft, since we believe that it would have made it very difficult for entrepreneurs and startups to apply AI solutions. Imagine an entrepreneur using it were liable for alleged harm caused to individuals regardless of the existence of a causal link between the decision made and its use, it would be harmful for AI’s takeup and development across Europe as it would deter entrepreneurs from using it in the first place.

As always, startups should only have to scale up once, not 27 times, and harmonising rules prevents startups from having to adapt to 27 different frameworks. 

The not-so-good:  The current draft directive allows national courts to presume that a particular high-risk AI caused damage if it failed to comply with relevant national or European regulations on the matter (such as the upcoming AI Act) or if relevant data is not disclosed. If these presumptions were implemented, the AI Act should establish very clear and startup-friendly compliance requirements. Otherwise, the directive would again deter entrepreneurs from starting up. Also, disclosure requirements should be proportionate, clear, and respect intellectual property. Otherwise, startups would be discouraged to develop or commercialise AI solutions in Europe. 

Another key point concerning liability rules in AI has to do with establishing responsibilities between the different parties involved in the deployment of an AI system, such as developers, deployers, and users. Developers should not be held accountable for outcomes deriving from the application of an AI system that has been bought off by a user or deployer and over which they have no control. Moreover, definitions in both regulations (the AI Act and the AI liability Directive) should be the same to avoid confusion.

In addition, the draft directive establishes that the text is to be reviewed after a five-year period, especially regarding the need for the introduction of a no-fault liability regime and/or mandatory insurance for certain AI systems, mostly those that affect legal values such as life, health, and property. No-fault liability would make it very difficult for startups to innovate and apply AI in sensitive but key fields such as healthcare. As for mandatory insurances, their impact on startups’ ability to innovate should be carefully considered. We are weary of how mandatory insurance could affect the startup ecosystem as they have the potential of creating market-entry barriers for smaller actors. 

As far as harmonisation, we have long advocated for a regulation that allows startups to scale up once, instead of 27 times. However, the draft directive opts for a minimum harmonisation approach and allows claimants to invoke more favourable national rules. Clearly defining concepts such as fault or damage would support the enhancement of the Digital Single Market. Legal fragmentation can hinder the ability of startups to scale up legally compliant AI solutions at a European level. That is why we believe that damage should be clearly defined in the directive and limited to physical or material damage. 

Policymakers: Startups need a reasonable, clear, and proportionate liability regime that protects consumers and fosters innovation and economic growth so that in turn they can provide the solutions the EU is after.