SERVICE

8 Points about AI Development Agreements that can be learned from the “Contract Guidance on Utilization of AI and Data

1.  Liability for damage sustained by the user when engaged in AI development

Concrete Example:
A vendor’s delivery is delayed due to a longer than expected time required in learning due to a mistake of a level that a vendor with an ordinary level of technology would not normally make.

First, it is obvious that it does not mean the vendor (the mandatary) has no liablility [for any damages] whatsoever when the legal nature of an AI development agreement is a quasi-delegation contract.
Even in a quasi-delegation contract, the vendor is still subject to the “duty of care of mandatary” (under Article 644 of the Civil Code: “A mandatary shall assume a duty to administer the mandated business with the care of a good manager in compliance with the main purport of the mandate.”).
Therefore, there is no reason to distinguish between AI development and conventional system development with respect to liability arising from engaging in development.
Accordingly, Article 22, Paragraph 1 of the model AI development agreement provides as follows:

Article 22, Paragraph 1
「userおよびベンダは、本契約の履行に関し、相手方の責めに帰すべき事由により損害を被った場合、相手方に対して、損害賠償(ただし直接かつ現実に生じた通常の損害に限る。)を請求することができる。ただし、この請求は、業務の終了確認日から●か月が経過した後は行うことができない。」

2. Liability for damage sustained by the user due to use of AI software that is a deliverable

Concrete Example:
The vendor delivered to the user AI anomaly detection screening for half-finished goods in a factory developed by the vendor. When used in the user’s own company, the AI software missed certain anomalies, resulting in the shipment of defective goods to the user’s customer, thereby causing the user to sustain significant damage.

Unlike the “liability for damage sustained by the user when engaged in AI development” in item 1 above, I believe that it would be very difficult to hold the vendor liable for “liability for damage sustained by the user due to use of AI software that is a deliverable.”
The reason for this difficulty in holding the vendor liable is that, stemming from the differences between conventional system development and AI software development that I introduced at the beginning, there is the technological difficulty of prior performance assurance for the input of unknown data in a trained model.
Furthermore, the AI Guidelines point out that an after-the-fact inspection for a cause-effect relationship is technologically difficult; the performance and the like of the trained model depends on the training dataset; and the characteristics of AI products depend on the quality of utilization-phase input data.
Therefore, with respect to this “liability for damage sustained by the user due to use of AI software that is a deliverable”, I think it would be reasonable to provide in the AI development agreement for the vendor’s non-liability, or, if the vendor is liable, a fixed maximum amount for compensation for damages.
Article 20 of the AI model development agreement provides that, in principle, the vendor will not be liable for [damages attributable to], among other things, the use of deliverables such as trained models and the like.

Article 20
userによる本件deliverable等の使用、複製および改変、並びに当該、複製および改変等により生じたgeneration物の使用(以下「本件deliverable等の使用等」という。)は、userの負担と責任により行われるものとする。ベンダはuserに対して、本契約で別段の定めがある場合またはベンダの責に帰すべき事由がある場合を除いて、userによる本件deliverable等の使用等によりuserに生じた損害を賠償する責任を負わない。

Of course, depending on the parties’ needs and power relationship, there may also be a desire for the vendor to be liable for damages attributable to the use of the trained model. However, even in that case, I think it would be reasonable to fix a maximum amount for compensation for damages.
In the case where an upper limit is fixed, this would mean having the same terms as Article 22, Paragraph 2 of the model development agreement, which provides that “[T]he compensation for damages for which the vendor shall be liable to the user shall be limited to the subcontracting fees in this Agreement regardless of any default obligation, defect liability under the law, infringement of intellectual property rights, unjust enrichment, tort, or a claim on any other grounds.”

3. Liability attributable to the user’s infringement of the intellectual property rights of a third party due to the user’s use of AI software which is a deliverable.

Concrete Example:
A patent has been granted to company A for a certain learning method. The vendor exploits that patent without company A’s consent to generate a trained model which the vendor gave to the user. When the user started to provide that model to numerous, unspecified third parties, the vendor received a warning letter from company A claiming infringement of its patent.

Although this “liability attributable to the user’s infringement of the intellectual property rights of a third party due to the user’s use of AI software which is a deliverable” is one kind of damage sustained by the user due to use of AI software, the issue of intellectual property rights infringement needs to be carefully considered since it is of great concern to the user.
Although the user usually requests a warranty of non-infringement of intellectual property rights from the vendor (i.e., a warranty that no third party intellectual property rights have been infringed), in general terms, it is often extremely difficult from a financial perspective for the vendor to complete an exhaustive investigation and confirmation of the existence of any infringement, including foreign patents.
Below are three potential approaches.

  1. A pattern with no warranties at all
  2. A pattern with only a warranty of non-infringement for copyright
  3. A pattern with a warranty of non-infringement for all the intellectual property rights

I think that the reason for item 2 (a pattern with only a warranty of non-infringement for copyright) is that many vendors are able to warrant non-infringement [of a copyright] since [the infringing product] being practically identical to the copyright product is a required element to constitute infringement of a copyright (for example, the copyright of a program).
I have proposed Article 21, draft B for item 2 above and Article 21, draft A for item 3 above.

Summary: The “8 Points about AI Development Agreements” that can be learned from the “Contract Guidance on Utilization of AI and Data

1. Performance assurance, acceptance inspections, and defect liability

(1) Mutual understanding of characteristics of AI and its limitations
(2) Dividing the process from the contract
(3) Devising the contents of a development agreement

2. Rights/Intellectual Property

(1) Among the materials, interim deliverables, and deliverables, know which are or are not covered by intellectual property rights
(2) With respect to (1) above, know who has what rights under the default rules (i.e., a legal rule)
(3) Know how to craft contract provisions that benefit your own company (without being particular about the “ownership of intellectual property rights”, prioritize the “terms of use”)
(4) Know the limitations of the contract

3. Liability

Postscript August 8, 2018
intellectual property rightsデフォルトルールの〇×表と経産省「オープンなデータ流通構造に向けた環境整備」に掲載されている類似の表との関係について追記しました。
【2018年8月10日追記】
本記事における「AI」の用語の意味を明確にしました。
(Taichi Kakinuma, attorney-at-law)