Who is Liable When AI Goes Wrong?

When you use a new generative artificial intelligence (GenAI) tool, you’re likely to impatiently click on the user or licensing agreement without taking a minute to skim its terms. You probably don’t think that you could get sued, land yourself in a murky legal situation, or inadvertently share trade secrets through your use of the tool.

Think again. The ease of use and rapid advancement in capabilities of generative AI tools has created a gap between the power of these technologies and what most users understand about the legal rights and liabilities involved in that use.

Within those apps, platforms, and tools are a wide variety of potential ‘gotchas’ in the user agreements, licenses, and contracts you must sign as a condition of use. Agreeing to the terms within those documents, which are typically boilerplate, may limit your ownership of anything you create using AI, expose you to potential liability in the event of a lawsuit, and share your output with the large language models behind the AI.

Ownership of AI-Created Material

Content solely created by AI is not copyrightable, according to the U.S. Copyright Office. While only people can be inventors, AI-assisted inventions can be patented if they meet U.S. Patent and Trademark Office criteria. Direct output from a generative AI tool is not protected by copyright, which means that any other person or business can use that output without legal repercussions.

“What isn’t settled is how much human input does there need to be in a work for that work to have copyright value,” said Craig Auge, a partner with Columbus, OH-based legal firm Vorys, Sater, Seymour and Pease LLP. “It’s a fine line that still being drawn.”

These legal copyright issues are more than academic. They can apply if you’re using source code derived from an AI application that you then put into a larger application. If that app has commercial potential, you—and your company, if you are affiliated with one—need to know what app or apps you used to come up with the code, when engaged in creating the code, and what your rights were to that code at the time you used the AI to create it. The original AI use or uses could potentially imperil future commercial product viability.

“If you’re using generative AI as an employee or a business owner, you need to think about what you’re using it for and what’s potentially sensitive,” Auge said. “Maybe if it’s terribly important, call a lawyer, but at a minimum, stop, slow down, read a little bit and ask some questions to make sure you don’t get sideswiped later in terms of not having the kinds of protections or restrictions you need in a competitive business world.” 

AI Liability

GenAI tools offer varying liability protections. Liability protection is an important issue in the large language models behind GenAI because of their tendency to hallucinate, or make up, facts and statements. Terms are different based on where you live because the European Union has created rules around GenAI use, privacy, and liability, terms and provisions that the U.S. doesn’t have.

Typically, the user agreements in free GenAI tools differ from the GenAI tools offered under corporate licenses from companies such as Microsoft, Google, and OpenAI. Corporations may negotiate licensing agreements with these companies for stronger protections or protections may be offered within the agreements available to paid users. For example, Microsoft Copilot has provided liability protection against copyright infringement if it is used correctly and not intentionally misused—and other AI companies have followed that lead.

AI user agreements, licenses, and contracts may be subject to change without notice, so you can’t necessarily rely on what an agreement says today, because it could change tonight or tomorrow.

To be protected by any AI agreements that your company has, you need to follow any protocols they’ve established. If you want to use AI tools outside of approved tools, agreements, contracts, and licenses should be reviewed by lawyers with experience in AI law. Double-check any facts or sources that come from an AI for accuracy.

AI liability isn’t limited to copyright infringement, which is the most common indemnification that AI companies offer. Lawsuits are possible in situations where AI was used to create a flawed product, for example, according to Matt Henshon, chair of the American Bar Association Artificial Intelligence and Robotics Committee. In such a case, the plaintiff who claimed harm would likely sue you, your company, and the AI company that was used to create the code.

To protect yourself against liability when using AI, make sure that all AI uses are ultimately checked or controlled by you or others. “Never surrender human control,” advised Marcus Denning, senior lawyer at MK Law in Melbourne, Australia. “Courts are beginning to treat AI like any other business tool; you’re responsible if you use it carelessly.”

Sharing Output with LLMs

Many free generative AI tools—and even some subscription ones—use any information you input to train their large language models (LLMs). Material used to train LLMs can be passed onto other app users any time the app responds to a similar prompt. In fact, in 2023, Samsung banned employees from using AI tools such as ChatGPT after engineers fed proprietary source code into ChatGPT.

“If you stick confidential information or information that’s a company trade secret into a public generative AI model, you kind of blew it,” said Auge. “If you use generative AI coding tools for anything proprietary or that you want to keep secured in any way, use a secure generative AI coding app, not an unsecured model, even if it’s slower.”

To avoid sharing any generative AI output with any free AI tools you use, check to see if you can opt out of having your content used to train models. If you can, you need to follow the designated processes outlined to fully opt out.

Amy Buttell is a Silver Spring, MD-based technology, legal, and business journalist, content creator, writer, and ghostwriter.