On Demand Library
Created in partnership with
Legal Risk Using Generative AI Tools
6:31
Legal Disclaimer 
The information provided in this video does not, and is not intended to, constitute legal advice, instead, all information, content, and materials available on this site are for general informational purposes only. The law changes fast, so information in the video may not constitute the most up-to-date legal or other information. 
Transcript

00:06
Adam Stofsky
Annette, what is copyright? 


00:08

Annette Hurst
So copyright is a kind of intellectual property in content. It covers the things that we make. 


00:18

Annette Hurst
When we're creative and when we're expressing ourselves. 


00:22

Annette Hurst
So text, audio, music, video, visual arts, painting, these expressions are the core of what's covered by copyright. Doesn't matter if we try to sell it can be purely private. But that's that creative spark is the essence of what is covered by copyright. 


00:47

Adam Stofsky
For your average company. Maybe you're a tech company, maybe you're a big company, maybe you're like a mom and pop shop making marketing materials using some kind of AI. There's going to be millions of uses, right? 


01:00

Annette Hurst
Yeah. 


01:00

Adam Stofsky
Kind of explain what they need to worry about. Where is a violation of copyright law likely to happen? 


01:09

Annette Hurst
That's a great question. I think a lot of companies are probably going to use these for marketing materials, right? And in that case, the copyright risk is relatively low. Copyright generally, marketing material is generally very factual. It's not the kind of thing that the copyright office in the US recognizes as having any real meaningful scope of copyright protection. So I think in that area, your risks are relatively low. The risks go up when you're using generative AI to help you create something that you intend to commercialize. And whatever that content is, if you intend to make money off of it and put it out in the marketplace, then there's a greater risk that if it matches some pre existing material, you might have a copyright infringement problem. 


02:00

Adam Stofsky
Can you give an example of what that might look like? 


02:03

Annette Hurst
Yeah. So let's say that you are using it to create a visual work of some kind, and, you know, maybe you've created some art and you want to put it in an NFT because you're like a, you know, crypto company or something like that. Well, if it turns out that your art output that you want to embody an NFT token and sell to the world actually match something else that's already out there in the world and that may have been part of the training of the model and you don't even know it, right. Because you're just using it. And to create some art, there could be an infringement problem. 


02:44

Adam Stofsky
There doesn't even need to be an NFT or anything involving fancy technology, right? I could be a greeting card company, right? And I decided these artists are really expensive, I'd rather just have some AI and then suddenly they're trained on lots of other artwork. I know I'm saying this crudely, and someone else's artwork ends up in yours or something that looks a lot like it. 


03:08

Annette Hurst
There's something that looks like it. The way these models work, it's not cutting and pasting, it's not taking from a database of material. The way they work is that they learn relationships among information and then they use those relationships to generate new combinations of information. So in the case of art, what we're talking about now, they learn about the relationships in visual imagery and then use the data that they've learned about those relationships to make new images. But the possibility exists for you to go through that process and either intentionally because humans are giving the tool a suggestion that causes it to happen or unintentionally, it spits out something that ends up looking like part of the training data and then you could have an infringement problem. 


03:59

Adam Stofsky
Is there a legal standard yet that explains how much it has to look like the previous work? Is it a kind of you know it when you see it thing or who decides this and what's the standard? 


04:11

Annette Hurst
Right. So courts are the ones who decide it. And in general the test is something called substantial similarity. And you know, depending on where you are in the country that can take different legal forms, but you know, when you see it is a pretty good summary. 


04:29

Adam Stofsky
Anything else to say? Just for your average business owner on this risk, how do you avoid it? 


04:37

Annette Hurst
It's not clear whether you can really avoid it if you don't have control over the model and you don't understand what went into the training dataset. If you really care about avoiding this risk, what you want to do is look at the tools that are available to you and you want to think about using a tool that actually has a completely authorized data set. So even if you do end up matching on the backend, it doesn't matter. 


05:04

Annette Hurst
Because you have all the permissions that you need. 


05:06

Annette Hurst
So it's more of a risk mitigation through tool selection and also through the. 


05:12

Annette Hurst
Contractual terms that you might enter into giving you protection from the tool provider, right? 


05:19

Adam Stofsky
So a provider might say, hey, you know what, we're going to identify you against any ip infringement claims because we know this is all our stuff. 


05:26

Annette Hurst
Right? Exactly. 


05:28

Adam Stofsky
Any other sort of major things to think about in terms of copyright and sort of the right to use that copyright? 


05:36
Annette HurstSure. I mean I mentioned earlier there are a lot of different kinds of creative works that are subject to copyright. One of the other major areas where people are using generative AI tools to help in an area that's covered by copyright is software, it's source code and source code is covered by copyright. There are a number of tools out there, all the way from open source to commercially licensed models that are helping coders write code. Again, you go through this risk calculus where you look at, what am I using it for? Is it purely for internal purposes, or am I writing a program that I'm going to try to sell to the world? Depending on what the answer to that question is, you might have to approach the question of, do I have the freedom to use this in different ways?

PDFs
Audio
Share Video
Embed Video
© 2024 Briefly