On Demand Library
Created in partnership with
Generative AI and Confidential Information
3:44
Legal Disclaimer 
The information provided in this video does not, and is not intended to, constitute legal advice, instead, all information, content, and materials available on this site are for general informational purposes only. The law changes fast, so information in the video may not constitute the most up-to-date legal or other information. 
Transcript

00:08
Adam Stofsky
Can you comment a bit on the issue of kind of confidentiality and trade secrets and generative AI? This kind of worries me. What do we have to think about? 


00:17

Annette Hurst
Yeah. So this is worrying a lot of people. So, you know, in a business, you might have confidential business information that you don't want disclosed to your competitors or to the outside world. You know, some examples are you've designed a product and it's super confidential and you haven't released it yet. And let's say you're working on a marketing plan for that brand new product. Your competitors don't know about it yet, and your marketing team starts using a generative AI and putting in prompts related to your new product, and they describe the new product and they disclose all kinds of things about it in those prompts. 


00:53

Annette Hurst
Well, if you don't have the right agreements in place, those prompts can be used by the model to train the model, and as a result, the data can become available to the model and can be spit back out to somebody else. Worst case scenario, your competitor. Right. Another example is, let's say you're using a generative AI tool to summarize or analyze company confidential financial information. That could result in the worst case scenario if you're a public company and you disclosing inside material. Inside information. Right. So there can be some very serious risks associated with disclosing information in prompting these tools or providing them with input that's based on company operations. Now, those risks can be mitigated in a lot of instances by entering into commercial contracts for the use of the tools within the company. 


01:51

Annette Hurst
But what you don't want to do is have your employees out there using them without your knowledge so that you don't even have a chance to identify and mitigate those risks through appropriate commercial arrangements. 


02:03

Adam Stofsky
And one of those contracts in broad strokes, what do they look like? 


02:07

Annette Hurst
Yeah. So the key is, like, if you enter into an enterprise agreement with a provider who's a reputable commercial supplier, then you'll get a non disclosure agreement. Right. You'll get an agreement that they're not going to use your inputs to retrain the model and so they can remain confidential. You might also get some reps and warranties or some indemnities related to the operation of the model. 


02:34

Adam Stofsky
This is a routine matter, sending out and signing NDAs. Make sure they include language on AI and AI training in 2023. 


02:44

Annette Hurst
Wow. That's a good question. I mean, yeah, just, I have not thought about that. But just hearing you say it right, you would want to specify probably that information that I supply to you under NDA is not going to be input into an AI without adequate confidentiality protections to the service provider. 


03:08

Adam Stofsky
Right. So this means if some other entity you're in business with has access to your information, they can't use it to train any kind of AI tool because it might spit out something that's confidential. 


03:19

Annette Hurst
Yeah. In this case, we're not really talking about training, just inputs are prompting. So training is a whole different thing. And that's an area where you need to worry about trade secrets and privacy and data harms as well. 


03:31

Adam Stofsky
Right. So their prompt shouldn't involve anything about your company or data. 


03:36

Annette Hurst
Right. 


03:37
Adam StofskyThat's interesting. Yeah.

PDFs
Audio
Share Video
Embed Video
© 2024 Briefly