Skip to content

AI in Content Creation: Key Takeaways from Loeb & Loeb’s AI Summit Roundtable

At Loeb & Loeb’s AI Summit in Los Angeles on April 21, 2026, I hosted a roundtable on the use of AI in content creation.

Generative AI is transforming the creative process—from pitch writing and concept development to game design and visual world-building. In-house legal departments are navigating uncharted territory, but there is genuine excitement about AI’s possibilities. As a result, many AI providers are working collaboratively with content companies—including across film, television, documentary, unscripted and gaming—to build new business and legal frameworks for the innovations ahead.

One theme emerged clearly in the roundtable discussion—in-house legal teams are finding more ways to say “yes” to using AI in content creation. Whether the driver is creative ambition, budget efficiency or business necessity, getting to “yes”—or at least “maybe”—requires constructive engagement with colleagues, talent, vendors and AI providers to find workable solutions for the legal issues that can arise.

Our roundtable discussion focused on four key areas: (1) AI legal language in talent and vendor agreements, (2) copyright implications, (3) AI tool “terms of use” and (4) state-specific digital replica laws.

Talent and Vendor Agreements: Form Contract Language for AI

Two AI-related provisions have become central to agreements between content companies and their third-party talent and vendors. First, third-party talent and vendors are typically restricted from using AI tools in connection with their services or inputting any of the IP they create into AI systems unless approved in advance by the content company's legal department. Second, content companies continue to reserve broad rights to digitally alter talent’s image, likeness, voice and performance, but are now including additional language around their right to broadly use AI and machine learning technologies for those alterations.

So, how and when are legal departments saying “yes” (or “maybe”) to the use of AI tools? And how broad are the rights of content companies to digitally alter talent's image, likeness, voice and performance with AI? The discussion from the roundtable centered around copyright, the “terms of use” of AI providers and state-specific digital avatar laws.  

Copyright: Human/Machine Collaboration in the Gray Zone

The customarily black-and-white world of work-for-hire agreements—where humans create IP and copyright ownership flows clearly to the commissioning party—is becoming increasingly gray as AI tools collaborate with human creators. 

The U.S. Copyright Office’s guidance on Copyright and Artificial Intelligence, Part 2: Copyrightability, published January 17, 2025, analyzes “the type and level of human contribution sufficient to bring [AI-assisted] outputs within the scope of copyright protection.”

This is not a new issue. In 1965, the Copyright Office observed: “The crucial question appears to be whether the ‘work’ is basically one of human authorship, with the computer merely being an assisting instrument, or whether the traditional elements of authorship in the work... were actually conceived and executed not by man but by a machine.” The 2025 guidance builds on this: “Where AI merely assists an author in the creative process, its use does not change the copyrightability of the output. At the other extreme, if content is entirely generated by AI, it cannot be protected by copyright.”

Critically, the Copyright Office concludes that prompts alone “do not provide sufficient human control to make users of an AI system the authors of the output” and that repeatedly revising prompts is merely “re-rolling’ the dice”—not authorship. However, a human may “select or arrange AI-generated material in a sufficiently creative way” or “modify material originally generated by AI technology to such a degree that the modifications meet the standard for copyright protection.”

Roundtable participants note that comfort is often found when machine-created elements are in the background—not central to key creative elements, brand representations or monetizable assets. But if those machine-created elements are central, the gray areas around human vs. machine creation can lead to copyright protections falling short. The good news—many content companies are developing internal workflows to help creative teams understand where the line is, and AI providers are increasingly offering transparency and collaboration around how their tools interact with human-created intellectual property.

AI Terms of Use: A Contrast with Traditional Vendor Agreements

Content companies have long relied on VFX and post-production vendors who provide fulsome representations, warranties, indemnities and insurance—if their delivered IP is infringing, then there’s a clear path to risk mitigation. By contrast, standard AI provider click-through “terms of use” are very different—AI tools are typically provided “as is” with few (if any) warranties, indemnification runs one way (the customer typically indemnifies the AI provider, not vice versa) and liability gets capped at the amount of fees paid or at hundreds or thousands (and not millions) of dollars.

The key takeaway—track when the use of AI tools is falling under click-through “terms of use” and be very cognizant of their terms, and when possible, negotiate case-specific enterprise licenses with AI providers. The encouraging news is that many AI providers recognize the content industry’s unique needs and are willing to engage in substantive negotiations. Additionally, content companies with large enterprise agreements may be able to allocate their seats to their third-party production companies and vendors under those agreements—creating a more protected ecosystem for AI-assisted content creation.

Digital Replica Laws: Navigating CA and NY's State-Specific Requirements

As AI tools are increasingly used to enhance and modify talent’s image, likeness and voice, state-specific digital replica laws are essential to consider. 

A “digital replica” is a synthetic performance generated by AI using an individual’s image, likeness or voice. Both California and New York require contracts for digital replicas to include a “reasonably specific description of the intended uses”—though what that means remains ambiguous. California’s legislative history offers some guidance—broad grants allowing perpetual use of a performer’s “name, voice (actual or simulated), likeness (actual or simulated) and biography... in any and all media... throughout the universe and in perpetuity” would not be enforceable. Both states do provide a safe harbor from the “reasonably specific” requirement if the contract gets negotiated by a lawyer or labor union.  

Practical recommendations from roundtable participants—be very specific in describing intended uses, ensure that talent has an attorney to negotiate on their behalf and require talent to explicitly confirm that attorney representation in the contract. These requirements add process but ultimately protect both parties—creating clearer expectations and reducing the risk of disputes down the road.  

Last, while NY and CA are often the states most relevant to the creation of content, it is important to determine whether any digital replica/AI laws from other states are applicable, which will likely depend on where the content is being made and where the talent, entertainment companies and AI providers reside. 

Conclusion

The legal analysis for AI in content creation is increasingly nuanced, but this complexity need not be an absolute block. Content companies that invest in understanding these issues—and that work collaboratively with AI providers to address them—will be best positioned to harness AI’s benefits while managing associated risks. The AI legal landscape is evolving rapidly, and so are the legal departments and AI providers leading the charge, shaping a future where creativity, compliance and innovation can advance in tandem.

* Finally, note that the above discussion does not address guild requirements, biometric data, privacy, E&O insurance, AI-related bond company requirements and other evolving considerations—all of which also require careful legal analysis (and with many of those other considerations having been discussed in other roundtables at Loeb & Loeb's LA AI Summit).