Collaborative AI Fan Fiction Video: A Context Primer
This post was inspired by a recent rewatch of Freaks and Geeks during a layover in Heathrow. It’s a great show that should have had a second season but it didn’t. I hate when this happens. So what if we could just… make our own second season? Fan fiction is already a thing. With advances in Ai video generation, maybe video fan fiction is feasible soon. What would take—legally, technically, culturally? That’s what I’m starting to investigate, starting with this post.
Document Purpose
This is an ai generated summary [take it with a huge grain of salt] of an exploratory conversation I had with Claude about the future of collaborative fan-created video content using AI tools. It covers legal frameworks, technical i…
Collaborative AI Fan Fiction Video: A Context Primer
This post was inspired by a recent rewatch of Freaks and Geeks during a layover in Heathrow. It’s a great show that should have had a second season but it didn’t. I hate when this happens. So what if we could just… make our own second season? Fan fiction is already a thing. With advances in Ai video generation, maybe video fan fiction is feasible soon. What would take—legally, technically, culturally? That’s what I’m starting to investigate, starting with this post.
Document Purpose
This is an ai generated summary [take it with a huge grain of salt] of an exploratory conversation I had with Claude about the future of collaborative fan-created video content using AI tools. It covers legal frameworks, technical infrastructure, existing precedents, and a proposed vision for how this space might evolve. It’s purpose is as personal document to serve as a context primer for future conversations with AI assistants and industry professionals. If you’re reading it online, I’ve shared it here to document my progress and hopefully attract feedback and collaborators.
Table of Contents
- The Core Idea
- Existing Models for Collaborative Fiction
- Legal Landscape
- Fan Fiction Legal Status
- Deepfake and AI-Generated Content Laws
- Platform Liability and Section 230
- Right of Publicity
- Decentralized Platform Considerations
- Why Decentralization
- Existing Decentralized Moderation Approaches
- The Moderation Challenge
- The Sample Library Analogy
- Proposed Vision
- Path Forward
- Open Questions and Concerns
- Key References and Resources
The Core Idea
With the emergence of AI video generation tools, it’s now technically possible for fans to create new episodes of TV shows using AI-generated versions of characters, actors’ likenesses, and established fictional worlds. This raises a fundamental question:
Could there be a legitimate, community-driven ecosystem where people collaboratively create fan video content—not for commercial purposes, but for creative expression, skill development, and community engagement?
The analogy to consider: In the music world, sample libraries democratized orchestral composition. Someone who could never afford the London Symphony Orchestra can now practice arranging, create demos, and develop their skills using virtual instruments. This doesn’t compete with real orchestras—it creates a training ground and enables creative expression.
Could the same model apply to film and television? Could the “raw building blocks” of a TV show—characters, voices, storylines—become available as creative assets for non-commercial use?
Existing Models for Collaborative Fiction
How Collaborative Fan Fiction Currently Works
Most fan fiction operates without coordination toward a shared canon. Each author writes standalone stories, and there’s no attempt to make them compatible. Platforms like AO3 and FanFiction.net are publishing platforms, not collaborative writing spaces.
For projects that do attempt shared continuity, several governance models exist:
| Model | Description | Example |
|---|---|---|
| Curated/Voted | Community upvotes/downvotes submissions; low-rated content is removed | SCP Foundation |
| Moderated | Editorial gatekeepers review submissions and maintain canon “bible” | Star Wars Expanded Universe (pre-Disney) |
| Claimed Territories | Authors stake out different corners of the world to avoid overlap | Various shared universe projects |
| Forking | Someone takes shared foundation and spins off their own branch | Open source model |
SCP Foundation: The Gold Standard
The SCP Foundation represents the most successful large-scale collaborative fiction project. Key elements of their model:
- Wiki platform with community voting
- Deletion threshold for low-quality content
- Senior “staff” who maintain tone guides and can intervene
- Strong culture of critique and revision before posting
- High barrier to participation (must write well enough to survive voting)
- Modular format (individual entries, not continuous narrative)
Amazon Kindle Worlds (2013-2018): A Failed Experiment
Amazon attempted to create a commercial platform for licensed fan fiction:
- Authors received 35% royalties for works over 10,000 words
- Rights holders got a cut; Amazon took theirs
- Strict content restrictions: no pornography, no crossovers, no “offensive content”
Why it failed: In June of its operation, 46 Pretty Little Liars stories were posted to Kindle Worlds. During the same period, 6,000 stories were posted to other fan fiction sites. Fan fiction.net alone sees roughly 100 stories per hour.
Key insight: Fan fiction isn’t produced for profit, but for love. The gift economy nature of fan creation doesn’t translate well to commercial structures. Heavy-handed content restrictions killed engagement.
Star Trek Fan Film Guidelines
CBS and Paramount released official guidelines in 2016 allowing fan films under specific conditions:
- Maximum 15 minutes for single story, or 30 minutes total across 2 segments
- No additional seasons, sequels, or remakes
- Fundraising capped at $50,000
- Must be amateur (no compensation for participants)
- Must be family-friendly (no profanity, nudity, drugs, etc.)
- Cannot be distributed physically (DVD/Blu-ray)
- Must include disclaimer that it’s not affiliated with CBS/Paramount
This represents a “we won’t sue you if you stay small and clean” approach.
Legal Landscape
Fan Fiction Legal Status
Written fan fiction exists in a legal gray area that has evolved into an informal detente:
Why it’s technically infringement:
- Fan creations based on existing works are derivative works
- Copyright holders have exclusive right to create or authorize derivative works
- Using core characters, settings, or storylines weighs against fair use
Why it’s tolerated:
- Non-commercial, transformative works are more likely protected under fair use
- Rights holders recognize fan works as free marketing
- Suing fans creates PR backlash
- The moment someone monetizes, they become a target
Fair use factors courts consider:
- Purpose and character of use (non-commercial/transformative favored)
- Nature of the copyrighted work (creative works get strong protection)
- Amount and substantiality of portion used
- Effect on potential market for original work
Deepfake and AI-Generated Content Laws
Federal Law (as of 2025):
The Take It Down Act (signed May 2025) is the first federal legislation to criminalize non-consensual deepfake pornographic content. It:
- Creates a nationwide framework for victims
- Gives victims the right to demand takedowns from platforms
- Allows victims to report violations to federal authorities
- Enables lawsuits across state lines
For non-pornographic deepfakes, federal prosecutors must show the defendant intended to cause, or did cause, financial, psychological, or reputational harm.
Penalties for publishing deepfake pornography: 18 months to three years federal prison time, plus fines and property forfeiture. Harshest penalties apply when the image depicts a minor.
State Laws:
Over half of states have enacted laws prohibiting deepfake pornography, with significant variation:
- Some require proof of intent to harm
- Some are broader than others in what they cover
- California, Florida, Virginia, Georgia, Hawaii, New York have specific provisions
- Inconsistency creates a patchwork where legal recourse varies by state
Key distinction: Pornographic deepfakes are now clearly illegal for creators. Non-pornographic fan content using AI-generated likenesses remains legally ambiguous—sitting somewhere between existing fan fiction tolerance and newer deepfake concerns.
Platform Liability and Section 230
Section 230 of the Communications Decency Act (1996):
“No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”
What this means:
- Platforms cannot be held liable for content users post
- Platforms can choose to moderate (or not) without becoming liable
- This has been called “the 26 words that created the internet”
Recent changes and pressures:
- FOSTA (2018) carved out sex trafficking from Section 230 protections
- The Take It Down Act conditions some protections on platform action
- Proposed Deepfake Liability Act would require “duty of care” for platforms to maintain immunity
- Multiple legislative efforts aim to further narrow Section 230
Practical implications:
- US-based platforms that comply with DMCA takedowns retain safe harbor
- Platforms that stop complying become directly liable
- This is why “alt-tech” platforms like Rumble still enforce copyright—they use automated flagging for copyright infringement and follow standard DMCA procedures
Right of Publicity
Separate from copyright, actors own their likeness through right of publicity laws:
- Using someone’s face without permission is a potential lawsuit
- This is state law (California’s is particularly strong)
- Survives death in some states
- Creates additional legal exposure beyond just copyright issues
Emerging frameworks:
SAG-AFTRA has been establishing frameworks for AI likeness licensing:
- Agreement with Replica Studios for digital voice replicas in video games
- Agreement with Narrativ for voice licensing in advertising
- Core principles: consent + compensation + control
- Performers can set their rates and approve/deny uses
This suggests a potential path toward legitimate licensing of likenesses for creative use.
Decentralized Platform Considerations
Why Decentralization
The concern: Any centralized platform hosting fan-created AI video would be subject to:
- DMCA takedowns
- Direct legal liability if they stop complying
- Potential shutdown through legal pressure
A decentralized approach could theoretically provide:
- No central point to sue or shut down
- Censorship resistance
- Community-driven moderation instead of corporate control
Legal Status of Decentralized Protocols
Key precedents:
- Sony Betamax (1984): If a technology has “substantial non-infringing uses,” the maker isn’t liable for how users misuse it
- BitTorrent itself: The protocol is legal; Bram Cohen was never sued
- BitTorrent clients: All legal—they’re just software implementing a protocol
Where liability attaches:
- Napster: Had central servers indexing infringing files—shut down
- Grokster: Supreme Court found liable because they actively induced infringement (marketing targeted Napster refugees)
- Pirate Bay: Founders imprisoned—ran a site specifically indexing infringing content
Modern decentralized social platforms:
- Mastodon (federated)
- Nostr (protocol-level, no central operator)
- IPFS (just a protocol)
- Farcaster (decentralized, VC-funded)
All are legal. The liability concentrates at the discovery/curation layer—whoever runs the index or search that helps people find content.
Existing Decentralized Moderation Approaches
Token Curated Registries (TCRs)
A blockchain-based approach where token holders vote on content inclusion/exclusion:
- Content creators stake tokens to submit
- Token holders can challenge inclusions
- Winners get staked tokens; losers lose their stake
- Creates economic incentives for quality curation
Challenges:
- “Free-rider” problem: token holders stay passive, hoping others curate well
- “Vote memeing”: copying others’ voting behavior
- “Coin flipping”: random votes to save time
- Complexity barriers to participation
Web of Trust (Nostr’s approach)
Instead of one global truth about “what’s good,” each user sees content filtered through the trust graph of people they trust:
- Users represented by public keys
- Moderation handled client-side
- Users choose whom to follow or block
- Relays can block public keys, but users can switch relays
Key insight: “The curation of a simple list by one’s web of trust should be considered the atomic unit of the decentralized web.”
Opt-in Blocklists
Similar to email spam filtering:
- Users, clients, and relays opt into blocklists they want to use
- Blocklist providers maintain and update lists
- Different providers can have different standards
- Market determines which blocklists succeed
Potential application: Relay operators subscribe to blocklists; users choose relays based on moderation quality they prefer.
Relay-Level Filtering
Tools like Nostr filter relay packages can filter content based on:
- Content type (SFW/NSFW)
- User type
- Language
- Hate speech/toxic comments
- Sentiment
- Topic
Individual relay operators run their own filtering; users choose which relays to connect to.
The Moderation Challenge
The core tension: A decentralized, censorship-resistant platform could be abused for content the community doesn’t want (pornography, illegal material, harassment).
Possible hybrid approach:
- Protocol layer is dumb and permissionless
- Relays compete on curation quality (some permissive, some strict)
- Web of Trust + reputation determines what surfaces by default
- Opt-in blocklists for CSAM and truly illegal content
None of these approaches have been battle-tested at scale for video content.
The Sample Library Analogy
This is the central insight that reframes the entire discussion.
How Music Sample Libraries Work
- Spitfire Audio sells you a virtual orchestra for $500
- You’re not competing with the London Symphony Orchestra
- You’re learning to arrange, building a portfolio, making demos
- Nobody expects you to pay royalties on your practice pieces
- Maybe 1% of users “graduate” to real orchestras, but that’s not the point
- The value is in the practice, the community, the creative expression
The Evolution of Music Industry Norms
Originally, the only person who could have a Steinway piano on a recording was someone with access to a Steinway. Then:
- More pianos became available to rent
- Sample packs emerged
- Virtual instruments became ubiquitous
- Sampling became normalized (with legal frameworks)
The industry “grew up” about what reasonably could be controlled as intellectual property.
Applied to Film/TV
What if someone could license a “Freaks and Geeks Character Pack”—or it was simply available for non-commercial creative use?
- You’re not competing with NBC/Apatow
- You’re learning to direct, write, produce—making demos
- The community shares and enjoys what you make
- Maybe 1% of users get noticed and go on to real productions
- The value is in the practice, the community, the creative expression
Why This Framing Helps Studios
If a studio executive thinks about this correctly:
- This is a free farm system for talent development
- People making fan episodes are practicing to make shows for the industry someday
- Studios can see who’s good before hiring them
- IP stays culturally relevant at zero cost
- None of this competes with anything being sold (especially for canceled shows)
- It’s not commercial use, so legal exposure is different
Proposed Vision
The Non-Commercial Creative Commons for Visual Media
A future where:
- “Raw building blocks” of TV shows (characters, voices, storylines) are available as creative assets
- Non-commercial use is explicitly permitted under clear terms
- Community can share, collaborate, and enjoy fan-created content
- Creators develop skills in directing, producing, writing
- No expectation of commercial revenue—this is practice, sketching, creative expression
- Clear distinction between “demo/practice” and “commercial release”
What Currently Doesn’t Exist
Music has decades of evolved infrastructure:
- Standard licensing terms everyone understands
- Platforms built for sharing (SoundCloud, Bandcamp)
- Cultural norms around what’s okay
- Clear delineation between “demo/practice” and “commercial release”
Film/TV has none of this:
- No standard “non-commercial creative use” license for characters/likenesses
- No platform designed for this kind of sharing
- No cultural norms around AI fan video
- No framework distinguishing “I’m practicing/sketching” from “I’m pirating”
Path Forward
Phase 1: Proof of Concept with Willing Participants
Find IP holders who want this to exist. Candidates:
- Creators of canceled/dormant shows who’d love their work to live on
- Independent filmmakers thrilled to have characters become “practice material”
- Estates of deceased creators wanting to keep work culturally alive
- Public domain adjacent content (older works, lapsed copyrights)
Build a small community around those willing participants. Demonstrate:
- It works
- It produces good creative work
- It doesn’t spiral into abuse
- Document everything
Phase 2: Demonstrate Value to Industry
- Track how many community participants go on to professional work
- Show the “talent pipeline” value proposition to studios
- Document the PR/cultural relevance benefits
- Build case studies proving the model
Phase 3: Formalize the Framework
Once there’s proof it works:
- Work with SAG-AFTRA on a “non-commercial creative use” framework for likenesses
- Work with studios on a “dormant IP creative commons” model
- Build standard terms that everyone can adopt
- Create platform infrastructure that makes compliance easy
The Bootstrapping Problem
This is fundamentally a “public good” problem. Someone must:
- Build the platform
- Recruit willing IP holders
- Set up community norms and moderation
- Do it without revenue (since it’s non-commercial)
- Have enough credibility that IP holders trust them
The value accrues to everyone (aspiring creators, IP holders, the industry, culture broadly), but nobody has a direct incentive to build and fund it.
Open Questions and Concerns
Legal Questions
- How would non-commercial AI fan video be treated under fair use analysis?
- Can a framework be created that clearly distinguishes non-commercial creative use from infringement?
- How do right of publicity laws interact with fan-created content using likenesses?
- Could a compulsory licensing model (like music) ever emerge for derivative video content?
Technical Questions
- What platform architecture balances decentralization with necessary moderation?
- How do you prevent abuse (pornographic content, harassment) in a permissionless system?
- Can Web of Trust models scale to video content communities?
- What’s the right discovery/curation layer that doesn’t create liability?
Economic Questions
- Who funds the infrastructure for a non-commercial platform?
- How do IP holders benefit enough to participate willingly?
- Can this create a genuine talent pipeline that studios value?
- What’s the business model for sustainable operation?
Community Questions
- How do you establish content norms without killing creative freedom?
- What governance model prevents the community from fragmenting?
- How do you handle disputes about canon/continuity in collaborative works?
- How do you build critical mass without a commercial driver?
Cultural Questions
- Will established creators/actors see this as threat or opportunity?
- How long until norms shift to accept AI fan video like they accept written fan fiction?
- What catalyzing event might accelerate acceptance (like a high-profile positive example)?
Key References and Resources
Legal Frameworks
- Section 230 of the Communications Decency Act (1996): Platform immunity for user-generated content
- Take It Down Act (2025): Federal criminalization of non-consensual deepfake pornography
- DEFIANCE Act: Proposed civil remedy for digital forgeries
- Star Trek Fan Film Guidelines (2016): Example of IP holder creating explicit fan work terms
- Sony Betamax (1984): Substantial non-infringing use doctrine
Industry Frameworks
- SAG-AFTRA AI Agreements: Replica Studios (2024), Narrativ (2024)—consent + compensation + control model
- NO FAKES Act: Proposed federal right to voice and likeness
Platforms and Precedents
- Kindle Worlds (2013-2018): Failed commercial fan fiction experiment
- SCP Foundation: Successful large-scale collaborative fiction
- Nostr: Decentralized social protocol with client-side moderation
- Token Curated Registries: Blockchain-based content curation mechanism
Community Resources
- Archive of Our Own (AO3): Non-commercial fan fiction platform
- Organization for Transformative Works: Fan advocacy organization
- Memory Alpha / Memory Beta: Star Trek wiki communities (canon and non-canon)
Document History
- Created: December 2024
- Context: Exploratory conversation about collaborative AI fan video
- Purpose: Context primer for future AI conversations and industry discussions
This document represents speculative exploration of emerging technology and legal questions. It is not legal advice. Anyone considering projects in this space should consult qualified legal counsel.