Be careful in use of moderators
Section 512(c) of the Digital Millennium Copyright Act (DMCA) is largely responsible for the plethora of user-generated content on the Internet. Section 512(c) provides that a service provider (e.g., a website) will not be liable for damages for copyright infringement claims arising from content posted “at the direction of a user.” Facebook, Twitter, YouTube, Instagram, etc., etc. would not be able to exist if not for this little section of law.
The importance of the immunity granted here cannot be overstated. However, there are limitations on this immunity. One such limitation is that the content must be on the site “at the direction of a user.” In April, a Ninth Circuit Court of Appeals in San Francisco (Mavrix v. LiveJournal), ruled that moderating user-generated content prior to posting might remove the immunity defense.
The Case
LiveJournal is a collection of user-generated communities with various themes, where the users post content and each community is able to create its own rules for submitting and commenting on the posted content. Primarily, volunteer users are used to moderate content prior to posting to confirm that the content complies with the community’s rules. At first blush, LiveJournal community content appears to be about as “user-generated” as can be.
“Oh No They Didn’t” is a LiveJournal community comprised of celebrity gossip. Mavrix, which is a photo agency specializing in celebrity photos, filed suit against LiveJournal for hosting 20 of its photos in the gossip community. The District Court granted summary judgment finding that LiveJournal should be shielded by the DMCA immunity provisions. However, the Ninth Circuit Court of Appeals reversed that finding sending the case back to the District Court to determine whether the photos qualify as user-generated content.
What’s the issue?
As mentioned, Section 512 requires that the content be posted “at the direction of a user.” Which means the user is posting the content. However, the Court of Appeals considered that the moderation process as implemented by LiveJournal might have converted the moderators into the agents of LiveJournal, which would mean that the content was posted “at the direction of LiveJournal” resulting in no immunity.
In making its decision, the Court of Appeals reviewed the level of control that LiveJournal had over the moderators. For instance, in addition to community-generated rules, LiveJournal gave express directions for what content was to be accepted or rejected, and almost two thirds of posts were rejected. LiveJournal also hired a moderator to supervise the volunteer moderators.
The problem
The problem is that legal uncertainty in this area may negatively impact users of the many sites composed of user-generated content. For instance, use of moderators to filter user-generated content greatly benefits users by removing undesirable content such as pornography, hate/racist comments, irrelevant comments, commercial postings, etc. The Electronic Frontier Foundation, among others, has strongly objected to the Ninth Circuit’s decision.
Also, it may appear that this case strengthens the rights of copyright holders. However, copyright protection organizations such as the Motion Picture Association of America (MPAA) has demanded that platforms moderate content in order to remove infringing material. Ironically, the effect of this case might be removing or minimizing moderation in order to be sure to maintain immunity under the DMCA, therefore increasing infringing activity.
What to do now?
There is nothing in the DMCA that prohibits the use of moderators. So, the answer is not removal of moderators. Rather sites now need to review their moderation policies in order consider whether the level of moderation might result in a claim that the moderators are acting as agents of the site.
Other methods of moderation that are not problematic include (1) use of automated filters and (2) moderation after posting to remove undesirable content.