By: Steve Outing
Over at Backfence.com — one of a wave of new “citizen-journalism” enterprises beginning to dot the Internet landscape — when co-founder Mark Potts looks at his expense sheet, he sees lots of legal bills. Indeed, says the online pioneer who helped start Washingtonpost.com a decade ago, legal expenses are the biggest line item of the entire start-up business so far.
That’s in large part because citizen journalism (or grassroots media, or whatever you want to call it) is so new that those experimenting with it are trying to figure out the legal angles of this new kind of media. If you want to run a citizen-journalism Web site, find yourself a good lawyer before opening up to the public.
The elephant in the living room is legal liability for what citizen contributors post on these sites. If someone posts something that, say, libels someone else on Backfence.com or a citizen-journalism site of any of a number of newspapers or other news outlets that have entered this game, could the site be jointly held liable in the event that the aggrieved party files a lawsuit?
Probably not, if the site handles the situation correctly. It can depend on whether citizen-submitted content is edited or not. It can also depend on how a California court case turns out.
The risk perhaps isn’t as great as you might think for publishing unvetted citizen content — but any publisher entering the citizen-journalism space needs to go in with eyes wide open and knowledge of the risks.
To edit or not to edit
In reviewing the growing number of citizen-journalism websites popping up now, you generally find a dividing line on the editing issue.
On the one side are sites like Backfence.com, which permits anyone to post an article or photograph to its local Web sites without it being screened by an editor prior to publication. The way that inappropriate content from citizens is handled is by a “report misconduct” button or link attached to each article or photo. If a site user notices something that’s defamatory, infringes copyright, is racist, pornographic, a prohibited commercial message, etc. — anything that violates the site’s terms of service agreed to by site users — then editors are alerted and can remove the item if necessary.
On the other side are sites that believe that editing citizen content is a good thing — that it enhances the overall product and makes for a better read for site users, by ensuring higher quality and by weeding out “bad” stuff before it makes its way to a user’s computer screen. The legal liability is a bit greater with this approach, but not significantly at this point in the evolution of Internet publishing law.
Why take a hands-off approach? Potts says there are two principal reasons: less legal liability, and lower cost. The former is what swayed him. The more seriously you edit citizen submissions, the more potentially liable you are for it, he says.
And besides, he says, the Backfence.com experience so far is that the content submitted to the company’s two local-news sites by community members is “pretty high quality” — at least as good as the typical press release that shows up in editors’ inboxes. He doesn’t expect pro-quality content from citizen contributors, because the site is about community members speaking in their own voices. And the sites have seen very little abuse; Potts recalls deleting only one citizen submission (a commercial message).
Critics of the hands-off approach to editing are coming from a mindset of classic editorial control — something you need to rid yourself of if you want to succeed in citizen journalism, Potts says.
The yes-we-edit model is represented by a number of Web sites, including iTalkNews.com. Co-founder Liz Lee says her site edits citizen submissions for journalistic-accountability reasons. It’s about using the power of professional editors to craft a Web site that adheres to accuracy, ethics, and simple journalistic writing standards. Her site may have fewer articles than free-for-all sites because not everything is accepted, but those that are generally are of higher quality, she says.
That approach suits Clyde Bentley, an associate professor of journalism at the University of Missouri and advisor to the student citizen-journalism site MyMissourian. His view is that submissions to citizen-journalism sites should be treated no differently than a newspaper’s letters to the editor, which typically are line-edited for spelling and grammar and screened for issues like defamation, copyright infringement, factual adherence, taste, and so on.
To simply let anything and everything be published unaided on a citizen-journalism site is shirking our duties as journalists, he says. “That’s the coward’s way out. And we’re not a profession of cowards.”
Of course, this means hiring enough editors to screen everything that comes in. But that shouldn’t be too onerous, Bentley believes, because probably 90% of what is submitted by community members is of the innocuous variety — Aunt Jane’s recipe, reports of a little league team’s triumph, etc. — and will require little more than a quick glance. You’ll need editors, but not too many. “It’s gonna cost you something” to do it right, he says.
And what’s the big advantage of editing citizen content? A better citizen-driven editorial product — and thus a competitive advantage over entrepreneurs who choose the no-editing, free-for-all approach, says Bentley.
What the law says …
OK, so now you know the two basic choices of citizen journalism. But what does the law say? Does it demand that you choose one over the other in order to publish safely?
Actually, no. From the perspective of minimizing risk the most, the hands-off model is clearly the safest way to go, say two media and intellectual property attorneys I consulted for this article. But for just a little more risk, the decision to edit or pre-screen citizen submissions is still a rational one.
Pardon me while I review a few (U.S.) cases that lead up to today, to help you understand today’s situation.
In the U.S., case law is still evolving, but it dates all the way back to 1991 and the case of Cubby v. Compuserve Inc., in which the proprietary online service was held not to be responsible for the content its members posted on the company’s bulletin boards. (That would seem to apply to citizen-journalism sites that allow community submissions and don’t edit them.)
In 1995, the picture was rounded out a bit with Stratton Oakmont v. Prodigy, which held that an online service could be held liable for subscriber content under certain circumstances. Prodigy, another proprietary online service, had positioned itself as a monitor of content posted in its forums — in order to make Prodigy a family friendly place. So the court in this case ruled that Prodigy was liable because of its actions as overseer and nanny over the content in its forums. (This would seem to indicate that it’s best to take the no-editing stance with citizen content, for editing is pretty risky.)
Now jump to 1996, when the U.S. Congress pass the much-maligned Communication Decency Act. Many aspects of the CDA were ruled as unconstitutional restrictions of freedom of speech and struck down in 1997, but Section 230 of the act was not challenged and became law. Section 230 has been “a valuable defense for Internet intermediaries ever since,” according to the Electronic Frontier Foundation’s Legal Guide to Blogging.
What Section 230 says is that “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider,” and it pre-empts state laws to the contrary. As applied to citizen-journalism sites (presuming that the courts indeed would consider such a site an “interactive computer service”), Section 230 offers considerable protection to them from being sued if someone posts something defamatory. The poster is liable, but generally not the hosting Web site.
Intellectual property and media lawyer Jonathan
Hart, of the Washington, D.C., law firm of Dow Lownes & Albertson, explains that almost without fail, Section 230 is being read by the courts to cover publishers who provide forums for others to speak. Ergo, it should cover citizen-journalism sites.
Publisher vs. distributor
Where things may get sticky is in the difference between being a “publisher” and a “distributor” of content. Section 230 specifically refers to “publisher” liability, not “distributor” liability; it does not specifically differentiate between the two. The way that distributor liability can best be understood is in viewing how bookstores are treated. If a bookstore carries a book that contains a defamatory falsehood, it can’t be held liable for its content if its managers have not been alerted to the defamation. But if the store managers were told of the defamatory falsehood but did not remove the book from the shelves, then the store could in theory get sued by the defamed party.
So, as applied to citizen-journalism sites, if they are considered to be distributors of content submitted by community members, then it would appear that it’s vital that they remove any citizen content when informed that it contains a defamatory falsehood. Otherwise, they could be held as jointly liable along with the defamer should a lawsuit be filed by an aggrieved party.
That sounds like a pretty reasonable thing to do, eh? If you’re informed that a citizen contributor has posted something that appears to falsely defame another on your site (but you don’t really know for sure), you take it down in order to protect your site from liability.
Not so fast. Let’s think about the impact on the online community of the issues involved here, points out Hart. Of course it’s prudent to remove something that’s appears to be defamatory falsehood. But a knee-jerk reaction isn’t always the right one.
Consider the example of a citizen who submits an article about a little-league game, and mentions how the coach slapped a kid for dropping a ball. What if the coach complains to the citizen-journalism site’s editor that this article defames him and is not true? The poster of the article might have witnesses to back up his claim. So does the site editor remove the article just because the coach claims it’s a false charge?
Hart points out that when a reporter’s story is subjected to such a charge, the newspaper’s editors typically don’t retract it without an investigation. So should citizen journalism be treated completely differently?
Says Hart, “If the operator of the [citizen-journalism] site takes down every posting about which he receives a complaint, he’s essentially allowing members of the community to censor other people’s speech, which undercuts the integrity of the community forum, in which, ideally, offending speech is countered not with threats of litigation but with more counter-speech.”
A California case enters the picture
We get to today with a case currently before the California Supreme Court, Barrett v. Rosenthal. In this case, the court is reviewing a lower-court decision that held that service providers can be liable for a defamatory falsehood as a result of their republication of the defamation if they knew of or had reason to know the falsity of the statements.
If the California Supreme Court upholds that view of Section 230, then it would say to publishers of citizen-journalism sites (at least, in California) that they must remove all content that they are informed is false and defamatory or else face potential joint liability for what’s been published. If the decision goes the other way, it will be a resounding reaffirmation of the scope of Section 230 protections, says Hart, and should make publishers feel much more comfortable in allowing third parties to speak out.
So, where do we go from here? Some thoughts:
* The most secure safety net is not to touch or edit citizen submissions, and to take down citizen content that someone has complained about as being false and defamatory, or infringing copyright, or divulges a trade secret, or infringes privacy rights, and so on.
* Legal precedent is still evolving, but it’s probably safe to edit citizen submissions and still not be held liable for the content. But be careful with this, because if you introduce defamation into a citizen submission during the editing process that wasn’t there before, you may be liable, points out Hart.
* You should be safe in removing “bad” content that’s been pointed out to you by other users, or spotted by your site’s editors. Until that California case is decided (probably sometime this year), it’s probably prudent to do so promptly.
* Watch the discussion forums on your citizen-journalism site, too. The same principles apply in terms of liability as with the rest of the site.
* Watch for the outcome of the California case. If indeed the court reaffirms the protections under Section 230, it’ll give you a lot more leeway in terms of editing and reviewing citizen content. It will make it a bit safer to not remove a citizen posting that you may have received a complaint about. You’ll be more free to let citizen contributors have their say, even if it’s controversial or potentially troublesome.
* It’s a good idea for citizen-journalism sites to carry libel insurance, especially for smaller operations that might otherwise not be able to resist the pressure from a deep-pocketed challenger who wants something removed from a citizen-journalism site and can force the issue via its financial advantage. It’s insurance against having to react involuntarily to every threat, says Hart.
Newspapers operating citizen-journalism sites will need to add a rider to their existing libel insurance policy, according to Kevin Heller, an Internet and intellectual property attorney operating a private practice in New Jersey and New York. A typical newspaper policy will not cover citizen submissions or even a “blog network,” he says.
* Be careful in writing the terms of service for a citizen-journalism site. It should be broad enough to allow the site’s editors to remove just about anything imaginable that might be objectionable. And as Heller points out, it’s questionable whether some website terms of service are even binding. “Make sure that the readers/reporters have to check a box and click to agree,” he advises.
* Have site editors be on the lookout for not only defamation, but
also copyright infringement, trademark and trade-secret infringement. That’s the kind of stuff that you can’t necessarily count on site users being savvy enough to spot.