Moderation and Free Speech Online
Censorship. Moderation. For many users of virtual communities, the two terms are synonymous. However, while censorship stifles speech, moderation can increase the diversity of online communication.
There is a crucial difference between censorship and moderation. In censorship, someone is granted power to prohibit speech about a particular topic or event across a wide range of communication platforms. Censorship can deny communication about particular topics to large groups of individuals. In Mainland China, for example, all online traffic must travel through government controlled firewalls that track controlled words like Dalai Lama . Companies providing archival or search capabilities like Google must deny Mainland Chinese access to search results from prohibited sites.
In contrast, moderation is the practice of prohibiting speech in a particular virtual community by authorities within that community. A topic that is moderated on one virtual community can be communicated elsewhere. Moderation occurs in distinct forms in different virtual communities. On most posting forums, the moderator deletes or edits others' posts. On Internet Relay Chat, or IRC, the moderator grants speaking rights to particular individuals, and removes individuals who violate the moderation policy. On Usenet, a distributed posting forum, the moderator screens postings before allowing them to be viewed.
Moderation allows for a diversity of speech, because online groups can stifle the speech of members of other groups without resorting to regulation. Like every other action one can take, communicating in a forum has its costs and benefits. Communication in a forum also has network effects. The benefit increases as there are more interested parties in the forum. Censorship inhibits communication by either making it too costly to communicate, or by making the benefit of communication minimal.
There are numerous natural constraints to online speech. For example, most Usenet clients function like e-mail clients. Users must download fragments of all new messages (e.g., the subject title) to access them. The more messages that exist in a Usenet forum (called a newsgroup), the longer it takes for a person to access them. As the number of irrelevant or hurtful messages in a newsgroup increases, the cost to the reader per relevant message increases. Once that cost outweighs the benefits of the community, the reader leaves. As more readers leave, the benefit of the community to others also drops. This creates a cascade effect that causes more people to leave. The effect of a large number of irrelevant or hurtful messages is thus the same as censorship; that is, a group's ability to discuss a particular topic is curtailed.
To illustrate the problem, consider the cases of soc.culture.jewish (SCJ), or sci.psychology.psychotherapy (SPP). These are newsgroups that have no moderation. As a result, it is exceedingly difficult for Jews to discuss Jewish issues with other Jews on SCJ, and for psychotherapists to discuss psychotherapy with other psychotherapists on SPP. Because anyone can join and anyone can send any message, the purpose for which the newsgroup was established becomes diluted.
In most cases where a group finds value in only a few of the messages in a newsgroup, the problem can be addressed without moderation. Simply, members of the group that find low utility in the newsgroup migrate elsewhere. For example, Singaporean users of Usenet migrated from soc.culture.asean into a new newsgroup called soc.culture.singapore, because they wanted to talk about Singapore without having to receive communications about Indonesia or the Philippines.
There is one circumstance where community migration is not feasible--when the community is opposed by an adversarial group. An adversarial group defines itself as the opponent of another group (called the opposed group here). Nazis,creationists/intelligent designers, and Scientologists, for example, are opposed to Jews, evolutionists, and psychiatrists respectively. Furthermore, any group that exists online must communicate its beliefs, as there is no online presence without communication. For example, a group that espouses intelligent design must speak out against evolution, as opposition to evolution is an inherent part of believing in intelligent design. Furthermore, adversarial groups seek conflict. Because Scientology opposes psychotherapy, some members of this online group of scientologists will seek conflict with psychotherapists. The relationship between the adversarial and opposed group is inherently parasitical. To assert their identity as an adversarial group, the adversarial groups must argue with the opposed group, and must therefore frequent virtual communities inhabited by the opposed group.
Conversely, the opposed group does not necessarily define itself in relationship to the adversarial group. Jews, evolutionists, and psychotherapists would generally prefer that Nazis, intelligent designers, and Scientologists leave them alone. Thus, every time a member of the adversarial group communicates his adversarial belief to the opposed group, the value of the virtual community decreases for every member of the opposed group. Furthermore, for the opposed group, emigration is not an option. Regardless of where they move their online discussion, the opposed group will be pursued by the adversarial group. Jews pursued by Nazis have no space on these open lists to discuss how to raise their childrenfollowing Jewish traditions. Biologists pursued by Creationists cannot discuss the latest biological research. Thus, for opposed groups, adversarial group messages have properties identical to censorship.
For such opposed groups, moderation provides an important way to enable them to speak. Under moderation, members of the adversarial group are prohibited from communicating on their opposed group's newsgroup. This in turn provides bandwidth and capacity for the opposed group to communicate. Furthermore, moderation does not prohibit members of the adversarial or opposed group from joining other virtual communities. Indeed, because members of the adversarial group remain a threat to the opposed group on other virtual communities, there will exist members of the opposed group who will argue with the adversarial group elsewhere. Thus, ironically, the total diversity of speech across all forums is increased. The opposed group speaks on the moderated forum, while the adversarial group speaks on another forum where the adversarial and opposed group co-exist.
To illustrate, compare the conversation topics on soc.culture.jewish.moderated (SCJM), andsci.psychology.psychotherapy.moderated (SPPM) with their non-moderated counterparts soc.culture.jewish (SCJ), and sci.psychology.psychotherapy (SPP). Topics on the moderated groups tend to address within-group issues. For example, a typical topic on soc.culture.jewish.moderated might extol the virtues of schmaltz, or cooking fat rendered from birds. In contrast, the non-moderated group will contain cross-group arguments. Debates on Christian vs. Jewish concepts, for example, are common on non-moderated newsgroups.
Furthermore, it is possible to demonstrate that within-group conversations only successfully emerge when the opposed group creates a moderated forum. SCJ and SPP predate SCJM, and SPPM. The justification for the creation of SCJM andSPPM explicitly state the need to preserve within-group conversations from adversarial groups. Also, a visit to the various newsgroups will reveal the difference in conversation topics.
Moderation has real value in encouraging free speech. Without moderation, speech by adversarial groups would quickly kill off speech by opposed groups. Thus, speech is most diverse when some virtual communities are moderated, while others are not. Speech by adversarial groups thrives in unmoderated environments, while speech by opposed groups thrives in the moderated environment.