Tuesday, June 24, 2008

Clique cohesion requires an enemy

Wikipedia editors are by and large too individualistic to stick together in cliques. Even in situations where several editors work together on an article, there is not likely to be full agreement and editors must learn to accept that others approach the topic differently. It's not much fun to make big compromises, and most editors prefer to work on articles where they can make edits unhindered by the need to accommodate others.

Cohesion in a clique requires a force to counteract the friction of working with others. That force is usually ideological, though personal ties reinforce the ideological. A clique will form with the intent of protecting territory from an enemy. A sense of in-group and out-group develops. The out-group is seen to have reprehensible beliefs or to belong to a reprehensible segment of the population. The clique requires an enemy, and will assign that role to almost any editor who enters their territory, imputing to this stray editor the worst of motives.

In such cliques, there is not likely to be much agreement beyond the agreement that the enemy must be kept from the territory. The pages protected by a clique will be badly written and full of erroneous information. The reason for the low quality is not the constant attacks of the enemy, but rather that the clique controlling the territory is unable to organize itself for the task of article writing. In fact, the members of the clique might not know much about the subject of the pages they are defending, since ideological fervor is often a sign that one does not understand the full complexity of an issue (an example of the Dunning-Kruger effect).

The ability of the clique to control territory is based upon the willingness of clique members to back each other up in an edit dispute. In Wikipedia, of course, edit disputes are over content, so there is always an argument about content at the surface of a dispute. Clique members have the illusion that they have won a dispute because they were correct about content, when in fact they have only won because of their numbers and their cohesiveness. Arguments over content are likely to be repeated whenever a new editor wanders into their territory, and the clique finds its arguments more compelling each time they are repeated (an "availability cascade"). Trust simultaneously strengthens among the members, as each sees that the others, again and again, back them up. Cliques thus can become stronger over time.

Cliques can usually stay within the boundaries of the rules. A numerical advantage allows it to out-revert the solitary editors who oppose them, and the ease with which the clique wins content disputes helps its members stay calm and civil. Solitary editors in conflict with the clique are more likely to run afoul of the rules, reverting too often or exploding with frustration and saying uncivil things. As the rules currently stand, cliques are not often threatened by administrators.

Cliques are a special form of the "ownership" problem on Wikipedia. One proposal I've seen is that all of the editors active on an article or its talk pages can be asked to leave (for several months), opening up the article to a new set of editors. For this to work, a rule must determine when an article hits a state that all of the active editors must go elsewhere. It must be a simple rule, requiring little research on the part of the administrator, so that it cannot be contested--like the 3RR rule.

Wednesday, February 20, 2008

Free riding in Wikipedia

Axelrod and Hamilton (1981) showed that a winning strategy in prisoner's dilemma games is tit-for-tat; that is, one should start out behaving cooperatively, and then on every subsequent move simply match what the other agent did in her previous move. This strategy, however, describes only 2-person, repeated games. In games where multiple agents interact with each other, even a small number of defectors will prompt the conditional cooperators to also defect, so that the game moves to an equilibrium of no cooperation. Thus, the model cannot explain how cooperation persists in groups with multi-agent interactions.

Experimental games have shown that cooperation can be maintained when defectors are punished. But since punishment is costly to the punisher, it would be rational for a player to let others punish the defectors--these rational players are called second-order free riders (since they are free riding by letting others punish free riders). Without rewards for the punishers, or punishment for the second-order free riders, no punishment of defectors will occur.

Panchanathan and Boyd (2003) have shown that reputation works well in models as the reward sought by punishers, who gain in reputation by helping the deserving (those with good reputations) and punishing the undeserving (those with bad reputations--i.e., persistent defectors). The ultimate reward of reputation is that others willingly cooperate with them. The reputation model requires that agents know each other's reputations, which is only realistic in small groups.

Second-order free riding is a common problem on Wikipedia. One often encounters editors who appear to be doing something wrong: inserting a strange point-of-view; moving pages without discussion; deleting good work done by others. But it is costly to battle these people. Conflict itself is unpleasant, and one is likely to break some rules when in a conflict. If an administrator observes the broken rule she is unlikely to have the time to look back at the history of the conflicting parties and figure out who is the rogue and who is the guardian, and she will simply punish whoever broke the rules. What she lacks is ready access to information on the reputations of the parties in conflict.

A mechanism to record and update reputation would encourage good editors to oppose the actions of bad editors--it would reduce the problem of second-order free riding, and make it more difficult for rogue editors to get their way.
  1. Axelrod, Robert, and William D. Hamilton. 1981. “The Evolution of Cooperation.” Science 211: 1390-1396.
  2. Panchanathan, K., and R. Boyd. 2003. “A tale of two defectors: the importance of standing for evolution of indirect reciprocity.” Journal of Theoretical Biology 224:115-126.