Friday, May 8, 2009

In the long run, we are all dead

The Classical school of political economy holds the view that recessions are short-lived and self-correcting. When a recession occurs, some labor and capital falls idle. According to the Classicals, these idle factors of production react by accepting employment at a lower price; factor prices continue to fall until firms have willingly hired all "surplus" factors. Thus, unemployment (of capital and labor) automatically ends, through the mechanism of falling factor prices--a mechanism requiring no government intervention.
John Maynard Keynes famously disagreed with the Classical view, arguing that factor prices are "downwardly rigid"--they do fall when factors are idle, but reluctantly and slowly. "In the long run," Keynes acknowledged, falling factor prices would eliminate the recession, "but in the long run, we are all dead." In other words, the self-correcting mechanism works too slowly to be useful.
This is also the problem with the eventualist view of Wikipedia. Sure, given enough editors, articles will eventually evolve toward something fair and balanced, but the problem is that eventually can be a very long time. Too long to be useful.
Some parts of Wikipedia move towards excellence much faster than others. A sophisticated user of Wikipedia understands this, and has learned which parts are worthy of trust and which parts should best be avoided. The problem is that the average user of Wikipedia is not sophisticated--she is a high school student or college student, looking up a topic about which she knows next to nothing. That the topic may eventually have a good article is not much consolation to the student who is imbibing misinformation today. And it is certainly no consolation to her teacher, who will continue to ban Wikipedia as a legitimate source for research papers.

Tuesday, June 24, 2008

Clique cohesion requires an enemy

Wikipedia editors are by and large too individualistic to stick together in cliques. Even in situations where several editors work together on an article, there is not likely to be full agreement and editors must learn to accept that others approach the topic differently. It's not much fun to make big compromises, and most editors prefer to work on articles where they can make edits unhindered by the need to accommodate others.

Cohesion in a clique requires a force to counteract the friction of working with others. That force is usually ideological, though personal ties reinforce the ideological. A clique will form with the intent of protecting territory from an enemy. A sense of in-group and out-group develops. The out-group is seen to have reprehensible beliefs or to belong to a reprehensible segment of the population. The clique requires an enemy, and will assign that role to almost any editor who enters their territory, imputing to this stray editor the worst of motives.

In such cliques, there is not likely to be much agreement beyond the agreement that the enemy must be kept from the territory. The pages protected by a clique will be badly written and full of erroneous information. The reason for the low quality is not the constant attacks of the enemy, but rather that the clique controlling the territory is unable to organize itself for the task of article writing. In fact, the members of the clique might not know much about the subject of the pages they are defending, since ideological fervor is often a sign that one does not understand the full complexity of an issue (an example of the Dunning-Kruger effect).

The ability of the clique to control territory is based upon the willingness of clique members to back each other up in an edit dispute. In Wikipedia, of course, edit disputes are over content, so there is always an argument about content at the surface of a dispute. Clique members have the illusion that they have won a dispute because they were correct about content, when in fact they have only won because of their numbers and their cohesiveness. Arguments over content are likely to be repeated whenever a new editor wanders into their territory, and the clique finds its arguments more compelling each time they are repeated (an "availability cascade"). Trust simultaneously strengthens among the members, as each sees that the others, again and again, back them up. Cliques thus can become stronger over time.

Cliques can usually stay within the boundaries of the rules. A numerical advantage allows it to out-revert the solitary editors who oppose them, and the ease with which the clique wins content disputes helps its members stay calm and civil. Solitary editors in conflict with the clique are more likely to run afoul of the rules, reverting too often or exploding with frustration and saying uncivil things. As the rules currently stand, cliques are not often threatened by administrators.

Cliques are a special form of the "ownership" problem on Wikipedia. One proposal I've seen is that all of the editors active on an article or its talk pages can be asked to leave (for several months), opening up the article to a new set of editors. For this to work, a rule must determine when an article hits a state that all of the active editors must go elsewhere. It must be a simple rule, requiring little research on the part of the administrator, so that it cannot be contested--like the 3RR rule.

Wednesday, February 20, 2008

Free riding in Wikipedia

Axelrod and Hamilton (1981) showed that a winning strategy in prisoner's dilemma games is tit-for-tat; that is, one should start out behaving cooperatively, and then on every subsequent move simply match what the other agent did in her previous move. This strategy, however, describes only 2-person, repeated games. In games where multiple agents interact with each other, even a small number of defectors will prompt the conditional cooperators to also defect, so that the game moves to an equilibrium of no cooperation. Thus, the model cannot explain how cooperation persists in groups with multi-agent interactions.

Experimental games have shown that cooperation can be maintained when defectors are punished. But since punishment is costly to the punisher, it would be rational for a player to let others punish the defectors--these rational players are called second-order free riders (since they are free riding by letting others punish free riders). Without rewards for the punishers, or punishment for the second-order free riders, no punishment of defectors will occur.

Panchanathan and Boyd (2003) have shown that reputation works well in models as the reward sought by punishers, who gain in reputation by helping the deserving (those with good reputations) and punishing the undeserving (those with bad reputations--i.e., persistent defectors). The ultimate reward of reputation is that others willingly cooperate with them. The reputation model requires that agents know each other's reputations, which is only realistic in small groups.

Second-order free riding is a common problem on Wikipedia. One often encounters editors who appear to be doing something wrong: inserting a strange point-of-view; moving pages without discussion; deleting good work done by others. But it is costly to battle these people. Conflict itself is unpleasant, and one is likely to break some rules when in a conflict. If an administrator observes the broken rule she is unlikely to have the time to look back at the history of the conflicting parties and figure out who is the rogue and who is the guardian, and she will simply punish whoever broke the rules. What she lacks is ready access to information on the reputations of the parties in conflict.

A mechanism to record and update reputation would encourage good editors to oppose the actions of bad editors--it would reduce the problem of second-order free riding, and make it more difficult for rogue editors to get their way.
  1. Axelrod, Robert, and William D. Hamilton. 1981. “The Evolution of Cooperation.” Science 211: 1390-1396.
  2. Panchanathan, K., and R. Boyd. 2003. “A tale of two defectors: the importance of standing for evolution of indirect reciprocity.” Journal of Theoretical Biology 224:115-126.

Sunday, November 11, 2007

Tempers boiling!

Disputes are easy to enter. One can very easily lose sight of Wikipedia as a place one writes an encyclopedia, and instead view Wikipedia as the place one fights back against aquel pendejo. There are two psychological phenomena apparently responsible for this problem.

Tit for tat: Game theory has established that the strategy most likely to help a social organism pass on its genes is one of tit-for tat. Start out cooperating with other people, but if they fail to cooperate then immediately switch to their behavior. In Wikipedia, when an editor refuses to hear our comments, we respond by refusing to hear their comments. After a few exchanges, there is no longer any realistic chance for dialog, since the initial exchanges have created a pair of editors who don't want to cooperate with each other.

A major problem with tit-for-tat is that the anger and hostility created by one non-cooperative encounter is likely to be carried on to the next encounter. If an editor has proved to be a real pain, then we are primed to consider that the next editor we meet will also prove to be a big pain. We are ready to drop into non-cooperative behavior at the first provocation.

Self-serving assessments: We are fairly accurate at determining the biases and errors of others, but much less accurate at identifying our own biases (Epley and Dunning 2000). Each editor in a dispute will be very conscious of the misbehaviors and errors of the others, but not conscious of her own. This has an evolutionary advantage in that we can convincingly present ourselves to other people as virtuous, and gain their acceptance as desirable partners in reciprocal relationships--we are convincing because we believe the story ourselves. At the same time, it is also to our advantage to assess correctly the suitability of others as partners, so we are not so biased when it comes to looking at other people.

The problem here is that few editors have the insight to understand that they are partially at fault: most will perceive that the fault lies overwhelmingly on the other side. And even if both have this insight, confessing that one is at fault is a good example of the prisoners' dilemma. If both editors can admit that they are at fault, then the conflict can quickly end and cooperation begin. If only one editor admits that she is at fault, then the other will emerge victorious in the conflict, so the one who admits fault is in a bad situation. To avoid this bad situation, both editors will deny that they are at fault.


  • Epley, and Dunning. 2000. Feeling "holier than thou": Are self-serving assessments produced by errors in self- or social prediction? Journal of Personality and Social Psychology. 79(6): 861-875.



Wednesday, October 31, 2007

Laziness

Lots of editors are just plain lazy. The last thing they intend to do is to look up sources. If they look up sources, they use Google and find something on-line. But even that is usually too much trouble. Even writing is too much trouble. Even reading is too much trouble. So what's left? Opinions, expressed on a talk page. Deletions of someone else's work. Pasting tags: POV, DELETE, whatever. These editors aren't trolls, but they are slackers. And like slackers in a college class, you just want to give them an F and send them home. But you can't, of course, and you have to waste your own energy countering their opinions, reverting their edits, and teaching them policy.

Wikipedia is full of friction. This is not a gang of enthusiastic boy scouts eagerly working together to put up camp. This is a bunch of eccentric strangers each trying to do things their own way. Interaction between editors is seldom efficient team work. One seldom hears, "I'll take care of this, and you do that!" Instead one hears, "What exactly do you think you are doing?", followed by a day of exchanges on a talk page. It's best to avoid other editors, they are either trolls or slackers, they just slow you down, they just cause friction.

But who would want to be here if there were no other editors? Isn't the problem with Citizendium that it's just too lonely? Other editors are friction, sure, but they are also the reason we come here, to participate with a community doing something worthwhile. And since we are all eccentric strangers, we don't pull together in harness in the same direction. Friction is inevitable.

The evolved rules help reduce friction. Trolls are beasts that generate way too much friction, and the rules have evolved to force out the incorrigibly trollish, and to make the rest of us suppress our trollish natures. Slackers generate friction too, but the rules don't appear to touch them.

I would like to see, not a formal rule, but a norm, that anyone who complains that an article is unbalanced has the obligation to do something about it. This wouldn't apply so much to someone who notices an objectively evident problem (for example, that the article lacks citations), but would apply full force to people who delete large sections of an article because they object to its POV, or paste POV tags on an article. A norm like this would reduce the friction caused by slackers.

Sunday, October 14, 2007

Irrational exuberance

Stock markets are one of the most obvious examples of the "Wisdom of Crowds." The efficient market hypothesis holds that the current price of a stock reflects all available information, that any changes in a stock's price can only come from information that is unavailable, and that therefore no one can predict changes in a stock's price from available information. The most important piece of information, of course, is the current and future earnings of the stock-issuing firm.

In the 1990s, stock investors became markedly more optimistic about stocks, without really encountering much in the way of new information. Alan Greenspan famously described this a a period of "irrational exuberance". The crowd--not as wise as one would have hoped--succeeded in creating a stock bubble, which eventually popped.

In Wikipedia, one can encounter editors who are possessed with what may be described as "irrational exuberance." For example, an editor who engages in a months-long battle with other editors over something as trivial as the renaming of a page, writing hundreds of talk page comments, provoking other editors to respond with even more comments, and eventually winning, because the other editors have finally given up waging such a pointless battle.

Monomania can be rewarded in Wikipedia, in that someone absurdly devoted to a particular issue can get his way. Nevertheless, the monomaniac can only win if the issue has no devoted partisans on the other side, and in practice this means that he can win only if the issue is relatively trivial. And the monomaniac can only win for a short time. As soon as he grows tired of Wikipedia and leaves for other interests, his work will be reverted, since new editors will find it so out of tune with the prevailing view.

Telling the teacher

In elementary school the teacher is an all-powerful enforcer, who interprets the rules of the school, and metes out reward and punishment. When children encounter conflict they turn to this enforcer, they "tell the teacher." There is some opprobrium associated with tattling, so it is considered only a last resort, and chosen only by the loser in a conflict, after peers have refused to rally to his side.

Wikipedia is celebrated as an anarchy, where it is possible to ignore rules, and where everything should be settled by consensus among reasonable editors. Nevertheless, these editors all came up through elementary schools, and all have learned the tactic of telling the teacher when they feel unfairly used. The analogue of the teacher in Wikipedia is the administrator, and these must constantly lend an ear to the complaints of the insecure and the marginalized.

In fact, administrators can't do much. A conflict takes time to develop, and may be spread over many pages; the users all have a background, providing context for the conflict, spreading over many more pages. A just appraisal of a conflict requires too much work, and no one is interested in a conflict to which they have not been a party.

But, on the other hand, users appeal regularly to administrators, and it seems wrong to simply ignore the appeal. There are a few rules that provide a clean way to assign guilt to a user, without requiring an administrator to study background material: the three-revert rule, and rules against threats, especially legal threats. Violate one of these rules, and the case is immediately decided; otherwise, the case is more murky, and the administrator's most sensible response is to say, in diplomatic terms, "don't be a crybaby, learn to get along with other children."

A system like Wikipedia needs a few rules whose violation leads to near-certain punishment. A good system of rules relies on a very narrow set of information: detection of violators should require reading only a small amount of text, and should require no effort to interpret what the user really meant. Violating these rules should function as a tripwire, leading to automatic punishment. Most of the worst species of trolls are weeded out with these tripwire rules.

Such rules create an environment in which a certain kind of troll has evolved: a troll who appears to be polite, but who edits aggressively, who is a master of subtle tactics of obfuscation and obstruction, and whose polite comments often appear to their recipients as thinly veiled taunts. Never in violation of the tripwire rules, this kind of troll flourishes in Wikipedia, and can only be detected by the subjective and laborious task of reading his posts and interpreting his meaning--a task that no administrator is willing to take on.