Sunday, November 11, 2007

Tempers boiling!

Disputes are easy to enter. One can very easily lose sight of Wikipedia as a place one writes an encyclopedia, and instead view Wikipedia as the place one fights back against aquel pendejo. There are two psychological phenomena apparently responsible for this problem.

Tit for tat: Game theory has established that the strategy most likely to help a social organism pass on its genes is one of tit-for tat. Start out cooperating with other people, but if they fail to cooperate then immediately switch to their behavior. In Wikipedia, when an editor refuses to hear our comments, we respond by refusing to hear their comments. After a few exchanges, there is no longer any realistic chance for dialog, since the initial exchanges have created a pair of editors who don't want to cooperate with each other.

A major problem with tit-for-tat is that the anger and hostility created by one non-cooperative encounter is likely to be carried on to the next encounter. If an editor has proved to be a real pain, then we are primed to consider that the next editor we meet will also prove to be a big pain. We are ready to drop into non-cooperative behavior at the first provocation.

Self-serving assessments: We are fairly accurate at determining the biases and errors of others, but much less accurate at identifying our own biases (Epley and Dunning 2000). Each editor in a dispute will be very conscious of the misbehaviors and errors of the others, but not conscious of her own. This has an evolutionary advantage in that we can convincingly present ourselves to other people as virtuous, and gain their acceptance as desirable partners in reciprocal relationships--we are convincing because we believe the story ourselves. At the same time, it is also to our advantage to assess correctly the suitability of others as partners, so we are not so biased when it comes to looking at other people.

The problem here is that few editors have the insight to understand that they are partially at fault: most will perceive that the fault lies overwhelmingly on the other side. And even if both have this insight, confessing that one is at fault is a good example of the prisoners' dilemma. If both editors can admit that they are at fault, then the conflict can quickly end and cooperation begin. If only one editor admits that she is at fault, then the other will emerge victorious in the conflict, so the one who admits fault is in a bad situation. To avoid this bad situation, both editors will deny that they are at fault.


  • Epley, and Dunning. 2000. Feeling "holier than thou": Are self-serving assessments produced by errors in self- or social prediction? Journal of Personality and Social Psychology. 79(6): 861-875.



Wednesday, October 31, 2007

Laziness

Lots of editors are just plain lazy. The last thing they intend to do is to look up sources. If they look up sources, they use Google and find something on-line. But even that is usually too much trouble. Even writing is too much trouble. Even reading is too much trouble. So what's left? Opinions, expressed on a talk page. Deletions of someone else's work. Pasting tags: POV, DELETE, whatever. These editors aren't trolls, but they are slackers. And like slackers in a college class, you just want to give them an F and send them home. But you can't, of course, and you have to waste your own energy countering their opinions, reverting their edits, and teaching them policy.

Wikipedia is full of friction. This is not a gang of enthusiastic boy scouts eagerly working together to put up camp. This is a bunch of eccentric strangers each trying to do things their own way. Interaction between editors is seldom efficient team work. One seldom hears, "I'll take care of this, and you do that!" Instead one hears, "What exactly do you think you are doing?", followed by a day of exchanges on a talk page. It's best to avoid other editors, they are either trolls or slackers, they just slow you down, they just cause friction.

But who would want to be here if there were no other editors? Isn't the problem with Citizendium that it's just too lonely? Other editors are friction, sure, but they are also the reason we come here, to participate with a community doing something worthwhile. And since we are all eccentric strangers, we don't pull together in harness in the same direction. Friction is inevitable.

The evolved rules help reduce friction. Trolls are beasts that generate way too much friction, and the rules have evolved to force out the incorrigibly trollish, and to make the rest of us suppress our trollish natures. Slackers generate friction too, but the rules don't appear to touch them.

I would like to see, not a formal rule, but a norm, that anyone who complains that an article is unbalanced has the obligation to do something about it. This wouldn't apply so much to someone who notices an objectively evident problem (for example, that the article lacks citations), but would apply full force to people who delete large sections of an article because they object to its POV, or paste POV tags on an article. A norm like this would reduce the friction caused by slackers.

Sunday, October 14, 2007

Irrational exuberance

Stock markets are one of the most obvious examples of the "Wisdom of Crowds." The efficient market hypothesis holds that the current price of a stock reflects all available information, that any changes in a stock's price can only come from information that is unavailable, and that therefore no one can predict changes in a stock's price from available information. The most important piece of information, of course, is the current and future earnings of the stock-issuing firm.

In the 1990s, stock investors became markedly more optimistic about stocks, without really encountering much in the way of new information. Alan Greenspan famously described this a a period of "irrational exuberance". The crowd--not as wise as one would have hoped--succeeded in creating a stock bubble, which eventually popped.

In Wikipedia, one can encounter editors who are possessed with what may be described as "irrational exuberance." For example, an editor who engages in a months-long battle with other editors over something as trivial as the renaming of a page, writing hundreds of talk page comments, provoking other editors to respond with even more comments, and eventually winning, because the other editors have finally given up waging such a pointless battle.

Monomania can be rewarded in Wikipedia, in that someone absurdly devoted to a particular issue can get his way. Nevertheless, the monomaniac can only win if the issue has no devoted partisans on the other side, and in practice this means that he can win only if the issue is relatively trivial. And the monomaniac can only win for a short time. As soon as he grows tired of Wikipedia and leaves for other interests, his work will be reverted, since new editors will find it so out of tune with the prevailing view.

Telling the teacher

In elementary school the teacher is an all-powerful enforcer, who interprets the rules of the school, and metes out reward and punishment. When children encounter conflict they turn to this enforcer, they "tell the teacher." There is some opprobrium associated with tattling, so it is considered only a last resort, and chosen only by the loser in a conflict, after peers have refused to rally to his side.

Wikipedia is celebrated as an anarchy, where it is possible to ignore rules, and where everything should be settled by consensus among reasonable editors. Nevertheless, these editors all came up through elementary schools, and all have learned the tactic of telling the teacher when they feel unfairly used. The analogue of the teacher in Wikipedia is the administrator, and these must constantly lend an ear to the complaints of the insecure and the marginalized.

In fact, administrators can't do much. A conflict takes time to develop, and may be spread over many pages; the users all have a background, providing context for the conflict, spreading over many more pages. A just appraisal of a conflict requires too much work, and no one is interested in a conflict to which they have not been a party.

But, on the other hand, users appeal regularly to administrators, and it seems wrong to simply ignore the appeal. There are a few rules that provide a clean way to assign guilt to a user, without requiring an administrator to study background material: the three-revert rule, and rules against threats, especially legal threats. Violate one of these rules, and the case is immediately decided; otherwise, the case is more murky, and the administrator's most sensible response is to say, in diplomatic terms, "don't be a crybaby, learn to get along with other children."

A system like Wikipedia needs a few rules whose violation leads to near-certain punishment. A good system of rules relies on a very narrow set of information: detection of violators should require reading only a small amount of text, and should require no effort to interpret what the user really meant. Violating these rules should function as a tripwire, leading to automatic punishment. Most of the worst species of trolls are weeded out with these tripwire rules.

Such rules create an environment in which a certain kind of troll has evolved: a troll who appears to be polite, but who edits aggressively, who is a master of subtle tactics of obfuscation and obstruction, and whose polite comments often appear to their recipients as thinly veiled taunts. Never in violation of the tripwire rules, this kind of troll flourishes in Wikipedia, and can only be detected by the subjective and laborious task of reading his posts and interpreting his meaning--a task that no administrator is willing to take on.

Sunday, September 23, 2007

Gresham's Law

"Bad money drives out good," is an adage ascribed to Sir Thomas Gresham, who pointed out that valuable coins were hoarded by anyone so lucky as to receive them, while debased coins were spent, so that only the debased coins remained in circulation.

The adage applies also to any kind of intellectual exchange, such as academic email lists or Wikipedia, and can be expressed in this context as "bad contributors drive out good." I once participated on an email list populated by radical economists, some quite famous. For the first year or so, lurking on the list was one of the highlights of my life, much like sitting in a conference and listening to what those really shaping the field really thought. But very soon, some less famous people began to post, and post frequently. By the second year, it seemed that every third post was from the same graduate student, who had an opinion about everything and couldn't wait to share it. The famous economists fell silent, and then the less-famous but still lucid ones began to disappear, and finally the graduate student had the list all to himself.

My local newspaper allows its online readers to write comments on its articles, a feature that it calls a readers' forum. Very few of the readers are gentlemen or scholars, though, and the average contribution is an angry vent about illegal aliens, government officials, Muslims, or other people's children. Much of the commentary, perhaps the majority of it, takes as its target other commentators, rather than the content of the news article. The comments do have some value, in that they show how some people think in private, and there is also the rare post where someone actually adds information to the article, or provides a lucid analysis. But these rare posts are rare precisely because of Gresham's Law: the bad contributors drive out the good. Why are they driven out? Partly, no doubt, because the audience of other contributors is so obviously indifferent to intelligent discourse, but also because the usual standards of civility do not prevail here--one is in the world of the mob, where barbaric insult is piled on barbaric insult for no other reason than to get a rise out of another commentator. These commentators are trolls, and trolls drive out people looking for intelligent discourse.

Wikipedia has problems of bad editors driving out good. In the first of his laws, Raul maintains that, of the "...highly dedicated users who have left, the vast majority left as a result of trolls, vandals, and/or POV warriors." Since dedicated users are Wikipedia's main resource, he concludes that "...such problem users should be viewed as Wikipedia's biggest handicap." But compared to the anarchy of the newspaper forum, Wikipedia is a highly structured environment, replete with rules designed to ensure that all editors behave civilly, and avoid any personal attacks. As part of this effort, there is a rule that the talk pages focus on the articles rather than on the personalities of the editors. When followed, these rules guarantee that the level of discourse will stay high enough to attract intelligent contributors. When the rules are not followed, there are real sanctions--the violator is banned, at least temporarily.

Nevertheless, Wikipedia will always suffer from problems with trolls, because of two of its features: unrestricted entry of contributors, and contributor anonymity. Unrestricted entry brings in unworthy people--those who are uninformed, uncivil, or both--and anonymity emboldens people to behave in ways they would never behave were they using their real name. Citizendium has sought to remove these two features, and therefore provides a better environment for the highly qualified contributor, but has run into a different problem. That is that the rate of progress (in terms of article creation and improvement) is much higher with unrestricted entry. Perhaps Citizendium is too lonely a place, and therefore not attractive, even to the highly qualified. But perhaps the swarm of random Wikipedia editors does manage to do some good, as long as their work is shaped a bit by a smaller core group of competent editors.

It seems, then, that the best wiki structure is one in which large numbers of people edit, but rules are structured so as to readily identify and discourage troll-like behavior, and a core group of dedicated editors are able to exert a great deal of influence. Wikipedia has all of these features: anonymity and free entry help bring in large numbers of users; rules have evolved to drive out trolls; and a Wikipedia elite, both formally designated and informally acknowledged, shapes policy and content.

Friday, August 31, 2007

Discouraging cliques

Cliques form readily in Wikipedia, primarily because editors dislike conflict, and tend to drift towards articles where like-minded editors are writing. Cliques can be good, especially in articles requiring some specialized knowledge, since they serve to keep incompetent editors at bay. But cliques are often undesirable, because they narrow the range of viewpoints expressed in an article, so that the article does not reflect a NPOV. This is an especially acute problem in articles relating, however tangentially, with some kind of ethnic conflict.

Much Wikipedia policy has evolved to discourage cliques. One interesting behavioral guideline is the rule against canvassing. When a community discussion occurs, editors are forbidden from recruiting allies. An editor may inform others of the discussion, but they are obligated to "keep the number of notifications small... , keep the message text neutral, and not preselect recipients according to their established opinions." As it happens, it is relatively easy to locate allies in Wikipedia, because of the custom of installing userboxes on userpages. These small icons may announce, for example, that the user is a "Libertarian", and will attach the category "User: Libertarian" to his userpage. An editor seeking libertarian allies need only go to the category page for "User: Libertarian" to find a long list of potential helpers. For this reason, political userboxes have recently been deprecated.

Here cliques are viewed as a problem because they disrupt community discussions, a realm populated by administrators, where important decisions should be made. At the level of the article, canvassing is not an issue. At the article level, perhaps the most important policy discouraging cliques is that about "ownership." Editors are admonished to avoid taking a proprietary interest in their articles; they are warned that, within reason, they should not prevent others from editing those articles. Thus a clique is not on firm ground when it seeks to discourage editors with dissenting views. Those editors can appeal to an administrator, who would cite the policy regarding ownership and caution the clique that they must not try to control the article. The exceptions would be, as mentioned in an earlier post, those cases where general cultural prejudices would cause the administrator to view the dissenting editor as a rogue with false or even despicable views. The general cultural prejudices most evident on Wikipedia appear to include ethnocentrism, political correctness, and secularism.

Discouraging ownership gives editors another reason to roam from article to article, making minor changes, rather than fully researching an article. No one enjoys the feeling of working weeks on putting an article together, only to see a stranger show up and, with all the right on his side, change the article's structure and meaning. A few experiences like this, and editors will adopt the style of putting only a little effort into many articles, rather than a lot of effort into one. So the policy of discouraging ownership actually tends to amplify the already existing tendency on Wikipedia for editors to make only minor edits--a tendency, as argued earlier, due to the desire of editors to avoid conflict.

Nevertheless, most articles do have owners, who will defend them. It seems even that most articles that manage to improve do so only because a editor has taken on the task of assembling the good edits into a coherent whole, and reverting the bad edits. The lesson from this might be that only some kinds of articles are capable of becoming good articles in Wikipedia: those where cliques are formed on the basis of specialized expertise, where the article content is not likely to be controversial to those with expertise, and where ownership fulfills the function of defending and improving the article.

Thursday, August 23, 2007

Cliques

As mentioned in an earlier post, cliques form spontaneously in Wikipedia due to the desire of editors to avoid conflict. Editors will abandon articles where their edits are resisted or reverted and gravitate toward articles where their edits are accepted without much fuss. Thus, most active articles will be worked on by a relatively homogeneous clique of editors. On the one hand this is useful, since the editors are happy and a great deal of work can be done when there is not constant bickering about each contribution. On the other hand it is difficult to achieve a neutral point of view when all contributors are in such happy agreement with each other.

Cliques can form due to special expertise. For example, the editors of game theory articles are likely to be among that small group of people who know something about game theory. Arguably, nothing is served in such articles by bringing in outsiders who know nothing about game theory. So certain articles are well-served by clique structure, since it keeps out the incompetent and the ill-informed. A corollary of this is that, for articles requiring special expertise to edit, a tacit selection process ensures that only the most qualified Wikipedians edit. For one can imagine that if a widely recognized game theorist were to appear and edit on game theory articles, the graduate students and minor academics who had previously done the work would defer to her judgment. For these kinds of articles the model of open access editing is not really harmful: the most qualified editors will eventually form cliques and control editing.

Other kinds of cliques are obviously harmful. The most obvious of these in Wikipedia are ethnic cliques. As mentioned in an earlier post, the hatred of Armenians for Turks has led to some obvious cases of article bias. This bias tends to persist, in part, because Armenian editors receive substantial support from other editors, to the point that editors writing from the Turkish perspective are immediately branded as trolls. The support of editors with names like "John Smith" for Armenians is no doubt due to very deep anti-Turkish prejudices, persisting from Medieval times, prejudices that can be seen in many parts of Christian European culture, such as--for example--the geographical location of the land of Mordor in Tolkein's Lord of the Rings.

One lesson from this example is that cliques are most likely to maintain hegemony over articles when the worldview of the clique jibes well with the prejudices of the contemporary Anglophone world. What are those prejudices? The phrase "political correctness", though a pejorative, captures a broad swath of those prejudices, and one could well predict that cliques with politically correct perspectives are much more likely to maintain hegemony over articles than cliques without those perspectives. The article on Race and intelligence serves as a good example of how an article about politically incorrect scientific research is dominated by a politically correct clique. The point I'm trying to make is that in Wikipedia it is extremely unlikely that this kind of research would be presented except in a politically correct way.

Another group of prejudices center around Laïcité, Secularism, and Science. Many Wikipedians are proud skeptics, and consider themselves the enemy of all superstition. But, as Richard Dawkins notes in his magnificent The God Delusion, religion is privileged in our culture, protected from any criticism or analysis. Thus skeptics tend to avoid articles on recognized religions, and instead to form cliques around articles related to the paranormal. An example would be the article on Electronic voice phenomena, where one of the most notable researchers in that field, Tom Butler, was discouraged from editing by a skeptical clique. Again, the point is that only the skeptics would have the backing from general prejudices to form cliques dominating paranormal articles.




Sunday, August 12, 2007

Edit conflict

Edit conflict is perhaps the most unpleasant feature of participating in Wikipedia. Even in the best circumstances--where those concerned try to behave civilly--a conflict can be extremely annoying, as the editors must discuss their actions on a talk page. Back and forth they go, pointing out each other's errors and explaining the fundamental correctness of their own positions. At the end, if an agreement is reached, the number of words and the amount of energy expended on the talk page is many times more than the words and energy expended on the article page. A whole day or two can be spent just to get another sentence into the article. Edit conflict is emotionally draining, it wastes time, and very few editors seem to enjoy it.

Several mechanisms have evolved to help editors avoid edit conflict. First and foremost, there is the effort in Wikipedia to set the cultural environment so that discussions are civil, and to encourage editors to assume good faith. The three-revert rule allows an editor to revert another editor on a given article only three times with 24 hours, and serves to push conflicts to the talk pages. These policies serve to reduce conflict between editors, mostly by forcing discussion on talk pages, and focusing discussion on the article, rather than on the personalities of the editors. But beyond these official policies (which like almost everything in Wikipedia evolved spontaneously from the joint work of editors), several unofficial and largely unrecognized mechanisms have evolved.

One of these mechanisms is the tendency to form cliques. A cluster of related articles will typically be edited by a group of editors with similar interests, who have accepted each other as valid contributors. Even though they may not always agree, they have developed a tolerance and respect towards each other that allows editing to proceed without constant reversion. If an outsider wanders into this cluster of articles and begins to make edits that go against the prevailing norm, he will immediately find himself in conflict with, not one, but a whole clique of editors. The outsider will soon give up in disgust, and go elsewhere, where he fits in better and can himself become a part of a prevailing clique. Cliques are therefore the result of a sorting mechanism, an emergent property created by the desire of individual editors to avoid edit conflict.

While cliques solve one problem, they create another: the informational cascade. The homogeneity of clique members can create a very fragile consensus--fragile because based on a narrow set of information--which would be overturned were other viewpoints considered. In other words, clique control causes articles to be biased. A good example of clique-created bias is the article on the Armenian genocide: the article is entirely from the perspective of the Armenians, and any editor who attempts to explain the Turkish perspective will eventually give up in frustration.

A second unofficial and unrecognized mechanism to avoid edit conflict is the tendency of editors to avoid substantial edits, instead devoting themselves to reverting vandalism, fixing typos, and making minor changes for readability. Very few editors are willing to spend time on research, finding new sources, filling in the untold parts of the story. Even within a clique, big changes in an article require a lot of explaining.

A third mechanism to avoid edit conflict would be creating articles: rather than work on an existing article, create an entirely new one. New articles typically have empty talk pages for many months. So rather than work on that obviously unfinished article about a major figure, create a new article on a minor figure. New articles have the great advantage that they have yet to acquire a clique.

The desire of editors to avoid conflict with other editors can therefore explain many of the features of Wikipedia. It explains the terrific pace at which articles have been created, and it also explains why so many important articles lie about unfinished. It explains why so many good editors have given up writing articles and instead engage in more routine activities such as clearing up vandalism. And, most importantly, it explains why cliques form and take control of certain articles.

Wednesday, August 8, 2007

Who edits?

One problem with Wikipedia is that editors often know little or nothing about the articles they edit. So even if, ceteris paribus, the collective judgment is superior to an individual judgment, a collective judgment from the completely clueless will not be superior to a judgment from an expert. What one needs is a collective judgment from experts or near-experts.

This is not such a problem on articles that attract little popular attention and that require some relatively rare technical competence for an editor to make any contribution. Articles in mathematics and statistics, and in the hard sciences, generally have few editors, and these are generally quite knowledgeable. But articles on current events, on pop culture, on the softer sciences and history--these articles attract the modal Wikipedia editor, who feels free to make any "improvement" and move on.

Who is the modal Wikipedia editor? Someone who finds life on Wikipedia more fulfilling than life in the real world: someone for whom anonymity and egalitarianism offers an increase in status, someone for whom interaction with other Wikipedians offers an increase in the quality of human relationships--someone whose real life is characterized by subordination and loneliness. The modal Wikipedian is a teenager, alone in his room after school.

To most academics Wikipedia doesn't look like an attractive place to "publish." Journal referees may often make stupid comments, but they are vastly more knowledgeable than the swarm of random editors that snipe at articles in Wikipedia. Dealing with other editors feels a bit like dealing with students, but with the big difference that students are almost always respectful, while Wikipedians, even the really dumb ones (maybe especially the dumb ones), all think they are just as good as anyone else. Anonymity and egalitarianism, for academics, means a loss in status; it is only those with no reputation and of low status who will see status improvement in Wikipedia.

Thus only those with no reputation and low status will have an incentive to participate heavily in Wikipedia. Teenagers, because of their unfortunate position in the life cycle, fall into this group. And teenagers, because of their youth, lack both the experience and the education to have near-expert knowledge in any domain (except, perhaps, popular culture).

Sunday, June 17, 2007

The evolution of Wikipedia

On my Wikipedia user page, I have a short statement of faith in the veracity and usefulness of Wikipedia:

James Surowiecki's book, The Wisdom of Crowds, begins with Francis Galton's anecdote about an ox-weighing contest at a country fair: for a half-shilling, one could purchase a ticket on which to write an estimate of the slaughtered and dressed weight of a displayed living ox. The ticket with the guess closest to the actual weight would win a prize. Galton found that the mean of all guesses was in fact more accurate than the best guess, even though the guessers included livestock experts. This is a good illustration of the fact that a collective judgment may often be more correct than the judgment of any individual expert — something which appears to be true in financial markets, for example.
Wikipedia is a mechanism for producing collective judgments about the accuracy and importance of factual statements. I think this makes Wikipedia very exciting — any statement placed in Wikipedia is immediately subject to review and revision, and if everyone is animated by the same sense of trying to achieve truth, the text can quite rapidly evolve to something accurate and balanced.


But is this faith well-placed? Do articles always or even usually "evolve to something accurate and balanced"? What are some of the mechanisms skewing articles toward falsehood and bias? What institutions (policies, traditions) have spontaneously emerged to mitigate these problems of falsehood and bias? I hope to address questions such as these in this blog.

Tuesday, June 12, 2007

Who cares about truth?

This may be the great age of the autodidact. For the first time in history, a person can access a significant fraction of the world's knowledge, from home, without investing heavily in books. The internet should make it possible for the intelligent person of low income to become truly knowledgeable.
I remember an English professor, who told us of his military service, and spoke warmly of a sergeant, a man who kept a shelf of Great Books next to his cot, and who spent every free moment perusing them. This kind of autodidact is like a character out of a Jack London story, a working class man with an insatiable thirst for knowledge, a type of person increasingly rare today.
Why does it seem that so few of the working class care about knowledge? One might well retort, "why do so few of the middle class care about knowledge?" Knowledge itself has fallen out of esteem. Beauty is as fashionable as ever, but truth has become a bit risible. Why so? Why do young people no longer dream over maps? Why do students act as if only motivated by status and income, and not at all by insight and understanding?
Perhaps I like Wikipedia because I detect in so many of the editors a love of knowledge that is not often found elsewhere.

Wednesday, May 23, 2007

Getting Started

Don't expect anything worth reading for a few weeks. Until I get the hang of it, this will be one of the many blogs with one writer and one reader.