Karma system: Difference between revisions

From Citizendium
Jump to navigation Jump to search
imported>Howard C. Berkowitz
imported>Jess Key
No edit summary
Line 33: Line 33:
  | title = Meta-moderation
  | title = Meta-moderation
  | publisher = [[Slashdot]]}}</ref>
  | publisher = [[Slashdot]]}}</ref>
==Automatic karma==
Some software systems, such as the [[MediaWiki]] software used to power Citizendium and Wikipedia, have an automatic system for gaining reputation. This allows the user to be automatically given new privileges as they hit certain milestones. The MediaWiki software defines three such milestones based on having a confirmed email address, number of edits and length of time since registration. The exact numbers are set by an administrator, however a typical usage may be to allow users to move pages once they have confirmed their email address, and then allow them to delete pages once they have also made 90 edits and been registered for 60 days. If an administrator wishes they are able to promote a user early in order to bypass these thresholds.
The only way such automatic systems can work is with careful control and strict moderation, as they rely on problem users being removed before they reach the preset milestones. Should problems go unnoticed the user will gain the new powers regardless and be able to cause more problems than before.


==References==
==References==
{{reflist|2}}
{{reflist|2}}

Revision as of 12:26, 29 July 2010

This article is developing and not approved.
Main Article
Discussion
Related Articles  [?]
Bibliography  [?]
External Links  [?]
Citable Version  [?]
 
This editable Main Article is under development and subject to a disclaimer.

A Karma system, also known as a web reputation system is, at its most general, an automated or semiautomated method of characterizing online user behavior in a way useful to the particular venue. The term "reputation system" is broader than World Wide Web context alone, and draws from the distributed trust model used by Pretty Good Privacy to assess the reliability of cryptographic keys, and, indirectly, the reputation of their user. Closely related mechanisms may be used to detect spammers in electronic mail systems.

"There are three types of lies - lies, damn lies, and facts found on the Web." — Dr. Tim Finin, paraphrasing the well known quotation by Benjamin Disraeli on Statistics[1]

Farmer and Glass have described some general terms for characterizing karma systems. First, a system may be public, with its findings immediately available to all or most users, or private, available only to administrators who control others' access. In all cases, the karma score is context-specific and should not be generalized to things beyond its capability, within the site or to other venues. They mention, as one example of overuse of a reputation, the FICO score for creditworthiness widely used in the United States, which has been controversially applied to such things as risk in granting insurance. More relevant to web use are attempts to apply eBay seller scores to other contexts, but

The eBay Feedback score reflects only the transaction worthiness of a specific account, and it does so only for particular products bought or sold on eBay. The user behind that identity may in fact steal candy from babies, cheat at online poker, and fail to pay his credit card bills.[2]

Some of the first applications were used on the early online discussion forum site, The Well. This was a computer-assisted means of measuring quality of posts. In each rating interval, a random sample of users were selected to do basic quality assessments of individual posts. Over time, the poster's reputation in the community was established.

Discussion forums

As with the Well, formal ratings of users, statistically controlled to avoid bias, is one approach. While the raters were public, their ratings were not. Regresssion-based statistical approaches have been used to assess the community's appreciation of posts. [3]

With growing reputation, the user often gets either quantitatively more privilege, or qualitatively new privileges. Drops in reputation can reduce privilege. One common problem is that a minority of new users may flood the site with posts that variously might be spam, or simply are written without a good understanding of the culture. Typical karma systems create a dynamic quota, for new users, of volume or number of posts, a quota that moves to removing limits as reputation builds. In privilege-limiting systems, it should be easy for users to request waivers from administrators, or, as here, Constables, who can use human judgment.

Slashdot uses a karma system., which

... primarily represents how your comments have been moderated in the past. Karma is structured on the following scale "Terrible, Bad, Neutral, Positive, Good, and Excellent. If a comment you post is moderated up, your karma will rise. Consequently, if you post a comment that has been moderated down, your karma will fall. In addition to moderation, other things factor into karma as well. You can get some karma by submitting a story that we decide to post. Also, metamoderation can cause your karma to change. This encourages good moderators, and ideally removes moderator access from bad ones.[4]

Metamoderation is a second level of moderation that judges if moderator comments were fair, based on having any logged-in user rate the fairness of ten random ratings.[5]

Automatic karma

Some software systems, such as the MediaWiki software used to power Citizendium and Wikipedia, have an automatic system for gaining reputation. This allows the user to be automatically given new privileges as they hit certain milestones. The MediaWiki software defines three such milestones based on having a confirmed email address, number of edits and length of time since registration. The exact numbers are set by an administrator, however a typical usage may be to allow users to move pages once they have confirmed their email address, and then allow them to delete pages once they have also made 90 edits and been registered for 60 days. If an administrator wishes they are able to promote a user early in order to bypass these thresholds.

The only way such automatic systems can work is with careful control and strict moderation, as they rely on problem users being removed before they reach the preset milestones. Should problems go unnoticed the user will gain the new powers regardless and be able to cause more problems than before.

References

  1. Models of Trust for the Web (MTW'06), 15th International World Wide Web Conference (WWW2006), May 22-26, 2006}
  2. Randy Farmer and Bryce Glass, On Karma: Top-line Lessons on User Reputation Design, Building Web Reputation Systems: The Blog
  3. Chiao-Fang Hsu, Elham Khabiri, and James Caverlee, Ranking Comments on the Social Web, Department of Computer Science and Engineering, Texas A&M University
  4. Comments and Moderation, Slashdot
  5. Meta-moderation, Slashdot