Ian Pye

From WikiPapers
Jump to: navigation, search

Ian Pye is an author.

Tools

Tool Description
WikiTrust WikiTrust is an open-source, on-line reputation system for Wikipedia authors and content.


Publications

Only those publications related to wikis are shown here.
Title Keyword(s) Published in Language DateThis property is a special property in this wiki. Abstract R C
Reputation systems for open collaboration Communications of the ACM English 2011 Algorithmic-based user incentives ensure the trustworthiness of evaluations of Wikipedia entries and Google Maps business information. 0 0
Detecting Wikipedia Vandalism using WikiTrust - Lab Report for PAN at CLEF 2010 English 2010 0 1
Assigning Trust to Wikipedia Content WikiSym English 2008 The Wikipedia is a collaborative encyclopedia: anyone can contribute to its articles simply by clicking on an "edit" button. The open nature of the Wikipedia has been key to its success, but has also created a challenge: how can readers develop an informed opinion on its reliability? We propose a system that computes quantitative values of trust for the text in Wikipedia articles; these trust values provide an indication of text reliability. The system uses as input the revision history of each article, as well as information about the reputation of the contributing authors, as provided by a reputation system. The trust of a word in an article is computed on the basis of the reputation of the original author of the word, as well as the reputation of all authors who edited text near the word. The algorithm computes word trust values that vary smoothly across the text; the trust values can be visualized using varying text-background colors. The algorithm ensures that all changes to an article's text are reflected in the trust values, preventing surreptitious content changes. We have implemented the proposed system, and we have used it to compute and display the trust of the text of thousands of articles of the English Wikipedia. To validate our trust-computation algorithms, we show that text labeled as low-trust has a significantly higher probability of being edited in the future than text labeled as high-trust. 0 7
Measuring Author Contributions to the Wikipedia WikiSym English 2008 0 5
Robust content-driven reputation Reputation
User generated content
Wikipedia
Proceedings of the ACM Conference on Computer and Communications Security English 2008 In content-driven reputation systems for collaborative content, users gain or lose reputation according to how their contributions fare: authors of long-lived contributions gain reputation, while authors of reverted contributions lose reputation. Existing content-driven systems are prone to Sybil attacks, in which multiple identities, controlled by the same person, perform coordinated actions to increase their reputation. We show that content-driven reputation systems can be made resistant to such attacks by taking advantage of thefact that the reputation increments and decrements depend on content modifications, which are visible to all. We present an algorithm for content-driven reputation that prevents a set of identities from increasing their maximum reputation without doing any useful work. Here, work is considered useful if it causes content to evolve in a direction that is consistent with the actions of high-reputation users. We argue that the content modifications that require no effort, such as the insertion or deletion of arbitrary text, are invariably non-useful. We prove a truthfullness result for the resulting system, stating that users who wish to perform a contribution do not gain by employing complex contribution schemes, compared to simply performing the contribution at once. In particular, splitting the contribution in multiple portions, or employing the coordinated actions of multiple identities, do not yield additional reputation. Taken together, these results indicate that content-driven systems can be made robust with respect to Sybil attacks. Copyright 2008 ACM. 0 0