From open-source software to Wikipedia: 'Backgrounding' trust by collective monitoring and reputation tracking
|From open-source software to Wikipedia: 'Backgrounding' trust by collective monitoring and reputation tracking|
|Author(s)||de Laat P.B.|
|Published in||Ethics and Information Technology|
|Keyword(s)||Bots, Open-source software, Reputation, Trust, Vandalism, Wikipedia (Extra: Computational methods, Bots, Open-source softwares, Reputation, Trust, Vandalism, Wikipedia, Open source software)|
|Article||BASE, CiteSeerX, Google Scholar|
|Web||Ask, Bing, Google (PDF), Yahoo!|
|Download and mirrors|
|Local copy||Not available|
|Remote mirror(s)||Not available|
|Export and share|
|BibTeX, CSV, RDF, JSON|
|Browse properties · List of journal articles|
From open-source software to Wikipedia: 'Backgrounding' trust by collective monitoring and reputation tracking is a 2014 journal article written in English by de Laat P.B. and published in Ethics and Information Technology.
Open-content communities that focus on co-creation without requirements for entry have to face the issue of institutional trust in contributors. This research investigates the various ways in which these communities manage this issue. It is shown that communities of open-source software-continue to-rely mainly on hierarchy (reserving write-access for higher echelons), which substitutes (the need for) trust. Encyclopedic communities, though, largely avoid this solution. In the particular case of Wikipedia, which is confronted with persistent vandalism, another arrangement has been pioneered instead. Trust (i.e. full write-access) is 'backgrounded' by means of a permanent mobilization of Wikipedians to monitor incoming edits. Computational approaches have been developed for the purpose, yielding both sophisticated monitoring tools that are used by human patrollers, and bots that operate autonomously. Measures of reputation are also under investigation within Wikipedia; their incorporation in monitoring efforts, as an indicator of the trustworthiness of editors, is envisaged. These collective monitoring efforts are interpreted as focusing on avoiding possible damage being inflicted on Wikipedian spaces, thereby being allowed to keep the discretionary powers of editing intact for all users. Further, the essential differences between backgrounding and substituting trust are elaborated. Finally it is argued that the Wikipedian monitoring of new edits, especially by its heavy reliance on computational tools, raises a number of moral questions that need to be answered urgently.
- This section requires expansion. Please, help!
Probably, this publication is cited by others, but there are no articles available for them in WikiPapers.