Browse wiki

Jump to: navigation, search
When the levee breaks: Without bots, what happens to wikipedia's quality control processes?
Abstract In the first half of 2011, ClueBot NG - onIn the first half of 2011, ClueBot NG - one of the most prolific counter-vandalism bots in the English-language Wikipedia - went down for four distinct periods, each period of downtime lasting from days to weeks. In this paper, we use these periods of breakdown as naturalistic experiments to study Wikipedia's heterogeneous quality control network, which we analyze as a multi-tiered system in which distinct classes of reviewers use various reviewing technologies to patrol for different kinds of damage at staggered time periods. Our analysis showed that the overall time-to-revert edits was almost doubled when this software agent was down. Yet while a significantly fewer proportion of edits made during the bot's downtime were reverted, we found that those edits were later eventually reverted. This suggests that other agents in Wikipedia took over this quality control work, but performed it at a far slower rate. Categories and Subject Descriptors H.5.3 [Information Systems]: Group and Organization Interfaces-computer-supported collaborative work. Copyright 2010 ACM.ed collaborative work. Copyright 2010 ACM.
Abstractsub In the first half of 2011, ClueBot NG - onIn the first half of 2011, ClueBot NG - one of the most prolific counter-vandalism bots in the English-language Wikipedia - went down for four distinct periods, each period of downtime lasting from days to weeks. In this paper, we use these periods of breakdown as naturalistic experiments to study Wikipedia's heterogeneous quality control network, which we analyze as a multi-tiered system in which distinct classes of reviewers use various reviewing technologies to patrol for different kinds of damage at staggered time periods. Our analysis showed that the overall time-to-revert edits was almost doubled when this software agent was down. Yet while a significantly fewer proportion of edits made during the bot's downtime were reverted, we found that those edits were later eventually reverted. This suggests that other agents in Wikipedia took over this quality control work, but performed it at a far slower rate. Categories and Subject Descriptors H.5.3 [Information Systems]: Group and Organization Interfaces-computer-supported collaborative work. Copyright 2010 ACM.ed collaborative work. Copyright 2010 ACM.
Bibtextype inproceedings  +
Doi 10.1145/2491055.2491061  +
Has author Geiger R.S. + , Aaron Halfaker +
Has extra keyword Robot + , Information quality + , Peer production + , Sociotechnical systems + , Wikipedia + , Automation + , Maintenance + , Quality assurance + , Quality control + , Software agents + , Websites +
Has keyword Automation + , Robot + , Information quality + , Peer production + , Socio-technical systems + , Software agents + , Wikipedia +
Isbn 9781450318525  +
Language English +
Number of citations by publication 0  +
Number of references by publication 0  +
Published in Proceedings of the 9th International Symposium on Open Collaboration, WikiSym + OpenSym 2013 +
Title When the levee breaks: Without bots, what happens to wikipedia's quality control processes? +
Type conference paper  +
Year 2013 +
Creation dateThis property is a special property in this wiki. 8 November 2014 03:24:59  +
Categories Publications without license parameter  + , Publications without remote mirror parameter  + , Publications without archive mirror parameter  + , Publications without paywall mirror parameter  + , Conference papers  + , Publications without references parameter  + , Publications  +
Modification dateThis property is a special property in this wiki. 8 November 2014 03:24:59  +
DateThis property is a special property in this wiki. 2013  +
hide properties that link here 
When the levee breaks: Without bots, what happens to wikipedia's quality control processes? + Title
 

 

Enter the name of the page to start browsing from.