Analyzing the Wikisphere: Tools and Methods for Wiki Research
|Analyzing the Wikisphere: Tools and Methods for Wiki Research|
|Author(s)||Jeffrey Charles Stuckman|
|Keyword(s)||crawler, distribution, gini, measurement, Mediawiki, wiki|
|Article||BASE, CiteSeerX, Google Scholar|
|Web||Ask, Bing, Google (PDF), Yahoo!|
|Download and mirrors|
|Local copy||Not available|
|Remote mirror(s)||Not available|
|Export and share|
|BibTeX, CSV, RDF, JSON|
|Browse properties · List of master's theses|
We present tools and techniques that facilitate wiki research and an analysis of wikis found on the internet. We developed WikiCrawler, a tool that downloads and analyzes wikis. With this tool, we built a corpus of 151 Mediawiki wikis. We also developed a wiki analysis toolkit in R, which, among other tasks, fits probability distributions to discrete data, and uses a Monte Carlo method to test the fit. From the corpus we determined that, like Wikipedia, most wikis were authored collaboratively, but users contributed at unequal rates. We proposed a distribution-based method for measuring wiki inequality and compared it to the Gini coefficient. We also analyzed distributions of edits across pages and users, producing data which can motivate or verify future mathematical models of behavior on wikis. Future research could also analyze user behavior and establish measurement baselines, facilitating evaluation, or generalize Wikipedia research by testing hypotheses across many wikis.
- This section requires expansion. Please, help!
Probably, this publication is cited by others, but there are no articles available for them in WikiPapers.