Property:Description
From WikiPapers
This is a property of type Text.
Pages using the property "Description"
Showing 25 pages using this property.
(previous 25) (next 25)A | |
---|---|
AVBOT + | AVBOT is an [[anti-vandalism]] [[bot]] in Spanish Wikipedia. It uses [[regular expressions]] and scores to detect vandalism. |
Alternative MediaWiki parsers + | Alternative parsers is a compilation of various alternative MediaWiki parsers which are able or intended to translate MediaWiki's text markup syntax into something else. |
AssessMediaWiki + | AssessMediaWiki is an open-source web application that, connected to a MediaWiki installation, supports for hetero, self and peer to peer assessment procedures, whilst keeps track of compiled assessment data. Thus supervisors can obtain reports to help assessing students. |
Authorship Tracking + | Authorship Tracking This code implements the algorithms for tracking the authorship of text in revisioned content that have been published in WWW 2013: http://www2013.wwwconference.org/proceedings/p343.pdf The idea consists in attributing each portion of text to the earliest revision where it appeared. For instance, if a revision contains the sentence "the cat ate the mouse", and the sentence is deleted, and reintroduced in a later revision (not necessarily as part of a revert), once re-introduced it is still attributed to its earliest author. Precisely, the algorithm takes a parameter N. If a sequence of tokens of length equal or greater than N has appeared before, it is attributed to its earliest occurrence. See the paper for details. The code works by building a trie-based representation of the whole history of the revisions, in an object of the class AuthorshipAttribution. Each time a new revision is passed to the object, the object updates its internal state and it computes the earliest attribution of the new revision, which can be then easily obtained. The object itself can be serialized (and de-serialized) using json-based methods. To avoid the representation of the whole past history from growing too much, we remove from the object the information about content that has been absent from revisions (a) for at least 90 days, and (b) for at least 100 revisions. These are configurable parameters. With these choices, for the Wikipedia, the serialization of the object has size typically between 10 and 20 times the size of a typical revision, even for pages with very long revision lists. See paper for detailed experimental results. |
C | |
Catdown + | Catdown is a tool to download images in Wikimedia Commons categories. |
ClueBot + | ClueBot is an [[anti-vandalism]] [[bot]] in English Wikipedia. |
CoCoBi + | CoCoBi is a Corpus of Comparable Biographies in [[German]] and contains 400 annotated biographies of 141 famous people. Automatic annotation was done the same way and with the same tools as in [[WikiBiography]]. Biographies come from different sources, mainly, from Wikipedia and the Brockhaus Lexikon. |
Commons explorer + | Commons explorer is a tool map for exploring [[Wikimedia Commons]] multimedia files by location and year. |
Contropedia + | http://contropedia.net Analysis and visualization of controversies within Wikipedia articles. More info, publications, and a demo are available at http://contropedia.net/ |
Coordinates in Wikipedia articles + | Coordinates in Wikipedia articles is a compilation of all the [[coordinates]] added to Wikipedia, language-by-language. |
D | |
DBpedia + | DBpedia is a community effort to extract structured information from Wikipedia and to make this information available on the web. DBpedia allows you to ask sophisticated queries against Wikipedia, and to link other data sets on the web to Wikipedia data. |
Deletionpedia + | Deletionpedia is an [[archive]] of about 62,471 pages which have been deleted from [[English Wikipedia]]. |
DiffDB + | DiffDB are made of DiffIndexer and DiffSearcher. |
Domas visits logs + | Domas visits logs are page view statistics for Wikimedia projects. |
Dump-downloader + | dump-downloader Script to request and download the full history dump of all the pages in a MediaWiki. Meant to work for Wikia's wikis but I could work with other wikis. Source code here: https://github.com/Grasia/wiki-scripts/tree/master/wikia_dump_downloader |
E | |
EPIC/Oxford Wikipedia quality assessment + | EPIC/Oxford Wikipedia quality assessment This dataset comprises the full, anonymized set of responses from the blind assessment of a sample of Wikipedia articles across languages and disciplines by academic experts. The study was conducted in 2012 by EPIC and the University of Oxford and sponsored by the Wikimedia Foundation. |
H | |
HistoryFlow + | HistoryFlow is a tool for visualizing dynamic, evolving documents and the interactions of multiple collaborating authors. In its current implementation, history flow is being used to visualize the evolutionary history of wiki pages on Wikipedia. |
I | |
Ikiwiki + | Ikiwiki supports to store a wiki as a git repository. |
Images for biographies + | Images for biographies is a tool that suggests images for [[biographies]] in several Wikipedias. |
Infobox2rdf + | infobox2rdf generates huge RDF datasets from the infobox data in Wikipedia dump files. |
J | |
Java Wikipedia Library + | Java Wikipedia Library is an application programming interface that allows to access all information in Wikipedia. |
L | |
Listen to Wikipedia + | Listen to Wikipedia a visual and audio illustration of live editing activity on Wikipedia. |
M | |
Manypedia.com + | Manypedia.com is a web tool in which you can compare [[Linguistic Points Of View]] (LPOV) of different language Wikipedias. For example (but this is just one of the many possible comparisons), are you wondering if the community of editors in the English, Arabic and Hebrew Wikipedias are crystallizing different histories of the Gaza War? |
MediaWiki + | MediaWiki is famous a [[wiki engine]]. It is used by [[Wikipedia]]. |
MediaWiki API + | MediaWiki API provides direct, high-level access to the data contained in [[MediaWiki]] databases. |