Browse wiki

Jump to: navigation, search
SmartWiki: A reliable and conflict-refrained Wiki model based on reader differentiation and social context analysis
Abstract Wiki systems, such as Wikipedia, provide aWiki systems, such as Wikipedia, provide a multitude of opportunities for large-scale online knowledge collaboration. Despite Wikipedia's successes with the open editing model, dissenting voices give rise to unreliable content due to conflicts amongst contributors. Frequently modified controversial articles by dissent editors hardly present reliable knowledge. Some overheated controversial articles may be locked by Wikipedia administrators who might leave their own bias in the topic. It could undermine both the neutrality and freedom policies of Wikipedia. As Richard Rorty suggested "Take Care of Freedom and Truth Will Take Care of Itself"[1], we present a new open Wiki model in this paper, called TrustWiki, which bridge readers closer to the reliable information while allowing editors to freely contribute. From our perspective, the conflict issue results from presenting the same knowledge to all readers, without regard for the difference of readers and the revealing of the underlying social context, which both causes the bias of contributors and affects the knowledge perception of readers. TrustWiki differentiates two types of readers, "value adherents" who prefer compatible viewpoints and "truth diggers" who crave for the truth. It provides two different knowledge representation models to cater for both types of readers. Social context, including social background and relationship information, is embedded in both knowledge representations to present readers with personalized and credible knowledge. To our knowledge, this is the first paper on knowledge representation combining both psychological acceptance and truth reveal to meet the needs of different readers. Although this new Wiki model focuses on reducing conflicts and reinforcing the neutrality policy of Wikipedia, it also casts light on the other content reliability problems in Wiki systems, such as vandalism and minority opinion suppression. © 2013 Elsevier B.V. All rights reserved. © 2013 Elsevier B.V. All rights reserved.
Abstractsub Wiki systems, such as Wikipedia, provide aWiki systems, such as Wikipedia, provide a multitude of opportunities for large-scale online knowledge collaboration. Despite Wikipedia's successes with the open editing model, dissenting voices give rise to unreliable content due to conflicts amongst contributors. Frequently modified controversial articles by dissent editors hardly present reliable knowledge. Some overheated controversial articles may be locked by Wikipedia administrators who might leave their own bias in the topic. It could undermine both the neutrality and freedom policies of Wikipedia. As Richard Rorty suggested "Take Care of Freedom and Truth Will Take Care of Itself"[1], we present a new open Wiki model in this paper, called TrustWiki, which bridge readers closer to the reliable information while allowing editors to freely contribute. From our perspective, the conflict issue results from presenting the same knowledge to all readers, without regard for the difference of readers and the revealing of the underlying social context, which both causes the bias of contributors and affects the knowledge perception of readers. TrustWiki differentiates two types of readers, "value adherents" who prefer compatible viewpoints and "truth diggers" who crave for the truth. It provides two different knowledge representation models to cater for both types of readers. Social context, including social background and relationship information, is embedded in both knowledge representations to present readers with personalized and credible knowledge. To our knowledge, this is the first paper on knowledge representation combining both psychological acceptance and truth reveal to meet the needs of different readers. Although this new Wiki model focuses on reducing conflicts and reinforcing the neutrality policy of Wikipedia, it also casts light on the other content reliability problems in Wiki systems, such as vandalism and minority opinion suppression. © 2013 Elsevier B.V. All rights reserved. © 2013 Elsevier B.V. All rights reserved.
Bibtextype article  +
Doi 10.1016/j.knosys.2013.03.014  +
Has author Haifeng Zhao + , Kallander W. + , Johnson H. + , Wu S.F. +
Has extra keyword Community discoveries + , Confirmation bias + , Natural language generation + , On-line social networks + , Trust + , Wikipedia + , Knowledge representation + , Online systems + , Social networking (online) +
Has keyword Community discovery + , Confirmation bias + , Knowledge representation + , Natural language generation + , Online social network + , Trust + , Wikipedia +
Issn 9507051  +
Language English +
Number of citations by publication 0  +
Number of references by publication 0  +
Pages 53–64  +
Published in Knowledge-Based Systems +
Title SmartWiki: A reliable and conflict-refrained Wiki model based on reader differentiation and social context analysis +
Type journal article  +
Volume 47  +
Year 2013 +
Creation dateThis property is a special property in this wiki. 8 November 2014 06:48:06  +
Categories Publications without license parameter  + , Publications without remote mirror parameter  + , Publications without archive mirror parameter  + , Publications without paywall mirror parameter  + , Journal articles  + , Publications without references parameter  + , Publications  +
Modification dateThis property is a special property in this wiki. 8 November 2014 06:48:06  +
DateThis property is a special property in this wiki. 2013  +
hide properties that link here 
SmartWiki: A reliable and conflict-refrained Wiki model based on reader differentiation and social context analysis + Title
 

 

Enter the name of the page to start browsing from.