Journal of Information Processing

From WikiPapers
Jump to: navigation, search


Only those publications related to wikis already available at WikiPapers are shown here.
Title Author(s) Keyword(s) Language DateThis property is a special property in this wiki. Abstract R C
A Malicious Bot Capturing System using a Beneficial Bot and Wiki Takashi Yamanoue
Kentaro Oda
Koichi Shimozono
Information security
Network analysis
English February 2013 Locating malicious bots in a large network is problematic because the internal firewalls and network address translation (NAT) routers of the network unintentionally contribute to hiding the bots’ host address and malicious packets. However, eliminating firewalls and NAT routers merely for locating bots is generally not acceptable. In the present paper, we propose an easy to deploy, easy to manage network security control system for locating a malicious host behind internal secure gateways. The proposed network security control system consists of a remote security device and a command server. The remote security device is installed as a transparent link (implemented as an L2 switch), between the subnet and its gateway in order to detect a host that has been compromised by a malicious bot in a target subnet, while minimizing the impact of deployment. The security device is controlled remotely by 'polling' the command server in order to eliminate the NAT traversal problem and to be firewall friendly. Since the remote security device exists in transparent, remotely controlled, robust security gateways, we regard this device as a beneficial bot. We adopt a web server with wiki software as the command server in order to take advantage of its power of customization, ease of use, and ease of deployment of the server. 5 2
Effects of implicit positive ratings for quality assessment of Wikipedia articles Yu Suzuki Edit history
English 2013 In this paper, we propose a method to identify high-quality Wikipedia articles by using implicit positive ratings. One of the major approaches for assessing Wikipedia articles is a text survival ratio based approach. In this approach, when a text survives beyond multiple edits, the text is assessed as high quality. However, the problem is that many low quality articles are misjudged as high quality, because every editor does not always read the whole article. If there is a low quality text at the bottom of a long article, and the text has not seen by the other editors, then the text survives beyond many edits, and the text is assessed as high quality. To solve this problem, we use a section and a paragraph as a unit instead of a whole page. In our method, if an editor edits an article, the system considers that the editor gives positive ratings to the section or the paragraph that the editor edits. From experimental evaluation, we confirmed that the proposed method could improve the accuracy of quality values for articles. 0 0
Classification of Recommender Expertise in the Wikipedia Recommender System Christian D. Jensen
Povilas Pilkauskas
Thomas Lefévre
English 2011 0 0