Browse wiki

Jump to: navigation, search
A framework for benchmarking entity-annotation systems
Abstract In this paper we design and implement a beIn this paper we design and implement a benchmarking framework for fair and exhaustive comparison of entity-annotation systems. The framework is based upon the definition of a set of problems related to the entity-annotation task, a set of measures to evaluate systems performance, and a systematic comparative evaluation involving all publicly available datasets, containing texts of various types such as news, tweets and Web pages. Our framework is easily-extensible with novel entity annotators, datasets and evaluation measures for comparing systems, and it has been released to the public as open source1. We use this framework to perform the first extensive comparison among all available entity annotators over all available datasets, and draw many interesting conclusions upon their efficiency and effectiveness. We also draw conclusions between academic versus commercial annotators. Copyright is held by the International World Wide Web Conference Committee (IW3C2).rld Wide Web Conference Committee (IW3C2).
Abstractsub In this paper we design and implement a beIn this paper we design and implement a benchmarking framework for fair and exhaustive comparison of entity-annotation systems. The framework is based upon the definition of a set of problems related to the entity-annotation task, a set of measures to evaluate systems performance, and a systematic comparative evaluation involving all publicly available datasets, containing texts of various types such as news, tweets and Web pages. Our framework is easily-extensible with novel entity annotators, datasets and evaluation measures for comparing systems, and it has been released to the public as open source1. We use this framework to perform the first extensive comparison among all available entity annotators over all available datasets, and draw many interesting conclusions upon their efficiency and effectiveness. We also draw conclusions between academic versus commercial annotators. Copyright is held by the International World Wide Web Conference Committee (IW3C2).rld Wide Web Conference Committee (IW3C2).
Bibtextype inproceedings  +
Has author Cornolti M. + , Paolo Ferragina + , Massimiliano Ciaramita +
Has extra keyword Comparative evaluations + , Design and implements + , Entity annotation + , Evaluation measures + , Systems performance + , Wikipedia + , Benchmarking + , World Wide Web + , Data processing +
Has keyword Benchmark framework + , Entity annotation + , Wikipedia +
Isbn 9781450320351  +
Language English +
Number of citations by publication 0  +
Number of references by publication 0  +
Pages 249–259  +
Published in WWW 2013 - Proceedings of the 22nd International Conference on World Wide Web +
Title A framework for benchmarking entity-annotation systems +
Type conference paper  +
Year 2013 +
Creation dateThis property is a special property in this wiki. 6 November 2014 15:07:33  +
Categories Publications without license parameter  + , Publications without DOI parameter  + , Publications without remote mirror parameter  + , Publications without archive mirror parameter  + , Publications without paywall mirror parameter  + , Conference papers  + , Publications without references parameter  + , Publications  +
Modification dateThis property is a special property in this wiki. 6 November 2014 15:07:33  +
DateThis property is a special property in this wiki. 2013  +
hide properties that link here 
A framework for benchmarking entity-annotation systems + Title
 

 

Enter the name of the page to start browsing from.