Browse wiki

Jump to: navigation, search
A hybrid method based on WordNet and Wikipedia for computing semantic relatedness between texts
Abstract In this article we present a new method foIn this article we present a new method for computing semantic relatedness between texts. For this purpose we use a tow-phase approach. The first phase involves modeling document sentences as a matrix to compute semantic relatedness between sentences. In the second phase, we compare text relatedness by using the relation of their sentences. Since Semantic relation between words must be searched in lexical semantic knowledge source, selecting a suitable source is very important, so that produced accurate results with correct selection. In this work, we attempt to capture the semantic relatedness between texts with a more accuracy. For this purpose, we use a collection of tow well known knowledge bases namely, WordNet and Wikipedia, so that provide more complete data source for calculate the semantic relatedness with a more accuracy. We evaluate our approach by comparison with other existing techniques (on Lee datasets).her existing techniques (on Lee datasets).
Abstractsub In this article we present a new method foIn this article we present a new method for computing semantic relatedness between texts. For this purpose we use a tow-phase approach. The first phase involves modeling document sentences as a matrix to compute semantic relatedness between sentences. In the second phase, we compare text relatedness by using the relation of their sentences. Since Semantic relation between words must be searched in lexical semantic knowledge source, selecting a suitable source is very important, so that produced accurate results with correct selection. In this work, we attempt to capture the semantic relatedness between texts with a more accuracy. For this purpose, we use a collection of tow well known knowledge bases namely, WordNet and Wikipedia, so that provide more complete data source for calculate the semantic relatedness with a more accuracy. We evaluate our approach by comparison with other existing techniques (on Lee datasets).her existing techniques (on Lee datasets).
Bibtextype inproceedings  +
Doi 10.1109/AISP.2012.6313727  +
Has author Malekzadeh R. + , Bagherzadeh J. + , Noroozi A. +
Has extra keyword Lexical semantics + , Semantic relatedness + , Semantic similarity + , Wikipedia + , Wordnet + , Artificial intelligence + , Information retrieval + , Ontology + , Semantics + , Signal processing + , Websites + , Natural language processing systems +
Has keyword Information retrieval + , Lexical semantic knowledge + , Semantic relatedness + , Semantic similarity + , Wikipedia + , Wordnet +
Isbn 9781467314794  +
Language English +
Number of citations by publication 0  +
Number of references by publication 0  +
Pages 107–111  +
Published in AISP 2012 - 16th CSI International Symposium on Artificial Intelligence and Signal Processing +
Title A hybrid method based on WordNet and Wikipedia for computing semantic relatedness between texts +
Type conference paper  +
Year 2012 +
Creation dateThis property is a special property in this wiki. 6 November 2014 12:37:12  +
Categories Publications without license parameter  + , Publications without remote mirror parameter  + , Publications without archive mirror parameter  + , Publications without paywall mirror parameter  + , Conference papers  + , Publications without references parameter  + , Publications  +
Modification dateThis property is a special property in this wiki. 6 November 2014 12:37:12  +
DateThis property is a special property in this wiki. 2012  +
hide properties that link here 
A hybrid method based on WordNet and Wikipedia for computing semantic relatedness between texts + Title
 

 

Enter the name of the page to start browsing from.