Omnipedia: Bridging the Wikipedia Language Gap
|Omnipedia: Bridging the Wikipedia Language Gap|
|Author(s)||Patti Bao, Brent Hecht, Samuel Carton, Mahmood Quaderi, Michael Horn, Darren Gergle|
|Published in||International Conference on Human Factors in Computing Systems|
|Keyword(s)||Wikipedia, multilingual, hyperlingual, language barrier, user-generated content, text mining|
|Article||BASE, CiteSeerX, Google Scholar|
|Web||Ask, Bing, Google (PDF), Yahoo!|
|Download and mirrors|
|Local copy||Not available|
|Export and share|
|BibTeX, CSV, RDF, JSON|
|Browse properties · List of conference papers|
Omnipedia: Bridging the Wikipedia Language Gap is a 2012 conference paper written in English by Patti Bao, Brent Hecht, Samuel Carton, Mahmood Quaderi, Michael Horn, Darren Gergle and published in International Conference on Human Factors in Computing Systems.
We present Omnipedia, a system that allows Wikipedia readers to gain insight from up to 25 language editions ofWikipedia simultaneously. Omnipedia highlights the similarities and differences that exist among Wikipedia language editions, and makes salient information that is unique to each language as well as that which is shared more widely. We detail solutions to numerous front-end and algorithmic challenges inherent to providing users with a multilingual Wikipedia experience. These include visualizing content in a language-neutral way and aligning data in the face of diverse information organization strategies. We present a study of Omnipedia that characterizes how people interact with information using a multilingual lens. We found that users actively sought information exclusive to unfamiliar language editions and strategically compared how language editions defined concepts. Finally, we briefly discuss how Omnipedia generalizes to other domains facing language barriers.
- This section requires expansion. Please, help!
Probably, this publication is cited by others, but there are no articles available for them in WikiPapers.