Logo des Repositoriums
 

Word Embeddings for Practical Information Retrieval

dc.contributor.authorGalke, Lukas
dc.contributor.authorSaleh, Ahmed
dc.contributor.authorScherp, Ansgar
dc.contributor.editorEibl, Maximilian
dc.contributor.editorGaedke, Martin
dc.date.accessioned2017-08-28T23:47:39Z
dc.date.available2017-08-28T23:47:39Z
dc.date.issued2017
dc.description.abstractWe assess the suitability of word embeddings for practical information retrieval scenarios. Thus, we assume that users issue ad-hoc short queries where we return the first twenty retrieved documents after applying a boolean matching operation between the query and the documents. We compare the performance of several techniques that leverage word embeddings in the retrieval models to compute the similarity between the query and the documents, namely word centroid similarity, paragraph vectors, Word Mover’s distance, as well as our novel inverse document frequency (IDF) re-weighted word centroid similarity. We evaluate the performance using the ranking metrics mean average precision, mean reciprocal rank, and normalized discounted cumulative gain. Additionally, we inspect the retrieval models’ sensitivity to document length by using either only the title or the full-text of the documents for the retrieval task. We conclude that word centroid similarity is the best competitor to state-of-the-art retrieval models. It can be further improved by re-weighting the word frequencies with IDF before aggregating the respective word vectors of the embedding. The proposed cosine similarity of IDF re-weighted word vectors is competitive to the TF-IDF baseline and even outperforms it in case of the news domain with a relative percentage of 15%.en
dc.identifier.doi10.18420/in2017_215
dc.identifier.isbn978-3-88579-669-5
dc.identifier.pissn1617-5468
dc.language.isoen
dc.publisherGesellschaft für Informatik, Bonn
dc.relation.ispartofINFORMATIK 2017
dc.relation.ispartofseriesLecture Notes in Informatics (LNI) - Proceedings, Volume P-275
dc.subjectWord embeddings
dc.subjectDocument representation
dc.subjectInformation retrieval
dc.titleWord Embeddings for Practical Information Retrievalen
gi.citation.endPage2167
gi.citation.startPage2155
gi.conference.date25.-29. September 2017
gi.conference.locationChemnitz
gi.conference.sessiontitleDeep Learning in heterogenen Datenbeständen

Dateien

Originalbündel
1 - 1 von 1
Lade...
Vorschaubild
Name:
B29-2.pdf
Größe:
421.04 KB
Format:
Adobe Portable Document Format