Inductive Learning of Concept Representations from Library-Scale Bibliographic Corpora
dc.contributor.author | Galke, Lukas | |
dc.contributor.author | Melnychuk, Tetyana | |
dc.contributor.author | Seidlmayer, Eva | |
dc.contributor.author | Trog, Steffen | |
dc.contributor.author | Förstner, Konrad U. | |
dc.contributor.author | Schultz, Carsten | |
dc.contributor.author | Tochtermann, Klaus | |
dc.contributor.editor | David, Klaus | |
dc.contributor.editor | Geihs, Kurt | |
dc.contributor.editor | Lange, Martin | |
dc.contributor.editor | Stumme, Gerd | |
dc.date.accessioned | 2019-08-27T12:55:21Z | |
dc.date.available | 2019-08-27T12:55:21Z | |
dc.date.issued | 2019 | |
dc.description.abstract | Automated research analyses are becoming more and more important as the volume of research items grows at an increasing pace. We pursue a new direction for the analysis of research dynamics with graph neural networks. So far, graph neural networks have only been applied to small-scale datasets and primarily supervised tasks such as node classification. We propose to use an unsupervised training objective for concept representation learning that is tailored towards bibliographic data with millions of research papers and thousands of concepts from a controlled vocabulary. We have evaluated the learned representations in clustering and classification downstream tasks. Furthermore, we have conducted nearest concept queries in the representation space. Our results show that the representations learned by graph convolution with our training objective are comparable to the ones learned by the DeepWalk algorithm. Our findings suggest that concept embeddings can be solely derived from the text of associated documents without using a lookup-table embedding. Thus, graph neural networks can operate on arbitrary document collections without re-training. This property makes graph neural networks useful for the analysis of research dynamics, which is often conducted on time-based snapshots of bibliographic data. | en |
dc.identifier.doi | 10.18420/inf2019_26 | |
dc.identifier.isbn | 978-3-88579-688-6 | |
dc.identifier.pissn | 1617-5468 | |
dc.identifier.uri | https://dl.gi.de/handle/20.500.12116/24973 | |
dc.language.iso | en | |
dc.publisher | Gesellschaft für Informatik e.V. | |
dc.relation.ispartof | INFORMATIK 2019: 50 Jahre Gesellschaft für Informatik – Informatik für Gesellschaft | |
dc.relation.ispartofseries | Lecture Notes in Informatics (LNI) - Proceedings, Volume P-294 | |
dc.subject | machine learning | |
dc.subject | representation learning | |
dc.subject | neural networks | |
dc.subject | graph mining | |
dc.title | Inductive Learning of Concept Representations from Library-Scale Bibliographic Corpora | en |
dc.type | Text/Conference Paper | |
gi.citation.endPage | 232 | |
gi.citation.publisherPlace | Bonn | |
gi.citation.startPage | 219 | |
gi.conference.date | 23.-26. September 2019 | |
gi.conference.location | Kassel | |
gi.conference.sessiontitle | Data Science |
Dateien
Originalbündel
1 - 1 von 1