Schrumpf, JohannesThelen, TobiasHenning, Peter A.Striewe, MichaelWölfel, Matthias2022-08-232022-08-232022978-3-88579-716-6https://dl.gi.de/handle/20.500.12116/38852Digital Study Assistant (DSA) systems for higher education seek to support learners in identifying, structuring and pursuing their personal educational goals. One strategy to achieve this is to galvanize learner interest in engaging with educational resource beyond the scope of their known, pre-determined curriculum. For this purpose, DSA systems may provide a recommendation engine that matches learner interests in natural language to an educational resource covering the topic of interest. To offer a rich assortment of educational resources, these resources need to be fetched from multiple sources such as MOOC and OER repositories or from the learning management system of a local University. In a previous publication, we have presented SidBERT, a BERT-based natural language processing neural network for educational resource classification and recommendation which has been in active use in a prototypical Digital Study Assistant system. This work seeks to follow up on the SidBERT architecture, by introducing an evolution of SidBERT, SemBERT, that is capable of comparing educational resources on a more fine-grained level, thereby addressing multiple shortcomings of the SidBERT architecture and its application within the DSA software. We present network architecture, training parameters and evaluate SemBERT on two datasets. We compare SemBERT to SidBERT and discuss the implications of SemBERT for DSA systems at large.enRe-thinking Transformer based educational resource recommendation engines for higher educationText/Conference Paper10.18420/delfi2022-0141617-5468