Logo des Repositoriums
 
Textdokument

In-Database Machine Learning: Gradient Descent and Tensor Algebra for Main Memory Database Systems

Lade...
Vorschaubild

Volltext URI

Dokumententyp

Zusatzinformation

Datum

2019

Zeitschriftentitel

ISSN der Zeitschrift

Bandtitel

Quelle

Verlag

Gesellschaft für Informatik, Bonn

Zusammenfassung

Machine learning tasks such as regression, clustering, and classification are typically performed outside of database systems using dedicated tools, necessitating the extraction, transfor-mation, and loading of data. We argue that database systems when extended to enable automatic differentiation, gradient descent, and tensor algebra are capable of solving machine learning tasks more efficiently by eliminating the need for costly data communication. We demonstrate our claim by implementing tensor algebra and stochastic gradient descent using lambda expressions for loss functions as a pipelined operator in a main memory database system. Our approach enables common machine learning tasks to be performed faster than by extended disk-based database systems or as well as dedicated tools by eliminating the time needed for data extraction. This work aims to incorporate gradient descent and tensor data types into database systems, allowing them to handle a wider range of computational tasks.

Beschreibung

Schüle, Maximilian; Simonis, Frédéric; Heyenbrock, Thomas; Kemper, Alfons; Günnemann, Stephan; Neumann, Thomas (2019): In-Database Machine Learning: Gradient Descent and Tensor Algebra for Main Memory Database Systems. BTW 2019. DOI: 10.18420/btw2019-16. Gesellschaft für Informatik, Bonn. PISSN: 1617-5468. ISBN: 978-3-88579-683-1. pp. 247-266. Wissenschaftliche Beiträge. Rostock. 4.-8. März 2019

Schlagwörter

Zitierform

Tags