Show simple item record

dc.contributor.authorEhlers, Jens
dc.contributor.authorHasselbring, Wilhelm
dc.contributor.editorReussner, Ralf
dc.contributor.editorGrund, Matthias
dc.contributor.editorOberweis, Andreas
dc.contributor.editorTichy, Walter
dc.date.accessioned2019-01-17T13:47:12Z
dc.date.available2019-01-17T13:47:12Z
dc.date.issued2011
dc.identifier.isbn978-3-88579-277-2
dc.identifier.issn1617-5468
dc.identifier.urihttp://dl.gi.de/handle/20.500.12116/19873
dc.description.abstractIn addition to studying the construction and evolution of software services, the software engineering discipline needs to address the operation of continuously running software services. A requirement for its robust operation are means for effective monitoring of software runtime behavior. In contrast to profiling for construction activities, monitoring of operational services should only impose a small performance overhead. Furthermore, instrumentation should be non-intrusive to the business logic, as far as possible. Monitoring of continuously operating software services is essential for achieving high availability and high performance of these services. A main issue for dynamic analysis techniques is the amount of monitoring data that is collected and processed at runtime. On one hand, more data allows for accurate and precise analyses. On the other hand, probe instrumentation, data collection and analyses may cause significant overheads. Consequently, a trade-off between analysis quality and monitoring coverage has to be reached. In this paper, we present a method for self-adaptive, rule-based performance monitoring. Our approach aims at a flexible instrumentation to monitor a software system’s timing behavior. A performance engineer’s task is to specify rules that define the monitoring goals for a specific software system. An inference engine decides at which granularity levelacomponent will be observed. We employ the Object Constraint Language (OCL) to specify the monitoring rules. Our goal-oriented, self-adaptive method is based on the continuous evaluation of these rules. The implementation is based on the Eclipse Modeling Framework and the Kieker monitoring framework. In our evaluation, this implementation is applied to the iBATIS JPetStore and the SPECjEnterprise2010 benchmark.en
dc.language.isoen
dc.publisherGesellschaft für Informatik e.V.
dc.relation.ispartofSoftware Engineering 2011 – Fachtagung des GI-Fachbereichs Softwaretechnik
dc.relation.ispartofseriesLecture Notes in Informatics (LNI) - Proceedings, Volume P-183
dc.titleSelf-adaptive software performance monitoringen
dc.typeText/Conference Paper
dc.pubPlaceBonn
mci.reference.pages51-62
mci.conference.sessiontitleRegular Research Papers
mci.conference.locationKarlsruhe
mci.conference.date21.-25. Februar 2011


Files in this item

Thumbnail

Show simple item record