This extended abstract summarizes our paper entitled "A Comparison Framework for Runtime Monitoring Approaches" published in the Journal on Systems and Software in vol. 125 in 2017 (https://doi.org/10. 1016/j.jss.2016.12.034). This paper provides the following contributions: (i) a framework that supports analyzing and comparing runtime monitoring approaches using different dimensions and elements; (ii) an application of the framework to analyze and compare 32 existing monitoring approaches; and (iii) a discussion of perspectives and potential future applications of our framework, e.g., to support the selection of an approach for a particular monitoring problem or application context.Index Terms-Runtime monitoring, literature review, comparison framework
I. SUMMARYMany of today's large-scale software systems are complex and heterogeneous, and need to be monitored since their full behavior often only emerges at runtime (e.g., when interacting with other systems or the environment). Diverse monitoring approaches for various kinds of systems and purposes have been proposed [1], e.g., requirements monitoring [2], monitoring of architectural properties [3], and runtime verification [4]. The desired runtime behavior is often formally expressed using temporal logic or through the use of domain-specific (constraint) languages. Defined constraints are checked based on events and data collected from systems at runtime, typically through instrumentation.Existing monitoring approaches are very diverse: only few provide (end-user) tool support; some cover specific architectural styles (e.g., service-oriented architectures), while others are general-purpose; some automatically generate instrumentations based on models, while others require probes to be manually developed. Approaches also differ regarding their expressiveness [5], e.g., the degree of support to check the occurrence and/or order of runtime events (temporal behavior), the interactions occurring between different (sub-)systems (structural behavior), and/or the properties held by certain runtime data (data checks).This diversity makes it difficult to analyze and compare existing approaches. We thus developed a comparison frameworkThe full version of this work was published as journal article [1].for runtime monitoring approaches [1] based on the results of a systematic literature review [2], building on existing taxonomies for monitoring languages and patterns [5], [6], and taking inspiration from comparison frameworks from other domains such as software architecture or software product lines [7], [8].Specifically, our paper [1] provides the following contributions: (i) a framework that supports analyzing and comparing runtime monitoring approaches using different dimensions and elements; (ii) an application of the framework to analyze and compare 32 existing monitoring approaches; and (iii) a discussion of perspectives and potential future applications of our framework, e.g., to support the selection of an approach for a particular monitoring problem or application...