Statistical Model Checking of Response Times for Different System Deployments

Bernhard K. Aichernig, Severin Kann, Richard Alexander Schumi

Publikation: Beitrag in Buch/Bericht/KonferenzbandBeitrag in einem KonferenzbandBegutachtung


Performance testing is becoming increasingly important for interactive systems. Evaluating their performance with respect to user expectations is complex, especially for different system deployments. Various load-testing approaches and performance-simulation methods aim at such analyses.However, these techniques have certain disadvantages, like a high testing effort for load testing, and a questionable model accuracy for simulation methods. Hence, we propose a combination of both techniques. We apply statistical model checking with a learned timed model and evaluate the results on the real system with hypothesis testing. Moreover, we check the established hypotheses of a reference system on various system deployments (configurations), like different hardware or network settings, and analyse the influence on the performance. Our method is realised with a property-based testing tool that is extended with algorithms from statistical model checking. We illustrate the feasibility of our technique with an industrial case study of a web application.
TitelDependable Software Engineering. Theories, Tools, and Applications – 4th International Symposium, SETTA 2018
Herausgeber (Verlag)Springer
PublikationsstatusVeröffentlicht - 2018
Veranstaltung4th International Symposium on Dependable Software Engineering (SETTA 2018) - Beijing, China
Dauer: 4 Sep. 20186 Sep. 2018


Konferenz4th International Symposium on Dependable Software Engineering (SETTA 2018)

Fields of Expertise

  • Information, Communication & Computing


Untersuchen Sie die Forschungsthemen von „Statistical Model Checking of Response Times for Different System Deployments“. Zusammen bilden sie einen einzigartigen Fingerprint.

Dieses zitieren