Generating Test Sequences to Assess the Performance of Elastic Cloud-based Systems
Abstract
Elasticity is one of the main features of cloud-based systems (CBS), where elastic adaptations, such as those to deal with scaling in or scaling out of computational resources, help to meet performance requirements under varying workload. There is an industrial need to find configurations of elastic adaptations and workload that could lead to degradation of performance in a CBS, serving possibly millions of users. However, the potentially great number of such configurations poses a challenge: executing and verifying all of them on the cloud can be prohibitively expensive in both, time and cost. We present an approach to model elasticity adaptation due to workload changes as a classification tree model and consequently generate short test sequences of configurations that cover all T-wise interactions between parameters in the model. These test sequences, when executed, help us to assess the performance of elastic CBS. Using MongoDB as a case study, test sequences generated by our approach reveal several significant performance degradations.
Origin | Files produced by the author(s) |
---|
Loading...