Towards Leveraging Tests to Identify Impacts of Metamodel and Code Co-evolution
Résumé
Models play a significant role in Model-Driven Engineering (MDE) and metamodels are commonly transformed into code. Developers intensively rely on the generated code to build language services and tooling, such as editors and views which are also tested to ensure their behavior. The metamodel evolution between releases updates the generated code, and this may impact the developers' additional, client code. Accordingly, the impacted code must be co-evolved too, but there is no guarantee of preserving its behavior correctness. This paper envisions an automatic approach for ensuring code co-evolution correctness. It first aims to trace the tests impacted by the metamodel evolution before and after the code co-evolution, and then compares them to analyze the behavior of the code. Preliminary evaluation on two implementations of OCL and Modisco Eclipse projects. showed that we can successfully traced the impacted tests automatically by selecting 738 and 412 tests, before and after co-evolution respectively, based on 303 metamodel changes. By running these impacted tests, we observed both behaviorally correct and incorrect code co-evolution.
Fichier principal
Towards_Leveraging_Tests_to_Identify_Impacts_of_Metamodel_and_Code_Co_evolution_CAISE23_forum.pdf (490.4 Ko)
Télécharger le fichier
Origine | Fichiers produits par l'(les) auteur(s) |
---|