Automated Test Model Generation and Evaluation for Model Transformation Verification in Simulink-Based Systems Development

Abstract

Rigorous testing is essential to ensure the correctness of model transformations, particularly those involving Simulink models used in the design of complex systems. A central challenge within model transformation testing lies in verifying the preservation of semantics between input and output models. This thesis proposes developing a method to automatically generate test model suites specifically designed for this semantic verification task. The student will implement this method and integrate automated execution and evaluation of the derived test models.

Problem Statement

Model transformations are a core element in maintaining model continuity throughout large system development, where diverse tools and formalisms are used in different design stages. Semantic inconsistencies between transformed models are a major risk. Manually creating test models for transformation verification is time-consuming, prone to errors, and lacks scalability.

Research Questions

  1. How can a systematic method be defined to automatically generate suites of test models that specifically target the preservation of semantics between Simulink input models and their transformed counterparts?
  2. How can this method be effectively implemented to provide a practical tool for streamlining model transformation testing?
  3. What mechanisms are best suited for automating the execution and evaluation of generated test models to provide informative verification results?

Methodology

  1. Method Development: Design a method based on formal analysis of Simulink model structures and transformation rules. The method will generate test input models exercising various model elements and semantic relationships.
  2. Implementation: Implement the method as a software tool, potentially integrated within an existing modeling environment.
  3. Case Studies: Develop representative Simulink-based case studies, with accompanying model transformations to demonstrate different transformation patterns.
  4. Test Generation and Evaluation: Use the implemented tool to generate test suites. Execute transformations with these test models, and develop automated evaluation techniques to assess semantic equivalence of output models.

Student Opportunities

  • Model-Driven Engineering: Deepen understanding of Model-Driven Engineering (MDE) concepts, model transformation techniques, and their role in system design.
  • Functional Safety: Explore potential applications of model transformation testing to ensure functional safety in critical systems.
  • Agile Development: Collaborate within a team using Scrum-like methodologies for project management and software development.

Significance

This project addresses the crucial need for automated, reliable verification of model transformations, especially within Simulink-based design environments. Its findings will contribute to increased efficiency and trustworthiness in developing large, complex systems where model continuity and semantic consistency are paramount.

Last Modification: 03.05.2024 - Contact Person: Webmaster