OBJECT-ORIENTED PROGRAMMING, SYSTEMS, LANGUAGES and APPLICATIONS
 
 
Program
 


Program (2mb PDF)

Explore
  Invited Speakers
  Onward!
  Panels
  Workshops
Discover
  Research Papers
  Student Research Comp.
  Posters
  Doctoral Symposium
  Educators' Symposium
  Wiki Symposium
  Dynamic Lang. Symp.
Understand
  Tutorials
  Essays
  Practitioner Reports
  Demonstrations
Create
  DesignFest
  Lightning Talks
  FlashBoF
  Instant Arts School Exp.
 
Other Events
 
Resort Map (364kb PDF)
 
Resort Map (JPG)

 

 
Basket
 

view, help

"MDAbench: A Tool for Customized Benchmark Generation Using MDA"

 

 
Page
 

Printer-friendly

 
 
  > Demonstrations

 : Tuesday

MDAbench: A Tool for Customized Benchmark Generation Using MDA

Courtyard (room C)
Tuesday, 12:00, 45 minutes

 


 
7·8·9·10·11·12·13·14·15·16·17·18·19·20·21

Liming Zhu, School of Computer Science and Engineering, University of New South Wales, Australia;Empirical Software Engineering Program, National ICT Australia Ltd.
Yan Liu, Empirical Software Engineering Program, National ICT Australia Ltd.
Ian Gorton, Empirical Software Engineering Program, National ICT Australia Ltd.
Ngoc Bao Bui, Faculty of Information Technology, University of Technology Sydney, Australia

Demonstration number: 6

Designing component-based application that meet performance requirements remains a challenging problem, and usually requires a prototype to be constructed to benchmark performance. Building a custom benchmark suite is however costly and tedious due to the complexity and ?plumbing? involved in modern component containers and the ad hoc mechanisms they adopt for performance measurement. This demonstration illustrates an approach for generating customized component-based benchmark applications using a Model Driven Architecture (MDA) approach. All the platform related plumbing and basic performance testing routines are encapsulated in MDA generation ?cartridges? along with default implementations of testing logic. We will show how to use a tailored version of the UML 2.0 Testing Profile to model a customized load testing client. The load testing design is logically structured following the testing profile. The performance configuration (such as transaction mix, concurrent users, testing duration and spiking simulations) can also be modeled using the UML model and consequently be generated into code and configuration files. Executing the generated deployable code will collect the performance testing data automatically. The tool implementation is based on a widely used open source MDA framework AndroMDA. We extended AndroMDA by providing a cartridge for a performance testing tailored version of the UML 2.0 Testing Profile. You can use it to model and generate a load testing suite and performance measurement infrastructure. Essentially, we use OO-based meta-modeling in designing and implementing a lightweight performance testing domain specific language with supporting infrastructure on top of the existing UML testing standard.
 
.