Sunday, 26 October

8:30-17:00 Full day

Tutorial 19 Automated Software Testing: Hands On and Interactive!
Workshop 24: Middleware Benchmarking

8:30-12:00 Morning

Tutorial 5: Evolutionary Design
Workshop 17: Extreme Programming Practices in the First CS Courses

Wednesday, 29 October

13:30-17:00 Afternoon

Tutorial 45: Test-Driven Development with "fit", the Framework for Integrated Test
Tutorial 48: Guided Inspection of UML Models

19 Automated Software Testing: Hands On and Interactive!

Sunday, 26 October – 8:30-17:00 Full day

Gerard Meszaros, ClearStream Consulting, gerard.meszaros@acm.org
Ralph Bohnet, ClearStream Consulting, ralph@clrstream.com

This tutorial takes the "extreme" out of Extreme Programming. It brings the expertise acquired by the agile software development community to mainstream software developers working on all kinds of projects. The Extreme Programming community has shown the value of writing automated software unit and acceptance tests for increasing quality, reducing development cost and improving agility. Now that we know this is possible, how can we apply these lessons to more traditional software projects? And if you are an XP developer, how can you improve the way you write tests?

The XUnit family of testing frameworks (including JUnit, SUnit, NUnit, VbUnit, etc.) provide an important first step in software test automation. But test automation is much more than just knowing how to program a test in XUnit. The neophyte test-writer is faced with a deluge of questions and issues; e.g.,

  • What should I be testing?
  • How many tests are enough?
  • What should I focus on while writing my tests?
  • How do I test my tests?
  • How can I make my tests more understandable?
  • How do my tests relate to use cases?
  • How can I make my software easier to test?

This tutorial addresses these and many other questions.

Attendee background

This tutorial is intended for professional software developers who would like to learn how to test their software more effectively.

Prerequisites: Participants must be familiar with the object-oriented software development paradigm, and be fluent in one or more object-oriented languages. Familiarity with Java or C# is useful, but not required. Familiarity with a unit testing framework, such as JUnit, is preferable.


Lecture and hands-on exercises


Gerard built his first unit testing framework in 1996 and has been doing automated unit testing ever since. Along the way, he has become an expert in refactoring of software, refactoring of tests, and design for testability. Gerard has applied automated unit and acceptance testing on projects ranging from full-on eXtreme Programming to traditional waterfall development. He has presented successful tutorials at the past three OOPSLAs, has organized workshops at four previous OOPSLAs, and has presented papers on developing and testing object oriented software frameworks at two previous OOPSLAs as well as XP2001 and XP2002.

Ralph Bohnet is a senior consultant and trainer with Clearstream Consulting and has been doing agile development since 2000. He is an avid advocate of automated testing and Test First Development. Over the past 13 years of IT experience, he has acted as team leader, mentor, business analyst, OO developer, and instructor. He is currently the team lead of a testing team for a national Railway company. Ralph has been the president of Calgary Java Users Group for the past 4 years.

24 Middleware Benchmarking

Sunday, 26 October – 8:30-17:00 Full day

Paul Brebner, CSIRO Mathematical and Information Sciences, Australia, paul.brebner@csiro.au
Emmanuel Cecchet, INRIA Rhone-Alpes, France, emmanuel.cecchet@inrialpes.fr
Julie Marguerite, ObjectWeb Consorcium, France, julie.marguerite@inrialpes.fr
Petr Tuma, Charles University, Czech Republic, petr.tuma@mff.cuni.cz

The goal of the workshop is to help advance the current practice of gathering performance characteristics of middleware implementations through benchmarking. The workshop will serve as a meeting point between middleware developers and middleware users as two representative groups that are typically involved in middleware benchmarking but tend to have different requirements. Positions are solicited especially from people with previous or impending benchmarking experience.

The participants of the workshop will identify requirements and obstacles of middleware benchmarking and form a position on issues such as designing a framework that would allow the design, running and evaluating a diverse range of benchmarks over a diverse range of middleware, designing benchmark criteria that would allow for a meaningful comparison of results collected over different platforms, designing a framework suitable for regular regression testing, or providing means to verify the benchmarking results and their applicability to specific usage situations.

Keywords: middleware, benchmarking, performance evaluation


5 Evolutionary Design

Sunday, 26 October – 8:30-12:00 Morning

Joshua Kerievsky, Extreme Programmer and Coach, Industrial Logic, Inc., joshua@industriallogic.com
Russ Rufer, Extreme Programmer and Coach, Industrial Logic, Inc., russ@industriallogic.com

While Test-Driven Development and Refactoring are extremely useful software development practices, they are insufficient for evolving great designs. What's missing are the thinking and coding practices that real-world evolutionary designers use to evolve top-notch designs effectively. Such practices include critical learning that results from early end-to-end system development, significant time savings obtained by determining what doesn't need to be automated, eye-opening design simplicity achieved by automating failing acceptance tests before writing code, important design progress that results, paradoxically, from undoing previous design work and more.

This tutorial takes the mystery out of Evolutionary Design by naming and explaining what its thinking and coding practices are and how to implement them. You'll be challenged to solve Evolutionary Design exercises and you'll experience how a game of blackjack evolves from a first failing UI test to a functioning version of the game. Along the way you'll learn how not to evolve blackjack, you'll study micro-snapshots of the evolution-in-progress and you'll understand what the evolution of a game teaches us about Evolutionary Design on real-world projects.

Attendee background

Prerequisites: Participants should be able to read Java™ code to get the most out of the session. No background is required in Agile Development, Refactoring, or Test-Driven Design.


Interactive lecture and programming demonstration


Joshua Kerievsky has been programming professionally since 1987, and is the founder of Industrial Logic (http://industriallogic.com), a company specializing in Extreme Programming (XP). Since 1999, Joshua has been coaching and programming on small, large and distributed XP projects and teaching XP to people throughout the world. He is the author of numerous XP and patterns-based articles, simulations and games, including the forthcoming book, Refactoring to Patterns (http://industriallogic.com/xp/refactoring/).

Russ Rufer has been building software systems for 15 years. His wide-ranging experience includes desktop applications, embedded firmware, telecommunications, networking, satellite simulation, and productivity tools. Russ leads weekly meetings of the Silicon Valley Patterns Group, which he founded in 1998 (http://pentad.com/SiliconValleyPatterns.html) and regularly organizes pre-publication review teams to provide feedback on new literature from the software patterns and agile development communities. Russ has worked with Industrial Logic for several years. He divides his time between pure development, coaching and leading workshops on Extreme Programming, Testing, Refactoring and Patterns.

17 Extreme Programming Practices in the First CS Courses

Sunday, 26 October – 8:30-12:00 Morning

Joseph Bergin, Pace University, berginf@pace.edu
James Caristi, Valparaiso University, James.Caristi@valpo.edu
Daniel Steinberg, Dim Sum Thinking, Inc., DSteinberg@core.com

Most of the practices of Extreme Programming are beneficial to students in their computer science courses. But in order to teach students properly, pedagogical changes are needed as early as CS1. This workshop seeks participants who have significant ideas for changes that can be made in early computer science courses that involve integrating any of the practices of Extreme Programming or other agile methodologies.

Would-be participants should send in a short position paper outlining one or two ideas they have. During the workshop, participants will critically discuss the ideas that have been suggested and explore any new ones that arise. Participants will agree to allow their ideas to be shared via a web page to be posted in various CS educational resources repositories.


45 Test-Driven Development with "fit", the Framework for Integrated Test

Wednesday, 29 October – 13:30-17:00 Afternoon

Ward Cunningham, Cunningham & Cunningham, Inc., ward@c2.com

This tutorial introduces the Framework for Integrated Test (fit) and demonstrates its use in Test-Driven Development (TDD), as practiced in Extreme Programming and other agile development methods. Projects use fit-style tests both to guide programming and to test the correctness of the result. Test-driven designs are more easily "refactored," making it the only programming method that expects programs to get "cleaner" over time.

Short lectures will explain just enough of Extreme Programming to establish the context for test-driven design. These will be followed by live demonstrations and laboratory exercises. The labs will use simple Java, but the emphasis is on familiarity with the frameworks, tools and techniques, not programming. If you are unfamiliar with Java, you will learn enough just by watching to be able to complete some of the exercises and obtain all of the benefits of the tutorial.

Bring a laptop, or join someone who has one, to do hands-on exercises. Bring a wireless networking card to participate in additional "online" activities. Install Java at home to save time in class. Google "sun java download" to find a version for your computer.

Attendee background

Prerequisites: Some programming experience with an object-oriented language is required (not necessarily Java). Familiarity with downloading and installing software is also required.


Lecture, demonstration, and optional exercises


Ward Cunningham is a founder of Cunningham & Cunningham, Inc. He has served as Director of R&D at Wyatt Software and as Principle Engineer in the Tektronix Computer Research Laboratory. Ward is well known for his contributions to the developing practice of object-oriented programming, the variation called Extreme Programming, and the communities hosted by his WikiWikiWeb.

48 Guided Inspection of UML Models

Wednesday, 29 October – 13:30-17:00 Afternoon

John McGregor, Clemson University, johnmc@cs.clemson.edu

There is widespread agreement that finding defects as early in the development life cycle as possible is cost effective; however, there are few systematic techniques for accomplishing this goal. Guided inspection is an inspection technique that is "guided" by test cases. By constructing a "complete" set of test cases, the guided inspection technique identifies elements missing from the model, and it also evaluates the quality of those that are present. This tutorial illustrates the technique using design models created using the Unified Modeling Language. Checklists designed for use at various points in a typical development process assist the inspector in selecting the most effective test cases.

Guided Inspection has several benefits:

  • Objectivity - systematically selects test cases to give all portions of the model equal coverage.
  • Traceability - links the faults detected back to specific requirements.
  • Testability - identifies portions of the design that are complex and require much effort to test.

After taking this tutorial, participants will be able to:

  • define test scenarios from use cases.
  • apply these test cases to an actual system model.
  • adapt the technique and checklists to the maturity of a specific model.

Attendee background

Prerequisites: Attendees should be familiar with UML. It will be helpful if attendees have participated in software reviews and inspections previously.


Lecture and exercises


Dr. John D. McGregor is a partner in Luminary Software and an associate professor of computer science at Clemson University. He conducts research, teaches graduate software engineering courses, and serves as a consultant to companies in several domains. Dr. McGregor has conducted research for organizations such as the Software Engineering Institute, National Science Foundation, DARPA, IBM and AT&T. He has applied those research results on projects in telecommunications, insurance, and financial institutions. He is co-author of "A Practical Guide to Testing Object-Oriented Software," published by Addison-Wesley. Dr. McGregor's current research interests include software product lines, design quality, testing and measurement.