Middleware Benchmarking: Approaches, Results, Experiences

OOPSLA 2003 Workshop
October 26, 2003
Anaheim, California, USA

The workshop summary article is in preprint phase

The goal of the workshop is to help advance the current practice of gathering performance characteristics of middleware implementations through benchmarking. The workshop will serve as a meeting point between middleware developers and middleware users as two representative groups that are typically involved in middleware benchmarking but tend to have different requirements. Positions are solicited especially from people with previous or impending benchmarking experience.

The participants of the workshop will identify requirements and obstacles of middleware benchmarking and form a position on issues such as designing a framework that would allow the design, running and evaluating a diverse range of benchmarks over a diverse range of middleware, designing benchmark criteria that would allow for a meaningful comparison of results collected over different platforms, designing a framework suitable for regular regression testing, or providing means to verify the benchmarking results and their applicability to specific usage situations.

Identified Questions


To reflect the focus of the submissions, the schedule divides the workshop into two parts. The morning part groups presentations related to middleware benchmark construction, whereas the afternoon part groups presentations related to workload generation and characterization. The schedule was arranged to provide time for debate after each presentation.

The schedule ends with a preparation of the workshop statement, which will become a basis for the workshop poster and later also for the workshop publication. To help achieve convergence, a set of open questions relevant to middleware benchmarking will be raised by the participants during the opening of the workshop, and each presentation will be asked to conclude with explaining which questions and how it addresses.

8:30-8:35Welcome and introduction
8:35-8:45Round of raising questions
Topic 1: Middleware Benchmark Construction
8:45-9:15 Bruno Dillenseger, Emmanuel Cecchet: CLIF is a Load Injection Framework (instead of Paul Brebner: The Impact of Object Oriented Characteristics of Middleware Benchmarks)
9:30-10:00Arvind S. Krishna, Jaiganesh Balasubramanian, Aniruddha Gokhale, Douglas C. Schmidt, Diego Sevilla, Gautam Thaker: Empirically Evaluating CORBA Component Model Implementations
10:30-11:00Lubomir Bulej, Petr Tuma: Regression Benchmarking in Middleware Development
11:15-11:45Bruno Dufour, Laurie Hendren, Clark Verbrugge: Problems in Objectively Quantifying Benchmarks using Dynamic Metrics
Topic 2: Workload Generation And Characterization
13:30-14:00Lieven Eeckhout, Andy Georges, Koen De Bosschere: Selecting a Reduced but Representative Workload
14:15-14:45Stephane Frenot, Tudor Balan: A CPU Resource Consumption Prediction Mechanism for EJB Deployment on a Federation of Servers
15:15-15:45Octavian Ciuhandu, John Murphy: Reflective QoS-Enabled Load Management Framework for Component-Based Middleware
16:00-17:00Workshop statement preparation


The organizing committee will disseminate all submissions to the workshop attendees prior to the workshop and select position papers and experience reports to be presented at the beginning of the workshop. The submissions will also serve as a base for a joint workshop report, to be published in the Concurrency and Computation: Practice and Experience journal by Wiley.

The statements of interest, position papers and experience reports should be mailed to the organizing committee at oopsla-workshop@d3s.mff.cuni.cz. Any of the DOC, PDF, PS or RTF formats are accepted, for formatting instructions please see the ACM instructions.

Important dates

Submission deadline: PASSED
Notification of selection for presentation: PASSED
Early conference registration deadline: September 18, 2003
OOPSLA conference: October 26-30, 2003
OOPSLA workshop: October 26, 2003

Organizing committee