Experiments in performance evaluation involve much technical work that must cope with increasingly complex software and hardware stacks to deliver correct results. We publish reusable elements of our experiments to reduce redundant work and help avoid experimental errors.

Measurement Methodology

Our research contributions to measurement methodology begun with our CORBA benchmarking experiments [7] [8] [9]. A summary of technical obstacles to correct benchmarking, illustrated in the context of CORBA but easily generalized, is available in [10].

Seeing that many experimental issues persist across multiple projects and platforms, we have also created a compact summary of technical obstacles to benchmarking Java, presented as a tutorial during ICPE 2015 [5]. We also provide slides and handouts.

We also investigated overhead of dynamic monitoring in Java [6]. We were able to formulate practical rules how to configure dynamic performance monitoring for collection of reliable data (in terms of precision). Our experimental data set is freely available.


Tool Link Description
Java Microbenchmark Agent Project Page, Source Repository A compact JVMTI agent that can provide microbenchmarks with information on major JVM events, hardware performance counters and time.

Experiment Randomization

On contemporary platforms, many factors that impact the observed performance are determined long before the measurement itself. These include obvious factors such as compiler configuration relevant to optimization, but also factors that are difficult to both observe and control, such as choice of physical memory pages or disk blocks. We show that this effect can influence measurement results significantly and propose more robust methods of processing the measurements [1] [2] [3]. We also show that effects of memory allocation, which have attracted considerable attention later, can be made more predictable by controlling the allocation policies [4].


Lubomír Bulej lubomir.bulej<at-sign>d3s.mff.cuni.cz
Vojtěch Horký vojtech.horky<at-sign>d3s.mff.cuni.cz
Petr Tůma petr.tuma<at-sign>d3s.mff.cuni.cz


  • [1]Kalibera T., Bulej L., Tůma P., Benchmark Precision and Random Initial State, In proceedings of the International Symposium on Performance Evaluation of Computer and Telecommunication Systems (SPECTS), Cherry Hill, NJ, USA, SCS, ISBN: 1-56555-300-4, pp. 853-862, July 2005. PDF, PDF
  • [2]Kalibera T., Bulej L., Tůma P., Automated Detection of Performance Regressions: The Mono Experience, In proceedings of the 13th IEEE International Symposium on Modeling, Analysis, and Simulation of Computer and Telecommunication Systems (MASCOTS), Atlanta, GA, USA, IEEE, ISBN: 0-7695-2458-3, ISSN: 1526-7539, pp. 183-190, September 2005. PDF, PDF, Link
  • [3]Kalibera T., Tůma P., Precise Regression Benchmarking with Random Effects: Improving Mono Benchmark Results, In proceedings of the 3rd European Performance Engineering Workshop (EPEW), Budapest, Hungary, Springer, LNCS 4054, ISBN: 3-540-35362-3, ISSN: 0302-9743, pp. 63-77, June 2006. PDF, PDF, Link
  • [4]Hocko M., Kalibera T., Reducing Performance Non-determinism via Cache-aware Page Allocation Strategies, Accepted at First Joint WOSP/SIPEW International Conference on Performance Engineering, September 2009. PDF, Link
  • [5]Horký V., Libič P., Steinhauser A., Tůma P., DOs and DON'Ts of Conducting Performance Measurements in Java (tutorial), In proceedings of the 6th ACM/SPEC International Conference on Performance Engineering (ICPE), Austin, Texas, USA, ACM, ISBN: 978-1-4503-3248-4, pp. 337-340, January 2015. Link
  • [6]Horký V., Kotrč J., Libič P., Tůma P., Analysis of Overhead in Dynamic Java Performance Monitoring, In Proceedings of the 7th ACM/SPEC International Conference on Performance Engineering (ICPE), Delft, Netherlands, ACM, ISBN: 978-1-4503-4080-9, March 2016. Link, Resources
  • [7]Distributed Systems Research Group, CORBA Comparison Project, Final Project Report, June 1998. PDF
  • [8]Distributed Systems Research Group, CORBA Comparison Project, Project Extension Final Report, August 1999. PDF
  • [9]Tůma P., Buble A., Open CORBA Benchmarking, Proceedings of the 2001 International Symposium on Performance Evaluation of Computer and Telecommunication Systems (SPECTS 2001), Orlando, FL, Copyright (C) SCS, San Diego, CA, USA, July 2001. PDF
  • [10]Buble A., Bulej L., Tůma P., CORBA Benchmarking: A Course With Hidden Obstacles, In proceedings of the 17th International Parallel & Distributed Processing Symposium (IPDPS), Workshop on Performance Modeling, Evaluation and Optimization of Parallel and Distributed Systems (PMEOPDS), Nice, France, IEEE CS, ISBN: 0-7695-1926-1, ISSN: 1530-2075, pp. 279.1, CDROM DATA/W18_PMEO_11.PDF, 6 pg., April 2003. PDF, PDF, Link
Logo of Faculty of Mathematics and Physics
  • Phone: +420 951 554 267, +420 951 554 236
  • Email: info<at-sign>d3s.mff.cuni.cz
  • How to find us?
Modified on 2016-05-27