System Behavior Models and Verification – NSWI101

Model Checking
Time and Location: Winter Semester 2017/2018
Lab: ???
Guaranteed by: Department of Distributed and Dependable Systems
Winter Term: 2/2 Zk+Z
Lecturers: František Plášil
e-mail: frantisek.plasil<at-sign>
Jan Kofroň
e-mail: jan.kofron<at-sign>
Lab: Jan Kofroň
e-mail for HW: nswi101<at-sign>
Information in SIS: NSWI101








Basic concepts of behavior description of parallel and distributed systems. Equivalence checking and model checking — techniques and tools.


  • Practical examples of behavior modeling and verification
    • The SPIN model checker (developed at Bell Labs) which is being successfully used from 1989 for analysis of communication and cryptographic protocols, distributed algorithms and parts of OS kernels (e.g. process schedulers)
    • The NuXMV (SMV) – Symbolic model checker based on Ordered Binary Decision Diagrams
    • UPPAAL model checker
  • Mathematical structures for behavior modeling: labeled transition systems, Kripke structures
  • Timed automata
  • Specification of system properties using temporal logic
  • Basic verification tasks: equivalence checking and model checking
    • Decidability and complexity (of equivalence checking and model checking) in dependence of the type of the model
    • Software tools for equivalence checking and model checking
  • Bounded model checking, probabilistic model checking
  • Open issues in formal verification: infinite-state systems, state explosion problem


The purpose of the lab is to provide students with a hand-on experience with verification tools (SPIN, SMV, UPPAAL), higher-level behavior specification languages (process algebra, behavior protocols), and temporal logics (LTL, CTL).

There will be two assignments (one taking approximately 8 hours of homework, the other an hour). The homeworks are to be submitted via e-mail:

Note: 10 % of your score will be deduced for every calendar day your assignment is late. This implies that no assignment will be accepted after 10 calendar days past its due date.


Final grades will be determined by the quality of homework and the result of the final exam in the following ratio:

  • 55 % Assignments (homework)
  • 45 % Final exam


  • P. Regan, S. Hamilton: NASA's Mission Reliable, IEEE Computer, vol. 37, no. 1, Jan 2004
  • G. J. Holzmann: The Spin Model Checker, Addison Wesley, 2003
  • E. M. Clarke, Jr., O. Grumberg, D. A. Peled: Model Checking, MIT Press, 2002
  • J. A. Bergstra, A. Ponse, S. A. Smolka: Handbook of Process Algebra, Elsevier 2001
  • R. Milner: Communication and Concurrency, Prentice Hall 1989
  • C. Stirling: Modal and Temporal Properties of Processes, Springer 2001
  • F. Plasil, S. Visnovsky: Behavior Protocols for Software Components, IEEE Transactions on Software Engineering, vol. 28, no. 11, Nov 2002 (link)
  • J. Adamek, F. Plasil: Component Composition Errors and Update Atomicity: Static Analysis, Journal of Software Maintenance and Evolution: Research and Practice, 2005 (link)
  • D. Engler: Static analysis versus software model checking for bug finding (paper, slides, other information can be found here)
  • J. Esparza: Software reliability (slides, other information can be found here [text of the web pages in German, referenced documents in English])
  • B. Nielsen: LTS Based Testing and IOCO (slides, other information can be found here)
  • T. Latvala: Reactive Systems (slides)
Modified on Warning: filemtime(): stat failed for .include/slides_lectures.txt in /var/www/vhosts/ on line 6 Warning: filemtime(): stat failed for .include/slides_labs.txt in /var/www/vhosts/ on line 7 2017-07-20