Rachid Guerraoui (EPFL Lausanne, Switzerland)

Towards a Theory of Transactional Memory

The Transaction Memory (TM) paradigm is argued to be as easy to use as coarse-grained locking and as efficient as hand-crafted, fine-grained locking. It is this very appealing to leverage multi-core architectures and a large body of work has been dedicated to implementing the TM paradigm and exploring its limitations. Very little work has however been devoted to explore its theoretical ramifications. This tutorial will introduce the TM paradigm, summarize some of the underlying theoretical work, andc open new research perspectives for establishing correctness properties and verifying the correctness of TM implementations.

Jaeha Kim (Stanford, US)

Mixed-Signal System Verification: A High-Speed Link Example

This tutorial starts by showing various mixed-signal circuit examples in high-speed links, of which verification cannot be done via the established methods for digital systems and instead has relied on compute-intensive time-domain simulations. While much effort has been spent on extending the prevailing digital verification methods to these mixed-signal systems, this talk examines the properties that the designers would most want to verify for these circuits and claims that most of them are the properties of a linear system model, e.g. gain, bandwidth, damping factor, etc. It is because most of the mixed-signal circuits in high-speed links are motivated by the need to replace analog circuits with digital for the ease of process portability. The resulting mixed-signal circuits are still intended to have analog functionalities which are best described in linear system models. Therefore, the most suitable way to do formal verification for those circuits is to extend the traditional AC analysis in SPICE. This tutorial will demonstrate possible ways to extend the AC analysis to measure the phase-domain transfer functions of PLLs or the noise transfer functions of sigma-delta ADCs, for example. The tutorial also examines the limitations of the linear analysis, namely the inability to detect the start-up failures, which is the problem that can be addressed by leveraging the digital verification techniques and coverage concepts.

Jean Krivine (Harvard Medical school, US)

Representation, Analysis and Simulation of Large Molecular Networks in the Kappa Calculus

The behavior of concurrent systems is the result of the interaction of independent processes or agents that are loosely coupled by means of messages, signals or direct physical contacts. Such systems may have various scales of transaction complexity and dynamics, from several clients communicating with a unique server, to topologically decentralized interaction networks involving thousands of protein types in a living cell.

The analysis of such systems is a challenge that not only requires correct modeling of the pertinent agents (the clients, the server, each protein type), but also an understanding of the global properties that arise from their interactions. In computer science, one can usually check these emerging properties against some specification based on a reference system. In biology, the need for such a formal specification, especially in the context of signaling pathways, gave rise to "systems biology". This discipline is now engaged in an unbalanced competition between the accumulation of biological facts, which are rapidly growing because of new technologies, and their integration and apprehension in formal models, which is still rather limited in scope. This gap between understanding the properties of an agent (here, the interaction capabilities of a protein) and its potential role in a global pattern of interest (the tissue, the cell, the signaling cascade) has its correspondence in software verification. It is natural to investigate to which extent languages and verification techniques of computer science can contribute to the effort of formalization in biology. There are several challenges specific to systems biology that need to be addressed by a proper modeling platform: first the size of the networks to be modeled imposes strong scalability requirements on both the static (verification) and dynamic (simulation) components of the formalism. Second, the need to let the community of biologists use the tools by themselves requires a certain degree of transparency in how biological facts are represented and analyzed.

Among the formalisms attempting to address these issues, this tutorial will focus on Kappa, a simple language for protein networks. We will walk through various concepts familiar to software verification, such as abstract interpretation techniques and causality analysis that can be applied to help biologists assemble, tune and analyze large models of combinatorial systems. We will then present the simulation algorithm that allows one to generate stochastic traces from a Kappa model with a per event simulation cost which is independent of the dimensions of the model (the number of agents, or their possible combinations). These tools have been implemented by Plectix Biosystems (a spin-off from Harvard Medical School) and are available as a rapidly evolving beta version on the website www.cellucidate.com.

Joseph Sifakis (VERIMAG/CNRS, France)

Component-based Construction of Heterogeneous Real-time Systems in BIP

We present a framework for the component-based construction of real-time systems. The framework is based on the BIP (Behaviour, Interaction, Priority semantic model, characterized by a layered representation of components. Compound components are obtained as the composition of atomic components specified by their behaviour and interface, by using connectors and dynamic priorities. Connectors describe structured interactions between atomic components, in terms of two basic protocols: rendezvous and broadcast. Dynamic priorities are used to select amongst possible interactions - in particular, to express scheduling policies.

The BIP framework has been implemented in a language and a toolset. The BIP language offers primitives and constructs for modelling and composing atomic components described as state machines, extended with data and functions in C. The BIP toolset includes an editor and a compiler for generating from BIP programs, C++ code executable on a dedicated platform. It also allows simulation and verification of BIP programs by using model checking techniques.

BIP supports a model-based design methodology involving three steps:

We present the basic theoretical results about BIP including modelling interactions by using connectors, modelling priorities, incremental verification and expressiveness. We also present two examples illustrating the methodology as well as experimental results obtained by using the BIP toolset.

Further information is available at: http://www-verimag.imag.fr/~async/bip.php

  • Home
  • Workshops
  • Participation

  • Registration
  • Accomodation
  • Venue and Travel
  • Conference Program

  • Accepted Papers
  • Schedule
  • Program
  • Tutorials
  • Invited Speakers
  • CAV Award Process
  • Social Events
  • Misc.

  • PhotosNEW
  • The 2009 CAV Award
  • Important Dates
  • Committees
  • Sponsors
  • Poster
  • Out of date:

  • Call for Papers
  • Instructions to authors
  • PAPER SUBMISSION