Project Colloquium



Upcoming (to be confirmed):

Mar 22: Fuyuki Ishikawa

Thu 15 March 2017, 16:30–18:00

ERATO MMSD Takebashi Site Common Room 3

Thorsten Wissman (...),  


Thu 1 March 2017, 16:30–18:00

ERATO MMSD Takebashi Site Common Room 3

Étienne André (University Paris 13),  
Parametric timed automata: theory and applications

Critical real-time systems must be formally verified to prevent any undesired behavior. When some timing constants are known with only a limited precision, traditional timed model-checking techniques may not apply anymore.

We use here the formalism of parametric timed automata. Parametric timed automata are very expressive, and decision problems are notoriously very hard (in fact most interesting decision problems are undecidable). On the one hand, we review decidability results, and propose new decidable subclasses. On the other hand, we adopt a more pragmatic approach, and show that parametric timed automata allow us to model and verify distributed real-time systems; using the IMITATOR software, we derive best- and worst-case execution times for a real-time system designed by Thales, and subject to uncertainty in some tasks periods.

We also briefly report on recent techniques to speed-up the verification time using abstractions, and on distributed parametric model checking.

Thu 22 February 2017, 16:30–18:00

ERATO MMSD Takebashi Site Common Room 3

Takeo Uramoto (Tohoku University),  
Recent developments in algebraic language theory

The purpose of this talk is to introduce a recent (2008~) active research trend in  algebraic language theory. This field, among other approaches to formal languages, is characterized by its methodology: one reduces combinatorial / logical problems on formal languages (say, regular languages) into some tractable problems on algebraic structures (resp., finite semigroups). Recent works in the last decade shed new lights on this field (or its methodology) from the viewpoint of Stone-type duality theorem; and most importantly, indicated several new research directions extending classical frameworks. In this talk, based on our own works, we introduce this research trend (and its background motivation) and then carefully discuss its possible future directions.


Thu 15 February 2017, 16:30–18:00

ERATO MMSD Takebashi Site Common Room 3

Georg Moser (University of Innsbruck),  
Uniform Resource Analysis

In this talk, I'll describe how rewriting techniques can be successfully employed to built state-of-the-art automated resource analysis tools which favourably compare to other approaches. Furthermore I'll sketch the genesis of a uniform framework for resource analysis, emphasising success stories, without hiding intricate weaknesses

and how those can be overcome in future work.

Thu 8 December 2017, 10:00–11:30

ERATO MMSD Takebashi Site Common Room 3

Toru Takisaka (Kyoto University),  
On Use of Martingales to Refute Almost Reachability

It is well known that supermartingales can

Thu 28 December 2017, 10:00–11:30

ERATO MMSD Takebashi Site Common Room 3

Toru Takisaka (Kyoto University),  
On Use of Martingales to Refute Almost Reachability

It is well known that supermartingales can be used to prove almost-termination of probabilistic transition systems.

Recently Chatterjee et al. [1] proposed a way of their use to refute almost-termination.

A crucial tool in their proofs is the Azuma's inequality, which roughly states that one does not have much chance to earn money in a fair gambling. In this talk we do the similar argument as [1] using another inequality instead of Azuma's one. The resulting martingale technique is much simpler and complete, i.e. finding a witnessing supermartingale for given non-almost-terminating system is always possible.

[1] C. Chatterjee, P. Novotny and D. Zikelic. Stochastic Invariants for Probabilitsic Termination. Proc. POPL2017.

Thu 30 November 2017, 16:30–18:45

ERATO MMSD Takebashi Site Common Room 3


Jérémy Dubut (NII),  
Some thoughts on bisimilarity using open morphisms

In this talk, I will present some work I have done on open morphisms during my PhD. I will start by giving an introduction on this categorical formalism from Joyal, et al.’s work on bisimilarity defined as the existence of a span of morphisms having some lifting properties with respect to “execution shapes”. I will show you examples, relational and logical characterisations for this bisimilarity.

Next, I would like to present two quite different aspects of this theory that I used during my PhD. First, a general framework, called the accessible categorical models, where the open morphisms formalism is particularly nice. This is a case where it is possible to define a nice notion of trees. In this framework, we show that bisimilarity can be precisely captured by the existence of some bisimulation relation on executions, but also that there is a fine notion of unfolding, which acts very similarly as a universal covering. Secondly, a particular case of the theory of Joyal et al.: the bisimilarity of diagrams, namely functors from a small category to a specified category of values. I will show you that this case is equivalent to the existence of a nice notion of bisimulations, which allows us to translate the problem of bisimilarity, to a problem of isomorphisms in the category A, which is useful for (un)decidability. I will finally show you how to translate the general setting of Joyal et al. where the category of paths is small into diagrams, and I will describe the relation between the different bisimilarities involved.


Akihisa Yamada (NII),  

Mathematics for Complexity Analysis

IsaFoR is a library of Isabelle-formalized theorems for validating program analyzers, originally on term rewriting. My previous work was to extend the library for more practical programing languages and also to enrich the library with deep results from mathematics. In this talk, I will explain how complexity analysis leads us to formalizing algebraic numbers, the Berlekamp factorization, Jordan normal forms, and the Perron-Frobenius theorem.

Thu 9 November 2017, 16:30–18:45

ERATO MMSD Takebashi Site Common Room 3


Jurriaan Rot (Radboud University),  
Traces and Triangles

In the theory of coalgebras, trace semantics can be defined in various distinct ways, including through algebraic logics, the Kleisli category of a monad or its Eilenberg-Moore category. I will talk about recent joint work with Bart Jacobs, which elaborates two new unifying ideas in the theory of coalgebraic trace semantics: 1) previous approaches can be placed and connected in so-called state-and-effect triangles, that arise in the semantics of programs; 2) coalgebraic trace semantics is naturally presented in terms of corecursive algebras. This perspective puts the different approaches under a common roof, and allows us to derive conditions under which they coincide.


Kenta Cho (Radboud University),  

String diagrams in probability theory

String diagrams are a graphical language for monoidal categories, which has become very popular in categorical quantum mechanics initiated by Abramsky and Coecke. In this talk I will explain that string diagrams are also useful for (classical) probability theory. The Kleisli categories of the distribution and the Giry monad give concrete interpretation of string diagrams, respectively, for discrete probability and general measure-theoretic probability. Topics include disintegration, Bayesian inversion, and conditional independence. The talk is based on joint work with Bart Jacobs; see preprint

Thu 26 October 2017, 16:30–18:45

ERATO MMSD Takebashi Site Common Room 3


Zhenya Zhang (SOKENDAI, ERATO MMSD),  
Localization of Linearizability Faults on the Coarse-Grained Level

Linearizability is an important correctness criterion that guarantees the safety of concurrent data structures. Due to the nondeterminism of concurrent executions, reproduction and localization of a linearizability fault still remain challenging. The existing work mainly focuses on model checking the thread schedule space of a concurrent program on a fine-grained (state) level, and hence suffers from the severe problem of state space explosion. This paper presents a tool called CGVT to build a small test case that is sufficient enough for reproducing a linearizability fault. Given a possibly long history that has been detected non-linearizable, CGVT first locates the operations causing a linearizability violation, and then synthesizes a minimum test case for further investigation. Moreover, we present several optimization techniques to improve the effectiveness and efficiency of CGVT.We have applied CGVT to 10 concurrent objects, while the linearizability of some of the objects is unknown yet. The experiments show that CGVT is powerful and efficient enough to build the test cases more adaptable for a fine-grained fault localization.


David Sprunger (NII, ERATO MMSD),  

Bisimulation and precongruence approaches to coinductive predicates

In this talk, I will take a very low-level view of coalgebras to contrast two very similar approaches to behavioral equivalence. Bisimulation has a long history stretching back to Milner and Park's use in the context of concurrent systems. Often bisimulation is used in concert with a restriction on the transition system type to ensure certain nice properties of bisimulations hold, namely that the type functor preserve weak pullbacks. I will use several representation results due to Adamek and various collaborators to bypass the typical categorical presentation of coalgebras and give a view closer to "bare metal" on the weak pullback preservation condition and why it is often needed when using bisimulations.

A lesser known approach to behavioral equivalence involves a type of relation called a precongruence, introduced by Aczel and Mendler in 1989. Precongruences are subtly different than bisimulations for system types which do not preserve weak pullbacks. However, they retain many of the important properties we use the weak pullback preservation property to obtain for bisimulations. I will show how using a precongruence approach to two coinductive predicates--behavioral equivalence and language inclusion--obviates the need for this property.

Thu 12 October 2017, 16:30–18:45

ERATO MMSD Takebashi Site Common Room 3


Krzysztof Czarnecki (University of Waterloo),  
Designing and Testing Autonomous Vehicles: What the Engineers are Doing and Thinking

How do you design and then test a vehicle that is supposed to drive itself.  In any environment. Without getting into an accident.  Ever?  This presentation will explore autonomous vehicles from an engineering standpoint.  What is an autonomous vehicle and how (and when) will vehicles evolve to reach this state?  Crash course on how autonomous vehicles work.  What are the challenges in programming and then testing automated vehicle technologies? What are the legal and ethical issues that arise?

The second part of the presentation will focus on the ongoing work at the University of Waterloo on safety assurance and validation and verification of autonomous vehicles.


Sean Sedwards (National Institute of Informatics),  

Lightweight Verification of Markov Decision Processes

Markov decision processes (MDP) are a useful abstraction of discrete nondeterminism that can be used to optimise both discrete and continuous time models. MDPs may be verified numerically (i.e. optimised) in polynomial time with respect to the size of their state space, however this increases exponentially with the number of state variables. Popular existing approximative verification techniques tend to only partially address this “state explosion” and have other problems.

I will present a recently developed technique of “lightweight" verification of Markov decision processes, based on an O(1) memory representation of history-dependent schedulers. This allows schedulers to be chosen at random and constructed on the fly, thus facilitating scalable verification on massively parallel computational architectures.

Fri 22 September 2017, 15:00–17:00

NII meeting room 2010 

Masaaki Konishi (Research Manager, ERATO MMSD),
Practical issues and measures of Search Base Test for engine control software

My former job was Verification and Validation application development on vehicle control software in an automotive company. Automotive control software is getting larger and more complicate to meet customers’ demand and various kinds of regulations and it takes huge amount of time and manpower to ensure software quality.  In this situation “Falsification” (Search Base Test) using closed loop simulation is expected to be a promising approach. However, there still remain some practical issues to apply and expand the usage. In this talk I will introduce  the issues and measures on two items, 1) metrics to measure searching extent and 2) evaluation of realistic input scenario.

Fri 15 September 2017, 15:00–17:15

NII meeting room 2010 


Xavier Rival (INRIA/CNRS/ENS Paris),  
Towards a library of abstract domains to describe memory properties

In this talk, we will present MemCAD, an abstract interpreter that integrates a library of abstract domains to represent properties of memory states. It can describes structures, dynamic structures, and arrays. It can also interact with abstract domains that represent numeric properties and constraints over set symbols. It relies on interfaces that allow adding novel abstract domains, and to define combination operators, to build up advanced abstract domains from basic ones.

We will present the structure of the analyzer and highlight the main abstract domains that it implements. We will also highlight the core analysis algorithms, and show a few examples of static analyses using these domains.


Kokichi Futatsugi (National Institute of Advanced Industrial Science and Technology),  Specification Verification with CafeOBJ

Specification is description of model, and its verification is to show that the model has desired properties by deducing the properties from the specification.  As the proverb says “All models are wrong, some are useful”, and specification verification is an effective way for getting the useful models.

CafeOBJ is an executable algebraic specification language system which can be used as a specification verification system. CafeOBJ's specification is a set of equations and transition rules (including conditional ones), and the deduction is equational inference.  The equational inference is implemented through reduction by interpreting the specification's equations as left to right rewriting/reduction rules.  This transparent view of CafeOBJ's specification verification relaxes inherent difficulty of specification verification.  CafeOBJ language also has powerful constructs for modules including parameterized ones, and it makes it easy to write well formed reusable specifications and proof scores (scripts).

This talk gives an overview of specification verification with CafeOBJ, focusing on recent developments about "specification calculus" and "refinement of proof constants".

Fri 1 September 2017, 13:00–15:00

NII meeting room 2010 

Shaukat Ali (Simula Research Laboratory),  Uncertainty-based Test Case Generation and Minimization for Cyber-Physical Systems: A Multi-Objective Search-based Approach

Cyber-Physical Systems (CPSs) typically operate in highly indeterminate environmental conditions, which require the development of testing methods that must explicitly consider uncertainty in test design, test generation, and test optimization. Towards this direction, we propose uncertainty-wise test case generation and test case minimization strategies that rely on test ready models explicitly specifying subjective uncertainty. We propose two test case generation strategies and four test case minimization strategies based on Uncertainty Theory and multi-objective search. These strategies include a novel methodology for designing and introducing indeterminacy sources in the environment during test execution and a novel set of uncertainty-wise test verdicts. We performed an extensive empirical study to select the best algorithm out of eight commonly used multi-objective search algorithms, for each of the four minimization strategies, with five use cases of two industrial CPS case studies. The minimized set of test cases obtained with the best algorithm for each minimization strategy were executed on the two real CPSs. The results showed that our best test strategy managed to observe 51% more uncertainties due to unknown indeterminate behaviors of the physical environment of the CPSs as compared to the rest of the test strategies. In addition, the same test strategy managed to observed 118% more unknown uncertainties as compared to the unique number of known uncertainties.

Download Link:

Fri 25 August 2017, 15:00–17:00

NII meeting room 1901 

Yusuke Kawamoto (National Institute of Advanced Industrial Science and Technology),  Hybrid method for program analysis -- combining statistical analyses with symbolic techniques

In modeling and analyzing a probabilistic system, the statistical approach is often useful to efficiently evaluate quantitative aspects of the behaviors of the system. It is usually considered as a blackbox testing approach in which the analyst runs the system many times and records the execution traces to construct an approximate model of the system. Due to this random sampling of the system, statistical analysis provides only approximate estimates while it can evaluate the accuracy and error of the analysis, e.g. by statistical hypothesis testing. Although a large sample size may be required to have accurate estimates, it can be reduced by employing symbolic techniques from program analysis.

In this talk we introduce basic ideas on the statistical analysis on probabilistic programs, and on a hybrid method that combines the statistical analysis with symbolic techniques. We then show how the hybrid method (presented in [1,2]) is used in estimating entropy-based properties in the context of quantitative information flow. In particular, we present the mathematical details on how to correct the bias of the estimate and how to compute a confidence interval to evaluate the accuracy of the estimation.

[1] Yusuke Kawamoto, Fabrizio Biondi and Axel Legay.

Hybrid Statistical Estimation of Mutual Information for Quantifying Information Flow.

In Proc. of 21st International Symposium on Formal Methods (FM 2016), Lecture Notes in Computer Science, Vol.9995, pp.406-425, November 2016.

[2] Fabrizio Biondi, Yusuke Kawamoto, Axel Legay, and Louis-Marie Traonouez.

HyLeak: Hybrid Analysis Tool for Information Leakage.

In Proc. of the 15th International Symposium on Automated Technology for Verification and Analysis (ATVA 2017), Lecture Notes in Computer Science, October 2017, To appear.

Fri 4 August 2017, 15:00–17:00

NII meeting room 2010 

Masako Kishida (National Institute of Informatics),  Introduction to event- and self-triggered control

Many of recent control systems are connected via networks. This is because the use of network for communications among the system elements reduces the deployment cost and increases the flexibility compared with the conventional wired communications. However, the use of network often forces the system to perform control tasks with limited resources such as communication channel bandwidth and battery power. Thus, the importance of resource-aware control approaches that satisfy given performance requirements has been increased.

This talk starts with an introduction to control theory, where we will go over the basics and concepts of control theory as well as some control approaches to different types of control problems. Following the introduction, recent studies of event-triggered and self-triggered controls that are designed to be resource-efficient will be presented.

Fri 14 July 2017, 15:00–17:00

NII meeting room 2010 

Hayato Waki (Kyushu University),  Computational aspects on non-strictly feasible semidefinite program

 Semidefinite program (SDP) is the convex minimization on matrix variables. Some efficient algorithms to solve SDPs, e.g., the ellipsoid method and interior-point method are proposed, and applications of combinatorial optimization, polynomial optimization, control and statistics etc are well-known. The strict feasibility of SDP guarantees the celebrated strong duality theorem on SDP and the convergence of algorithms to solve SDP. However non-strictly feasible SDPs often appear in these applications, and we can see that the behavior of  algorithms are numerically unstable.  We talk on computational aspects on such SDPs. In particular, we present some properties of non-strictly feasible SDPs in control theory are and numerical remedy for such SDPs.

Fri 30 June 2017, 15:00–17:00

NII meeting room 2010 

Georgios Fainekos (Arizona State University),  Beyond Requirements Falsification : Semi-formal methods and tools for the analysis of Cyber-Physical Systems

Correct-by-design synthesis methods for Cyber-Physical Systems (CPS) are still in their infancy for CPS with complex physical dynamics. For that reason, a combination of design theories for simpler systems and/or ad-hoc design approaches are utilized. Hence, numerous design and implementation errors are discovered while CPS are operational in the field. Such errors can have catastrophic effects to human life and to the economy. Over the last few years, requirements guided falsification methods have proven to be a practical approach to the verification problem of industrial size CPS. However, requirements falsification is just one component of the necessary tools for the development of safe and reliable CPS. In this talk, we provide an overview of our research in providing support for all the stages of the development for CPS, from formal requirements elicitation and mining to system conformance to on-line monitoring. Most of our methods have been implemented in a Matlab (TM) toolbox called S-TaLiRo (System's TemporAl LogIc Robustness). Finally, in this talk, we demonstrate that S-TaLiRo can provide answers to challenge problems from the automotive industry.


Georgios Fainekos is an Associate Professor at the School of Computing, Informatics and Decision Systems Engineering (SCIDSE) at Arizona State University (ASU). He is director of the Cyber-Physical Systems (CPS) Lab and he is currently affiliated with the NSF I/UCRC Center for Embedded Systems (CES) at ASU. He received his Ph.D. in Computer and Information Science from the University of Pennsylvania in 2008 where he was affiliated with the GRASP laboratory. He holds a Diploma degree (B.Sc. & M.Sc.) in Mechanical Engineering from the National Technical University of Athens and an M.Sc. degree in Computer and Information Science from the University of Pennsylvania. Before joining ASU, he held a Postdoctoral Researcher position at NEC Laboratories America in the System Analysis & Verification Group. He is currently working on Cyber-Physical Systems (CPS) and robotics. In particular, his expertise is on formal methods, logic, control theory, artificial intelligence, and optimization. His research has applications to automotive systems, medical devices, autonomous (ground and aerial) robots and human-robot interaction (HRI). In 2013, Dr. Fainekos received the NSF CAREER award. He was also recipient of the SCIDSE Best Researcher Junior Faculty award for 2013 and of the 2008 Frank Anger Memorial ACM SIGBED/SIGSOFT Student Award. Two of his conference papers have been nominated for student best paper awards. In 2016, Dr. Fainekos was the program co-Chair for the ACM International Conference on Hybrid Systems: Computation and Control.

Fri 23 June 2017, 15:00–17:00

NII meeting room 2010 

Gidon Ernst (ERATO MMSD),  Flashix: A verified, crash-tolerant flash file system

In the Flashix project, a file system for flash memory has been developed. It is proven functionally correct and tolerates system crashes such as abrupt power cuts at any point in time during its execution. The development approach is based in incremental and modular refinement. This talk gives an overview over the verification approach, the challenges for the practical development, and the results.

Fri 16 June 2017, 15:00–17:00

NII meeting room 2010 

Kenji Tei (NII),  Models@run.time for self-adaptive systems: A controller synthesis approach

In many application domains, continuous operation is a desirable attribute for software-intensive systems. As the environment or system requirements change, so the system should change and adapt without stopping or unduly disturbing its operation. There is, therefore, a need for sound engineering techniques that can cope with dynamic change.

 In this talk, I will show models@run.time-based approach for self-adaptive systems. Models are hold and used event at runtime for analysis and reasoning about software system to guarantee adaptations. I will overview architecture and key techniques about this approach.

 In particular, I will talk about the problem of dynamic update of controllers in reactive systems when the specification (environment assumptions, requirements and interface) of the current system changes. I will present a general approach to specifying correctness criteria for dynamic update and a technique for automatically computing a controller that handles the transition from the old to the new specification, assuring that the system will reach a state in which such a transition can correctly occur. Indeed, using controller synthesis I will show how to automatically build a controller that guarantees both progress towards update and safe update.

Thu 15 June 2017, 14:30–20:00

NII meeting room 1208-1210 

ERATO MMSD Project Kick-off Workshop

14:30-14:35 Introduction
14:35-14:55 Kengo Kido
           Nonstandard Static Analysis
           (Metatheoretical Transfer via Logic)
15:00-15:20 Natsuki Urabe
           Coalgebraic Unfolding in Forward and Backward Simulations
           (Metatheoretical Transfer via Category Theory)
15:25-15:45 Shin-ya Katsumata
           Simulations and Fibrational Liftings
           (Categorical Foundation for System Approximation)
15:45-16:00 Break
16:00-16:05 Interlude
16:05-16:25 Takamasa Okudono
           Generating Polynomial Interpolant with Semidefinite Programming
           (Numerical Optimization and Formal Methods)
16:30-16:50 Masaki Waga
           Efficient Timed Pattern Matching by Automata-Based Skipping
           (Automata Theory in Light-Weight Formal Methods)
16:55-17:15 Fuyuki Ishikawa
           Insights into Verification of Complex Systems from Software Testing Research
           (Formal Methods and Intelligence)

18:00-20:00 Continued Discussion

Fri 26 May 2017, 15:00–17:00

NII meeting room 2010 

Natsuki Urabe (The University of Tokyo & ERATO MMSD),  Categorical Liveness Checking by Corecursive Algebras

Final coalgebras as "categorical greatest fixed points" play a central role in the theory of coalgebras. Somewhat analogously, most proof methods studied therein have focused on greatest fixed-point properties like safety and bisimilarity. Here we make a step towards categorical proof methods for least fixed-point properties over dynamical systems modeled as coalgebras. Concretely, we seek a categorical axiomatization of well-known proof methods for liveness, namely ranking functions (in nondeterministic settings) and ranking supermartingales (in probabilistic ones). We find an answer in a suitable combination of coalgebraic simulation (studied previously by the authors) and corecursive algebra as a classifier for (non-)well-foundedness.


Fri 28 April 2017, 15:15–17:00

NII meeting room 2010 

Sean Sedwards (ERATO MMSD), Rare Event Simulation for Statistical Model Checking

Statistical model checking (SMC) avoids the intractable number of states associated with numerical model checking by estimating the probability of a property from a number of execution traces (simulations). Rare events nevertheless pose an important challenge to SMC because interesting properties, such as bugs and optima, are often very rare and occur infrequently in simulations. A key objective for SMC is thus to reduce the number and length of simulations necessary to produce an estimate with a given level of statistical confidence. In this talk I will describe two variance reduction approaches that address this problem: importance sampling and importance splitting. Importance sampling weights the probabilities of a system to make a rare property occur more frequently in simulations, then compensates the results by the weights. Importance splitting decomposes a rare property into the product of sub-properties that are less rare and easier to simulate. I will outline the challenges of using these techniques in SMC and highlight some practical solutions, including the recent successful application of importance sampling to stochastic timed automata.

Thu 13 April 2017, 17:00–18:00

Takebashi Site Common Room 3

Zhoulai Fu (UC Davis), A Mathematical Method for Reasoning about Numerical Code

 I will present Mathematical Execution (ME), a new, unconventional method for reasoning about numerical code. The idea is to reduce the problem of testing/verifying a program into the problem of minimizing a derived representing function. ME is particularly efficient for numerical code; it directs input space exploration by only executing the representing function, which avoids static or symbolic reasoning about the program semantics. We have applied ME on several instances: (1) satisfiability solving, (2) boundary value analysis, (3) coverage-based testing, and (4) path reachability. The results are extremely promising. During my presentation I will explain the unified method, that is ME, behind our very good results. I will use (1) and (2) above as two case studies.

Fri 7 April 2017, 15:00–16:00

Takebashi Site Common Room 3

Ichiro Hasuo (NII & ERATO MMSD), The ERATO MMSD Project: an Overview

Goals, Strategies and Tactics of the project.