Contact : firstname.lastname@example.org
Benoît Barbot (LACL)
Uniform Sampling for Timed Automata with Application to Language Inclusion Measurement
Monte Carlo model checking introduced by Smolka and Grosu is an approach to analyse non-probabilistic models using sampling and draw conclusions with a given confidence interval by applying statistical inference. Though not exhaustive, the method enables verification of complex models, even in cases where the underlying problem is undecidable. In this paper we develop Monte Carlo model checking techniques to evaluate quantitative properties of timed languages. Our approach is based on uniform random sampling of behaviours, as opposed to isotropic sampling that chooses the next step uniformly at random. The uniformity is defined with respect to volume measure of timed languages previously studied by Asarin, Basset and Degorre. We improve over their work by employing a zone graph abstraction instead of the region graph abstraction and incorporating uniform sampling within a zone-based Monte Carlo model checking framework. We implement our algorithms using tools PRISM, SageMath and COSMOS, and demonstrate their usefulness on statistical language inclusion measurement in terms of volume. Joint work with Nicolas Basset, Marc Beunardeau, and Marta Kwiatkowska.
Simon Martiel (LSV, ENS Cachan)
Reversible causal graph dynamics
Causal Graph Dynamics extend Cellular Automata to arbitrary, bounded-degree, time-varying graphs. The whole graph evolves in discrete time steps, and this global evolution is required to have a number of physics-like symmetries: shift-invariance (it acts everywhere the same) and causality (information has a bounded speed of propagation). We add a further physics-like symmetry, namely reversibility.
Ludovic Patey (IRIF)
Introduction à l'analyse calculatoire des théorèmes
De nombreux théorèmes peuvent être vus comme des problèmes abstraits. Par exemple, le lemme de König nous dit que tout arbre infini à branchement fini, admet un chemin infini. Il est possible de voir le lemme de König comme le problème dont les instances sont des arbres, et dont les solutions sont des chemins à travers les arbres.
Parmi ces théorèmes classiquement vrais, certains ne sont pas calculatoirement vrais, au sens où ils admettent des instance sans solution calculable en l'instance. Dans cette présentation, nous introduirons les outils de la réduction calculatoire pour comparer les degrés d'incalculabilité desdits théorèmes.
Quels sont les théorèmes calculatoirement vrais ? Comment utiliser le contenu calculatoire d'un théorème pour en résoudre un autre ? Vous le saurez… après la pub.
Luidnel Maignan (LACL)
Global (Graph) Transformations
Most of graph rewritting models are “asynchronous”. When many parts of a graph should be rewritten, a rewritting order has to be chosen. They are also often “non-deterministic”. Rewritting one part may prevent another overlapping part to be rewritten, making the different orders non-equivalent. Determinism is achieved only when there is a kind of mutual exlusion. In the spirit of cellular automata, we design a formal framework to speak about graph rewrittings where all the graph is modified synchronously and determinism is achieved by a notion of “mutual agreement”. The proposed rewritting process is discussed and finally formalized efficiently using the categorical toolkit. This is a joint work with Antoine Spicher.
Cécile Gonçalves (LACL)
Introduction à la cryptographie basée sur les courbes elliptiques
Dans cet exposé, nous présenterons brièvement les différents protocoles cryptographiques sur courbes elliptiques ainsi que les enjeux en terme de sécurité.
Nicolas Gastineau (Univ. Bourgogne Franche-Comté)
On-line algorithms and on-line coloring
In graph theory, an on-line algorithm considers a graph and an order on the vertices of this graph, takes a decision for each vertex following the order and only knowing information about the previous vertices. In this talk, we focus on the on-line coloring problem in order to illustrate the behavior of such algorithms. Afterwards, we present the Grundy coloring which is the best known on-line coloring algorithm. We present some results about the complexity of decision problems for on-line and Grundy coloring. Finally, we present several theorems about the maximal number of color the Grundy coloring can give for different classes of graphs.
Selma Djelloul Naboulsi (UPEC)
Graph classes of bounded treewidth and equational sets
At the beginning of the 1990's, researchers in algorithmic graph theory (re)discovered under different formalisms the notion of treewidth of a graph and its impact on parameterized complexity of many hard problems of graph theory. This gave rise to a flourishing literature by researchers from different fields. Indeed, tree-decompositions are used to describe deep structural properties of classes of graphs as well as to design FPT-algorithms. These are the most known ways treewidth is dealt with. In this talk, we focus on another field where treewidth hides, and which is less known than the previously mentioned two fields. More precisely, we present how algebras of graphs can be defined, how a graph can be viewed as the value of a term over a signature, and how the components of the least solution of a polynomial system over a signature can be related to sets of graphs with bounded treewidth.
Benjamin Monmege (LIF Marseille)
To Reach or not to Reach? Efficient Algorithms for Total-Payoff Games
Quantitative games are two-player zero-sum games played on directed weighted graphs. Total-payoff games—that can be seen as a refinement of the well-studied mean-payoff games—are the variant where the payoff of a play is computed as the sum of the weights. In this talk, I will describe the first pseudo-polynomial time algorithm for total-payoff games in the presence of arbitrary weights. It consists of a non-trivial application of the value iteration paradigm. Indeed, it requires to study, as a milestone, a refinement of these games, called min-cost reachability games, where we add a reachability objective to one of the players. For these games, we give an efficient value iteration algorithm to compute the values and optimal strategies (when they exist), that runs in pseudo-polynomial time. I will also present heuristics able to speed up the computations. Joint work with Thomas Brihaye, Gilles Geeraerts, and Axel Haddad.
Ilias Garnier (ENS Paris)
Giry and the Machine
We present a general method - the Machine - to analyse and characterise in finitary terms natural transformations between (iterates of) Giry-like functors in the category Pol of Polish spaces. The method relies on a detailed analysis of the structure of Pol and a small set of composable categorical conditions on the domain and codomain functors. We apply the Machine to transformations from the Giry and positive measures functors to combinations of the Vietoris, multiset, Giry and positive measures functors. We also show that for some combinations of these functors, there cannot exist more than one natural transformation between the functors, in particular the Giry monad has no natural transformations to itself apart from the identity. Finally we show how the Dirichlet and Poisson processes can be constructed with the Machine.
Matteo Mio (ENS-Lyon)
Regular Languages of Trees and Probability
The decidability of the SAT(isfiability) problem for most branching-time temporal logics (such as CTL) follows from Rabin’s theorem about the decidability of the Monadic Second Order Theory of Two Successors (S2S). However, the decidability of the SAT problem for probabilistic logics (such as probabilistic CTL) is open. In an attempt to solve this problem it is natural to investigate probability-related variants and extensions of Rabin’s theorem. In this talk I will present some recent results on this line of research.
Rodica Bozianu (LACL)
La complexité de la synthèse avec environnement rationnel (Rational Synthesis)
In this paper, we study the computational complexity of the cooperative and non-cooperative rational synthesis problems, as introduced by Kupferman, Vardi and coauthors in recent papers for general LTL objectives. From the previous papers we get that the cooperative and non-cooperative rational synthesis problems are 2ExpTime-complete for specifications expressed in linear temporal logic (LTL), thus matching exactly the complexity of classical zero-sum two-player LTL synthesis.
We investigate these problems on multiplayer turn-based games played on graphs, and provide complexity results for the classical omega-regular objectives, namely reachability, Safety, Buchi, coBuchi, parity, Rabin, Streett and Muller objectives. Most of these complexity results are tight and shed light on how to solve those problems optimally.
We also study the computational complexity of solving those games when the number of players is fixed. This parametrized analysis makes sense as the number of components forming the environment may be limited in practical applications.
Joint work with Emmanuel Filiot, Raffaella Gentilini et Jean-François Raskin
Catalin Dima (LACL)
Entropy Games and Matrix Multiplication Games
Two intimately related new classes of games are introduced and studied: entropy games (EGs) and matrix multiplication games (MMGs). An EG is played on a finite arena by two-and-a-half players: Despot, Tribune and the non-deterministic People. Despot wants to make the set of possible People’s behaviors as small as possible, while Tribune wants to make it as large as possible. An MMG is played by two players that alternately write matrices from some predefined finite sets. One player wants to maximize the growth rate of the product, while the other wants to minimize it. We show that in general MMGs are undecidable in quite a strong sense. On the positive side, EGs correspond to a subclass of MMGs, and we prove that such MMGs and EGs are determined, and that the optimal strategies are simple. The complexity of solving such games is in NP \cup coNP.
Based on joint work with Eugene Asarin, Aldric Degorre and Florian Horn (IRIF-Paris Diderot), and Victor Kozyakin (IITP Moscow).
Cecile Gonçalves (LACL)
A Point Counting Algorithm for Cyclic Covers of the Projective Line
We present a Kedlaya-style point counting algorithm for cyclic covers of the projective line over a finite field. This algorithm generalizes the Gaudry-Gürel algorithm for superelliptic curves to a more general class of curves, and has essentially the same complexity. Our practical improvements include a simplified algorithm exploiting the automorphism of the curve, refined bounds on the p-adic precision, and an alternative pseudo-basis for the Monsky-Washnitzer cohomology which leads to an integral matrix in some cases we will precise. Each of these improvements can also be applied to the original Gaudry-Gürel algorithm. Our algorithm allowed us to compute Weil polynomials of some large genus cyclic covers.
Adrien Basso-Blandin (IBISC Lab)
Approche comportementale pour la biologie : vers une automatisation de la conception des fonctions de synthèse
La biologie possède une masse de connaissances dont la disparité des données et la fragmentation entraine une difficulté d'appréhension des systèmes dans leur globalité. Plus particulièrement, en biologie de synthèse où l'objectif est de concevoir de nouvelles fonctionnalités n'existant pas à l'état naturel à partir de composants existants, la connaissance partielle de ces composants entraine de lourds risques de sécurité et de sureté lors de l'assemblage de nouveaux organismes.
Dans ce cadre, nous proposons un grand axe majeur centré sur la définition d'une tour de modèles formels permettant dans le cas le plus abstrait une description comportementale des fonctionnalités des systèmes et dans celui le plus concret de décrire les interactions mécanistiques entre composants élémentaires.
L'objectif à terme est de proposer à la communauté des biologistes un ensemble d'outils d'aide à la décision permettant l'agrégation de connaissances biologiques en une unique représentation de connaissances, puis l'inférence de nouvelles connaissances ainsi que la validation et la vérification formelle et in-vitro lors de la conception de nouvelles fonctions biologiques.
Edon Kelmendi (LaBRI)
Limit-sure reachability in a class of stochastic semiperfect-information games
We study two-player stochastic games with reachability conditions where the maximizer has no information whereas the minimizer has perfect-information. The question of limit-sure reachability is the following: is it true that for all epsilon>0 and finite-memory strategies tau (for the minimizer) there exists a finite word w (for the maximizer) such that the probability of reaching the final states is larger than 1-epsilon when the game unravels according to w and tau. This question is undecidable in general, the reason being that if the minimizer has no choice at all such a game is equivalent to a probabilistic finite automaton where the value 1 problem is undecidable. Building up on previous work on probabilistic finite automata we identify a robust class of games where the limit-sure reachability is decidable. We also show that there are games where the optimal strategies of the minimizer need infinite-memory which is slightly surprising since the minimizer has perfect-information.
Olivier Serre (LIAFA)
How Good Is a Strategy in a Game With Nature?
We consider games with two antagonistic players — Éloïse (modelling a program) and Abélard (modelling a byzantine environment) — and a third, unpredictable and uncontrollable player, that we call Nature. Motivated by the fact that the usual probabilistic semantics very quickly leads to undecidability when considering either infinite game graphs or imperfect information, we propose two alternative semantics that leads to decidability where the probabilistic one fails: one based on counting and one based on topology.
Christophe Bahadoran (Université de Clermont-Ferrand)
Transitions de phase pour les modèles de trafic routier avec désordre
Le TASEP (Totally Asymmetric Simple Exclusion Process), système de particules introduit par les biologistes moléculaires en 1968, est un des modèles de base de la mécanique statistique hors équilibre. Il est considéré comme un modèle simplifié de trafic routier mais peut également se voir comme un système de files d'attente en séries ou un modèle de percolation dirigée. Dans cet exposé je parlerai de limites hydrodynamiques (description macroscopique du système) et de phénomènes de condensation et bouchons provoqués par le désordre spatial ou véhiculaire.
Anatol Slissenko (LACL)
Graph Isomorphism: the History Continues
In November 2015 László Babai (University of Chicago) had claimed that graph isomorphism can be recognized in quasi-polynomial time. It would be a major breakthrough in the complexity of this problem since the beginning of the 80ies. His result has not been yet checked. The algorithm is very complicated (a manuscript of 85 pages is available) and uses non-trivial group theory and previous results, especially those due to E.Luks and L.Babai himself. In the talk I plan to explain the problem, the history of the research on this subject (only the history related to the result of Babai), and maybe try to outline some difficulties surmounted by Babai.
Mathieu Hoyrup (LORIA)
Lire un programme ou l'exécuter : quelle différence ?
Que peut-on savoir d'une fonction si elle nous est présentée : - sous forme d'un programme calculant cette fonction, - sous forme d'une boîte noire permettant de connaître les valeurs de cette fonction sur toutes les entrées.
Disposer d'un programme donne au moins autant d'information qu'avoir accès à la boîte noire. Cela donne-t-il plus d'information ? Dans ce cas, quel type d'information ?
Cette question est un des problèmes les plus fondamentaux de l'informatique théorique et a donné lieu à de nombreux travaux, à commencer par ceux de Turing. Je présenterai les résultats historiques ainsi que les développements récents. Le domaine de recherche est la théorie de la calculabilité.
Alexis Bès (LACL)
Logique, automates et substitutions
Dans cet exposé de synthèse, on s'intéressera aux ensembles d'entiers reconnaissables par automate fini lorsqu'on lit les entiers en base k>=2. On montrera qu'il existe plusieurs caractérisations de ces ensembles, notamment en termes de substitutions et en termes de définissabilité logique. On verra également que ces propriétés sont sensibles à la base k choisie (théorème de Cobham). Puis on se concentrera sur l'approche logique, en montrant comment on peut aborder certains problèmes combinatoires et algorithmiques du domaine en utilisant des arguments de définissabilité et décidabilité (et des prouveurs automatiques).
Sebastian Barbieri (ENS-Lyon)
The domino problem for structures between Z and Z^d
We define the domino problem for subshifts where the symbols are restrained to subsets of Z^d. More specifically, we focus on subsets which have a self-similar structure defined by a family of substitutions. In this setting we exhibit non-trivial families of structures where the domino problem is decidable and undecidable. Amongst those we show that the Sierpinski triangle has decidable domino problem, while the Sierpinski carpet has undecidabe domino problem.
Daniele Varacca (LACL)
On the Buridan’s Principle by Leslie Lamport
Abstract to be updated
Nicolas Markey (LSV, ENS Cachan)
ATL with strategy contexts
The alternating-time temporal logic (ATL) was defined 15 years ago as an extension of CTL for expressing and verifying properties of multi-player games. However, in order to inherit nice algorithmic properties of CTL, ATL has limited expressiveness. In this talk, I will begin with presenting our extension of ATL with strategy contexts, which contrary to ATL can express rich properties with interactions between strategies. I will then present QCTL, the extension of CTL with quantification over atomic propositions, and explain how it can be used to get algorithms for ATL with strategy contexts. This talk is based on joint works with Thomas Brihaye, Arnaud Da Costa Lopes, and François Laroussinie.
Andreï Romashchenko (LIRMM, Montpellier)
Quasiperiodicity and non-computability in tilings.
We study tilings of the plane that combine strong properties of different nature: combinatorial and algorithmic. We prove the existence of a tile set that accepts only quasiperiodic and non-recursive tilings. We characterise the classes of Turing degrees that can be represented by quasiperiodic tilings. We also show that every minimal 1-dim subshift can be implemented as a subaction of a 2-dim minimal SFT.
Reem Yassawi (
Décidabilité de la conjugaison topologique entre shifts substitutifs de longueur constante
Soit $(X,\sigma)$ et $(Y,\sigma)$ deux shifts: une conjugaison (topologique) entre ces deux systèmes est un homéomorphisme $\Phi:X\rightarrow Y$ qui commute avec le shift. Autrement dit, une conjugaison est un automate cellulaire bijectif entre X et Y. Étant donné deux shifts, nous voudrions savoir s'il existe une conjugaison entre eux. Nous démontrons que si $(X,\sigma)$ et $(Y,\sigma)$ sont engendrés par des substitutions de longueurs constantes, alors il y a un algorithme qui énumère toutes les conjugaisons entre ces deux systèmes. Ceci est un travail commun avec Ethan Coven et Anthony Quas.
Mathieu Sassolas (LACL)
Polynomial Interrupt Timed Automata
Modeling real world systems implies taking into account the continuous elapsing of time along with the discrete set of states the system can operate in, thus yielding so called /hybrid systems/. However, verification of such systems is mostly undecidable, and finding decidable fragments has been an ongoing research topic for the last 25 years. In this work we propose a decidable model of hybrid systems taking into account constraints expressed by polynomials instead of the more widespread linear constraints. The decision procedure relies on the cylindrical decomposition of reals.
This talk is based on joint work with Béatrice Bérard, Serge Haddad, Claudine Picaronny, and Mohab Safey El Din.
Mehdi Haddad (LACL)
Inference of sensitive information in data integration systems
The inference problem, in an access control context, refers to the ability of a malicious user to synthesize sensitive information from a combination of non sensitive information. This problem is highlighted in data integration systems where a mediator providing a unique entry point to several heterogeneous sources is defined.
In this talk we describe an incremental methodology able to tackle the inference problem in a data integration context. This methodology has three phases. The first phase, the propagation phase, allows combining source policies and therefore generating a preliminary policy at the mediator level. The second phase, the detection phase, characterizes the role of semantic constraints in inducing inference about sensitive information. We also introduce in this phase a graph-based approach able to enumerate all indirect access that could induce accessing sensitive information. In order to deal with previously detected indirect access, we introduce the reconfiguration phase which provides two solutions. The first solution could be implemented at design time. The second solution could be implemented at runtime.
Benoit Monin (UPEC LACL)
A small history of K-triviality
The Kolmogorov complexity of a string is, informally, the length of the smallest program that produces this string. We write C(\s) = n to mean that the smallest program producing the string \s is of length n.
An infinite binary sequence X is said to be C-trivial if the Kolmogorov complexity of its prefixes is minimal, that is, if there is a constant d such that for any n, we have C(X \rest n) < C(n) + d (Here X \rest n denotes the n first bits of X). It is clear that any computable (infinite binary) sequence is C-trivial. Chaitin proved that the converse holds:
A sequence X is computable iff it is C-trivial (1).
Chaitin also successfully used Kolmogorov complexity to provide a formal definition of the intuitive idea we can have of a random sequence. To do so, he needed a variation of the standard Kolmogorov complexity, called “prefix Kolmogorov complexity” and denoted by K. Using this prefix Kolmorogov complexity K, he defined a sequence Z to be random if for any n we have K(Z \rest n) > n - d for some constant d. The intuition is that the prefix Kolmogorov complexity should be maximal for prefixes of random sequences. This definition of randomness is still today the most studied, for many reasons that we shall not detail during the talk.
Chaitin conjectured (1) to also be true with prefix Kolmorogov complexity K, that is, X is computable iff X is K-trivial (that is, there is d such that for every n we have K(X \rest n) < K(n) + d). Solovay later refuted the conjecture by constructing a non-computable K-trivial sequence A. The notion of K-triviality was born, and had yet to reveal many surprises through its numerous different characterizations, and its connections with algorthmic randomness.
After introducing the main concepts with more details, we will try during this talk to give some explanations and intuitions on the work that has been done by various people on K-triviality these last 15 years.
Benoît Barbot (LACL - UPEC)
Building Power Consumption Models from Executable Timed I/O Automata Specifications
We develop a novel model-based hardware-in-the-loop (HIL) framework for optimising energy consumption of embedded software. Software components are specified as networks of parameterised timed input/output automata, which are translated into executable code run on a microcontroller connected to an executable plant model and a power monitor. We use timed Petri nets as an intermediate representation of the executable specification, which facilitates efficient code generation and fast simulations. The framework is able to produce optimal values of timing parameters that minimise energy usage, without compromising a given safety requirement, together with a probabilistic power consumption model derived from real measurement data. Our framework uniquely combines the advantages of rigorous distributed real-time specifications with accurate power measurements and online power model construction.
LACL, Département d'Informatique