TBA

### 27 mai 2019

*TBA*

**Stéphane Le Roux**

Skip to content
# Séminaires 2018-19

Contact : seminar@lacl.fr
# Les lundis, à partir de 14h - UPEC CMC - Salle P2-131

### 27 mai 2019

*TBA*

**Stéphane Le Roux**
(LSV - ENS Cachan)

### 20 mai 2019

*On the semantics of higher-order probabilistic programs*

**Jean Goubault Larrecq**
(LSV - ENS Cachan)

### 13 mai 2019

*Continuous models of computation: computability, complexity, universality*

**Amaury Pouly**
(IRIF - CNRS)

### 6 mai 2019

*Algèbre, pavages et Nivat*

**Etienne Moutot**
(ENS-Lyon)

### 15 avril 2019

*The complexity of mean payoff games using universal graphs*

**Nathanael Fijalkow**
(LABRI - Université de Bordeaux)

### 8 avril 2019

*The recurrence function of a random Sturmian word*

**Pablo Rotondo**
(LIGM)

### 1 avril 2019

*Ontology-Mediated Query Answering with OWL 2 QL Ontologies: Combined Complexity and Succinctness of Rewritings*

**Meghyn Bienvenu**
(CNRS - LABRI)

### 25 mars 2019

*Completeness for Identity-free Kleene Lattices*

**Amina Doumane**
(Université de Varsovie)

### 18 mars 2019

*Pavages apériodiques et codage de Z^2-actions sur le tore*

**Sébastien Labbé**
(LABRI - Université de Bordeaux)

### 18 février 2019

*The power of programs over monoids taken from some small varieties of finite monoids*

**Nathan Grosshans**
(ENS)

TBA

It seems pretty obvious to add random choice to your

preferred higher-order functional language.

One can then give it an operational semantics in the form

of an infinite Markov chain whose states are

machine configurations, and with easily understandable

rules.

Denotational semantics provide helpful invariants to

reason about programs, and Jones and Plotkin’s

probabilistic powerdomain (1990) models random choice

elegantly.

One can then interpret higher-order probabilistic

programs in the category of dcpos (directed-complete

partial orders), and that works perfectly well…

except that some dcpos are rather pathological,

and that prevents us from proving all the theorems

we would like to have. As a case in point, it

is unknown whether Fubini’s theorem holds on all

dcpos, which means that drawing x then y at random

is not known to be equivalent with drawing y then x.

Such problems do not occur with so-called continuous

dcpos, but then we must face the Jung-Tix problem (1998):

we do not know of any category of continuous dcpos

that can handle both higher-order features and

the probabilistic powerdomain.

We will show that there is a simple way of getting

around the Jung-Tix problem, relying on a variant

of Levy’s call-by-push-value paradigm (2003), and provided

we also include a form of demonic non-deterministic

choice (related to must-termination, operationally).

We will argue that the language satisfies adequacy:

on programs of base type (int), the denotational semantics

computes a subprobability distribution whose

mass at any given number n is exactly the minimal probability

that the output of the program will be n.

If we have time, we will then study full abstraction,

namely the relation between denotational (in)equality

and the observational preorder. Our language is

not fully abstract. One reason is expected: the absence

of parallel conditionals. Another has surfaced

more recently, and that is the absence of so-called

statistical termination testers. With both added,

however, our language is fully abstract.

In 1941, Claude Shannon introduced a continuous-time analog model of

computation, namely the General Purpose Analog Computer (GPAC). The

GPAC is a physically feasible model in the sense that it can be

implemented in practice through the use of analog electronics or

mechanical devices. It can be proved that the functions computed by a

GPAC are precisely the solutions of a special class of differential

equations where the right-hand side is a polynomial. Analog computers

have since been replaced by digital counterpart. Nevertheless, one can

wonder how the GPAC could be compared to Turing

machines.

A few years ago, it was shown that Turing-based paradigms and

the GPAC have the same computational power. However, this result did

not shed any light on what happens at a computational complexity

level. In other words, analog computers do not make a difference about

what can be computed; but maybe they could compute faster than a

digital computer. A fundamental difficulty of continuous-time model is

to define a proper notion of complexity. Indeed, a troubling problem is

that many models exhibit the so-called Zeno’s phenomenon, also known as

space-time contraction.

In this talk, I will present results from my thesis that give several

fundamental contributions to these questions. We show that the GPAC has

the same computational power as the Turing machine, at the complexity

level. We also provide as a side effect a purely analog, machine-

independent characterization of P and Computable Analysis.

I will present some recent work on the universality of polynomial

differential equations. We show that when we impose no restrictions at

all on the system, it is possible to build a fixed equation that

is universal in the sense it can approximate arbitrarily well any

continuous curve over R, simply by changing the initial condition of

the system.

If time allows, I will also mention some recent application of this

work to show that chemical reaction networks are strongly Turing

complete with the differential semantics.

La conjecture de Nivat dit que toute configuration (coloration de Z^2) de faible complexité (le nombre de motifs qui y apparaissent est “faible”) est nécessairement périodique.

En 2015, Michal Szabados et Jarkko Kari ont publié leur premier article utilisant une nouvelle approche pour s’attaquer à cette conjecture: une approche algébrique.

Leur idée est de représenter une configuration comme une série formelle, et en étudiant la structures de certains objets qui lui sont liés (tels que des idéaux polynomiaux), ils parviennent à utiliser des théorèmes d’algèbre pour se rapprocher de la conjecture de Nivat.

Dans cet exposé, je présenterai les travaux que j’ai effectué avec Jarkko Kari dans le continuation de la thèse de Michal Szabados. Je présenterai deux théorèmes utilisant ces outils algébriques pour se rapprocher encore une fois de la conjecture de Nivat, dans deux sens différents: Le premier montre que la conjecture de Nivat est vraie pour une certaine classe de sous-shifts, tandis que le second prouve la décidabilité du problème du domino pour les sous-shift de faible complexité (résultat que la conjecture de Nivat impliquerait de manière presque immédiate).

TBA

TBA

TBA

TBA

TBA

The computational model of programs over monoids, introduced by Barrington and Thérien in the late 1980s, gives a way to generalise the notion of (classical) recognition through morphisms into monoids in such a way that almost all open questions about the internal structure of the complexity class NC^1 can be reformulated as understanding what languages (and, in fact, even regular languages) can be program-recognised by monoids taken from some given variety of finite monoids. Unfortunately, for the moment, this finite semigroup theoretical approach did not help to prove any new result about the internal structure of NC^1 and, even worse, any attempt to reprove well-known results about this internal structure (like the fact that the language of words over the binary alphabet containing a number of 1s not divisible by some fixed integer greater than 1 is not in AC^0) using techniques stemming from algebraic automata theory failed.

In this talk, I shall present the model of programs over monoids, explain how it relates to “small” circuit complexity classes and present some of the contributions I made during my Ph.D. thesis to the understanding of the computational power of programs over monoids, focusing on the well-known varieties of finite monoids DA and J (giving rise to “small” circuit complexity classes well within AC^0). I shall conclude with a word about ongoing work and future research directions.