### November 27, 2017

**Lucas Bockzkowski**(IRIF)

In this talk, I will present and analyze a model of search trees with probabilistic memory faults. Specifically, let T be a tree of maximal degree Delta. One of the leaves tau is chosen adversarially. The goal is to find it by walking on the edges of T, starting at its root. To guide the algorithm in its search, nodes are decorated with an indication called advice. For a typical node v, this indication is a pointer to the neighbor of v which is closer to tau. However, each node v is declared faulty with some small probability q, and in this case the advice is chosen uniformly amongst v’s neighbors.

Usual models of noise assume the information can be refreshed by resampling. In our case, the advice at a node is fixed forever (hence the term “memory faults”). This prevents increasing confidence by visiting several times the same node. I will present upper and lower bounds on the noise regime q that can be tolerated for efficient search, with different measures of complexity: we consider the number of steps needed to reach tau, but also the number of node queries.

Joint work with Amos Korman and Yoav Rodeh.