Enno Mammen
Strong Approximations for Robbins-Monro Procedures
The Robbins-Monro algorithm is a recursive, simulation-based stochastic procedure to approximate the zeros of a function that can be written as an expectation. It is known that under some technical assumptions, Gaussian limit theorems approximate the stochastic performance of the algorithm. Here, we are interested in strong approximations for Robbins-Monro procedures. The main tool for getting them are local limit theorems, that is, studying the convergence of the density of the algorithm. The analysis relies on a version of parametrix techniques for Markov chains converging to diffusions. The main difficulty that arises here is the fact that the drift is unbounded. The talk is based on joint work with Valentin Konakov, Moscow, and Lorick Huang, Toulouse.
The first afternoon will close with a poster session, where the participants present their research topics. Knowing about the broad research interests of their peers from the start of the workshop will facilitate discussions and interactions. All conference participants will be able to vote for the prize for the best poster presentation. This will provide additional incentives for excellent presentations and helps the development of early-career researchers.
Hans Reimann
Data-Driven Impulse Control in Multiple Dimensions via Non-Parametric Estimation of the Optimal Stopping Rule
We investigate optimal stopping in multiple dimensions and corresponding non-parametric approaches for data-driven impulse control strategies. The analysis can be separated into two steps: understanding the underlying optimal stopping problem in higher dimensions for diffusion processes with known components, and constructing as well as evaluating non-parametric approaches for the case of unknown system components.
The key insights are as follows: Optimal stopping in multiple dimensions can be formulated via an operator for constructing a value function substitute with desirable properties regarding error stability in characterizing quantities. We can reliably estimate such quantities by estimating the unknown components therein. Based on these results, we propose a data-driven strategy and evaluate its proficiency.
Maarten Hoeneveld
Community Detection in Geometric Random Graphs
In recent years, there has been an increasing interest in community detection in random graphs [1]. The primary objective of my project is to extend the existing achievability and impossibility proofs for more generalised and robust models. Specifically, my work generalises random graph models from current literature [2,3,4] to accommodate for a general number of communities and dimensions, and more flexible, distance-dependent probability functions. This research aims to expand the knowledge of community recovery to more realistic geometric random graph models.
Ali Jalali
Topological transitions in statistical identification under complex information loss: A new application of topological data analysis to clinical trial evaluations
Topological data analysis (TDA) has developed a geometric framework to address statistical challenges in traditional statistical inference and identification when datasets under study are large, high-dimensional, and structurally complex. In addition to introducing new tools for data scientists, TDA has contributed a broader language and set of objects for studying structural features that may emerge in data and machine learning models. Motivated by persistent challenges in clinical trials conducted in complex care settings where substantial and systematic missing data arise due to patient attrition, I conduct a study of the topology of likelihood-based statistical inference itself as data become progressively incomplete/missing. Instead of treating information loss as a reduction in effective sample size--as is current practice by practitioners--I model longitudinal missingness as a monotone deformation parameter acting directly on the geometry of the likelihood function. This formulation yields a one-parameter family of likelihood landscapes indexed by increasing information loss (proportion of follow-up outcome data missing) while holding the nominal sample size fixed. Using tools from Morse theory and Reeb graph constructions, I characterize how the critical-point structure of these surfaces evolves under realistic attrition patterns.
A key result in the simulation results is that likelihood-based identification does not degrade smoothly. Beyond a critical level of missingness, the global mode corresponding to the correctly specified parameter degenerates into a saddle-like surface and disappears, giving rise to competing local optima and a transition to biased identification regime. I further observe an inverse persistence phenomenon where the estimator associated with the unbiased parameter often exhibits lower topological persistence than biased modes arising from relatively stable, selectively observed subpopulations. I conclude that under monotone information loss, the curvature supporting the true parameter erodes more rapidly than the curvature stabilizing certain biased alternatives, rendering the latter topologically more persistent. While not claimed as a universal property, this pattern raises concerns for common statistical practices that favor the most stable or robust estimates under missing data, as such estimates may correspond to biased inferential regimes. Overall, this work demonstrates how TDA can be applied to the topology of statistical inference itself, providing new insights into identification robustness and showing how complex likelihood geometry can arise even in low-dimensional models under progressive information loss.
Patrick Bastian
TWIN: Two window inspection for online change point detection
We propose a new class of sequential change point tests, both for changes in the mean parameter and in the overall distribution function. The methodology builds on a two-window inspection scheme (TWIN), which aggregates data into symmetric samples and applies strong weighting to enhance statistical performance. The detector yields logarithmic rather than polynomial detection delays, representing a substantial reduction compared to state-of-the-art alternatives. Delays remain short, even for late changes, where existing methods perform worst. Moreover, the new procedure also attains higher power than current methods across broad classes of local alternatives.For mean changes, we further introduce a self-normalized version of the detector that automatically cancels out temporal dependence, eliminating the need to estimate nuisance parameters. The advantages of our approach are supported by asymptotic theory, simulations and an application to monitoring COVID19 data. Here, structural breaks associated with new virus variants are detected almost immediately by our new procedures.This indicates potential value for the real-time monitoring of future epidemics.Mathematically, our approach is underpinned by new exponential moment bounds for the global modulus of continuity of the partial sum process, which may be of independent interest beyond change point testing.
Mathis Rost
Likelihood Approximation for Gibbs Point Processes
Although the likelihood function of a Gibbs point process is typically intractable, it is fundamental for likelihood-based inference, likelihood ratio tests, and Bayesian analysis, making accurate likelihood approximation an important challenge.
In this talk, we present a new method for approximating the likelihood function of Gibbs point processes. Building on recent probabilistic results, we derive a novel likelihood representation expressed entirely in terms of the Papangelou conditional intensity, which is typically tractable, and the void probability, i.e., the probability that a given region contains no points.
We introduce a new algorithm for approximating these void probabilities, based on newly derived structural characteristics of void probabilities for Gibbs processes, and compare its performance to existing state-of-the-art methods. Through a simulation study, we illustrate how this approach enables faster likelihood approximation for a broad class of Gibbs models.
Albertas Dvirnas
Bridging Matrix Profiles and Empirical Dynamic Modelling in the Search for Patterns and Predictions in Environmental Data
Empirical dynamical modelling (EDM) and matrix profiles offer complementary ways to discover structure in complex time series. EDM reconstructs low-dimensional attractors from high-dimensional observations, enabling local analogue forecasting and causal inference, while matrix profiles provide a scalable, domain-agnostic mechanism for fast motif discovery, anomaly detection, and nearest-neighbour search. This poster explores how these two perspectives can be combined to analyse high-dimensional environmental data, such as multi-species environmental DNA (eDNA) time series.
By interpreting matrix profile subsequences as embedded states in EDM’s reconstructed phase space, we obtain a unified framework for identifying recurrent dynamical patterns and constructing local, interpretable forecasts. The approach naturally extends to streaming settings, where incremental updates to the matrix profile support real-time pattern tracking and prediction as new observations arrive. We illustrate this with examples in seasonal environmental monitoring, highlighting how the joint use of matrix profiles and EDM can reveal candidate mechanisms, regime shifts, and nonlinear dependencies that are obscured by purely statistical or purely mechanistic models. The goal is to position this synergy as a practical toolkit for exploratory analysis and prediction in modern, high-dimensional environmental datasets.