Dr. Sam Canning (ETH Zürich)
Non-tautological cycles on the moduli space of smooth curves abstract
Abstract:
It is more difficult to find non-tautological algebraic cycles on moduli spaces of smooth curves than on moduli spaces of stable curves. In fact, there were only 11 known pairs (g,n) for which M_{g,n} was known to have a non-tautological algebraic cycle, due to Graber--Pandharipande and van Zelm. I will explain how to produce non-tautological algebraic cycles in infinitely many more cases. In particular, M_g has non-tautological algebraic cycles whenever g is at least 16. This is joint work with V. Arena, E. Clader, R. Haburcak, A. Li, S.C. Mok, and C. Tamborini.
13:30 • ETH Zentrum, Building ITS, Room
Alexandre Heinlein (Delft University)
Domain decomposition for physics-informed neural networks abstract
Abstract:
Physics-informed neural networks (PINNs) are a class of methods forsolving differential equation-based problems using a neural network asthe discretization. They have been introduced by Raissi et al. andcombine the pioneering collocation approach for neural network functions introduced by Lagaris et al. with the incorporation of data via an additional loss term. PINNs are very versatile as they do not require an explicit mesh, allow for the solution of parameter identification problems, and are well-suited for high-dimensional problems. However, the training of a PINN model is generally not very robust and may require a lot of hyper parameter tuning. In particular, due to the so-called spectral bias, the training of PINN models is notoriously difficult when scaling up to large computational domains as well as for multiscale problems.In this talk, overlapping domain decomposition-based techniques forPINNs are being discussed. Compared with other domain decompositiontechniques for PINNs, in the finite basis physics-informed neuralnetworks (FBPINNs) approach, the coupling is done implicitly via theoverlapping regions and does not require additional loss terms. Using the classical Schwarz domain decomposition framework, a very general framework, that also allows for mult-level extensions, can beintroduced. The method outperforms classical PINNs on several types ofproblems, including multiscale problems, both in terms of accuracy and efficiency. Furthermore, the combination of the multi-level domain decomposition strategy with multifidelity stacking PINNs fortime-dependent problems will be discussed. It can be observed that the combination of multifidelity stacking PINNs with a domain decomposition in time clearly improves the reference results without a domain decomposition.
14:00 • Université de Genève, Conseil Général 7-9, Room 1-05
Dr. Eren C. Kizildag (Columbia, US)
Computational Limits of Random Optimization Problems abstract
Abstract:
Optimization problems with random objective functions are central in computer science, probability, and modern data science. Despite their ubiquity, finding efficient algorithms for solving these problems remains a major challenge. Interestingly, many random optimization problems share a common feature, dubbed as statistical-computational gap: while the optimal value can be pinpointed non-constructively, all known polynomial-time algorithms find strictly sub-optimal solutions. That is, an optimal solution can only be found through brute force search which is computationally expensive.In this talk, I will discuss an emerging theoretical framework for understanding the computational limits of random optimization problems, based on the Overlap Gap Property (OGP). This is an intricate geometrical property that achieves sharp algorithmic lower bounds against the best known polynomial-time algorithms for a wide range of random optimization problems. I will focus on two models to demonstrate the power of the OGP framework: (a) the symmetric binary perceptron, a simple neural network classifying/storing random patterns and a random constraint satisfaction problem, widely studied in probability, statistics, and computer science, and (b) the random number partitioning problem as well as its planted counterpart, which is closely related to the design of randomized controlled trials. In addition to yielding sharp algorithmic lower bounds, our techniques also give rise to new toolkits for the study of statistical-computational gaps in other models, including the online setting.
14:15 • ETH Zentrum, Rämistrasse 101, Zürich, Building HG, Room G 19.1 + Zoom talk
Robert Stelzer (Ulm University)
Time-varying Lévy-driven state space models, locally stationary approximations and asymptotic normality abstract
Abstract:
We first introduce time-varying Lévy-driven state space models, as a class of time series models in continuous time encompassing continuous-time autoregressive moving average processes with parameters changing over time.In order to allow for their statistical analysis we define a notion of locally stationary approximations for sequences of continuous time processes and establish laws of large numbers and central limit type results under \\theta-weak dependence assumptions. Finally, we consider the asymptotic behaviour of the empirical mean and autocovariance function of time-varying Lévy-driven state space models under appropriate conditions.This talk is based on:Bitter, A., Stelzer, R., Ströh, B. (2023):Continuous-time Locally Stationary Time Series ModelsAdvances in Applied Probability, 55 no. 3, 965 - 99Stelzer, R., Ströh, B. (2022):Asymptotics of Time-varying Processes in Continuous-Time using Locally Stationary ApproximationsarXiv:2105.00223
16:30 • EPF Lausanne, UniL campus, Extranef - 110