EUROPT Summer School 2024

School dates, venue and topics

The school took place on 24-25  June at Lund University, Sweden. The school was planned to be in-person only, no live streaming was provided. There have been two courses focusing on different sides of the continuous optimization world, specifically on

1. Computational optimal transport

 2. Multiobjective optimization

The lectures on optimal transport have been delivered by Gabriel Peyré, on multiobjective optimization by Gabriele Eichfelder. Each day included 6 hours of lectures plus coffee breaks.

Attendance 

Attendance was free of charge but with a mandatory registration. Lectures were particularly suited for PhD students and young researchers to provide them with the chance of  attending two high level courses on continuous optimization, but  the school was open to everyone wishing to participate.

To register fill this form by 5 May (please, register only when you are sure you are coming: the available classroom is large, so everyone should have a place – in the very unlikely event of too many registrations  the priority will not be set upon early booking).

Timetable 

Monday 24 June – Gabriel Peyré

Tuesday 25 June – Gabriele Eichfelder

  9:00 – 10:30 Lecture I

10:30 – 11:00 Coffee break

11:00 – 12:30 Lecture II

12:30 – 14:00 Lunch break

14:00 – 15:30 Lecture III

15:30 – 16:00 Coffee break

16:00 – 17:00 Lecture IV/Discussion

The courses

Computational optimal transport

by  Gabriel Peyré

Optimal transport (OT) is a fundamental mathematical theory that intersects optimization, partial differential equations, and probability. It has recently emerged as a crucial tool for addressing a surprisingly wide range of problems in data science, such as shape registration in medical imaging, structured prediction problems in supervised learning, and training of deep generative networks. This mini-course will intertwine descriptions of the mathematical theory with recent developments in scalable numerical solvers. This approach will underscore the significance of recent advancements in regularized approaches for OT, which make it possible to address high-dimensional learning problems. In the first part of the course, I will explain Monge’s original transportation problem and detail Brenier’s theorem, which guarantees the existence and uniqueness of the optimal map. This theorem is then applied in the contexts of Gaussian distributions and one-dimensional densities. I will then present Kantorovich’s formulation, a convex relaxation that has a strong connection with linear programming. In particular, I will explain the celebrated Birkhoff-von Neumann theorem, which establishes an equivalence between Monge’s and Kantorovich’s formulations. I will also discuss the metric properties of Optimal Transport, which define a metric capable of quantifying the convergence in law of distributions and making it a powerful loss function in imaging and learning. The last part will focus on high-dimensional problems, requiring the use of entropic regularization. I will draw  the connection of this approach  with the Schrödinger problem in statistical physics and highlight the associated Sinkhorn’s algorithm, which is highly parallelizable on GPUs and has a significant connection with deep learning and neural networks.

Materials for the course, including a small book, slides, and computational resources, can be found online at

https://optimaltransport.github.io/

 

 Multiobjective optimization

by  Gabriele Eichfelder

In real-life applications often multiple objective functions arise which have to be optimized at the same time, like costs, energy consumption, and quality. This is known as a multiobjective optimization problem, which involves optimizing a vector-valued objective map. A naïve approach is a weighted sum of the objectives. However, it is difficult to choose appropriate weights beforehand and the values of the individual objective functions can have different scales. Moreover, for non-convex optimization problems, valuable solutions may be overlooked by simplifying with a weighted sum, even if the weights are varied. Thus, several other approaches have been developed to solve these types of problems. The most prominent are nonlinear scalarizations, i.e., the reformulation as a parameter-dependent single-objective replacement problem like the epsilon-constraint method. But also direct methods for special classes of multiobjective problems using for instance descent directions or branch-and-bound techniques have been proposed in the last decades. 

These lectures provide an introduction to multiobjective optimization. They begin by presenting the basic optimality concepts and the necessary theoretical background. The focus is then on numerical methods for solving such problems. The lectures give the classical scalarization approaches and discusses their advantages and limitations in terms of quality guarantees as well as for specific classes of optimization problems. A further topic will be descent methods, for which we examine optimality conditions like stationarity. The course continuous with additional ingredients for direct methods for solving multiobjective optimization problems. Participants will gain the ability to select appropriate solution approaches for specific multiobjective optimization problems and critically evaluate the outcomes of standard methods.

 

The lecturers

Gabriel Peyré (CNRS et École Normale Supérieure, Paris)

Gabriel Peyré is CNRS senior researcher and professor at the Ecole Normale Supérieure, Paris. He works at the interface between applied mathematics, imaging, and machine learning. He obtained 2 ERC grants (starting in 2010 and consolidator in 2017), the Blaise Pascal prize from the French Academy of sciences in 2017, the Magenes Prize from the Italian Mathematical Union in 2019 and the silver medal from CNRS in 2021. He is an invited speaker at the European Congress for Mathematics in 2020. He is the deputy director of the Prairie Institute for Artificial Intelligence, the director of the ENS Center for data science, and the former director of the GdR CNRS MIA. He is the head of the ELLIS (European Lab for Learning & Intelligent Systems) Paris Unit. He is engaged in reproducible research and code education, in particular through the platform www.numerical-tours.com.

Gabriele Eichfelder (Technische Universität Ilmenau)

Gabriele Eichfelder is a full professor of Mathematical Methods of Operations Research at the Institute of Mathematics, Technische Universität Ilmenau, Germany. She earned her doctoral degree in 2006 and completed habilitation at the Department of Applied Mathematics at the University of Erlangen-Nuremberg in 2012. She works in the field of Mathematical Optimization with a special interest in nonlinear optimization with vector-valued and set-valued objective functions. In addition to fundamental theoretical studies, she has also been working on numerical solvers for applied engineering problems. Professor Eichfelder has authored two Springer research monographs, has published extensively in top international journals, and has received several publication awards. Recently, she has been elected program director of the SIAM Activity Group on Optimization. Moreover, she served as a member of the program committee for many conferences, including the EURO Conference 2021 in Athens and the recent EUROPT conferences, and as a member of the organizing committee of the 2021 SIAM conference on optimization. She serves on the editorial boards of leading optimization journals, and as an area editor of the Journal of Optimization Theory and Applications. She has been elected EUROPT Fellow in 2024.

 

Organisation