If you would like to submit an event, please contact us at [email protected]
INVITED SPEAKERS:
In this talk, I try to explain my point of view as a Mathematical Optimizer — especially concerned with discrete (integer) decisions — on Big Data. I advocate a tight integration of Machine Learning and Mathematical Optimization (among others) to deal with the challenges of decision-making in Data Science. For such an integration I concentrate on three questions: 1) what can optimization do for machine learning? 2) what can machine learning do for optimization? 3) which new applications can be solved by the combination of machine learning and optimization? Finally, I will discuss in details two areas in which machine learning techniques have been (successfully) applied in the area of mixed-integer programming. [PDF]
Symmetry is the essential element of lifted inference that has recently demonstrated the possibility to perform very efficient inference in highly-connected, but symmetric probabilistic models models aka. relational probabilistic models. This raises the question, whether this holds for optimization problems in general. In this talk I shall demonstrate that for a large class of mathematical programs this is actually the case. More precisely, I shall introduce the concept of fractional symmetries of linear and convex quadratic programs (QPs), which lie at the heart of many machine learning approaches, and exploit it to lift, i.e., to compress them. These lifted QPs can then be tackled with the usual optimization toolbox (off-the-shelf solvers, cutting plane algorithms, stochastic gradients etc.): If the original QP exhibits symmetry, then the lifted one will generally be more compact, and hence their optimization is likely to be more efficient.[PDF]
This talk is based on joint works with Martin Mladenov, Martin Grohe, Leonard Kleinhans, Pavel Tokmakov, Babak Ahmadi, Amir Globerson, and many others.