Chiffres-clé
Chiffres clefs
189 personnes travaillent au LJLL
86 permanents
80 chercheurs et enseignants-chercheurs permanents
6 ingénieurs, techniciens et personnels administratifs
103 personnels non permanents
74 doctorants
15 post-doc et ATER
14 émérites et collaborateurs bénévoles
Chiffres janvier 2022
Leçons Jacques-Louis Lions 2020 : Dejan Slepčev
Leçons Jacques-Louis Lions 2020 (Dejan Slepčev)
Les Leçons Jacques-Louis Lions 2020 sont reportées à une date qui sera précisée ultérieurement
Initialement prévues du 2 au 5 juin 2020, les Leçons Jacques-Louis Lions 2020, qui seront données par Dejan Slepčev (Université Carnegie Mellon, Pittsburgh), ont du être reportées à une date ultérieure en raison de l’épidémie de Covid 19.
Cette nouvelle date (qui se situera à l’automne 2020 ou plus tard) sera annoncée dès que possible.
Les Leçons Jacques-Louis Lions 2020 consisteront en
— un mini-cours
Variational problems on random structures : analysis and applications to data science
de trois séances initialement prévues les mardi 2, mercredi 3 et jeudi 4 juin 2020 de 11h30 à 13h,
— et un colloquium
Machine learning meets calculus of variations
initialement prévu le vendredi 5 juin 2020 de 14h à 15h.
Résumé du mini-cours
Variational problems on random structures : analysis and applications to data science
Many machine learning tasks, such as clustering, regression, classification, and dimensionality reduction, are commonly described as optimization problems. Namely these tasks are modeled by introducing functionals, defined using the available random sample, which specifies the desired properties of the object sought.
While the data typically lie in a high dimensional space, they usually have an intrinsic low-dimensional structure that makes the learning tasks feasible. The intrinsic geometric structure is often encoded by a graph created by connecting the nearby data points.
The lectures will discuss a mathematical framework suitable for studies of asymptotic properties of variational problems posed on random samples and related random geometries (e.g. proximity graphs). In particular we will discuss the passage from discrete variational problems on random samples to their continuum limits. We will also consider approaches based on dynamics on graphs and connect these with the evolution equations describing the continuum limits.
The lectures will introduce the basic elements of the background material on calculus of variations and optimal transportation. They will also explain the motivation for the studies of the given functionals and their significance to machine learning. Asymptotic consistency of several important machine learning algorithms will be shown.
Finally the lectures will discuss how the insights from calculus of variations and partial differential equations can be used to improve the design of the functionals used in machine learning.
Résumé du colloquium
Machine learning meets calculus of variations
Modern data-acquisition techniques produce a wealth of data about the world we live in. Extracting the information from the data leads to machine learning tasks such as clustering, classification, regression, dimensionality reduction, and others. These tasks are often described as optimization problems by introducing functionals that specify the desired properties of the object considered.
The functionals take as the input the available data samples, yet we seek to make conclusions about the true distribution of data.
To compare the outcomes based on finite data and the ideal outcomes that one would have if full information is available, we study the asymptotic properties of the discrete optimization problems based on finite random samples. We will discuss how the tools of the calculus of variations and partial differential equations provide tools to compare the discrete and continuum descriptions for many relevant functionals. Furthermore, we will highlight how the connection between the discrete and continuum functionals can be used to improve the modeling of learning tasks on finite data.