Machine learning has achieved
remarkable successes in various applications, but there is wide agreement that a mathematical theory for deep learning is missing. Recently, some first mathematical results have been derived in different areas such as mathematical statistics and statistical learning. Any mathematical theory of machine learning will have to combine tools from different fields such as nonparametric statistics, high-dimensional statistics, empirical process theory and approximation theory. The main objective of the workshop was to bring together leading researchers contributing to the mathematics of machine learning.
A focus of the workshop was on theory for deep neural networks. Mathematically speaking, neural networks define function classes with a rich mathematical structure that are extremely difficult to analyze because of non-linearity in the parameters. Until very recently, most existing theoretical results could not cope with many of the distinctive characteristics of deep networks such as multiple hidden layers or the ReLU activation function. Other topics of the workshop are procedures for quantifying the uncertainty of machine learning methods and the mathematics of data privacy.