Show simple item record

Mini-Workshop: Interpolation and Over-parameterization in Statistics and Machine Learning

dc.date.accessioned2023-11-29T06:00:34Z
dc.date.available2023-11-29T06:00:34Z
dc.date.issued2023
dc.identifier.urihttp://publications.mfo.de/handle/mfo/4090
dc.description.abstractIn recent years it has become clear that, contrary to traditional statistical beliefs, methods that interpolate (fit exactly) the noisy training data, can still be statistically optimal. In particular, this phenomenon of "benign overfitting'' or "harmless interpolation'' seems to be close to the practical regimes of modern deep learning systems, and, arguably, underlies many of their behaviors. This workshop brought together experts on the emerging theory of interpolation in statistical methods, its theoretical foundations and applications to machine learning and deep learning.
dc.titleMini-Workshop: Interpolation and Over-parameterization in Statistics and Machine Learning
dc.rights.licenseDieses Dokument darf im Rahmen von § 53 UrhG zum eigenen Gebrauch kostenfrei heruntergeladen, gelesen, gespeichert und ausgedruckt, aber nicht im Internet bereitgestellt oder an Außenstehende weitergegeben werden.de
dc.rights.licenseThis document may be downloaded, read, stored and printed for your own use within the limits of § 53 UrhG but it may not be distributed via the internet or passed on to external parties.en
dc.identifier.doi10.14760/OWR-2023-41
local.series.idOWR-2023-41
local.subject.msc62
local.date-range17 Sep - 22 Sep 2023
local.workshopcode2338b
local.workshoptitleMini-Workshop: Interpolation and Over-parameterization in Statistics and Machine Learning
local.organizersMikhail Belkin, San Diego; Alexandre Tsybakov, Palaiseau; Fanny Yang, Zürich
local.report-nameWorkshop Report 2023,41
local.opc-photo-id2338b
local.publishers-doi10.4171/OWR/2023/41


Files in this item

Thumbnail
Report

This item appears in the following Collection(s)

Show simple item record