Abstract
Big data and deep learning are modern buzz words
which presently infiltrate all fields of science and technology.
These new concepts are impressive in terms
of the stunning results they achieve for a large variety
of applications. However, the theoretical justification
for their success is still very limited. In this snapshot,
we highlight some of the very recent mathematical
results that are the beginnings of a solid theoretical
foundation for the subject.