Skip to main content
Department of Information Technology

A Crash Course on Mathematics of Deep Learning


Mohammad Motamed, Associate Professor of Computational Mathematics, The University of New Mexico

General description

This is an introductory course on an advanced topic. The goal of this course is to introduce students to the approximation properties of deep neural networks. Specifically, we aim at understanding how and why deep neural networks outperform other classical linear and nonlinear approximation methods. We take two steps. We will start with the key ideas and concepts underlying deep networks and their compositional nonlinear structure. We formalize the neural network problem by formulating it as an optimization problem when solving regression and classification problems. We will briefly discuss the stochastic gradient descent algorithm and the back-propagation formulas used in solving the optimization problem and address a few issues related to the performance of neural networks, including the choice of activation functions, cost functions, overfitting issues, and regularization. After formalizing the network problem we will take the second step and shift our focus to the approximation theory for neural networks. We will start with an introduction to the concept of density in polynomial approximation and in particular, will study the Stone-Weierstrass theorem for real-valued continuous functions. Then, within the framework of linear approximation, we will review a few classical results on the density and convergence rate of feedforward networks, followed by more recent developments on the complexity of deep networks in approximating Sobolev functions. Utilizing nonlinear approximation theory, we will further elaborate on the approximation power and superiority of ReLU networks over other classical methods of nonlinear approximation.

Required background

Students should be comfortable with calculus, linear algebra, probability, and numerical analysis.

outline of the course:

date&time location topic
1 May 24, 13:15-15:00 101150
2 May 25, 14:15-16:00 101150
3 May 31, 10:15-12:00 101150
4 Jun 2, 13:15-15:00 101150
5 Jun 7, 10:15-12:00 101150
6 Jun 9, 10:15-12:00 101150
Updated  2022-05-24 15:50:50 by Murtazo Nazarov.