cid:image001.png@01D10430.F58F5E10

*Required

Course addition - Syllabus: Course Title*

Division*:

CEMSE

Course Number:

AMCS ****

Course Title (Limited to 40 characters)*:

A Mathematical Introduction to Deep Learning

Expected Starting Academic Semester*:

Fall

Expected Starting Academic Year*:

2022  

Course proposer(s)*:

Name(s) *:

Jinchao Xu

Phone:

Email*:

jinchao.xu@kaust.edu.sa

Instructor(s) information*:

Name(s) *:

Jinchao Xu

Phone:

Email*:

jinchao.xu@kaust.edu.sa

Prerequisite Course Number*:

Linear algebra; multivariable calculus

Comprehensive Course Description*:

This is a graduate course on the introduction of basic mathematical, numerical and practical aspects of deep learning techniques.  It will provide students with the mathematical background and also practical tools needed to understand, analyze and further develop numerical methods for deep learning methods and applications. The course is simultaneously geared towards math students who want to learn about the emerging technology of deep learning and also towards students from other fields who are interested in deep learning application but would like to strengthen their theoretical foundation and mathematical understanding.

Course Description for Program Guide*:

This is a graduate course on the introduction of basic mathematical, numerical and practical aspects of deep learning techniques.

Goals and Objectives*:

1. Understand basic ideas of machine learning and why deep learning works.

2. Learn to implement deep learning algorithms using Python and PyTorch.

3. Application of deep learning for image classifications.

Required Knowledge*:

The course will be accessible to all graduate students with basic knowledge of linear algebra and multi-variable calculus.   Some programming experiences with Python will be helpful.

Reference Texts*:

  1. Goodfellow I., Bengio Y. and Courville A. Deep learning. MIT press, 2016.
  2. Xu J. Deep Learning and Analysis, Lecture Notes (to be published by Springer).

Method of evaluation (Percentages & Graded content such as Assignments, Oral quizzes, Projects, Midterm exam, Final

Exam, Attendance and participation, etc.):

  30% - Homework

  20% - Midterm exam

  20% - Final project

  20% - Final exam

 

Nature of the assignments (assigned reading, case study, paper presentation, group

project, written assignment, etc.):

Homework consist of written assignments for conceptual and theoretical questions and programing assignments for algorithm implementation and practical exercises.   The final project will be closely related to topics in this course.  Groups will be formed to work on final project and group and individual presentations will be scheduled.

Course Policies (Absences, Assignments, late work policy, etc.):

Please pay attention to the due date of the assignments. No late homework will be accepted. Attendance is mandatory. Students should notify the instructor in advance of missing any class or as soon as possible thereafter. 

Additional Information:

NOTE

The instructor reserves the right to make changes to this syllabus as necessary.

Tentative Course Schedule:

(Time, topic/emphasis & resources)

Week/Lecture

Topic

1

  Introduction; logistic regression

2

  Multivariable calculus, convexity, gradient descent method

3

 Elements of probability; stochastic gradient descent.

4

 Elements of machine learning theory;

5

 Python, implementation and MNIST

6

Introduction to linear finite element space

7

Shallow neural network (NN) functions and approximation theory

8

Implementation: shallow NN for MNIST

9

Deep neural networks; convolutional neural networks

10

Initialization; batch normalization; implementation: CNN for MNIST

11

The Poisson equation and linear finite element method

12

Gradient descent and smoothing properties; multigrid method

13

 MgNet: from multigrid to a special CNN

14

MgNet: Applications

15

Transformers and other neural networks; Review