*Required
Course addition - Syllabus: Course Title* | |
Division*: | CEMSE |
Course Number: | AMCS **** |
Course Title (Limited to 40 characters)*: | A Mathematical Introduction to Deep Learning |
Expected Starting Academic Semester*: | Fall |
Expected Starting Academic Year*: | 2022 Â |
Course proposer(s)*: | |
Name(s) *: | Jinchao Xu |
Phone: | |
Email*: | jinchao.xu@kaust.edu.sa |
Instructor(s) information*: | |
Name(s) *: | Jinchao Xu |
Phone: | |
Email*: | jinchao.xu@kaust.edu.sa |
Prerequisite Course Number*: | Linear algebra; multivariable calculus |
Comprehensive Course Description*: | This is a graduate course on the introduction of basic mathematical, numerical and practical aspects of deep learning techniques. Â It will provide students with the mathematical background and also practical tools needed to understand, analyze and further develop numerical methods for deep learning methods and applications. The course is simultaneously geared towards math students who want to learn about the emerging technology of deep learning and also towards students from other fields who are interested in deep learning application but would like to strengthen their theoretical foundation and mathematical understanding. |
Course Description for Program Guide*: | This is a graduate course on the introduction of basic mathematical, numerical and practical aspects of deep learning techniques. |
Goals and Objectives*: | 1. Understand basic ideas of machine learning and why deep learning works. 2. Learn to implement deep learning algorithms using Python and PyTorch. 3. Application of deep learning for image classifications. |
Required Knowledge*: | The course will be accessible to all graduate students with basic knowledge of linear algebra and multi-variable calculus. Â Some programming experiences with Python will be helpful. |
Reference Texts*: |
|
Method of evaluation (Percentages & Graded content such as Assignments, Oral quizzes, Projects, Midterm exam, Final Exam, Attendance and participation, etc.): |  30% - Homework  20% - Midterm exam  20% - Final project  20% - Final exam  |
Nature of the assignments (assigned reading, case study, paper presentation, group project, written assignment, etc.): | Homework consist of written assignments for conceptual and theoretical questions and programing assignments for algorithm implementation and practical exercises. Â The final project will be closely related to topics in this course. Â Groups will be formed to work on final project and group and individual presentations will be scheduled. |
Course Policies (Absences, Assignments, late work policy, etc.): | Please pay attention to the due date of the assignments. No late homework will be accepted. Attendance is mandatory. Students should notify the instructor in advance of missing any class or as soon as possible thereafter. |
Additional Information: |
NOTE
The instructor reserves the right to make changes to this syllabus as necessary.
Tentative Course Schedule: (Time, topic/emphasis & resources) | |
Week/Lecture | Topic |
1 | Â Introduction; logistic regression |
2 | Â Multivariable calculus, convexity, gradient descent method |
3 | Â Elements of probability; stochastic gradient descent. |
4 | Â Elements of machine learning theory; |
5 | Â Python, implementation and MNIST |
6 | Introduction to linear finite element space |
7 | Shallow neural network (NN) functions and approximation theory |
8 | Implementation: shallow NN for MNIST |
9 | Deep neural networks; convolutional neural networks |
10 | Initialization; batch normalization; implementation: CNN for MNIST |
11 | The Poisson equation and linear finite element method |
12 | Gradient descent and smoothing properties; multigrid method |
13 | Â MgNet: from multigrid to a special CNN |
14 | MgNet: Applications |
15 | Transformers and other neural networks; Review |