1 of 36

CSCI 3280

Introduction to Multimedia Systems

(2026 Term 2)

Computer Science & Engineering

The Chinese University of Hong Kong

2 of 36

AI vs. Machine learning vs. Deep Learning

AI is the ultimate goal; machine learning and deep learning are ways to the goal.

3 of 36

What is Machine Learning?

4 of 36

Learning is not “memorization” – generalization.

What is Machine Learning?

5 of 36

Definition of Machine Learning

6 of 36

Machine learning = looking for a function f() ​

An example of bank credit card approval.

7 of 36

Components of Learning 

An example of bank credit card approval.

8 of 36

Solution Components

9 of 36

The Perceptron

10 of 36

Simple Learning - PLA

11 of 36

Classification vs. Regression

12 of 36

What is Deep Learning?

Definition

A family of methods that uses deep architectures to learn high-level feature representations.

13 of 36

Example of Trainable Features

14 of 36

Why do we need “Deep”?

Why Deep Model?

  1. Fill in the gap between low-level feature and semantic meaning
  2. Learn useful high-level abstraction with less variant/noise

signals

sentence

pixels

object

15 of 36

Why do we need “Deep”?

Why Deep Model?

  1. Fill in the gap between low-level feature and semantic meaning
  2. Learn useful high-level abstractions with less variant/noise

16 of 36

Three Steps for Deep Learning

Step 1: Neural Network

Step 2: Cost Function

Step 3: Optimization

Step 1. A neural network is a function composed of simple functions (neurons)

  • Usually we design the network structure, and let machine find parameters from data

Step 2. Cost function evaluates how good a set of parameters is

Step 3. Find the best function set (e.g. back propagation)

  • We design the cost function based on the task

17 of 36

……

nodes

Layer

……

……

Layer

nodes

……

Output of a neuron:

Neuron i

Layer

Output of one layer:

: a vector

A Type of Neural Networks

18 of 36

……

nodes

Layer

……

……

Layer

nodes

……

Layer

to Layer

from neuron j

to neuron i

(Layer )

(Layer )

Fully Connected Layer (1)

19 of 36

……

nodes

Layer

……

……

Layer

nodes

……

: bias for neuron i at layer l

bias for all neurons in layer l

Fully Connected Layer (2)

20 of 36

……

nodes

Layer

……

……

Layer

nodes

……

Fully Connected Layer (3)

21 of 36

……

nodes

Layer

……

……

Layer

nodes

……

Fully Connected Layer (4)

22 of 36

Fully Connected Network

23 of 36

Step 1: Neural Network

Step 2: Cost Function

Step 3: Optimization

Step 1. A neural network is a function composed of simple functions (neurons)

  • Usually we design the network structure, and let machine find parameters from data

Step 2. Cost function evaluates how good a set of parameters is

Step 3. Find the best function set (e.g. back propagation)

  • We design the cost function based on the task

Three Steps for Deep Learning

24 of 36

 

 

 

Function set:

Including all different w and b

 

 

 

class 1

class 2

z

z

0

0

Cost Function

25 of 36

Sigmoid Function

 

Cost Function

26 of 36

 

 

 

 

 

Cross entropy:

Cost Function - Cross Entropy

27 of 36

Total

Loss

w1

w2

Cross Entropy

Square

Error

Cross Entropy vs. Square Error

28 of 36

Step 1: Neural Network

Step 2: Cost Function

Step 3: Optimization

Step 1. A neural network is a function composed of simple functions (neurons)

  • Usually we design the network structure, and let machine find parameters from data

Step 2. Cost function evaluates how good a set of parameters is

Step 3. Find the best function set (e.g. back propagation)

  • We design the cost function based on the task

Three Steps for Deep Learning

29 of 36

  • Consider loss function 𝐿(𝑤) with one parameter w:

 

 

  • (Randomly) Pick an initial value w0

 

w0

Positive

Negative

Decrease w

Increase w

Gradient Descent

30 of 36

  • Consider loss function 𝐿(𝑤) with one parameter w:

 

  • (Randomly) Pick an initial value w0

 

w0

 

 

η is called “learning rate

 

Gradient Descent

31 of 36

  • Consider loss function 𝐿(𝑤) with one parameter w:

 

  • (Randomly) Pick an initial value w0

 

w0

 

 

 

…… Many iteration

Local minima

global minima

w1

w2

wT

 

Gradient Descent

32 of 36

Backpropagation – Summary

Forward Pass

Backward Pass

 

 

 

X

 

 

for all w

33 of 36

Convolutional Neural Networks

34 of 36

Recurrent Neural Network

  •  

f

h0

h1

y1

x1

f

h2

y2

x2

f

h3

y3

x3

……

No matter how long the input/output sequence is, we only need one function f

h and h’ are vectors with the same dimension

35 of 36

Auto Encoder

36 of 36

Summary

  • Concept of machine Learning, deep Learning;

  • Three steps of deep learning;

  • Types of deep neural networks.