ABCDEFGHIJKLMNOPQRSTUVWXYZ
1
[Description] This worksheet demonstrates the simplest supervised learning model - linear regression with gradient descent.
2
Linear Regression Matrix Operations Practice Worksheet
3
4
5
6
[Learning Objectives]
7
1. Understand linear regression using matrix form input data and weights
8
2. Calculate forward propagation through Matrix Multiplication
9
3. Calculate MSE loss function gradient (matrix differentiation)
10
4. Update weight matrix through backpropagation
11
12
[Network Structure]
13
• Input matrix X: 10 samples × 8 features (10×8 matrix)
14
• Weight matrix W: 8 features × 1 output (8×1 vector)
15
• Bias b: Scalar (applied equally to all samples)
16
• Output Y_pred: 10 samples × 1 output (10×1 vector)
17
• Loss function: MSE (Mean Squared Error)
18
19
[Mathematical Expression]
20
Forward: Y_pred = X · W + b (Matrix multiplication + broadcasting)
21
Loss: L = (1/2n) × Σ(y_pred - y_true)²
22
Gradient: ∂L/∂W = (1/n) × X^T · (Y_pred - Y_true)
23
∂L/∂b = (1/n) × Σ(y_pred - y_true)
24
Update: W_new = W - lr × ∂L/∂W
25
b_new = b - lr × ∂L/∂b
26
27
[Color Legend]
28
■ Input matrix (X)
29
■ Weight (W)
30
■ Output/Prediction (Y_pred)
31
■ Gradient
32
■ Student input cell
33
■ Answer
34
■ Wrong
35
36
[How to Use]
37
1. Enter calculation results in yellow cells in Forward_Propagation sheet
38
2. Check ✓(correct) or ✗(wrong) in verification cells
39
3. Tolerance: 0.0001 (up to 4 decimal places)
40
4. Hint: Matrix multiplication is calculated in 'row × column' order
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100