Session 4d
ACM AI + ACM TeachLA
Slides Link: https://teachla.uclaacm.com/resources
Linear Regression
w/ TensorFlow
while z is not equal to func_min:
#when do we want to stop?
x = x + calculate_step(0.25, calculate_gradient(x)) #calculate our new x based on the gradient at the old x
y = y + calculate_step(0.25, calculate_gradient(y)) #basically the same for y!
z = f(x, y) #now calculate our new z!
Homework Recap
Recap:
ML Math
How do we Multiply Matrices?
x
=
2
2
1
3
6
0
4
3
2
2
8
10
12
12
11
14
Today:
Tensorflow
What is a Computation Graph?
IMPORTANT: scalars, row vectors, column vectors, and matrices are ALL tensors. A scalar is a single-entry row vector, which is a 1-by-1 matrix, which is a 2D tensor. For linear regression, we will only use matrices. We will use 3D+ tensors when we get to convolutional neural networks.
Linear Regression Computation Graph
X
Z = W * X
X: n-by-m tensor
W: 1-by-n tensor,
Z: 1-by-m tensor
Ypred = Z + b
b: 1-by-1 tensor (scalar),
Ypred: 1-m tensor
What is Tensorflow?
TensorFlow Steps
TensorFlow Code Walkthrough
Import Tensorflow library, use name alias so we don’t have to type out whole name
every time
Clears the default graph and resets the global default graph.
Placeholders: values we will feed in when we run our session
TensorFlow Code Walkthrough
Variables: model parameters
Build computation graph
Specify loss metric and optimizer
TensorFlow Code Walkthrough
Print MSE on training set every 50 iterations -- should only decrease!
Train for 1000 iterations by running session 1000 times and using optimizer to adjust W and b so that MSE is minimized
Initialize global variables in computation graph
Homework!
Submission Instructions
Thanks!
ACM AI