1 of 6

FRC Programming

Machine Learning

2024 December 11

FRC#2022 TITAN ROBOTICS

2 of 6

Attendance

3 of 6

Colab

4 of 6

A few differences

Instead of running each iteration on the whole dataset, run it on a few data points - a batch

Useful because it makes training go faster, especially when we have a lot of data

Instead of being called gradient descent, this is called stochastic gradient descent

5 of 6

Convolutional Neural Networks

Same idea as a neural network, but requires fewer parameters and better at images

The parameter is the filter, or kernel, which is moved over the image

This can be repeated in layers, with an activation function after each layer

6 of 6

Overfitting

Sometimes, the model can pick up noise in the data

To tell how good a model is actually doing, the data is split in a training part and a validation part