FRC Programming
Machine Learning
2024 December 11
FRC#2022 TITAN ROBOTICS
Attendance
A few differences
Instead of running each iteration on the whole dataset, run it on a few data points - a batch
Useful because it makes training go faster, especially when we have a lot of data
Instead of being called gradient descent, this is called stochastic gradient descent
Convolutional Neural Networks
Same idea as a neural network, but requires fewer parameters and better at images
The parameter is the filter, or kernel, which is moved over the image
This can be repeated in layers, with an activation function after each layer
Overfitting
Sometimes, the model can pick up noise in the data
To tell how good a model is actually doing, the data is split in a training part and a validation part