Event Registration - Google Talk
Event: Industry Talk from Dr Benoit Dherin (Google) and Dr David Barrett (Deepmind)
Event Timing: 18th February 2021 at 16:00
To view and/or participate, please register here and you will be sent a link in advance of the event.

Not a member? Feel free to come along anyway, but find out how to join here https://sites.google.com/view/siam-ima-dublin/become-a-member

Title: Implicit Gradient Regularization

Abstract: Large deep neural networks used in modern supervised learning have a large submanifold of interpolating solutions, most of which are not good. However, it has been observed experimentally that gradient descent tends to converge in the vicinity of flat interpolating solutions producing trained models that generalize well to new data points, and the more so as the learning rate increases. Using backward error analysis, we will show that gradient descent actually follows the exact gradient flow of a modified loss surface, which can be described by a regularized loss preferring optimization paths with shallow slopes, and in which the learning rate plays the role of a regularization rate.
Sign in to Google to save your progress. Learn more
Name *
Email *
Submit
Clear form
Never submit passwords through Google Forms.
This content is neither created nor endorsed by Google. Report Abuse - Terms of Service - Privacy Policy