Danbury AI
@AndrewJRibeiro | AndrewRib.com
Art has always been cherished as the most expressive and human production. The idea that a computer, a logical machine, can create the most quintessential human objects is preposterous to some. As anyone that has engaged in the artistic process will tell you, a lot of art is based on emotion, not logical rules. In this talk we will discuss the connectivist history leading to convolutional networks and their application in style transfer. I hope that the topics herein demonstrate to you that machine learning is a dramatic departure from rule based computing and that it does mimic intelligent behavior.
Overview
Origin of Neural Networks
The Connectivist Timeline
“Either the universe is composable or God exists.”
-I heard Yann LeCun paraphrase this quote
*An incomplete history
Mark 1 Perceptron
Frank Rosenblatt
The Neuron
Biological Inspiration
Harbingers of the AI Winter
Neural Network Basics
Regularization coefficient
Regularization
Cost function
K Classes - Multi-Class Classification
Essentially Multinomial Logistic Regression
Hypothesis Parameterized by Theta ( Feature Weights )
Learning as an optimization problem: Minimize theta of J( Theta )
Hypothesis function: A linear combination of the bias and weights on features.
Convolutions
Key Question: Why do ConvNets give us better accuracy for visual object recognition over the standard MLP?
Key Point: Convolutions, in the form we are interested in for ConvNets, compute new values of a matrix based on surrounding values. This gives us a means of producing higher abstractions of local structure.
“In order to make decisions, or make sense of things, we must reduce the input feature space to a lower dimension.”
Try performing image recognition with the atoms of the objects as the features.
Sobel Operator
3X3 Kernel Convolutions which approximate the derivatives -- x,y changes. Let A be the source img and Gx,Gy be the x,y derivative approx.
Convolutional Neural Networks ( CNN )
A lot of stuff on this slide was ripped from the deep learning book: http://www.deeplearningbook.org/
Pooling and Feature Maps
Very Deep Convolutional Networks ( VGG )
STYLE VS. CONTENT
THE BIG IDEA
Figure out how to do this for different domains, and you will be the king/queen of ML.
Style Transfer Predecessors
Image Quilting
Image Quilting Vs. Neural Style Transfer
Style
Content
Image Quilting
Neural
Note: I took a screenshot from the paper to get the style,content, and image quilting result images. They were probably scaled down in the paper so we didn’t get great results, but it’s still illustrative.
content loss: 1.22706e+06
style loss: .659507
total loss: 1.89246e+06
A Neural Algorithm of Artistic Style
This is the composite loss function, consisting of the loss function for style and content, which we minimise to get our wonderful style transfer result.
Where:
I posit: creativity is optimization with competing constraints
Vocabulary and Concepts
“Extracting correlations between neurons is a biologically plausible computation that is, for example, implemented by so-called complex cells in the primary visual system”
We can visualise the information at different processing stages in the CNN by reconstructing the input image from only knowing the network’s responses in a particular layer. We reconstruct the input image from from layers ‘conv1 1’ (a), ‘conv2 1’ (b), ‘conv3 1’ (c), ‘conv4 1’ (d) and ‘conv5 1’ (e) of the original�VGG-Network.
Algorithm Overview
Mixing Style and Content
Methods
Formulae at a Glance
Key Definitions
Content Representation
Style Representation
a is the original image. x is the image to be generated. Al and Gl is the style representations in layer l for a,x respectively
Style Representation ( 2 )
See the paper for a discussion on what they experimentally found was good for the weighting factor wl for each convolutional layer.
Style Representation ( 3 )
Mixing Style and Content
Where:
Implementations
RESULTS
Look at the links
Discussion & Wrap Up
Thanks for coming!
Questions for you:
Questions for me?
By: Andrew Ribeiro of Knowledge-Exploration Systems
@kexpsocial
https://github.com/k-exp
WWW.KEXP.IO
Andrewnetwork@gmail.com
@AndrewJRibeiro
https://github.com/Andrewnetwork
AndrewRib.com
I REALLY WANT A TALK ON
Generative Adversarial Networks
Interested in giving one? Co-Author one with us?
Resources and Sources
Papers
Resources and Sources
Resources and Sources
Resources and Sources
BEST Machine Learning Book:
http://www.deeplearningbook.org/
Unused Slides
Google Deep Dream
The Connectivist Ideas of Leibniz
TBA
Wavenet