1 of 11

Intro to Tensorflow

Ashish Gaurav

2 of 11

Agenda

  • Tensorflow 1 vs Tensorflow 2
    • We'll be using Tensorflow 1 (low-level API)
  • Data types (dtypes)
  • Computation Graphs
  • Constants, Variables, Tensors
  • Tensorflow vs Numpy
  • Sessions
  • Neural Networks
  • Model to classify handwritten digits (MNIST)

3 of 11

Tensorflow 1 vs Tensorflow 2

https://www.tensorflow.org/guide/effective_tf2 https://www.tensorflow.org/guide/migrate

What changed?

  • Eager execution mode
  • Deletes global variables with no references
  • session.run(f(placeholder), feed_dict = {placeholder:input}) is replaced by function call f(input)
  • Integration with keras (tf.keras)

4 of 11

Data types (dtypes)

Following are the most common dtypes: �(full list: https://www.tensorflow.org/api_docs/python/tf/dtypes/DType)

  • float16/32/64
  • int8/16/32/64, uint8/16/32/64
  • bool
  • string

5 of 11

Computation Graphs

https://www.tensorflow.org/api_docs/python/tf/Graph

  • Graphs consist of operations and tensors
  • Inputs can be specified through placeholders
  • Given an input, the output can be obtained through session.run(...)

6 of 11

Constants, Variables, Tensors

  • Constant nodes in the graph can be made through tf.constant(...)
  • Variable nodes can be made through
    • tf.Variable(...)
    • tf.get_variable(...)
  • Variables need initializers
  • Placeholders can be made through tf.placeholder(...)
  • When any operation is done on the nodes, a Tensor output is created
    • Eg of an operation: tf.add(...)

7 of 11

Tensorflow vs Numpy

8 of 11

Sessions

To run our graph, we need a session:

  • Sessions can be created as tf.Session(...)
  • All variables must be initialized before use
    • tf.global_variables_initializer(...)
  • Any node's value can be obtained as:
    • sess.run(node)
    • sess.run(node, feed_dict = {placeholder:input})
    • sess.run([node1, node2, …], …)
  • Session can be closed as sess.close(...)

9 of 11

Neural Networks

Consider a neural network with one hidden layer:

10 of 11

An example:

Input x (1x5) => linear transformation xW+b => relu => hidden outputs 1x100�hidden (1x100) => linear transformation xW+b => relu => outputs 1x10

11 of 11

In general,

Layer inputs (1xp) => xW+b => elementwise relu => Layer outputs (1xq)�where weights W is matrix of dims pxq and bias b has dims 1xq