Soo Kyung Kim
Department of AI
AI IoT (AI Programming)� �- Course Introduction
What is AI?
What is Machine Learning?
What is Deep Learning?
What is Deep Learning?�: State-of-the-art performance
What is Deep Learning?�
What is Deep Learning?�
How is this Possible?
How is this Possible?
Modern AI
(Almost) Infinite Compute + (Almost) Infinite Data
Large Language Models (LLMs)
DALL-E 2, Imagen, Stable Diffusion
AlphaFold2
Transformer Architecture
Weekly Plan
Installing Conda
What's Tensorflow?
Install Tensorflow – Virtual environment
https://www.tensorflow.org/install
python3 -m venv venv3 # Create a Python 3 virtual environment source source ./venv3/bin/activate # Activate the virtual environment
pip install --upgrade pip
Pip install tensorflow jupyterlab numpy
conda create -n myenv python=3.10
conda activate myenv
conda install tensorflow jupyterlab numpy
conda deactivate
export PATH="/Users/sookim/miniconda3/bin:$PATH"
Data Flow Graphs
Data Flow Graphs
What is a Tensor?
And so on
Data Flow Graphs
import tensorflow as tf
a = tf.add(3, 5)�
Why x, y?
TF automatically names the nodes when you don’t explicitly name them.
x = 3
y = 5
�
�
Data Flow Graphs
import tensorflow as tf
a = tf.add(3, 5)�
Nodes: operators, variables, and constantsEdges: tensors��Tensors are data.
TensorFlow = tensor + flow
= data + flow
�
Data Flow Graphs
import tensorflow as tf
a = tf.add(3, 5)
Print(a)�
>> Tensor("Add:0", shape=(), dtype=int32)
(Not 8)
�
�
Data Flow Graphs
import tensorflow as tf
a = tf.add(3, 5)
Print(a.numpy())
�
>> 8
�
More Graph
x = 2�y = 3�op1 = tf.add(x, y)�op2 = tf.multiply(x, y)�op3 = tf.pow(op2, op1)
�
import tensorboard
%tensorboard --logdir logs/func
Okay! Let's get into deeper now.
Tensorflow 1.x vs Tensorflow 2.x�- TF1: Build a graph, then run it in session.
conda config --set subdir osx-64 # if you are a mac unser using M1,2 Apple silicon
conda create --name tf1.4 python=3.6
conda activate tf1.4
pip install tensorflow==1.4
pip install jupyterlab numpy
**conda env remove tf1.4 # REMOVE environment�
Tensorflow 1.x vs Tensorflow 2.x�- TF1: Build a graph, then run it in session.
Tensorflow 1.x vs Tensorflow 2.x�- TF2 : You can check node before session
conda create --name tf tensorflow
pip install jupyterlab numpy�
Tensorflow 1.x vs Tensorflow 2.x�- TF2 : You can check node before session
Tensorflow 1.x vs Tensorflow 2.x�- TF2 : Keras is built-in to TF2
Tensorflow 1.x vs Tensorflow 2.x�- TF2 : You can explore layers interactively
TF2: Three Model building styles�- Sequential, Functional, Subclassing
Sequential Model
Functional Model
Sequential Model
TF2: Two training approaches�- Built-in and Custom
Built-in
Custom (define your own)
Tensorboard
import tensorboard
%tensorboard --logdir logs/func
# Load the TensorBoard notebook extension�%load_ext tensorboard�import tensorflow as tf�import datetime��mnist = tf.keras.datasets.mnist��(x_train, y_train), (x_test, y_test) = mnist.load_data()�x_train, x_test = x_train / 255.0, x_test / 255.0��model = tf.keras.models.Sequential([� tf.keras.layers.Flatten(input_shape=(28, 28), name='layers_flatten'),� tf.keras.layers.Dense(512, activation='relu', name='layers_dense'),� tf.keras.layers.Dropout(0.2, name='layers_dropout'),� tf.keras.layers.Dense(10, activation='softmax', name='layers_dense_2')� ])��model.compile(optimizer='adam',� loss='sparse_categorical_crossentropy',� metrics=['accuracy'])��log_dir = "logs/fit/" + datetime.datetime.now().strftime("%Y%m%d-%H%M%S")�tensorboard_callback = tf.keras.callbacks.TensorBoard(log_dir=log_dir, histogram_freq=1)��model.fit(x=x_train, � y=y_train, � epochs=5, � validation_data=(x_test, y_test), � callbacks=[tensorboard_callback])�%tensorboard --logdir logs/fit
https://www.tensorflow.org/tutorials/quickstart/beginner
Do simple Mnist example
https://github.com/fastscience-ai/�Ewha/tree/main/lab
CODE
Next Class?