1 of 107

Deep Learning with TensorFlow & Earth Engine

Nicholas Clinton, Alexandrina GV (AGV)

Developer Relations, Geo for Environment

Oct 2022 | https://goo.gle/tf2_g4g_22 | #GeoForGood22

60min recording of this session on YouTube: https://youtu.be/aiqAN1Zlhdk

2 of 107

Alexandrina GV

Developer Relations Engineer, Google Earth Outreach

Nicholas Clinton

Developer Relations Engineer, Google Earth Outreach

David Cavazos

Developer Relations Engineer, Google Google Cloud

👋 Hello!

3 of 107

Agenda

01

�02

03

Options for training a model

(5mins)

What is Deep Learning?

(10 minutes)

Architecture components to train a model using TensorFlow

(10 minutes)

Demo: Land cover classification model (similar to Dynamic World)

(25 minutes)

04

05

Conclusion & resources

(5 minutes)

#GeoForGood22

4 of 107

In case you missed it! 👌🏽

🐝

5 of 107

1. Options for

training a model w/

Earth Engine

5 mins

Geo for Good Summit Presentation Template

Geo for Good Summit 2022

6 of 107

Options for training

(spiciness rating)

Classifiers built into Earth Engine

Google Cloud + Earth Engine

🌶🌶🌶🌶🌶🌶🌶🌶

🌶

7 of 107

Options for training

Classifiers built into Earth Engine

Google Cloud + Earth Engine

8 of 107

Start with Earth Engine classifiers

Classifiers built into Earth Engine

CART

NaiveBayes

SVM

RandomForest

(start here)

Start here

Clustering

Supervised

Unsupervised

9 of 107

Options for training

Classifiers built into Earth Engine

Google Cloud + Earth Engine

10 of 107

When to build a custom model

Do-it-yourself

Custom architecture

CART

NaiveBayes

SVM

RandomForest

(start here)

Clustering

Supervised

Unsupervised

Built-in algorithms

11 of 107

When to build a custom model

Import/build models in Keras, TensorFlow, PyTorch, scikit-learn

Building a state if

the art model

i.e. Dynamic World (land cover)

Number of training points is very large

(training dataset)

12 of 107

Error: User memory limit exceeded

Limit is 1 million rows* of training points

*LESS if you have lots of columns 🙂

Training data doesn’t fit into one Earth Engine task

Looking to make a hacky workaround to export table

13 of 107

When to build a custom model

Built-in classifiers

Custom model

Complex model building

Requires knowing only basic ML concepts

Built-into Earth Engine

Longer training time

Any size training data (2+ EE tasks)

Fast to use

14 of 107

Similar to weather predictions computation historically

Simulate EVERY atom in the system

Just find patterns

15 of 107

2. What is Deep Learning?

10mins

Geo for Good Summit Presentation Template

Geo for Good Summit 2022

16 of 107

Deep learning is a great general purpose model (supervised learning)

Algorithms to experiment

with

Linear regression

Artificial Neural Networks

Support Vector Machines

K-means clustering

Logistic regression

Decision Trees

Anomaly Detection

Naive Bayes

Principal Component Analysis

Gaussian Mixture Model

17 of 107

Examples of patterns

Supervised learning: tell a computer the right answers to look for, through examples

18 of 107

Deep learning is a flexible & general purpose model (supervised learning)

🎥 Video

🖼️ Image

🎧 Audio

Types of multidimensional data

RGB values across 2D space

Amplitude across time

RGB values across space + time

19 of 107

Training a model to classify tree species is similar to a kind of

image segmentation problem

Classifies every pixel in an image

20 of 107

🛰 Satellite sensors collect a wider range of bands than typical RGB channels

Image Classification

Geospatial Classification

21 of 107

Deep learning approaches data problems differently.

There’s no writing a function with explicit & sequential steps that reviews every single pixel one by one for every image

22 of 107

To modify function to add more tree species, may need to rewrite it all together

23 of 107

Deep learning can simplify things instead of hand-crafting every instruction

Simulate system

Just find patterns

24 of 107

Let the computer learn from labels of tree species as examples

25 of 107

Simply add images of new species, and retrain the model

Cedar tree

Oak tree

26 of 107

Recap —> journey with deep learning

What it can help with

Go-to for 🗺 images, 🎧 audio, 🎥 video (general purpose)

Why is it different than traditional programming

👁 Learns from patterns in examples

How to approach a problem

……

27 of 107

Example: Deforestation due to extractive supply chains

1984

2020

28 of 107

Many machine learning only understands tensors

Numbers

29 of 107

Tensors are numbers stored as an array or vector in one dimension

Numbers

1 dimension

Shape: [4]

30 of 107

…or as a matrix or table in 2 dimensions

Numbers

1 dimension

2 dimensions

31 of 107

…or as a multi-dimensional data

3 dimensions

Numbers

1 dimension

2 dimensions

32 of 107

Convert satellite images into tensors, every pixel has one number for every band

33 of 107

Get started: need a dataset with satellite images that have even # of labels of 🌲 & ⛔️🌲

34 of 107

Next, we have to frame our model differently depending on our goal

Input images

Output images

35 of 107

Here are a few common goals:

36 of 107

Predicts a number

37 of 107

Binary classification =

Tree

Not a tree

Makes a prediction between two categories

38 of 107

Classifies things across multiple categories

39 of 107

Classifies every pixel in an image

40 of 107

We want to know if there are trees or not in every pixel, this is a binary semantic segmentation problem

41 of 107

Based on goal: Outputs are the % of trees for every pixel between 0-1

42 of 107

Zero represents no trees, and 1 represents a high confidence there’s trees

43 of 107

🛰 Satellite image

256 pixels

256 pixels

🌲 Probability of Trees

How do we go from input images into probabilities of trees?

44 of 107

Input

Output

1 pixel

1 pixel

Input

Output

Input

Output

1 patch

1 pixel

(center)

1 patch

(neighboring pixels)

Dense Neural Network (DNN)

Convolutional Neural Network (CNN)

Fully Convolutional Network (FCN)

3 common ways…

FCNs great for building a map w/ predictions!

1 patch

(neighboring pixels)

45 of 107

Input

Output

1 pixel

1 pixel

Input

Output

Input

Output

1 patch

1 pixel

(center)

1 patch

(neighboring pixels)

FCNs get context from surroundings like our 👁 eyes!

1 patch

(neighboring pixels)

Dense Neural Network (DNN)

Convolutional Neural Network (CNN)

Fully Convolutional Network (FCN)

46 of 107

🛰 Sentinel 2

Fully Convolutional Networks (FCN)

A model is a collection of interconnected layers.

47 of 107

🛰 Sentinel 2

Arrange layers (like legos)

At each layer, data is transformed, based on desired outputs

48 of 107

Transformations are performed by activation functions at each layer

49 of 107

🛰 Sentinel 2

3 layers 👍 → good baseline (Normalization, 2D Convolutional, and DeConvolutional)

3 layers

50 of 107

4th & last layer (depending on goal): choose loss function to score how the model’s training went

51 of 107

Common ML goals and functions

Based on your goal, friendly recommendations for activation & loss functions

52 of 107

Reminder: most of this work is experimenting repeatedly

53 of 107

3. Architecture components to train & host a deep learning model (using TensorFlow)

10mins

Geo for Good Summit Presentation Template

Geo for Good Summit 2022

54 of 107

Accounts needed

Google Earth Engine

Google Cloud

+

$300 free trial

Commercial licenses (NEW 2022)

Free for nonprofits/academia

55 of 107

Products by function

Dataflow

Cloud Storage

Earth Engine

Colab

notebook

Cloud Run

Earth Engine

ML library

Vertex AI

Cloud product

Open Source

Geo product

Other Google product

56 of 107

Fully managed service for executing Apache Beam pipelines

Dataflow

Cloud Storage

Earth Engine

Colab

notebook

Cloud Run

Earth Engine

Data processing

ML library

Vertex AI

Cloud product

Open Source

Geo product

Other Google product

57 of 107

ML library

Vertex AI

Dataflow

Cloud Storage

Earth Engine

Colab

notebook

Cloud Run

Earth Engine

ML model building

Cloud product

Open Source

Geo product

ML Platform to train models and/or host them to make predictions

Other Google product

58 of 107

Cloud AI Platform vs Vertex AI (ML platform evolved)

Predecessor (integrated with EE)

Cloud AI Platform

Current (working on integration)

Vertex AI

59 of 107

ML library

Vertex AI

Dataflow

Cloud Storage

Earth Engine

Colab

notebook

Cloud Run

Earth Engine

Storage

Cloud product

Open Source

Geo product

Store and move data across products

Other Google product

60 of 107

🗄 Data storage file formats

Other frameworks

TFRecords

NumPy files

61 of 107

ML library

Vertex AI

Dataflow

Cloud Storage

Earth Engine

Colab

notebook

Cloud Run

Earth Engine

Web service

Cloud product

Open Source

Geo product

Other Google product

Managed hosting service that auto scales

(only uses resources needed, not on 24hrs)

62 of 107

ML library

Vertex AI

Dataflow

Cloud Storage

Earth Engine

Colab

notebook

Cloud Run

Earth Engine

Visualization

Geo product

Cloud product

Open Source

Visualize in EE or in a notebook using a map library (i.e. Folium)

Other Google product

63 of 107

Can prototype/practice in a 📓 Colab notebook

Markdown text

Click “Run” icons

⚡️Enable GPUs & TPUs

(free w/ limits)

🥨Share as a link

64 of 107

Designing for inputs & outputs (putting it all together)

Model

building

Vertex AI runs Keras

Cloud Storage

Model

stored

Cloud Storage

Sentinel 2 images

Elevation�

Datasets extracted

Data

stored

Earth Engine

Data from catalog (labeled)

Earth Engine & Dataflow

TensorFlow Keras

defines model

(layers, inputs, outputs)

💻

🗺

1

2

3

4

5

6

+

Host model &

make predictions

Land cover labels

INPUTS

OUTPUTS

65 of 107

Step by step instructions here (enjoy the 🚴🏽‍♂️walkthrough)

Model

building

Vertex AI runs Keras

Cloud Storage

Model

stored

Cloud Storage

Sentinel 2 images

Elevation�

Datasets extracted

Data

stored

Earth Engine

Data from catalog (labeled)

Earth Engine & Dataflow

TensorFlow Keras

defines model

(layers, inputs, outputs)

💻

🗺

1

2

3

4

5

6

+

Host model &

make predictions

Land cover labels

INPUTS

OUTPUTS

Notebook

66 of 107

Model

building

Vertex AI runs Keras

Cloud Storage

Model

stored

Run predictions

as a service

Cloud Storage

Cloud Run

Host model

Sentinel 2 bands, Elevation,�Earth classification�

Datasets extracted

Data

stored

Earth Engine

Data from catalog (labeled)

Earth Engine & Dataflow

TensorFlow Keras

defines model

(layers, inputs, outputs)

+

Collection of [Input, label] pairs & save as

TFRecord files

Dataflow: Extract from hours/days→ minutes

1

2

3

4

5

6

💻🗺 👉 https://bit.ly/3SvS2hl

67 of 107

Model

building

Vertex AI runs Keras

Cloud Storage

Model

stored

Run predictions

as a service

Cloud Storage

Cloud Run

Host model

Sentinel 2 bands, Elevation,�Earth classification�

Datasets extracted

Data

stored

Earth Engine

Data from catalog (labeled)

Earth Engine & Dataflow

+

Save data as TFRecord files

1

2

3

4

5

6

💻🗺 👉 https://bit.ly/3SvS2hl

68 of 107

Model

building

Vertex AI runs Keras

Cloud Storage

Model

stored

Run predictions

as a service

Cloud Storage

Cloud Run

Host model

Sentinel 2 bands, Elevation,�Earth classification�

Datasets extracted

Data

stored

Earth Engine

Data from catalog (labeled)

Earth Engine & Dataflow

TensorFlow Keras

defines model

(layers, inputs, outputs)

+

1

2

3

4

5

6

💻🗺 👉 https://bit.ly/3SvS2hl

69 of 107

Model

building

Vertex AI runs Keras

Cloud Storage

Model

stored

Run predictions

as a service

Cloud Storage

Cloud Run

Host model

Sentinel 2 bands, Elevation,�Earth classification�

Datasets extracted

Data

stored

Earth Engine

Data from catalog (labeled)

Earth Engine & Dataflow

TensorFlow Keras

defines model

(layers, inputs, outputs)

+

1

2

3

4

5

6

There’s no hard rule

…can be 80/20 etc

💻🗺 👉 https://bit.ly/3SvS2hl

70 of 107

+

Model

building

Vertex AI runs Keras

Cloud Storage

Model

stored

Run predictions

as a service

Cloud Storage

Cloud Run

Host model

Sentinel 2 bands, Elevation,�Earth classification�

Datasets extracted

Data

stored

Earth Engine

Data from catalog (labeled)

Earth Engine & Dataflow

TensorFlow

(community)

PyTorch

scikit-learn

TensorFlow

Pro

Can change ratio of both datasets without having to restart another data extraction pipeline.

(single source of truth 👌🏽)

1

2

3

4

5

6

71 of 107

+

Model

building

Vertex AI runs Keras

Cloud Storage

Model

stored

Run predictions

as a service

Cloud Storage

Cloud Run

Host model

Sentinel 2 bands, Elevation,�Earth classification�

Datasets extracted

Data

stored

Earth Engine

Data from catalog (labeled)

Earth Engine & Dataflow

TensorFlow Keras

defines model

(layers, inputs, outputs)

1

2

3

4

5

6

FYI: Can also be done 💻 locally but this can take a ⏰long time.

72 of 107

+

Model

building

Vertex AI runs Keras

Cloud Storage

Model

stored

Run predictions

as a service

Cloud Storage

Cloud Run

Host model

Sentinel 2 bands, Elevation,�Earth classification�

Datasets extracted

Data

stored

Earth Engine

Data from catalog (labeled)

Earth Engine & Dataflow

Save data as TF file called SavedModel

1

2

3

4

5

6

73 of 107

+

Model

building

Vertex AI runs Keras

Cloud Storage

Model

stored

Run predictions

as a service

Cloud Storage

Cloud Run

Host model

Sentinel 2 bands, Elevation,�Earth classification�

Datasets extracted

Data

stored

Earth Engine

Data from catalog (labeled)

Earth Engine & Dataflow

TensorFlow Keras

defines model

(layers, inputs, outputs)

make predictions on incoming data

Pro

  • Easy to use�
  • Generous free tier

  • Typical cost $X-XX

Con

  • Predictions live outside of EE

🧭Hosting model on Cloud Run

2

3

4

5

6

1

74 of 107

Model

building

Vertex AI runs Keras

Cloud Storage

Model

stored

Cloud Storage

Sentinel 2 bands, Elevation,�Earth classification�

Datasets extracted

Data

stored

Earth Engine

Data from catalog (labeled)

Earth Engine & Dataflow

TensorFlow Keras

defines model

(layers, inputs, outputs)

Batch update model

Dataflow

Pro

  • Scalable for historical data

Con

  • Predictions live outside of EE

🧭Hosting model on Dataflow

Update classifications on past predictions when model changes

2

3

4

5

6

1

75 of 107

+

Model

building

Vertex AI runs Keras

Cloud Storage

Model

stored

Cloud Storage

Sentinel 2 bands, Elevation,�Earth classification�

Datasets extracted

Data

stored

Earth Engine

Data from catalog (labeled)

Earth Engine & Dataflow

TensorFlow Keras

defines model

(layers, inputs, outputs)

Run model

locally

Colab or laptop

2

3

4

5

6

1

76 of 107

It cost us (at this time) $1-$5 using Cloud Run to train & host a model in Cloud Run (can cost $0-$XX)

goo.gle/PeopleAndPlanetAI

(10 min episodes)

🗺 Land cover

🏭 Carbon emissions

🐟 Illegal fishing

77 of 107

+

Model

building

Vertex AI runs Keras

Cloud Storage

Model

stored

Cloud Storage

Sentinel 2 bands, Elevation,�Earth classification�

Datasets extracted

Data

stored

Earth Engine

Data from catalog (labeled)

Earth Engine & Dataflow

TensorFlow Keras

defines model

(layers, inputs, outputs)

Host model

($100-$200/month)

AI Platform connector in EE ��($$$)

24hrs VM hosting

+

Con

  • Cost $XXX 24hrs

  • Additional cost when enabling GPUs $XXX 24hrs

🧭Hosting model on AI Platform

Pro

  • Convenient design for (Enterprise level)

  • Connected to Earth Engine

2

3

4

5

6

1

78 of 107

Frameworks supported in AI Platform (older Google platform)

(only w/ a custom container)

79 of 107

Zooming out 🗺 too much sends lots of requests

+

Model

building

Vertex AI runs Keras

Cloud Storage

Model

stored

Cloud Storage

Sentinel 2 bands, Elevation,�Earth classification�

Datasets extracted

Data

stored

Earth Engine

Data from catalog (labeled)

Earth Engine & Dataflow

TensorFlow Keras

defines model

(layers, inputs, outputs)

Host model

($100-$200/month)

AI Platform connector in EE ($$)

+

🧭FYI

2

3

4

5

6

1

💻🗺 👉 https://bit.ly/3SvS2hl

80 of 107

Earth Engine

(or browser)

Cloud Run

You can have Cloud Run translate NumPy Arrays to Cloud GeoTIFFs

and store in Cloud Storage

Example coming soon!

Translate model’s output to Cloud Optimized GeoTIFFs and store in Cloud Storage.

6

7

81 of 107

+

Model

stored

Convert to

Cloud Optimized GeoTIFFs (COGs)

Cloud Storage

Container

image w/ GDAL

(geospatial library)

GeoTIFF

NumPy Array

Cloud Storage

Earth Engine

(or browser)

Cloud Run

New format stored

Example coming soon!

Translate model’s output to Cloud Optimized GeoTIFFs and store in Cloud Storage.

This works by using GDAL (open source geospatial library) in a custom container in Cloud Run

6

7

82 of 107

Start with Earth Engine classifiers

Do you want to build it with an ML library or have lots of data?

Yes

No

Is extracting datasets from Earth Engine taking many hours/days?

Use Dataflow + .getDownloadURL�+ .stratifiedSample EE functions

Export directly from EE

Are you ok with spending $100-$XXX for model hosting/predictions? �(More w/ GPUs)

Convert NumPy arrays using GDAL to Cloud GeoTIFFs in Cloud Run

Yes

No

No

AI Platform + Vertex

(for now as of 2022)

Yes

Earth Engine ML

decision Cheatsheet

83 of 107

4. Demo: Land cover classification models

(similar to Dynamic World)

20mins

Geo for Good Summit Presentation Template

Geo for Good Summit 2022

84 of 107

Export tables of:

image.reduceRegions(...) or

image.neighborhoodToArray(...).reduceRegions(...).

REST API:

computePixels

Client-based:

Coming soon! even lighter weight than getDownloadURL().

computePixels()

Get patches (scale it with Dataflow):

getDownloadURL()

2

Data extraction

Getting training data from Earth Engine

85 of 107

Demonstration: Fully-convolutional model for image segmentation

86 of 107

Sample regions

87 of 107

2

Data extraction

Creating a balanced dataset via random samples

def sample_random_points(region: ee.Geometry, points_per_class: int) -> Iterable[List]:

"""Get a generator of random points in the region."""

image = landcover_image().select(OUTPUT_BANDS).int()

points = image.stratifiedSample(

POINTS_PER_CLASS,

region=region,

scale=SCALE,

geometries=True,

)

for point in points.toList(points.size()).getInfo():

yield point['geometry']['coordinates']

88 of 107

2

Data extraction

Get a patches centered at points as NumPy Arrays

@retry.Retry()

def get_patch(image: ee.Image, point: ee.Geometry, bands: List[str], patch_size: int) -> np.ndarray:

"""Get the patch of pixels in the geometry as a Numpy array."""

url = image.getDownloadURL({

'region': point,

'dimensions': [patch_size, patch_size],

'format': "NPY",

'bands': bands,

})

response = requests.get(url)

if response.status_code == 429:

raise exceptions.TooManyRequests(response.text)

response.raise_for_status()

return np.load(io.BytesIO(response.content), allow_pickle=True)

89 of 107

A training example

💧 Water

🌳 Trees

🌾 Grass

🌿 Flooded vegetation

🚜 Crops

🪴 Shrub and scrub

🏗️ Built-up areas

🪨 Bare ground

❄️ Snow and ice

Shape:

[128, 128, 9]

Bands:

['B1', 'B2', 'B3', 'B4', 'B5', 'B6', 'B7', 'B8', 'B8A', 'B9', 'B11', 'B12']

Inputs

Labels

2

Data extraction

90 of 107

Serialize patches into tf.train example protos

def serialize(patch: np.ndarray) -> bytes:

features = {

name: tf.train.Feature(

float_list=tf.train.FloatList(value=patch[name].flatten())

)

for name in patch.dtype.names

}

example = tf.train.Example(features=tf.train.Features(feature=features))

return example.SerializeToString()

2

Data extraction

91 of 107

Apache Beam pipeline to generate data

2

Data extraction

with beam.Pipeline(options=beam_options) as pipeline:

training_data, validation_data = (

pipeline

| "Create regions" >> beam.Create(REGIONS)

| "Sample random points" >> beam.FlatMap(sample_random_points, POINTS_PER_REGION)

| "Get patch" >> beam.Map(get_training_patch, BANDS, PATCH_SIZE)

| "Serialize" >> beam.Map(serialize)

| "Split dataset" >> beam.Partition(split_dataset, 2)

)

training_data | "Write training data" >> beam.io.WriteToTFRecord(

"datasets/training", file_name_suffix=".tfrecord.gz"

)

validation_data | "Write validation data" >> beam.io.WriteToTFRecord(

"datasets/validation", file_name_suffix=".tfrecord.gz"

)

92 of 107

Read datasets using tf.data.TFRecordDataset

def get_dataset(pattern, batch_size):

dataset = tf.data.Dataset.list_files(pattern).interleave(

lambda filename: tf.data.TFRecordDataset(filename, compression_type='GZIP'))

dataset = dataset.map(parse_tfrecord, num_parallel_calls=tf.data.AUTOTUNE)

dataset = dataset.map(to_tuple, num_parallel_calls=tf.data.AUTOTUNE)

dataset = dataset.cache()

dataset = dataset.shuffle(512)

dataset = dataset.batch(batch_size)

dataset = dataset.prefetch(buffer_size=tf.data.AUTOTUNE)

return dataset

Preprocessing

3

93 of 107

Make the model

def get_model(input_shape, num_classes):

inputs = keras.Input(shape=[None, None, len(INPUT_BANDS)])

# Your fancy model stuff here…

outputs = layers.Conv2D(num_classes, 3, activation="softmax", padding="same")()

model = keras.Model(inputs, outputs)

return model

Model building

4

94 of 107

Train the model (model.fit())

Train the model wherever you want!

    • Colab notebook
    • Vertex AI (example)
    • Deep Learning VMs (reference)
    • Your fancy, big cluster

Model building

4

95 of 107

Wrap the model in de/serialization layers (eeification)

class DeserlializeInput(tf.keras.layers.Layer):

def call(self, tensor):

return_dict={}

for (k, v) in tensor.items():

decoded = tf.io.decode_base64(v)

return_dict[k] = tf.map_fn(lambda x: tf.io.parse_tensor(x, tf.float32), decoded, dtype=tf.float32)

return return_dict

class ReserlializeOutput(tf.keras.layers.Layer):

def call(self, tensor_input):

return tf.map_fn(lambda x: tf.io.encode_base64(tf.io.serialize_tensor(x)), tensor_input, dtype=tf.string)

(also, earthengine model prepare)

5

Prepare model for hosting

96 of 107

Save the model to Cloud Storage

model.save('gs://your-bucket/your-folder)

5

Prepare model for hosting

97 of 107

Host the model

on AI Platform

!gcloud ai-platform versions create {VERSION_NAME} \

--project {PROJECT} \

--region {REGION} \

--model {MODEL_NAME} \

--origin {MODEL_DIR} \

--framework "TENSORFLOW" \

--runtime-version=2.8 \

--python-version=3.7

6

Host model &

make predictions

98 of 107

Connect to the model from Earth Engine

model = ee.Model.fromAiPlatformPredictor(

projectName=PROJECT,

modelName=MODEL_NAME,

version=VERSION_NAME,

proj=ee.Projection('EPSG:4326').atScale(10),

fixInputProj=True,

inputTileSize=[64, 64],

inputOverlapSize=[32, 32],

outputBands={

'array': {

'type': ee.PixelType.float(),

'dimensions': 1

}

})

6

Host model &

make predictions

+

99 of 107

Connect to the model from Earth Engine

predictions = model.predictImage(composite.select(INPUT_BANDS))

labels = predictions.arrayArgmax().arrayGet(0).byte().rename('label')

6

Host model &

make predictions

+

100 of 107

Demonstration: Convolutional model for image classification

120x120 at 10 meters

1x1 at

60 meters

101 of 107

Load a

pre-trained model

+

🛰

5

Prepare model for hosting

102 of 107

Wrap loaded

model in a

custom layer

class BigearthnetModel(tf.keras.layers.Layer):

def __init__(self, **kwargs):

super().__init__(**kwargs)

model_url = 'https://tfhub.dev/google/remote_sensing/bigearthnet-resnet50/1'

model_path = hub.resolve(model_url)

self.model = tf.saved_model.load(model_path, tags=[])

@tf.function

def call(self, tensor):

logits = self.model.signatures['default'](tensor)['logits']

probs = tf.nn.softmax(logits)

return tf.expand_dims(tf.expand_dims(probs, 1), axis=1, name='output')

def build(self, input_shape):

self.built = True

def get_config(self):

config = super().get_config()

return config

+

5

Prepare model for hosting

103 of 107

Connect to

the model

6

model = ee.Model.fromAiPlatformPredictor(

projectName=PROJECT,

modelName=MODEL_NAME,

version=VERSION_NAME,

proj=ee.Projection('EPSG:4326').atScale(10),

inputTileSize=[60, 60],

inputOverlapSize=[30, 30],

inputShapes={'array': [3]},

outputTileSize=[1, 1],

fixInputProj=True,

outputBands={

'reserlialize_output_8': {

'type': ee.PixelType.float(),

'dimensions': 1

}

})

Host model &

make predictions

+

6

104 of 107

Model output: length 43 vector for each 120x120 patch

predict_image = get_s2_composite().select(['B4', 'B3', 'B2']).toArray()

predictions = model.predictImage(predict_image)

labels = predictions.arrayArgmax().arrayGet(0).byte()

6

make predictions

6

105 of 107

5. Conclusion & resources

5mins

Geo for Good Summit Presentation Template

Geo for Good Summit 2022

106 of 107

💻🗺 Code samples

Stay connected

Links from demo section of this talk

End to end sample explaining how build a custom model & host in AI to Earth Engine

👉 https://bit.ly/3SvS2hl

@GoogleEE

  • 👁Fully Convolutional Networks
  • ⚡️Weather
  • 🔥Wildfire
  • 🌳Restoration

Video series:

Earth Engine

AGV

107 of 107

Participate in an Earth Engine Machine Learning User Study

Do you apply Machine Learning to Remote Sensing / Geospatial Analysis?

We want to hear from you!

Sign up to participate in an upcoming user study.

Know someone who would be interested? Please pass it on!

https://forms.gle/

EA2uBxNkcyZi5D3z7