1 of 110

How to get started with equivariant deep learning

Evangelos Chatzipantazis, Stefanos Pertigkiozoglou

2 of 110

Tutorial Overview

Case Study� Building an Equivariant Point Encoder

Equivariance in Different Applications

Where to go next?

Symmetry Groups

Invariance/Equivariance

Equivariant Layers

3 of 110

Tutorial Overview

Case Study� Building an Equivariant Point Encoder

Equivariance in Different Applications

Where to go next?

Symmetry Groups

Invariance/Equivariance

Equivariant Layers

4 of 110

Equivariant Deep Learning in Computer Perception

5 of 110

Classical Equivision.

SIFT Features (D.Lowe 2004)

=

+

Steerable Filters (W. Freeman 1991)

Early Deep Learning

LeNet (Y. LeCun 1998)

Neocognitron(K. Fukushima 1980)

6 of 110

Symmetry Groups

  • Symmetries: Set G of transformations Tg that leave an object invariant.

  • Finite (or infinite discrete) Groups. “You can only choose among 17 wallpapers!”
  • Continuous (aka topological) Groups e.g. Lie Groups.

generators

Frieze

Wallpaper

Crystallographic

elements

7 of 110

Transformation Groups

  • Abstract Algebra / Group Theory: Studies the properties of the groups themselves. E.g. is a group composed of smaller groups and in what sense? (e.g. generators, semi-direct products)
  • (linear) Representation Theory: Studies different actions of a group on different (vector) spaces. Sometimes, that also helps to understand the structure of the group itself.
  • Harmonic Analysis: Generalizes Fourier theory (studies bases) for functions on groups and homogeneous spaces.

A group is a set and an operation that satisfies:

  • Associativity:
  • Identity element:
  • Inverse element:

A transformation group is a group of transformations

where the operation is composition.

The Math behind:

8 of 110

Group Actions

The same group can act on different objects.

Same group, “same” object, different actions. Redundant representations give rise to more symmetries.

A (left) group action on a set is a function that satisfies:

  • Identity:
  • Compatibility:

9 of 110

(linear) Representations

A linear representation of a group on a vector space is a map with the property:

It is a group action on a vector space that preserves the vector space property too. (G- space) If n-dimensional then n-by-n invertible matrices. “Geometric vectors!”

Why linear? Even if base space not linear e.g. sphere, functions on this sphere induce linear actions. The (left) regular representation

Equivalence

Finite dimensional: single for all

Theorem (Maschke, Peter-Weyl): Any finite dimensional unitary representation of a compact group (on finite dimensional vector space) over a field with characteristic 0 is completely reducible into a direct sum of irreducible representations

10 of 110

SO(3) and its Irreducible Representations

SO(3) can rotate:

A vector

An inertia matrix

A scalar function (color)

A vector function:

SO(3) is compact. All these representations can be decomposed into irreps.

Irreps are like primes for natural numbers. Building blocks for larger representations!

Symmetry group? Yes, for preserves: norm and handedness, vector space

3

1

2

3

1

2

11 of 110

SO(3) and its Irreducible Representations

Spherical

Harmonics

The irreducibles of SO(3) are called Wigner-D matrices of size

Example: Rotation of Inertia Matrix :

12 of 110

Tutorial Overview

Case Study� Building an Equivariant Point Encoder

Equivariance in Different Applications

Where to go next?

Symmetry Groups

Invariance/Equivariance

Equivariant Layers

13 of 110

Invariance and Equivariance

Input space:

Action:

Output space:

Action:

Equivariance

car

Invariance

14 of 110

Equivariance

  • Composability: We can impose equivariance on linear layers and design ad-hoc (usually non-parametric) non-linearities.
  • Inductive bias: Equivariance is a form of constraint on the possible functions.
    • We can change actions in intermediate layers! Extra inductive bias.
  • Expressivity: We use rep. theory to decompose the constraints into the independent ones and maximize the remaining degrees of freedom.
    • For larger groups the per-layer constraint might hurt expressivity.

  • Do not want premature invariance! E.g. lose relative pose, just a bag of features.

15 of 110

Group / Space / Action : The story so far

Does my task have symmetries?

Specify group, input space, input action, output space, output action.

Next step: build layers that are parametric and process the input+action to produce output+action aka Equivariant layers.

Point Clouds

Occupancy Function

16 of 110

Benefits of Equivariance

  • Statistical Benefits. (smaller hypothesis space, smaller sample complexity)
  • Fewer Parameters (but same expressivity): due to weight sharing, smaller networks.
  • Exact constraint satisfaction leads to Robustness (safety) and Feasibility (physics)
  • Generalization Benefits: Constraint satisfied outside the training distribution.
  • Feature stability, predictable output.
  • The information (of e.g. the pose) is preserved not destroyed in the network.
  • Inductive bias in intermediate layers: interpretability.

17 of 110

Equivariance vs Canonicalization

Canonicalize instead! (i.e. choose an invariant input representation)

PCA

PCA

Intrinsic / Picasso Problem

Non-smoothness

Not robust to noise

Even selecting robust and general canonical object involves learning which in turn utilizes equivariant deep networks.

18 of 110

Equivariance vs Data Augmentation

  • Use data augmentations instead:

  • Jointly with equivariance: Locality (kernels) , Hierarchy (deep features). Synergy not well studied yet (blueprint of “Geometric Deep Learning” M.Bronstein et al. 2021).
  • General Transformations (non-groups): affine, crops, jitter, small noise.

  • Input/Output, not per-layer.

  • No problem - specific architectures.
  • Costly Training, especially for larger groups.

  • Not exactly imposed (physics rules, safety constraints).

  • Not imposed outside the training set.

  • No per-layer inductive bias.

19 of 110

Tutorial Overview

Case Study� Building an Equivariant Point Encoder

Equivariance in Different Applications

Where to go next?

Symmetry Groups

Invariance/Equivariance

Equivariant Layers

20 of 110

Building equivariant layers: Group / Space / Action

  • Group: Discrete, Continuous, (Locally) Compact, Matrix Lie Group etc.
  • Space: Vector space, finite/infinite dimensional, homogeneous etc.
  • Action: linear representation, irreducible, regular, induced etc.

21 of 110

Equivalently: (EMLP by Finzi et al. 2021)

  • Use Representation theory to decompose into blocks of independent degrees of freedom.
  • Use generators to reduce the number of equations into the linearly independent ones.
    • Lie Group Infinitesimal generators:
    • Discrete generators:

Input Representation: Output Representation:

Equivariant Linear Layer:

Schur’s Lemma: If irreducibles (on ) then:

If else,

Matrix group

22 of 110

Example: Irreducible Representations.

(VNN C.Deng et al. 2021)

(2D-VNN)

Example 2: Lie Group infinitesimal generators

Example 3: Discrete, generators

23 of 110

  • Memory overhead
  • Not exact for continuous (and large discrete) groups.
  • Standard non-linearities.
  • Single recipe for many groups.
  • After indexing can be implemented via std. conv
  • Equivariant (continuous) Linear Operators for functions on groups.
  • It’s all (regular) group convolutions (group cross-correlations).
  • Kernel on the group! Template matching in the DOFs of the group.
  • Discrete groups: Sum over all elements.

Equivariance:

24 of 110

Composing Lifting Convolution, Regular Group Convolution and a Projection operator (like Group Max pooling) constitutes the Regular G-CNNs (T. Cohen et. al 2016)

  • Lifting Convolution/Cross-correlation:

from function on (homogeneous) space

to function on group.

Equivariance:

25 of 110

Input/Output Action

(Tensor field)

  • Ad-hoc Nonlinearities.
  • Need to solve the constraints per-group (non-compact?).
  • Boilerplate change-of- basis code.
  • Exact for (some) continuous groups.
  • Additional inductive bias per-layer (types)

Steerable Equivariant Layers (Weiler et. al 2018)

All (continuous) equivariant linear operators

Harmonic Networks (D.Worall et al. 2016)

Steerability Constraint

1. Lifting Convolution: expanded domain

2.Steerable Convolution:

expanded co-domain

Connection via Fourier Transform!

26 of 110

  • Equivariant linear maps between functions on homogeneous space of the group.

27 of 110

Equivariant Layers on Lie Groups

28 of 110

Equivariant Layers on Lie Groups

29 of 110

Riemannian Manifolds

Clifford Algebras

Ray Space

More general spaces

30 of 110

Tutorial Overview

Case Study� Building an Equivariant Point Encoder

Equivariance in Different Applications

Where to go next?

Symmetry Groups

Invariance/Equivariance

Equivariant Layers

31 of 110

Tutorial Overview

Case Study� Building an Equivariant Point Encoder

Equivariance in Different Applications

Where to go next?

Symmetry Groups

Invariance/Equivariance

Equivariant Layers

32 of 110

Case Study: Equivariant Point Encoder

  • Identifying the symmetries of the problem

0

2

3

1

0

2

3

1

33 of 110

Case Study: Equivariant Point Encoder

  • Identifying the symmetries of the problem

0

2

3

1

0

2

3

1

Translation Equivariance

0

2

3

1

0

2

3

1

34 of 110

Case Study: Equivariant Point Encoder

  • Identifying the symmetries of the problem

0

2

3

1

0

2

3

1

1

3

0

2

Permutation+Translation Equivariance

1

3

0

2

35 of 110

Case Study: Equivariant Point Encoder

  • Translation and Permutation symmetry can be easily be incorporated by using a message passing module

3

2

1

0

Message Passing

36 of 110

Case Study: Equivariant Point Encoder

  • Translation and Permutation symmetry can be easily be incorporated by using a message passing module

3

2

1

0

Message Passing

37 of 110

Case Study: Equivariant Point Encoder

  • Translation and Permutation symmetry can be easily be incorporated by using a message passing module

3

2

1

0

Message Passing

38 of 110

Case Study: Equivariant Point Encoder

  • Translation and Permutation symmetry can be easily be incorporated by using a message passing module

3

2

1

0

Message Passing

39 of 110

Case Study: Equivariant Point Encoder

  • Translation and Permutation symmetry can be easily be incorporated by using a message passing module

0

3

2

1

Message Passing

40 of 110

Case Study: Equivariant Point Encoder

  • Additional Symmetry: Rotational Equivariance

0

2

3

1

0

2

3

1

0

2

3

1

0

2

3

1

Rotational�Equivariance

41 of 110

Case Study: Equivariant Point Encoder

  • Additional Symmetry: Rotational Equivariance

Rotational�Equivariance

0

2

3

1

0

2

3

1

0

2

3

1

0

2

3

1

42 of 110

Case Study: Equivariant Point Encoder

  • Additional Symmetry: Rotational Equivariance

0

2

3

1

0

2

3

1

0

2

3

1

0

2

3

1

43 of 110

Case Study: Equivariant Point Encoder

  • Additional Symmetry: Rotational Equivariance

0

2

3

1

0

2

3

1

0

2

3

1

0

2

3

1

44 of 110

Case Study: Equivariant Point Encoder

3

2

1

0

Message Passing

45 of 110

Case Study: Equivariant Point Encoder

Message Passing

3

2

1

0

46 of 110

Case Study: Equivariant Point Encoder

Message Passing

3

2

1

0

47 of 110

Case Study: Equivariant Point Encoder

Equivariant Message Passing

3

2

1

0

48 of 110

Case Study: Equivariant Point Encoder

Equivariant Message Passing

49 of 110

Case Study: Equivariant Point Encoder

Equivariant Message Passing

Steerable Convolutional Kernel

50 of 110

Case Study: Equivariant Point Encoder

Equivariant Message Passing

Steerable Convolutional Kernel

0

2

3

1

0

2

3

1

51 of 110

Examples of Equivariant Linear Layers

Equivariant Linear Layer Example

We show an example of using Vector Neurons

class VNLinear(nn.Module):

def __init__(self, in_channels, out_channels):

super(VNLinear, self).__init__()

self.map_to_feat = nn.Linear(in_channels, out_channels, bias=False)

def forward(self, x):

'''

x: point features of shape [B, N_feat, 3, N_samples, ...]

'''

x_out = self.map_to_feat(x.transpose(1,-1)).transpose(1,-1)

return x_out

52 of 110

Examples of Equivariant Linear Layers

Equivariant Linear Layer Example

We show an example of using Vector Neurons

class VNLinear(nn.Module):

def __init__(self, in_channels, out_channels):

super(VNLinear, self).__init__()

self.map_to_feat = nn.Linear(in_channels, out_channels, bias=False)

def forward(self, x):

'''

x: point features of shape [B, N_feat, 3, N_samples, ...]

'''

x_out = self.map_to_feat(x.transpose(1,-1)).transpose(1,-1)

return x_out

SO(3) Steerable Convolutional Kernel

We show an example of using Tensor Field Network, by utilizing the E3NN library�������

53 of 110

Examples of Equivariant Linear Layers

Equivariant Linear Layer Example

We show an example of using Vector Neurons

class VNLinear(nn.Module):

def __init__(self, in_channels, out_channels):

super(VNLinear, self).__init__()

self.map_to_feat = nn.Linear(in_channels, out_channels, bias=False)

def forward(self, x):

'''

x: point features of shape [B, N_feat, 3, N_samples, ...]

'''

x_out = self.map_to_feat(x.transpose(1,-1)).transpose(1,-1)

return x_out

SO(3) Steerable Convolutional Kernel

We show an example of using Tensor Field Network, by utilizing the E3NN library�������

54 of 110

Examples of Equivariant Linear Layers

Equivariant Linear Layer Example

We show an example of using Vector Neurons [citations]

class VNLinear(nn.Module):

def __init__(self, in_channels, out_channels):

super(VNLinear, self).__init__()

self.map_to_feat = nn.Linear(in_channels, out_channels, bias=False)

def forward(self, x):

'''

x: point features of shape [B, N_feat, 3, N_samples, ...]

'''

x_out = self.map_to_feat(x.transpose(1,-1)).transpose(1,-1)

return x_out

SO(3) Steerable Convolutional Kernel

We show an example of using Tensor Field Network [citations], by utilizing E3NN library�������

SO(3) Steerable Convolutional Kernel

from e3nn import o3

from e3nn.nn import FullyConnectedTensorProduct

class CGTensorProduct(torch.nn.Module):

def __init__(irreps_input="5x0e+5x1o", irreps_output="5x0e+1x1o",

max_harmonics=2):

super().__init__()

self.irreps_input=o3.Irreps(irreps_input), o3.Irreps(irreps_output)

self.irreps_output=o3.Irreps(irreps_output)

self.edge_irreps=o3.spherical_harmonics(max_harmonics)

self.tp=FullyConnectedTensorProduct(self.irreps_input,

self.edge_irreps,

self.irreps_output,

internal_weights=False)

self.phi=MLP(in_channels=1,out_channels=self.tp.weight_numel)

def forward(self,f,edge_diff):

harm=o3.spherical_harmonics(self.edge_irreps,edge_diff)

weights=self.phi(torch.norm(edge_diff))

return self.tp(f,harm,weight)

55 of 110

Examples of Equivariant Linear Layers

Equivariant Linear Layer Example

We show an example of using Vector Neurons [citations]

class VNLinear(nn.Module):

def __init__(self, in_channels, out_channels):

super(VNLinear, self).__init__()

self.map_to_feat = nn.Linear(in_channels, out_channels, bias=False)

def forward(self, x):

'''

x: point features of shape [B, N_feat, 3, N_samples, ...]

'''

x_out = self.map_to_feat(x.transpose(1,-1)).transpose(1,-1)

return x_out

SO(3) Steerable Convolutional Kernel

We show an example of using Tensor Field Network [citations], by utilizing E3NN library�������

SO(3) Steerable Convolutional Kernel

from e3nn import o3

from e3nn.nn import FullyConnectedTensorProduct

class CGTensorProduct(torch.nn.Module):

def __init__(irreps_input="5x0e+5x1o", irreps_output="5x0e+1x1o",

max_harmonics=2):

super().__init__()

self.irreps_input=o3.Irreps(irreps_input), o3.Irreps(irreps_output)

self.irreps_output=o3.Irreps(irreps_output)

self.edge_irreps=o3.spherical_harmonics(max_harmonics)

self.tp=FullyConnectedTensorProduct(self.irreps_input,

self.edge_irreps,

self.irreps_output,

internal_weights=False)

self.phi=MLP(in_channels=1,out_channels=self.tp.weight_numel)

def forward(self,f,edge_diff):

harm=o3.spherical_harmonics(self.edge_irreps,edge_diff)

weights=self.phi(torch.norm(edge_diff))

return self.tp(f,harm,weight)

56 of 110

Examples of Equivariant Linear Layers

Equivariant Linear Layer Example

We show an example of using Vector Neurons [citations]

class VNLinear(nn.Module):

def __init__(self, in_channels, out_channels):

super(VNLinear, self).__init__()

self.map_to_feat = nn.Linear(in_channels, out_channels, bias=False)

def forward(self, x):

'''

x: point features of shape [B, N_feat, 3, N_samples, ...]

'''

x_out = self.map_to_feat(x.transpose(1,-1)).transpose(1,-1)

return x_out

SO(3) Steerable Convolutional Kernel

We show an example of using Tensor Field Network [citations], by utilizing E3NN library�������

SO(3) Steerable Convolutional Kernel

from e3nn import o3

from e3nn.nn import FullyConnectedTensorProduct

class CGTensorProduct(torch.nn.Module):

def __init__(irreps_input="5x0e+5x1o", irreps_output="5x0e+1x1o",

max_harmonics=2):

super().__init__()

self.irreps_input=o3.Irreps(irreps_input), o3.Irreps(irreps_output)

self.irreps_output=o3.Irreps(irreps_output)

self.edge_irreps=o3.spherical_harmonics(max_harmonics)

self.tp=FullyConnectedTensorProduct(self.irreps_input,

self.edge_irreps,

self.irreps_output,

internal_weights=False)

self.phi=MLP(in_channels=1,out_channels=self.tp.weight_numel)

def forward(self,f,edge_diff):

harm=o3.spherical_harmonics(self.edge_irreps,edge_diff)

weights=self.phi(torch.norm(edge_diff))

return self.tp(f,harm,weight)

57 of 110

Examples of Equivariant Linear Layers

Equivariant Linear Layer Example

We show an example of using Vector Neurons [citations]

class VNLinear(nn.Module):

def __init__(self, in_channels, out_channels):

super(VNLinear, self).__init__()

self.map_to_feat = nn.Linear(in_channels, out_channels, bias=False)

def forward(self, x):

'''

x: point features of shape [B, N_feat, 3, N_samples, ...]

'''

x_out = self.map_to_feat(x.transpose(1,-1)).transpose(1,-1)

return x_out

SO(3) Steerable Convolutional Kernel

We show an example of using Tensor Field Network [citations], by utilizing E3NN library�������

SO(3) Steerable Convolutional Kernel

from e3nn import o3

from e3nn.nn import FullyConnectedTensorProduct

class CGTensorProduct(torch.nn.Module):

def __init__(irreps_input="5x0e+5x1o", irreps_output="5x0e+1x1o",

max_harmonics=2):

super().__init__()

self.irreps_input=o3.Irreps(irreps_input), o3.Irreps(irreps_output)

self.irreps_output=o3.Irreps(irreps_output)

self.edge_irreps=o3.spherical_harmonics(max_harmonics)

self.tp=FullyConnectedTensorProduct(self.irreps_input,

self.edge_irreps,

self.irreps_output,

internal_weights=False)

self.phi=MLP(in_channels=1,out_channels=self.tp.weight_numel)

def forward(self,f,edge_diff):

harm=o3.spherical_harmonics(self.edge_irreps,edge_diff)

weights=self.phi(torch.norm(edge_diff))

return self.tp(f,harm,weight)

58 of 110

Examples of Equivariant Linear Layers

Equivariant Linear Layer Example

We show an example of using Vector Neurons [citations]

class VNLinear(nn.Module):

def __init__(self, in_channels, out_channels):

super(VNLinear, self).__init__()

self.map_to_feat = nn.Linear(in_channels, out_channels, bias=False)

def forward(self, x):

'''

x: point features of shape [B, N_feat, 3, N_samples, ...]

'''

x_out = self.map_to_feat(x.transpose(1,-1)).transpose(1,-1)

return x_out

SO(3) Steerable Convolutional Kernel

We show an example of using Tensor Field Network [citations], by utilizing E3NN library�������

SO(3) Steerable Convolutional Kernel

from e3nn import o3

from e3nn.nn import FullyConnectedTensorProduct

class CGTensorProduct(torch.nn.Module):

def __init__(irreps_input="5x0e+5x1o", irreps_output="5x0e+1x1o",

max_harmonics=2):

super().__init__()

self.irreps_input=o3.Irreps(irreps_input), o3.Irreps(irreps_output)

self.irreps_output=o3.Irreps(irreps_output)

self.edge_irreps=o3.spherical_harmonics(max_harmonics)

self.tp=FullyConnectedTensorProduct(self.irreps_input,

self.edge_irreps,

self.irreps_output,

internal_weights=False)

self.phi=MLP(in_channels=1,out_channels=self.tp.weight_numel)

def forward(self,f,edge_diff):

harm=o3.spherical_harmonics(self.edge_irreps,edge_diff)

weights=self.phi(torch.norm(edge_diff))

return self.tp(f,harm,weight)

59 of 110

Examples of Equivariant Linear Layers

Equivariant Linear Layer Example

We show an example of using Vector Neurons

class VNLinear(nn.Module):

def __init__(self, in_channels, out_channels):

super(VNLinear, self).__init__()

self.map_to_feat = nn.Linear(in_channels, out_channels, bias=False)

def forward(self, x):

'''

x: point features of shape [B, N_feat, 3, N_samples, ...]

'''

x_out = self.map_to_feat(x.transpose(1,-1)).transpose(1,-1)

return x_out

SO(3) Steerable Convolutional Kernel

We show an example of using Tensor Field Network, by utilizing E3NN library�������

from e3nn import o3

from e3nn.nn import FullyConnectedTensorProduct

class CGTensorProduct(torch.nn.Module):

def __init__(irreps_input="5x0e+5x1o", irreps_output="5x0e+1x1o",

max_harmonics=2):

super().__init__()

self.irreps_input=o3.Irreps(irreps_input), o3.Irreps(irreps_output)

self.irreps_output=o3.Irreps(irreps_output)

self.edge_irreps=o3.spherical_harmonics(max_harmonics)

self.tp=FullyConnectedTensorProduct(self.irreps_input,

self.edge_irreps,

self.irreps_output,

internal_weights=False)

self.phi=MLP(in_channels=1,out_channels=self.tp.weight_numel)

def forward(self,x,edge_diff):

harm=o3.spherical_harmonics(self.edge_irreps,edge_diff)

weights=self.phi(torch.norm(edge_diff))

return self.tp(x,harm,weight)

60 of 110

Equivariant Non-Linearities

Equivariant Constraint

General Recipe for Equivariant Pointwise NonLinearity

Non-Linearity

Invariant Features

Equivariant Features

Input Features

61 of 110

Equivariant Non-Linearities

Equivariant Constraint

Tensor Field Network Non-Linearity

Non-Linearity

Invariant Features

Equivariant Features

Input Features

62 of 110

Equivariant Non-Linearities

Equivariant Constraint

Vector Neurons Non-Linearity

Dot product

ReLU

63 of 110

Incorporating Attention Layers

3

2

1

0

64 of 110

Incorporating Attention Layers

3

2

1

0

65 of 110

Incorporating Attention Layers

3

2

1

0

softmax

66 of 110

Incorporating Attention Layers

3

2

1

0

softmax

67 of 110

Incorporating Attention Layers

3

2

1

0

softmax

68 of 110

Incorporating Attention Layers

3

2

1

0

3

2

1

0

69 of 110

Incorporating Attention Layers

3

2

1

0

softmax

3

2

1

0

70 of 110

Incorporating Attention Layers

3

2

1

0

softmax

3

2

1

0

71 of 110

Group Convolution

Lifting to Group

Group of discrete rotations C4

72 of 110

Group Convolution

Lifting to Group

73 of 110

Group Convolution

Lifting to Group

74 of 110

Group Convolution

Lifting to Group

Lifting to Group

75 of 110

Group Convolution

Lifting to Group

Lifting to Group

76 of 110

Tutorial Overview

Case Study� Building an Equivariant Point Encoder

Equivariance in Different Applications

Where to go next?

Symmetry Groups

Invariance/Equivariance

Equivariant Layers

77 of 110

Equivariant Point Encoder Applications

Point Cloud Segmentation/Classification

Point Cloud Reconstruction

Robotic Manipulation

Molecular Property Prediction

“Equivariant Descriptor Fields: SE(3)-Equivariant Energy-Based Models for End-to-End Visual Robotic Manipulation Learning”, Ryu et al. (ICLR 2023)

Figure from “SE(3)-Equivariant Attention Netowrks for Shape Reconstruction in Function Space” Chatzipantazis et al. (ICLR 2023)

2D Keypoint Extraction

The pre-specified 2D grid of images allows for precomputation of the steerable kernel

The ESCNN library allows for easy implementation �“A Program to Build E(N)-Equivariant Steerable CNNs” Cesa et al. (ICLR 2022)

“S-Trek: Sequential Translation and Rotation Equivariant Keypoints for local feature extraction”, Santellani et al. 2023

78 of 110

From Equivariance to Bi-Equivariance

Assume we are interested on the task of finding the transform that connects two puzzle pieces

T

79 of 110

From Equivariance to Bi-Equivariance

Assume we are interested on the task of finding the transform that connects two puzzle pieces

T

T

80 of 110

From Equivariance to Bi-Equivariance

81 of 110

From Equivariance to Bi-Equivariance

82 of 110

From Equivariance to Bi-Equivariance

83 of 110

From Equivariance to Bi-Equivariance

84 of 110

From Equivariance to Bi-Equivariance

85 of 110

Bi-Equivariance Example: Relative rotation estimation

Equivariant Encoder

Equivariant Encoder

Bi-Equivariant Feature

86 of 110

Bi-Equivariance Example: Relative rotation estimation

Equivariant Encoder

Equivariant Encoder

Bi-Equivariant Feature

SVD

Procrustes

87 of 110

Bi-Equivariance Example: Relative rotation estimation

Equivariant Encoder

Equivariant Encoder

Bi-Equivariant Feature

SVD

Procrustes

88 of 110

Bi-Equivariant Applications

Point Cloud Registration

89 of 110

Bi-Equivariant Applications

Point Cloud Registration

Pick and Place

"Diffusion-EDFs: Bi-equivariant Denoising Generative Modeling on SE(3) for Visual Robotic Manipulation", Ryu et al. (CVPR 2024)

90 of 110

Bi-Equivariant Applications

Point Cloud Registration

Pick and Place

"Diffusion-EDFs: Bi-equivariant Denoising Generative Modeling on SE(3) for Visual Robotic Manipulation", Ryu et al. (CVPR 2024)

Protein Docking

Protein Z-dependent inhibitor

Protein Z

“INDEPENDENT SE(3)-EQUIVARIANT MODELS FOR END-TO-END RIGID PROTEIN DOCKING”, Ganea et al. 2021

91 of 110

Permutation Equivariance

92 of 110

Permutation Equivariance

Row and Column Permutation Equivariance

93 of 110

Permutation Equivariance

Row and Column Permutation Equivariance

“Deep Models of Interactions Across Sets” Hartford et. al. 2018

94 of 110

Permutation Equivariant Applications

Permutation Equivariant SfM

[“Deep Permutation Equivariant Structure from Motion”, Moran et al. 2021]

95 of 110

Permutation Equivariant Applications

Permutation Equivariant SfM

“Deep Permutation Equivariant Structure from Motion”, Moran et al. 2021

Symmetries in Deep Weight Space

“Equivariant Architectures for Learning in Deep Weight Spaces”� [Navon et al. 2021]

96 of 110

Tutorial Overview

Case Study� Building an Equivariant Point Encoder

Equivariance in Different Applications

Where to go next?

Symmetry Groups

Invariance/Equivariance

Equivariant Layers

97 of 110

Limitations and Open Questions

There are still various open problems and limitation on applying equivariant representations

98 of 110

Limitations and Open Questions

There are still various open problems and limitation on applying equivariant representations

  1. Increased computational and memory requirements (especially for group convolutional layers)

99 of 110

Limitations and Open Questions

There are still various open problems and limitation on applying equivariant representations

  • Increased computational and memory requirements (especially for group convolutional layers)

100 of 110

Limitations and Open Questions

There are still various open problems and limitation on applying equivariant representations

  • Increased computational and memory requirements (especially for group convolutional layers)

101 of 110

Limitations and Open Questions

There are still various open problems and limitation on applying equivariant representations

  • Increased computational and memory requirements (especially for group convolutional layers)

102 of 110

Limitations and Open Questions

There are still various open problems and limitation on applying equivariant representations

  • Increased computational and memory requirements (especially for group convolutional layers)
  • Solving the equivariant constraint can be non-trivial for general symmetry groups (especially for steerable convolutions)

103 of 110

Limitations and Open Questions

There are still various open problems and limitation on applying equivariant representations

  • Increased computational and memory requirements (especially for group convolutional layers)
  • Solving the equivariant constraint can be non-trivial for general symmetry groups (especially for steerable convolutions)

104 of 110

Limitations and Open Questions

There are still various open problems and limitation on applying equivariant representations

  • Increased computational and memory requirements (especially for group convolutional layers)
  • Solving the equivariant constraint can be non-trivial for general symmetry groups (especially for steerable convolutions)
  • Misspecified symmetries or symmetry breaking effects

105 of 110

Limitations and Open Questions

There are still various open problems and limitation on applying equivariant representations

  • Increased computational and memory requirements (especially for group convolutional layers)
  • Solving the equivariant constraint can be non-trivial for general symmetry groups (especially for steerable convolutions)
  • Misspecified symmetries or symmetry breaking effects

106 of 110

Limitations and Open Questions

There are still various open problems and limitation on applying equivariant representations

  • Increased computational and memory requirements (especially for group convolutional layers)
  • Solving the equivariant constraint can be non-trivial for general symmetry groups (especially for steerable convolutions)
  • Misspecified symmetries or symmetry breaking effects

107 of 110

Limitations and Open Questions

Regardless the benefits there are still open problems and limitation on apply equivariant representations

  • Increased computational and memory requirements (especially for group convolutional layers)
  • Solving the equivariant constraint can be non-trivial for general symmetry groups (especially for steerable convolutions)
  • Misspecified symmetries or symmetry breaking effects

108 of 110

Limitations and Open Questions

Regardless the benefits there are still open problems and limitation on apply equivariant representations

  • Increased computational and memory requirements (especially for group convolutional layers)
  • Solving the equivariant constraint can be non-trivial for general symmetry groups (especially for steerable convolutions)
  • Misspecified symmetries or symmetry breaking effects

109 of 110

Limitations and Open Questions

There are still various open problems and limitation on applying equivariant representations

  • Increased computational and memory requirements (especially for group convolutional layers)
  • Solving the equivariant constraint can be non-trivial for general symmetry groups (especially for steerable convolutions)
  • Misspecified symmetries or symmetry breaking effects
  • In the era of big models and data, how we can scale these architectures.

110 of 110

Useful Resources

Libraries

�Books/Courses

  • “Equivariant and coordinate independent convolutional networks”, M. Weiler
  • “Geometric Deep Learning: Grids, Groups, Graphs, Geodesics, and Gauges”, M. Brostein, J. Bruna, T. Cohen, P. Veličković
  • “Aspects of Representation Theory and Noncommutative Harmonic Analysis”, J. Gallier, J. Quaintance
  • Course: “An Introduction to Group Equivariant Deep Learning”, E. Bekkers [https://uvagedl.github.io/]