Lecture: Nadav Dym

Data Science Seminar

Nadav Dym (Technion-Israel Institute of Technology)

Registration is required to access the Zoom webinar.

Title:
Efficient Invariant Embeddings for Universal Equivariant Learning
 
Abstract:
In many machine learning tasks, the goal is to learn an unknown function which has some known group symmetries. Equivariant machine learning algorithms exploit this by devising architectures (=function spaces) which have these symmetries by construction. Examples include convolutional neural networks which respect the translation symmetry of images, and neural networks for graphs or sets which respect their permutation symmetries. More examples will be discussed…

A common theoretical requirement of an equivariant architecture is that it will be universal- meaning that it can approximate any continuous equivariant function. This question typically boils down to another theoretical question: assume that we have a group G acting on a set V, can we find a mapping f:V→R^m such that f is G invariant, and on the other hand f separates and two points in V which are not related by a G-symmetry? Such a mapping is essentially an injective embedding of the quotient space V/G into R^m, which can then be used to prove universality. We will review results showing that under very general assumptions such a mapping f exists, and the embedding dimension m can be taken to be 2dim(V)+1. We will show that in some cases (e.g., graphs) computing such an f can be very expensive, and will discuss our methodology for efficient computation of such f in other cases (e.g., sets). This methodology is a generalization of the algebraic geometry argument used for the well known proof of phase retrieval injectivity.

Based on work with Steven J. Gortler
 
 
Start date
Tuesday, Jan. 31, 2023, 1:25 p.m.
End date
Tuesday, Jan. 31, 2023, 2:25 p.m.
Location

Zoom

Share