CS&E Colloquium: Less Is More: Learning with Minimum Supervision for Embodied Agents

The computer science colloquium takes place on Mondays and Fridays from 11:15 a.m. - 12:15 p.m.

This week's speaker, Yanchao Yang (Stanford University), will be giving a talk titled "Less Is More: Learning with Minimum Supervision for Embodied Agents".

Abstract

We have recently seen many exciting robotic applications powered by neural networks trained using annotated datasets. However, when tested in the wild, these neural networks always output invalid predictions resulting in unexpected failures. To physically survive, autonomous agents have to manage the complexity and dynamics of the real world for tasks ranging from perception to decision-making, for which, human supervision would never be enough, not to mention the scarcity of data in some situations.

My research aims at learning algorithms that minimally rely on human supervision for robotic sensing and visual representations, so that neural networks can utilize and generalize to out-of-domain data streams. In this talk, I will first present an information-theoretic principle that detects and segments objects in real scenes. By exploiting the inductive bias from data, the method operates under no human supervision and can seamlessly incorporate multi-modal signals. It also enables continuous learning of object representations from interaction for compositional scene understanding. I will then present techniques that maximally utilize existing datasets by transferring annotations to unlabeled domains, each of which tackles a unique piece of the generalization problem.

Biography

Yanchao Yang is a Postdoctoral Research Fellow at Stanford University with Professor Leonidas J. Guibas at the Geometric Computation Group. He received his Ph.D. from the University of California, Los Angeles (UCLA), working with Professor Stefano Soatto in Computer Science. He researches at the intersection of computer vision, machine learning, and robotics, with a long-term pitch in developmental robotics for embodied agents. He currently focuses on self-supervised and semi-supervised techniques that allow autonomous agents to learn perception and representation for physical interactions in open environments. He is a recipient of the Dean's Award for Academic Excellence, and his work has won the AWS Nominated Paper Award. More information can be found at: https://yanchaoyang.github.io/.
 

Category
Start date
Monday, Feb. 14, 2022, 11:15 a.m.
End date
Monday, Feb. 14, 2022, 12:15 p.m.
Location

Mechanical Engineering 108 or online via Zoom

Share