Past events

UMN Machine Learning Seminar: Secure Model Aggregation in Federated Learning

The UMN Machine Learning Seminar Series brings together faculty, students, and local industrial partners who are interested in the theoretical, computational, and applied aspects of machine learning, to pose problems, exchange ideas, and foster collaborations. The talks are every Thursday from 12 p.m. - 1 p.m. during the Fall 2021 semester.

This week's speaker, Salman Avestimehr (University of Southern California), will be giving a talk titled "Secure Model Aggregation in Federated Learning."

Abstract

Federated learning (FL) has emerged as a promising approach for distributed machine learning over edge devices, in order to strengthen data privacy, reduce data migration costs, and break regulatory restrictions. A key component of FL is "secure model aggregation", which aims at protecting the privacy of each user’s individual model, while allowing their global aggregation. This problem can be viewed as a privacy-preserving multi-party computing, but with two interesting twists: (1) some users may drop out during the protocol (due to poor connectivity, low battery, unavailability, etc); (2) there is potential for multi-round privacy leakage, even if each round is perfectly secure. In this talk, I will first provide a brief overview of FL, then discuss several recent results on secure model aggregation, and finally end the talk by highlighting a few open problems in the area.

Biography

Salman Avestimehr is a Dean's Professor, the inaugural director of the USC-Amazon Center on Secure and Trusted Machine Learning (Trusted AI), and the director of the Information Theory and Machine Learning (vITAL) research lab at the Electrical and Computer Engineering Department of University of Southern California. He is also an Amazon Scholar at Alexa AI. He received his Ph.D. in 2008 and M.S. degree in 2005 in Electrical Engineering and Computer Science, both from the University of California, Berkeley. Prior to that, he obtained his B.S. in Electrical Engineering from Sharif University of Technology in 2003. His research interests include information theory, and large-scale distributed computing and machine learning, secure and private computing/learning, and federated learning.

Dr. Avestimehr has received a number of awards for his research, including the James L. Massey Research & Teaching Award from IEEE Information Theory Society, an Information Theory Society and Communication Society Joint Paper Award, a Presidential Early Career Award for Scientists and Engineers (PECASE) from the White House (President Obama), a Young Investigator Program (YIP) award from the U. S. Air Force Office of Scientific Research, a National Science Foundation CAREER award, a USC Mentoring Award, and the David J. Sakrison Memorial Prize, and several Best Paper Awards at Conferences. He has been an Associate Editor for IEEE Transactions on Information Theory and a general Co-Chair of the 2020 International Symposium on Information Theory (ISIT). He is a fellow of IEEE.

MSSE Online Information Session

Have all your questions about the Master of Science in Software Engineering (MSSE) program answered by attending this online information session.

RSVP now to reserve your spot.

Attendees will be sent a link prior to the event.
 

Last day to receive a 50% tuition refund for canceling full semester classes

The last day to receive a 50% tuition refund for canceling full semester classes is Monday, September 27.

View the full academic schedule on One Stop.
 

CS&E Colloquium: At the deep end: addressing the underwater human-robot collaboration problem

The computer science colloquium takes place on Mondays from 11:15 a.m. - 12:15 p.m.

This week's speaker, Junaed Sattar (University of Minnesota), will be giving a talk titled "At the deep end: addressing the underwater human-robot collaboration problem."

Abstract

Autonomous underwater vehicles (AUV) have traditionally been used for standalone missions, with limited or no direct human involvement, in applications where it is infeasible for humans to closely collaborate with the robots (e.g., long-term oceanographic surveys, search-and-rescue, infrastructure inspection). However, in recent decades, the advent of smaller AUVs suitable for working closely with humans (termed co-AUVs) has enabled robots and humans to collaborate on many subsea tasks. The underwater domain, nonetheless, is unique in many ways and stands out with its numerous challenges -- in sensing, control, and human-robot interaction -- that can justifiably be considered extreme. Our research at the Interactive Robotics and Vision Lab at the University of Minnesota looks into numerous issues in robust underwater human-robot collaboration. Specifically, we investigate underwater bidirectional human-robot communication, underwater imagery enhancement, localization/mapping of underwater objects of interest using multimodal sensing, and biological and non-biological object tracking. We primarily investigate computational solutions to these problems, and use methods from robotics, machine vision, stochastic reasoning, and (deep) machine learning. This talk will present a brief overview of our research and present an in-depth discussion of some recent work in underwater human-robot interaction and imagery enhancement.

Biography

Junaed is an assistant professor at the Department of Computer Science and Engineering at the University of Minnesota and a MnDrive (Minnesota Discovery, Research, and Innovation Economy) faculty, and a member of the Minnesota Robotics Institute. He is the founding director of the Interactive Robotics and Vision Lab, where he and his students investigate problems in field robotics, robot vision, human-robot communication, assisted driving, and applied (deep) machine learning, and develop rugged robotic systems. His graduate degrees are from McGill University in Canada, and he has a BS in Engineering degree from the Bangladesh University of Engineering and Technology. Before coming to the UoM, he worked as a post-doctoral fellow at the University of British Columbia where his research focused on human-robot dialog and assistive wheelchair robots, and at Clarkson University in New York as an Assistant Professor. Find him at junaedsattar.info, and the IRV Lab at irvlab.cs.umn.edu, @irvlab on Twitter, and their YouTube page at https://www.youtube.com/channel/UCbzteddfNPrARE7i1C82NdQ.

UMN Machine Learning Seminar: The Polyak-Lojasiewicz condition as a framework for over-parameterized optimization and its application to deep learning

The UMN Machine Learning Seminar Series brings together faculty, students, and local industrial partners who are interested in the theoretical, computational, and applied aspects of machine learning, to pose problems, exchange ideas, and foster collaborations. The talks are every Thursday from 12 p.m. - 1 p.m. during the Fall 2021 semester.

This week's speaker, Mikhail Belkin (University of California San Diego), will be giving a talk titled "The Polyak-Lojasiewicz condition as a framework for over-parameterized optimization and its application to deep learning."

Abstract

The success of deep learning is due, to a large extent, to the remarkable effectiveness of gradient-based optimization methods applied to large neural networks. In this talk I will discuss some general mathematical principles allowing for efficient optimization in over-parameterized non-linear systems, a setting that includes deep neural networks. I will discuss that optimization problems corresponding to these systems are not convex, even locally, but instead satisfy the Polyak-Lojasiewicz (PL) condition on most of the parameter space, allowing for efficient optimization by gradient descent or SGD. I will connect the PL condition of these systems to the condition number associated with the tangent kernel and show how a non-linear theory for those systems parallels classical analyses of over-parameterized linear equations. As a separate related development, I will discuss a perspective on the remarkable recently discovered phenomenon of transition to linearity (constancy of NTK) in certain classes of large neural networks. I will show how this transition to linearity results from the scaling of the Hessian with the size of the network controlled by certain functional norms. Combining these ideas, I will show how the transition to linearity can be used to demonstrate the PL condition and convergence for a general class of wide neural networks. Finally I will comment on systems which are ''almost'' over-parameterized, which appears to be common in practice.

Biography

Mikhail Belkin received his Ph.D. in 2003 from the Department of Mathematics at the University of Chicago. His research interests are in theory and applications of machine learning and data analysis. Some of his well-known work includes widely used Laplacian Eigenmaps, Graph Regularization and Manifold Regularization algorithms, which brought ideas from classical differential geometry and spectral analysis to data science. His recent work has been concerned with understanding remarkable mathematical and statistical phenomena observed in deep learning. This empirical evidence necessitated revisiting some of the basic concepts in statistics and optimization. One of his key recent findings is the "double descent" risk curve that extends the textbook U-shaped bias-variance trade-off curve beyond the point of interpolation. Mikhail Belkin is a recipient of a NSF Career Award and a number of best paper and other awards. He has served on the editorial boards of the Journal of Machine Learning Research, IEEE Pattern Analysis and Machine Intelligence and SIAM Journal on Mathematics of Data Science.

Fall 2021 College of Science and Engineering Virtual Career Fair

Tuesday, September 21 and Wednesday, September 22, 2021
Noon - 6 p.m. each day
The fair will be held via Handshake

View the day one list of companies recruiting and the day two list of companies recruiting now and begin signing up for time slots to speak individually with companies beginning September 14, 2021 at 8:00 a.m.

Visit the Career Information for Students webpage for more information!

For questions, contact the CSE Career Center at csecareer@umn.edu or by calling 612-624-4090.
 

Last day to apply for fall undergraduate graduation

The last day to apply for fall undergraduate graduation is Tuesday, September 21.

View the full academic schedule on One Stop.
 

Fall 2021 College of Science and Engineering Virtual Career Fair

Tuesday, September 21 and Wednesday, September 22, 2021
Noon - 6 p.m. each day
The fair will be held via Handshake

View the day one list of companies recruiting and the day two list of companies recruiting now and begin signing up for time slots to speak individually with companies beginning September 14, 2021 at 8:00 a.m.

Visit the Career Information for Students webpage for more information!

For questions, contact the CSE Career Center at csecareer@umn.edu or by calling 612-624-4090.
 

Last day to cancel full semester classes and not receive a "W"

The last day to cancel full semester classes and not receive a "W" is Monday, September 20. This is also the last day to receive a 75% tuition refund for canceling full semester classes.

In addition, this is the last day to add classes without college approval and to change grade basis (A-F or S/N) for full semester classes.

View the full academic schedule on One Stop.
 

Canceled: CS&E Colloquium

The computer science colloquium for Monday, September 20 has been canceled.

The colloquia series will resume Monday, September 27 at 11:15 a.m.