Past events

Industrial Problems Seminar: Research and Opportunities in the Mathematical Sciences at Oak Ridge National Laboratory

In collaboration with the Minnesota Center for Industrial Mathematics, the Industrial Problems Seminars are a forum for industrial researchers to offer a first-hand glimpse into industrial research. The seminars take place Fridays from 1:25 p.m. - 2:25 p.m.

This week's speaker, Juan Restrepo (Oak Ridge National Laboratory), will be giving a talk titled "Research and Opportunities in the Mathematical Sciences at Oak Ridge National Laboratory."

Registration is required to access the Zoom webinar.

Abstract

I will present a general overview of Oak Ridge National Laboratory research in mathematics and computing. A brief description of my own initiatives and research will be covered as well. I will also describe opportunities for students, postdocs, and professional mathematicians.

Biography

Dr. Juan M. Restrepo is a Distinguished Member of the R&D Staff at Oak Ridge National Laboratory. Restrepo is a fellow of SIAM and APS. He holds professorships at U. Tennessee and Oregon State University. Prior to ORNL, he was a professor of mathematics at Oregon State University and at the University of Arizona. He has been a frequent IMA visitor.

His research focuses on data-driven methods for dynamics, statistical mechanics, transport in ocean and uncertainty quantification in climate science.

UMN Machine Learning Seminar: Machine Learning and Scientific Computing

The UMN Machine Learning Seminar Series brings together faculty, students, and local industrial partners who are interested in the theoretical, computational, and applied aspects of machine learning, to pose problems, exchange ideas, and foster collaborations. The talks are every Thursday from 12 p.m. - 1 p.m. during the Fall 2021 semester.

This week's speaker, Eric Vanden-Eijnden (New York University), will be giving a talk titled "Machine Learning and Scientific Computing."

Abstract

The recent success of machine learning suggests that neural networks may be capable of approximating high-dimensional functions with controllably small errors. As a result, they could outperform standard function interpolation methods that have been the workhorses of current numerical methods. This feat offers exciting prospects for scientific computing, as it may allow us to solve problems in high-dimension once thought intractable. At the same time, looking at the tools of machine learning through the lens of applied mathematics and numerical analysis can give new insights as to why and when neural networks can beat the curse of dimensionality. I will briefly discuss these issues, and present some applications related to solving PDE in large dimensions and sampling high-dimensional probability distributions.

Biography

Eric Vanden-Eijnden is a Professor of Mathematics at the Courant Institute of Mathematical Sciences, New York University. His research focuses on the mathematical and computational aspects of statistical mechanics, with applications to complex dynamical systems arising in molecular dynamics, materials science, atmosphere-ocean science, fluids dynamics, and neural networks. He is also interested in the mathematical foundations of machine learning (ML) and the applications of ML in scientific computing. He is known for the development and analysis of multiscale numerical methods for systems whose dynamics span a wide range of spatio-temporal scales. He is the winner of the Germund Dahlquist Prize and the J.D. Crawford Prize, and a recipient of the Vannevar Bush Faculty Fellowship.

IMA Data Science Seminar: Scalable and Sample-Efficient Active Learning for Graph-Based Classification

The Institute for Mathematics and Its Applications (IMA) Data Science Seminars are a forum for data scientists of IMA academic and industrial partners to discuss and learn about recent developments in the broad area of data science. The seminars take place on Tuesdays from 1:25 p.m. - 2:25 p.m.

This week's speaker, Kevin Miller (University of California, Los Angeles), will be giving a talk titled "Scalable and Sample-Efficient Active Learning for Graph-Based Classification."

You may attend the talk either in person in Walter 402 or registering via Zoom.

Abstract

Active learning in semi-supervised classification involves introducing additional labels for unlabelled data to improve the accuracy of the underlying classifier. A challenge is to identify which points to label to best improve performance while limiting the number of new labels; this is often reflected in a tradeoff between exploration and exploitation, similar to the reinforcement learning paradigm. I will talk about my recent work designing scalable and sample-efficient active learning methods for graph-based semi-supervised classifiers that naturally balance this exploration versus exploitation tradeoff. While most work in this field today focuses on active learning for fine-tuning neural networks, I will focus on the low-label rate case where deep learning methods are generally insufficient for producing meaningful classifiers.

Biography

Kevin Miller is a rising 5th year Ph.D. candidate in Applied Mathematics at the University of California, Los Angeles (UCLA), studying graph-based machine learning methods with Dr. Andrea Bertozzi. He is currently supported by the DOD’s National Defense Science and Engineering Graduate (NDSEG) Fellowship and was previously supported by the National Science Foundation's NRT MENTOR Fellowship. His undergraduate degree was in Applied and Computational Mathematics from Brigham Young University, Provo. His research focuses on active learning and uncertainty quantification in graph-based semi-supervised classification.

Last day to receive a 25% tuition refund for canceling full semester classes

The last day to receive a 25% tuition refund for canceling full semester classes is Monday, October 4.

View the full academic schedule on One Stop.
 

Application deadline for Post-Baccalaureate Certificate

The application deadline for spring admission to the data science post-baccalaureate certificate is October 1.

Applications must be submitted online. Before applying, students should review the application procedures.

Industrial Problems Seminar: Long-term Time Series Forecasting and Data Generated by Complex Systems

In collaboration with the Minnesota Center for Industrial Mathematics, the Industrial Problems Seminars are a forum for industrial researchers to offer a first-hand glimpse into industrial research. The seminars take place Fridays from 1:25 p.m. - 2:25 p.m.

This week's speaker, Kaisa Taipale (CH Robinson), will be giving a talk titled "Long-term Time Series Forecasting and Data Generated by Complex Systems."

You may attend the talk either in person in Walter 402 or registering via Zoom.

Abstract

Data science, machine learning, and artificial intelligence are all practices implemented by humans in the context of a complex and ever-changing world. This talk will focus on the challenges of long-term, seasonal, multicyclic time series forecasting in logistics. I will discuss algorithms and implementations including STL, TBATS, and Prophet, with additional attention to the data-generating processes in trucking and the US economy and the importance in algorithm selection of understanding these data-generating processes. Subject matter expertise must always inform mathematical exploration in industry and indeed leads to asking much more interesting mathematical questions.

Data Science major application open

On October 1, applications open for the data science major. The application deadline is December 30.

Undergraduate students typically apply to a major while enrolled in fall semester courses during their sophomore year (third semester).

Submit your application here.

All applicants will be notified of their admission decision via email within three weeks of the application deadline.

UMN Machine Learning Seminar: Secure Model Aggregation in Federated Learning

The UMN Machine Learning Seminar Series brings together faculty, students, and local industrial partners who are interested in the theoretical, computational, and applied aspects of machine learning, to pose problems, exchange ideas, and foster collaborations. The talks are every Thursday from 12 p.m. - 1 p.m. during the Fall 2021 semester.

This week's speaker, Salman Avestimehr (University of Southern California), will be giving a talk titled "Secure Model Aggregation in Federated Learning."

Abstract

Federated learning (FL) has emerged as a promising approach for distributed machine learning over edge devices, in order to strengthen data privacy, reduce data migration costs, and break regulatory restrictions. A key component of FL is "secure model aggregation", which aims at protecting the privacy of each user’s individual model, while allowing their global aggregation. This problem can be viewed as a privacy-preserving multi-party computing, but with two interesting twists: (1) some users may drop out during the protocol (due to poor connectivity, low battery, unavailability, etc); (2) there is potential for multi-round privacy leakage, even if each round is perfectly secure. In this talk, I will first provide a brief overview of FL, then discuss several recent results on secure model aggregation, and finally end the talk by highlighting a few open problems in the area.

Biography

Salman Avestimehr is a Dean's Professor, the inaugural director of the USC-Amazon Center on Secure and Trusted Machine Learning (Trusted AI), and the director of the Information Theory and Machine Learning (vITAL) research lab at the Electrical and Computer Engineering Department of University of Southern California. He is also an Amazon Scholar at Alexa AI. He received his Ph.D. in 2008 and M.S. degree in 2005 in Electrical Engineering and Computer Science, both from the University of California, Berkeley. Prior to that, he obtained his B.S. in Electrical Engineering from Sharif University of Technology in 2003. His research interests include information theory, and large-scale distributed computing and machine learning, secure and private computing/learning, and federated learning.

Dr. Avestimehr has received a number of awards for his research, including the James L. Massey Research & Teaching Award from IEEE Information Theory Society, an Information Theory Society and Communication Society Joint Paper Award, a Presidential Early Career Award for Scientists and Engineers (PECASE) from the White House (President Obama), a Young Investigator Program (YIP) award from the U. S. Air Force Office of Scientific Research, a National Science Foundation CAREER award, a USC Mentoring Award, and the David J. Sakrison Memorial Prize, and several Best Paper Awards at Conferences. He has been an Associate Editor for IEEE Transactions on Information Theory and a general Co-Chair of the 2020 International Symposium on Information Theory (ISIT). He is a fellow of IEEE.

IMA Data Science Seminar

The Institute for Mathematics and Its Applications (IMA) Data Science Seminars are a forum for data scientists of IMA academic and industrial partners to discuss and learn about recent developments in the broad area of data science. The seminars take place on Tuesdays from 1:25 p.m. - 2:25 p.m.

This week's speaker is Boris Landa (Yale University).

Last day to receive a 50% tuition refund for canceling full semester classes

The last day to receive a 50% tuition refund for canceling full semester classes is Monday, September 27.

View the full academic schedule on One Stop.