Past events

MSSE Online Information Session

Have all your questions about the Master of Science in Software Engineering (MSSE) program answered by attending this online information session.

RSVP now to reserve your spot.

Attendees will be sent a link prior to the event.
 

UMN Machine Learning Seminar

The UMN Machine Learning Seminar Series brings together faculty, students, and local industrial partners who are interested in the theoretical, computational, and applied aspects of machine learning, to pose problems, exchange ideas, and foster collaborations. The talks are every Thursday from 12 p.m. - 1 p.m. during the Summer 2021 semester.

This week's speaker, Yu Xiang (University of Utah) will be giving a talk.

Abstract

Diffusion source identification on networks is a problem of fundamental importance in a broad class of applications, including rumor controlling and virus identification. Though this problem has received significant recent attention, most studies have focused only on very restrictive settings and lack theoretical guarantees for more realistic networks. We introduce a statistical framework for the study of diffusion source identification and develop a confidence set inference approach inspired by hypothesis testing. Our method efficiently produces a small subset of nodes, which provably covers the source node with any pre-specified confidence level without restrictive assumptions on network structures. Moreover, we propose multiple Monte Carlo strategies for the inference procedure based on network topology and the probabilistic properties that significantly improve the scalability. To our knowledge, this is the first diffusion source identification method with a practically useful theoretical guarantee on general networks. We demonstrate our approach via extensive synthetic experiments on well-known random network models and a mobility network between cities concerning the COVID-19 spreading. This is joint work with Quilan Dawkins and Haifeng Xu at UVA.

Biography

Yu Xiang is an Assistant Professor in Electrical and Computer Engineering at the University of Utah since July 2018. Prior to this, he was a postdoctoral fellow in Harvard John A. Paulson School of Engineering and Applied Sciences at Harvard University. He obtained his Ph.D. in Electrical and Computer Engineering from the University of California, San Diego in 2015. I received my B.E. with the highest distinction from the School of Telecommunications Engineering at Xidian University, Xi'an, China, in 2008. His current research interests include statistical signal processing, information theory, machine learning, and their applications to neuroscience and computational biology.

2nd annual (virtual) workshop on Knowledge Guided Machine Learning

We are excited to announce the 2nd annual workshop on Knowledge Guided Machine Learning (KGML2021).

This virtual workshop will be held August 9-11, 2021, with presentations via Zoom and YouTube (links will be provided just prior to the workshop start date). KGML2021 is part of a project funded by an award from the National Science Foundation's Harnessing the Data Revolution (HDR) Big Idea, and is free and open to anyone to attend.

The workshop will include invited talks by leading experts and contributed poster sessions. The workshop will bring together data scientists (researchers in data mining, machine learning, and statistics) and researchers from hydrology, atmospheric science, aquatic sciences, and translational biology to discuss challenges, opportunities, and early progress in designing a new generation of machine learning methods that are guided by scientific knowledge.

Register here. Space permitting, registration will remain open until August 8.

The previous workshop (held August 18-20, 2020) attracted over 1,000 attendees from over 30 countries.

UMN Machine Learning Seminar

The UMN Machine Learning Seminar Series brings together faculty, students, and local industrial partners who are interested in the theoretical, computational, and applied aspects of machine learning, to pose problems, exchange ideas, and foster collaborations.

This week's speaker, Tianxi Li (University of Virginia) will be giving a talk titled "Diffusion Source Identification on Networks with Statistical Confidence." Please note that this week's seminar will be held from 12:30 p.m. - 1:30 p.m.

Abstract

Diffusion source identification on networks is a problem of fundamental importance in a broad class of applications, including rumor controlling and virus identification. Though this problem has received significant recent attention, most studies have focused only on very restrictive settings and lack theoretical guarantees for more realistic networks. We introduce a statistical framework for the study of diffusion source identification and develop a confidence set inference approach inspired by hypothesis testing. Our method efficiently produces a small subset of nodes, which provably covers the source node with any pre-specified confidence level without restrictive assumptions on network structures. Moreover, we propose multiple Monte Carlo strategies for the inference procedure based on network topology and the probabilistic properties that significantly improve the scalability. To our knowledge, this is the first diffusion source identification method with a practically useful theoretical guarantee on general networks. We demonstrate our approach via extensive synthetic experiments on well-known random network models and a mobility network between cities concerning the COVID-19 spreading. This is joint work with Quilan Dawkins and Haifeng Xu at UVA.

Biography

Tianxi Li is currently an assistant professor in the Department of Statistics at the University of Virginia. He obtained his Ph.D. from the University of Michigan in 2018. His research is mainly about statistical machine learning and statistical network analysis.

MSSE Online Information Session

Have all your questions about the Master of Science in Software Engineering (MSSE) program answered by attending this online information session.

RSVP now to reserve your spot.

Attendees will be sent a link prior to the event.
 

UMN Machine Learning Seminar

The UMN Machine Learning Seminar Series brings together faculty, students, and local industrial partners who are interested in the theoretical, computational, and applied aspects of machine learning, to pose problems, exchange ideas, and foster collaborations. The talks are every Thursday from 12 p.m. - 1 p.m. during the Summer 2021 semester.

This week's speaker, Chiyuan Zhang (Google Brain) will be giving a talk titled "Characterizing Structural Regularities of Labeled Data in Overparameterized Models."

Abstract

Humans are accustomed to environments that contain both regularities and exceptions. For example, at most gas stations, one pays prior to pumping, but the occasional rural station does not accept payment in advance. Likewise, deep neural networks can generalize across instances that share common patterns or structures, yet have the capacity to memorize rare or irregular forms. We analyze how individual instances are treated by a model via a consistency score. The score characterizes the expected accuracy for a held-out instance given training sets of varying size sampled from the data distribution. We obtain empirical estimates of this score for individual instances in multiple data sets, and we show that the score identifies out-of-distribution and mislabeled examples at one end of the continuum and strongly regular examples at the other end. We identify computationally inexpensive proxies to the consistency score using statistics collected during training. We show examples of potential applications to the analysis of deep-learning systems.

Biography

Chiyuan Zhang is a research scientist at Google Research, Brain Team. He is interested in analyzing and understanding the foundations behind the effectiveness of deep learning, as well as its connection to the cognition and learning mechanisms of the human brain. Chiyuan Zhang holds a Ph.D. from MIT (2017, advised by Tomaso Poggio), and a Bachelor (2009) and a Master (2012) degrees in computer science from Zhejiang University, China. His work was recognized by INTERSPEECH best student paper award in 2014, and ICLR best paper award in 2017.

Graduate Programs Information Session

Prospective students can RSVP for an information session to learn about the following graduate programs:

  • Computer Science M.S.
  • Computer Science MCS
  • Computer Science Ph.D.
  • Data Science M.S.
  • Data Science Post-Baccalaureate Certificate

During the information session, we will go over the following:

  • Requirements (general)
  • Applying
  • Prerequisite requirements
  • What makes a strong applicant
  • Funding
  • Resources
  • Common questions
  • Questions from attendees

UMN Machine Learning Seminar

The UMN Machine Learning Seminar Series brings together faculty, students, and local industrial partners who are interested in the theoretical, computational, and applied aspects of machine learning, to pose problems, exchange ideas, and foster collaborations. The talks are every Thursday from 12 p.m. - 1 p.m. during the Summer 2021 semester.

This week's speaker, Xiwei Tang (University of Virginia) will be giving a talk titled "Multivariate Temporal Point Process Regression with Applications in Calcium Imaging Analysis."

Abstract

Point process modeling is gaining increasing attention, as point process type data are emerging in a large variety of scientific applications. In this article, motivated by a neuronal spike trains study, we propose a novel point process regression model, where both the response and the predictor can be a high-dimensional point process. We model the predictor effects through the conditional intensities using a set of basis transferring functions in a convolutional fashion. We organize the corresponding transferring coefficients in the form of a three-way tensor, then impose the low-rank, sparsity, and subgroup structures on this coefficient tensor. These structures help reduce the dimensionality, integrate information across different individual processes, and facilitate the interpretation. We develop a highly scalable optimization algorithm for parameter estimation. We derive the large sample error bound for the recovered coefficient tensor, and establish the subgroup identification consistency, while allowing the dimension of the multivariate point process to diverge. We demonstrate the efficacy of our method through both simulations and a cross-area neuronal spike trains analysis in a sensory cortex study.

Biography

Coming soon

MSSE Online Information Session

Have all your questions about the Master of Science in Software Engineering (MSSE) program answered by attending this online information session.

RSVP now to reserve your spot.

Attendees will be sent a link prior to the event.

UMN Machine Learning Seminar

The UMN Machine Learning Seminar Series brings together faculty, students, and local industrial partners who are interested in the theoretical, computational, and applied aspects of machine learning, to pose problems, exchange ideas, and foster collaborations. The talks are every Thursday from 12 p.m. - 1 p.m. during the Summer 2021 semester.

This week's speaker, Zhaoran Wang (Northwestern University) will be giving a talk titled "Demystifying (Deep) Reinforcement Learning with Optimism and Pessimism."

Abstract

Coupled with powerful function approximators such as deep neural networks, reinforcement learning (RL) achieves tremendous empirical successes. However, its theoretical understandings lag behind. In particular, it remains unclear how to provably attain the optimal policy with a finite regret or sample complexity. In this talk, we will present the two sides of the same coin, which demonstrates an intriguing duality between optimism and pessimism.

– In the online setting, we aim to learn the optimal policy by actively interacting with the environment. To strike a balance between exploration and exploitation, we propose an optimistic least-squares value iteration algorithm, which achieves a \sqrt{T} regret in the presence of linear, kernel, and neural function approximators.

– In the offline setting, we aim to learn the optimal policy based on a dataset collected a priori. Due to a lack of active interactions with the environment, we suffer from the insufficient coverage of the dataset. To maximally exploit the dataset, we propose a pessimistic least-squares value iteration algorithm, which achieves a minimax-optimal sample complexity.

Biography

Zhaoran Wang is an assistant professor at Northwestern University, working at the interface of machine learning, statistics, and optimization. He is the recipient of the AISTATS (Artificial Intelligence and Statistics Conference) notable paper award, Microsoft Ph.D. Fellowship, Simons-Berkeley/J.P. Morgan AI Research Fellowship, Amazon Machine Learning Research Award, and NSF CAREER Award.