Past events

UMN Machine Learning Seminar

The UMN Machine Learning Seminar Series brings together faculty, students, and local industrial partners who are interested in the theoretical, computational, and applied aspects of machine learning, to pose problems, exchange ideas, and foster collaborations. The talks are every Thursday from 12 p.m. - 1 p.m. during the Summer 2021 semester.

This week's speaker, Weijie Su (Wharton Statistics Department, University of Pennsylvania) will be giving a talk titled "Local Elasticity: A Phenomenological Approach Toward Understanding Deep Learning."

Biography

Weijie Su is an Assistant Professor in the Wharton Statistics Department and in the Department of Computer and Information Science, at the University of Pennsylvania. He is a co-director of Penn Research in Machine Learning. Prior to joining Penn, he received his Ph.D. from Stanford University in 2016 and his bachelor’s degree from Peking University in 2011. His research interests span machine learning, optimization, privacy-preserving data analysis, and high-dimensional statistics. He is a recipient of the Stanford Theodore Anderson Dissertation Award in 2016, an NSF CAREER Award in 2019, and an Alfred Sloan Research Fellowship in 2020.

Abstract

Motivated by the iterative nature of training neural networks, we ask: If the weights of a neural network are updated using the induced gradient on an image of a tiger, how does this update impact the prediction of the neural network at another image (say, an image of another tiger, a cat, or a plane)? To address this question, I will introduce a phenomenon termed local elasticity. Roughly speaking, our experiments show that modern deep neural networks are locally elastic in the sense that the change in prediction is likely to be most significant at another tiger and least significant at a plane, at late stages of the training process. I will illustrate some implications of local elasticity by relating it to the neural tangent kernel and improving on the generalization bound for uniform stability. Moreover, I will introduce a phenomenological model for simulating neural networks, which suggests that local elasticity may result from feature sharing between semantically related images and the hierarchical representations of high-level features. Finally, I will offer a local-elasticity-focused agenda for future research toward a theoretical foundation for deep learning.

University closed

The University of Minnesota will be closed in observance of Memorial Day.

View the full schedule of University holidays.

Graduate Programs Information Session

Prospective students can RSVP for an information session to learn about the following graduate programs:

  • Computer Science M.S.
  • Computer Science MCS
  • Computer Science Ph.D.
  • Data Science M.S.
  • Data Science Post-Baccalaureate Certificate

During the information session, we will go over the following:

  • Requirements (general)
  • Applying
  • Prerequisite requirements
  • What makes a strong applicant
  • Funding
  • Resources
  • Common questions
  • Questions from attendees

End of spring semester

The last day of the spring 2021 semester is Wednesday, May 12.

View the full academic schedule on One Stop.

Final exams begin

Final exams for spring 2021 will be held between Thursday, May 6 and Wednesday, May 12.

View the full academic schedule on One Stop.

Cray Colloquium: Machine Learning and Inverse Problems in Imaging

The computer science colloquium takes place on Mondays from 11:15 a.m. - 12:15 p.m.

This week's talk is a part of the Cray Distinguished Speaker Series. This series was established in 1981 by an endowment from Cray Research and brings distinguished visitors to the Department of Computer Science & Engineering every year.

Our speaker is Rebecca Willett from the University of Chicago.

Abstract

Many challenging image processing tasks can be described by an ill-posed linear inverse problem: deblurring, deconvolution, inpainting, compressed sensing, and superresolution all lie in this framework. Recent advances in machine learning and image processing have illustrated that it is often possible to learn inverse problem solvers from training data that can outperform more traditional approaches by large margins. These promising initial results lead to a myriad of mathematical and computational challenges and opportunities at the intersection of optimization theory, signal processing, and inverse problem theory.

In this talk, we will explore several of these challenges and the foundational tradeoffs that underlie them. First, we will examine how knowledge of the forward model can be incorporated into learned solvers and its impact on the amount of training data necessary for accurate solutions. Second, we will see how the convergence properties of many common approaches can be improved, leading to substantial empirical improvements in reconstruction accuracy. Finally, we will consider mechanisms that leverage learned solvers for one inverse problem to develop improved solvers for related inverse problems.

This is joint work with Davis Gilton and Greg Ongie.

Biography

Rebecca Willett is a Professor of Statistics and Computer Science at the University of Chicago. Her research is focused on machine learning, signal processing, and large-scale data science. Willett received the National Science Foundation CAREER Award in 2007, was a member of the DARPA Computer Science Study Group, received an Air Force Office of Scientific Research Young Investigator Program award in 2010, and was named a Fellow of the Society of Industrial and Applied Mathematics in 2021. She is a co-principal investigator and member of the Executive Committee for the Institute for the Foundations of Data Science, helps direct the Air Force Research Lab University Center of Excellence on Machine Learning, and currently leads the University of Chicago’s AI+Science Initiative. She serves on advisory committees for the National Science Foundation’s Institute for Mathematical and Statistical Innovation, the AI for Science Committee for the US Department of Energy’s Advanced Scientific Computing Research program, the Sandia National Laboratories Computing and Information Sciences Program, and the University of Tokyo Institute for AI and Beyond. She completed her PhD in Electrical and Computer Engineering at Rice University in 2005 and was an Assistant then tenured Associate Professor of Electrical and Computer Engineering at Duke University from 2005 to 2013. She was an Associate Professor of Electrical and Computer Engineering, Harvey D. Spangler Faculty Scholar, and Fellow of the Wisconsin Institutes for Discovery at the University of Wisconsin-Madison from 2013 to 2018.

Last day of instruction

The last day of instruction for the fall 2020 semester is Monday, May 3.

View the full academic schedule on One Stop.

IMA Data Science Seminar

Data science seminars hosted by the The Institute for Mathematics and Its Applications (IMA) take place on Tuesdays from 1:25 p.m. - 2:25 p.m.

This week, Diego Cifuentes (Massachusetts Institute of Technology), will be giving the lecture.

View the full list of IMA data science seminars.

Data Science Poster Fair

We invite you to attend the annual Data Science Poster Fair! This year's event will be held virtually via Zoom on Friday, April 23 from 11:30 a.m. - 1:00 p.m.

Every year, data science M.S. students present their capstone projects during this event. This year, research preview videos have been posted below so attendees can view and plan their participation during the virtual event. Attendees will have the ability to move between breakout rooms as they please. In order to do so attendees will need to have the Zoom version 5.3 or later.

The poster fair is open to the public and all interested undergraduate and graduate students, alumni, staff, faculty, and industry professionals are encouraged to attend.  This event will be offered via a single Zoom session with 4 parallel sessions organized into 5 quarter-hour time slots.  Each parallel session will be in a separate Breakout Room within the same main Zoom session.  If you have Zoom version 5.3 or later, you will be able to move between breakout rooms at will.  Each presenter has already submitted a video on their project.  We urge you to view the videos in advance. Click on the title of each project in the table below to find the abstract and a link to the video. During the live event, the Zoom host will be a moderator who can help with logistical problems.  The moderator can be contacted via the Chat function or by returning to the main Zoom room.

Schedule

Breakout Session When Project A Project B Project C Project D
1 11:30 am - 11:45 am “A Causal Analysis of Bipolar Disorder”
Hunter Chavis-Blakely
Advisor: Erich Kummerfeld
“Detecting mental state using Machine Learning”
Mingqian Duan
Advisor: Ju Sun
“Modeling COVID-19 Case Counts with Long Short-Term Memory Networks”
Brandon Voigt
Advisor: Tracy Flood
“Exploring User Engagement in An Online Health Community”
Ruyuan Wan
Advisor: Lana Yarosh, Maria Gini
2 11:45 am - 12:00 pm “Federated Learning approach to crop identification from satellite data”
Anubha Agrawal
Advisor: Kevin Silverstein
“Defining and Monitoring Patient Clusters Based on Therapy Adherence in Sleep Apnea Management”
Mourya Karan Reddy Baddam
Advisor: Jaideep Srivastava
“Inferring level of social connectedness from observed online/offline behavior”
Ayushi Rastogi
Advisor: Lana Yarosh
“Deep Learning Approaches for Breast Cancer Related Concepts Extraction from Electronic Health Records”
Sicheng Zhou
Advisor: Rui Zhang
3 12:00 pm - 12:15 pm "Using electronic health records to understand the effects of dietary supplements among patients with Mild cognitive impairment"
Jiyang Chen
Advisor: Rui Zhang
“Machine Learning in Stock Trading”
Yuanyuan Qiu
Advisor: Paul Schrater
“Deep Learning for Morphology Detection of Self-Assembly in Atomistic Simulation”
Zhengyuan Shen
Advisor: Ilja Siepmann
“Prediction Model for Mortality in Patients with Rib Fractures Based on ICU Timeline”
Qixian Zhao
Advisor: Christopher Tignanelli
4 12:15 pm - 12:30 pm “Implementing GAN-based method for real-valued medical time series data generation”
Anushree Choudhary
Advisor: Jaideep Srivastava
“Fraud Detection Using Machine Learning Methods”
Sheng Huang
Advisor: Daniel Boley
“Anomaly detection in Ship Trajectories”
Divya Shrinivasa Nairy
Advisor: Shashi Shekhar
“Deep Neural Network Diagnosis of Autism Spectrum Disorder Through Visual Image Eye Movements”
Connor Theisen
Advisor: Catherine Qi Zhao
5 12:30 pm - 12:45 pm “Fake Chest Radiograph Generation”
Rutvij Umesh Bora
Advisor: Daniel Boley
“Hazard Detection for the Visually Impaired”
Jason Moericke
Advisor: Paul Schrater
“Leveraging Machine Learning to Predict Inherited Variants Associated with Chronic Lymphocytic Leukemia”
Raphael Mwangi
Advisor: Cavan Reilly

"Chest X-Ray Images Generation Using GAN"

Iris Yan
Advisor: Daniel Boley

 

Please contact Allison Small at csgradmn@umn.edu with any questions.

IMA Data Science Seminar

Data science seminars hosted by the The Institute for Mathematics and Its Applications (IMA) take place on Tuesdays from 1:25 p.m. - 2:25 p.m.

This week, Lars Ruthotto (Emory University), will be giving the lecture.

View the full list of IMA data science seminars.