Math-to-Industry Boot Camp IX
Applications are due Friday, March 15th, 2024.
Overview
The Math-to-Industry Boot Camp is an intense six-week session designed to provide graduate students with training and experience that is valuable for employment outside of academia. The program is targeted at Ph.D. students in pure and applied mathematics. The boot camp consists of courses in the basics of programming, data analysis, and mathematical modeling. Students work in teams on projects and are provided with training in resume and interview preparation as well as teamwork.
There are two group projects during the session: a small-scale project designed to introduce the concept of solving open-ended problems and working in teams, and a "capstone project" that is posed by industrial scientists. Recent industrial sponsors included Cargill, Securian Financial and CH Robinson. Weekly seminars by speakers from many industry sectors provide the students with opportunities to learn about a variety of possible future careers.
Organizers
- Thomas Hoft, University of St. Thomas
- Daniel Spirn, University of Minnesota, Twin Cities
Capstone Projects
Parsimonious Model of the Implied Volatility Surface
Chris Jones, US Bank
Description: Option-implied volatility of interest rates has been of critical importance to financial market modeling. Option-implied volatilities essentially provide the market perception of an underlying interest rate at a specified horizon. The volatilities of at-the-money interest rate options of various interest rate tenors and option expirations form a three-dimensional surface. The volatility of short-term interest rates is closely related to the uncertainty of monetary policy, such as the level of the federal funds rate. Longer-term rates and longer-term option expirations are important in understanding uncertainty around inflation and growth expectations. Therefore, understanding how this surface changes provides quantitative models with a tool for interest rate risk management and bank balance sheet management in general. In this project, we will introduce a simple, parsimonious model capable of representing the range of shapes associated with the volatility surface. Participants will gain experience working with market data, parameter fitting, and a unique perspective of bank quantitative modeling of a subject of central importance in derivatives pricing, interest rate modeling, and scenario analysis.
CT Image Reconstruction Optimization
Kelsey Di Pietro, GE Healthcare
Abstract: Computed tomography (CT) is a commonly used diagnostic imaging procedure for a variety of medical specialties. CT images are created by taking several x-rays of a patient. These x-rays are then accumulated and processed through a series of computer algorithms to generate a three-dimensional image of the area of interest. Recently, there has been increased interest in developing robust photon counting CT systems, which provide greater spectral resolution while lowering the x-ray dose to the patient. However, the data is much more sensitive to noise fluctuations and is orders of magnitude larger than most current CT systems. The challenge lies in reconstructing images of sufficient quality for a radiologist to read while forming the image relatively quickly (~5 minutes for a scan). The focus of this project is two-fold:
We will use open-source data sets and python to explore potential denoising algorithms in both the sinogram and image space for CT data.
We will explore potential performance optimizations for handling the large data sets in CT imaging. In particular, how do we translate algorithms into a high-performance computing framework to be closer to target image reconstruction times.
This project is a great opportunity for anyone interested in scientific computing, medical imaging, or both.
Modeling Pre-provision Net Revenue
Kristina Martin, Minneapolis Federal Reserve Bank
Abstract: Pre-provision Net Revenue (PPNR) is an important measure of a bank's performance, and modeling PPNR is an active area of research. Projections of PPNR are a key part of the annual stress test exercise conducted by the Federal Reserve. PPNR has four subcomponents:
PPNR = Interest Income + Noninterest Income - Interest Expense - Noninterest Expense
The Fed uses several varieties of time series analysis to model different components of PPNR. In this project, we use public regulatory and macroeconomic data to explore the use of machine learning techniques for PPNR modeling. We will explore relationships between components of PPNR with bank characteristics and macroeconomic indicators and use multiple measures of predictive accuracy to compare model performance. A key aspect of this project is to consider the tradeoffs of explainability versus performance in model design.
Minimizing Noise, Vibrations and Harshness in Electric Vehicles
Parker Williams, Rivian
Abstract: Studying the tactile and aural properties of a vehicle is a branch of engineering called Noise, Vibration, and Harshness, or NVH. One method of understanding the overall performance and quality for a driver is analyzing the paths vibration can take through the vehicle, be it structure-borne or airborne, and how those vibrations arrive at the driver’s ears. This approach is called transfer path analysis, and it is currently done using both simulation and experimentation. Each design choice, from material to the size of a mass saving gap in a panel, affects NVH. Furthermore, the NVH group must work in tandem with design and manufacturing to understand how each change will affect the qualitative feel of the vehicle.
Primarily our question reduces to a constrained optimization problem relative to downstream performance targets. We wish to explore optimization techniques that can take the existing transfer path analysis and identify optimal combinations of transfer paths, which in turn suggest optimal combinations of parts, materials, etc. Currently, in the automotive industry, the exploration of the search space is nascent. Not only is it difficult from a combinatorial perspective, but also from nonlinear cost and mass considerations, methods that do not exist in any NVH software.
The problem we want to address has two phases:
- Perform the linear, cost and mass unaware optimization to arrive at a known baseline of performance that is in line with our data.
- Include nonlinear relationships into the optimization problem and develop methods to capture the qualitative nature of the final output, driver satisfaction.