Past Events

Developing Enabling Robotic Systems for High-throughput Plant Phenotyping By Dr. Tang

Join us for a presentation with Dr. Lie Tang from Iowa State University.
Dr. Tang will be presenting: Developing Enabling Robotic Systems for High-throughput Plant Phenotyping
Date: October 6th
Time: 10:00 AM - 11:00 AM
Location: BAE 106 or via Zoom (Meeting ID: 967 0425 4507)
 
Bio: Dr. Tang's research has been concerned with agricultural automation, optimization, machine intelligence and robotics. He has many years of international research experience in Europe and the US. He has developed an advanced real-time machine vision system for automated behavior monitoring for group-housed pigs in KULeuven (Belgium). During his PhD study he developed a sensing and control system for variable rate and selective weed control. He also developed an automated sensing system for corn plant spacing measurement for Deere Company. While he was on faculty in both Denmark and the Netherlands, he has been working on agricultural robotics and intelligent systems. Dr. Tang is currently continuing his research in developing advanced sensing, optimization and robotic technologies for agricultural production systems in the 21st century.

Faculty Meet and Greet with the MSR Program Students

Please join us at the MnRI Fall semester Meet and Greet with the MSR students. 

Light refreshments will be served. 

Robotics Colloquium: Guest Speaker- Ce Yang

Title: Drone and Ground-Based Remote Sensing for Precision Agriculture and Phenotyping
 

Abstract: Minnesota is a lead-producing state for several staple crops, including corn, soybean, and wheat. AI and remote sensing applied to agricultural fields enhance variable rate technology, which addresses field spatial variations and improves crop production by eliminating diseases and managing stresses more efficiently. Deep learning disease/stress detection modeling using remote sensing data and AI can also provide agricultural researchers with high-throughput solutions in field studies, e.g., scoring for diseases, pests, and stresses. This talk focuses on drone and ground-based remote sensing and AI modeling in Ce Yang's group for precision agriculture and high-throughput phenotyping.

 
Bio: Ce Yang is an associate professor working on remote sensing for precision agriculture and high-throughput phenotyping. She leads the Agricultural Robotics Lab at the University of Minnesota to work on nutrient management, yield prediction, and disease detection of staple food crops. The Ag Robotics Lab’s mission is to apply advanced ideas of robotics, remote sensing, data mining, and information technology to precision agriculture. Their core techniques include multispectral/hyperspectral imaging, spectroscopy, machine learning, geographic information systems (GIS), digital mapping, biochemical sensing, etc. The tools available for carrying out her research are unmanned aerial vehicles, unmanned ground vehicles, video cameras, multispectral cameras, hyperspectral cameras, DGPS, and various electrical, optical, and chemical sensors. Ce Yang obtained her Ph.D. in Agricultural Engineering and MS in Computer Science and Engineering at the University of Florida.

Practice Makes Perfect? Coaching by Observation and Simulation for Robots in Austere Environments

Bio: Dr. Voyles, the Daniel C. Lewis Professor of the Polytechnic, received a B.S. in Electrical Engineering from Purdue University in 1983, an M.S. from Mechanical Engineering at Stanford University in 1989, and a Ph.D. in Robotics from the School of Computer Science at Carnegie Mellon University in 1997. He was at the University of Minnesota as an Assistant Professor and then a tenured Associate Professor from 1997 - 2007, was a tenured Associate Professor of Electrical and Computer Engineering at the University of Denver from 2006 - 2013, NSF Program Director in CISE from 2010 - 2013, Assistant Director of Robotics and Cyber-Physical Systems at the White House Office of Science and Technology Policy from 2014 - 2015 and, since 2013, is Professor of Engineering Technology at Purdue University. He runs the Collaborative Robotics Lab, is the Director of the Purdue Robotics Accelerator, and was the Site Director of the NSF Center for Robotics and Sensors for Human Well-Being (RoSe-HUB). He is an IEEE Fellow.

Dr. Voyles' research interests are in the areas of miniature, constrained robots, mobile manipulation, Form + Function 4D Printing, learning from observation, robot-to-robot skill transfer for medical robotics, precision animal agriculture, and haptic sensors and actuators.

MnRI Master In Robotics Fall 2023 Admitted Students Welcoming Event

 

An in-person welcoming event where you will tour the building, meet current students and faculty, and complete administrative items for the program.

 

 

How the program can help in professional placement.

Join us on June 9, 2023, from 2:30 PM to 3:30 PM for an In-person event where the MSR program director talks about job placements over cookies and coffee.

 

 

Dissertation Title: Bridging Visual Perception and Reasoning: A Visual Attention Perspective

Doctoral Candidate: Shi Chen

Faculty Advisor: Dr. Catherine Qi Zhao

Dissertation Title: Bridging Visual Perception and Reasoning: A Visual Attention Perspective

Defense Date and Time: May 24th, 10 AM - 12 PM, Wed

Abstract:

One of the fundamental goals of Artificial Intelligence (AI) is to develop visual systems that can reason with the complexity of the world. Advances in machine learning have revolutionized many fields in computer vision, achieving human-level performance among several benchmark tasks and industrial applications. While the performance gap between machines and humans seems to be closing, the recent debates on the discrepancies between machine and human intelligence have also received a considerable amount of attention. These contradictory observations strike the very heart of AI research, and bring attention to the question: How can AI systems understand the comprehensive range of visual concepts and reason with them to accomplish various real-life tasks, as we do on a daily basis?

Humans learn much from little. With just a few relevant experiences, we are able to adapt to different situations. We also take advantage of inductive biases that can easily generalize, and avoid distraction from all kinds of statistical biases. The innate generalizability is a result of not only our profound understanding of the world but also the ways we perceive and reason with visual information. For instance, unlike machines that develop holistic understanding by scanning through the whole visual scene, humans prioritize their attention with a sequence of eye fixations. Guided by visual stimuli and the structured reasoning process, we progressively locate the regions of interest, and understand their semantic relationships as well as connections to the overall task. Research on humans' visual behavior can provide abundant insights into the development of vision models, and have the potential of contributing to AI systems that are practical for real-world scenarios.

With an overarching goal of building visual systems with human-like reasoning capability, we focus on understanding and enhancing the integration of visual perception and reasoning. We leverage visual attention as an interface for studying how humans and machines prioritize their focuses when performing visual reasoning tasks, and shed light on two important research questions: What roles does attention play in decision-making? How do we characterize attention in different scenarios?

We provide insights into these questions by making progress from three distinct perspectives: (1) From the visual perception perspective, we study how humans and machines allocate their attention when interacting with a variety of visual environments. We investigate the fine-grained characteristics of attention, which reveals the significance of different visual concepts and how they contribute to perception. (2) From the reasoning perspective, we pay attention to the connections between reasoning and visual perception, and develop vision models that make decisions in ways that agree with humans' reasoning procedures. (3) Humans not only capture and reason on important information with high accuracy, but can also justify their rationales with supporting evidence. We study the impacts of explainability in human-like intelligence and build generalizable and interpretable models. Our efforts provide an extensive collection of observations for demystifying the relationships between perception and reasoning, and offer insights into the development of trustworthy AI systems.

MnRI Master In Robotics Town Hall - Fall 2023 Admitted Students



Please join MnRI Director Nikos Papanikolopoulos at this event. Topics include course registration and preparation for starting Fall 2023.

 

MnRI In conjunction with OVPR: Opportunity to present your research to the Undersecretary of Defense Office

 

In conjunction with OVPR (as a result of the visit of the VP for Research to MnRI), we will start having these short events that will highlight your work with various funding agencies. The first event is on Monday, May 15, 2023, ⋅ 9 am – 10 am (Central Time - Chicago) with Dr. Kimberly Sablon in the Office of the Undersecretary for Defense. Her interests are in trustworthy AI and autonomy.  Her interests are described in detail in the following:

https://www.defense.gov/About/Biographies/Biography/Article/3153444/the-principal-director-for-trusted-ai-and-autonomy/


This needs to be a focused and technically rich discussion. Start with a 10-minute overview and then have 2-3 faculty from your team provide 7-10 minute presentations of their research.

We will send the presentation material in advance to the DOD. This will not be via video but audio as they do not have cameras in many rooms in the Pentagon. Please send us your material if you are interested by April 25 so the VP's office can select the material they want to send to DOD.

I suggest preparing 5-7 slides that capture your current work, some promising results, and plans. Keep it short since we will try to fit 3-4 of you in the presentation. 

Does New York City's Community Preference Policy Violate the Fair Housing Act?

Abstract: Many constraints dictate the allocation of New York City's affordable housing. Each unit is reserved for households of a particular size and income level, and preferences are given to certain groups (people with disabilities, community residents, and municipal employees) for a fraction of the units in each building. We demonstrate that these policies combine to make it difficult for low-income applicants to win lotteries for buildings outside their own neighborhoods. This finding is relevant to an ongoing lawsuit challenging the city's policy of favoring community residents. 

Bio: Nick Arnosti is an Assistant Professor at the Department of Industrial and Systems Engineering at the University of Minnesota. His research focuses on giving away social goods such as affordable housing, public school seats, visas, and scarce medical supplies. He has also studied the allocation of hunting licenses, hiking permits, and discounted event tickets. Previously, he was an Assistant Professor at Columbia Business School. He received a Ph.D. in Operations Research from Stanford University in 2016.