Newsletter header

MnRI Newsletter — April 2021

From the Director

Nikos headshot

One of the most important application domains for robotics in Minnesota is the area of agriculture. Our newsletter and website have previously covered the work of Professors David Mulla and Volkan Isler in this area and also the local start-up Rowbot. This column will expand on some of the challenges along with some of the opportunities. Autonomous or semi-autonomous robotic systems (e.g., drones, autonomous tractors, robotic manipulators, etc.) are a good fit for agricultural tasks that often are labor-intensive and repetitious. Some of the tasks where one envisions automated systems include pruning, weed removal, harvesting and picking, thinning, spraying, and phenotyping.

These tasks are also heavily dependent on a lot of data (e.g., moisture, nutrient deficiencies, weather data, etc.) that are important in effective decision-making by the farmers. All these parameters create the ideal ecosystem for the rapid deployment of robotic systems. Some of the great challenges associated with these deployment efforts are: (i) shortage of labor, (ii) lack of simple and effective human-robot interfaces, (iii) the need to feed a growing population in financially sound ways, (iv) the reduction of the available farmlands, and (v) environmentally safe use of pesticides and fertilizers.

Minnesota is at the forefront of many of these dramatic changes in agriculture. Minnesota has 40,469 square miles of farmland (2018 USDA Minnesota Ag News: Farms and Land in Farms report). Corn occupies more than 12,000 square miles. Soybeans account for a similar number of square miles. For comparison, the water surface in Minnesota is only 7,309 square miles. Minnesota has many organizations like the Corn and Soybean Growers Associations who have great interest in automation. These facts highlight the potential impact of robotics and automation not only on the state economy but also nationally and internationally. This impact will be materialized if the robotics systems can work reliably in the tough outdoors. Robotics and automation should rise to the occasion! Sensing needs to be functional and accurate in the presence of occlusions and shadows. Control needs to be robust and manipulation should transition to systems that include force feedback and flexible grippers.

The University of Minnesota is well positioned to take up the challenge. Our faculty and students are working in the area of agricultural robotics and automation for many years while the local industry and farmer organizations present unique opportunities for rapid deployment and testing of these exciting technologies. The digital transformation of agriculture is coming.

Stay safe and well,
Nikos Papanikolopoulos
Minnesota Robotics Institute Director


Return to Top

Cellular Robotics: Automated platforms for interfacing with single cells in tissue

Suhasa Kodandaramaiah headshot
Suhasa Kodandaramaiah,     
Assistant Professor,
Mechanical Engineering

For well over a century, biologists have been using microscopic probes such as glass micro needles to perturb and interface with cells, the fundamental units of life, so as to understand their function such as the mode of communication with other cells within tissue. These highly specialized techniques, such as microinjection or patch clamping have led to seminal discoveries in cell biology. Neher and Sakmann, who invented the patch clamping technique in the late 70s and early 80s won the Nobel Prize in 1992 for their work. These techniques have been gold standard in the field, but are incredibly hard to perform and require considerable skill and practice to accomplish.

My graduate work focused on applying engineering and robotics strategies to automate these delicate cellular interfacing techniques. We discovered that unbiased, non–image guided, in vivo whole-cell patching ('blind' patch clamping) of neurons, in which micropipettes are lowered until a cell is detected and then an opening in the cell membrane created for intracellular recording, can be reduced to a reliable algorithm, and built a robot to perform this algorithm in an automated fashion (Kodandaramaiah et al Nature Methods 2012). We demonstrated that this robot could obtain recordings with quality and yield similar to or exceeding trained human practitioners. Building upon this initial work, we have since extended the use of the autopatcher to perform whole cell recordings in awake head-fixed animals (Kodandaramaiah et al Nature Protocols 2016), and programmed it to obtained whole cell recordings from multiple neurons simultaneously in an automated fashion in awake behaving animals (Kodandaramaiah et al ELife 2018).

PATCH CLAMP RENDERING
Figure 1: Cartoon schematic of robotically
guided patch clamp recording electrode
interfacing with a single neuron. (Credit:
Sputnik Animation, Commissioned by
McGovern Institute for Brain Research, MIT).

Our first robotic in vivo patch clamping demonstrations were focused on automating ‘blind’ whole-cell patching, wherein no visual information of the location of the pipette or the cell being patched is available. In many experiments, it is important to be able to target specific types of neurons for patching, for which visual guidance is necessary. For instance, the brain has heterogeneous populations of cells with excitatory neurons far outnumbering inhibitory neuron populations. Inhibitory neurons are important for a number of neuronal computations occurring in the brain, and disruptions to normal functioning of inhibitory neurons is implicated in several brain disorders.

stem cells imaging
Figure 2: 100s of stem cells robotically injected
with fluorescent dye to track their progression
during brain development (Adapted from Shull
et al EMBO Reports 2019).

Building upon our prior work, we combined microscopic imaging, along with computer vision algorithms to robotically guide nanoscale pipettes to single cells that are of particular interest within the tissue (Suk et al Neuron 2018). Microscopy is used to image the pipette and cells, and computer vision algorithms locate coordinates of pipette tips and the centroids of cells of interest. Once the coordinates are determined, trajectories to guide pipettes to specific target cells of interest in brain slices can be computed. We have now used these approaches to target stem cells in developing tissue for genetic manipulation (Shull et al EMBO Reports 2019, Shull et al JoVE 2021).

In conclusion, we have a number of enabling technologies developed for truly high-throughput single-cell recordings and genetic manipulations in a variety of contexts. We are currently working towards human-out-of-the-loop, fully autonomous robotic cellular interfaces, with applications in neuroscience, genomics and precision medicine.


Reference
S. B. Kodandaramaiah, G. Talei Franzesi, B.Y. Chow, E.S. Boyden, C.R. Forest, Automated whole-cell patch clamp electrophysiology of neurons in vivo. Nature Methods, 2012
S. B. Kodandaramaiah, I. R. Wickersham, G. B. Holst, A. C. Singer, G. Talei-Franzesi, C. R. Forest, E. S. Boyden, Assembly and operation of the autopatcher for automated intracellular neural recording in vivo, Nature Protocols, 2016
S. B. Kodandaramaiah*, F. J. Flores*, G. L. Holst, A. C. Singer, X. Han , E. N. Brown, Edward S. Boyden^, C. R. Forest^ Multi-neuron intracellular recording in vivo via interacting autopatching robots, Elife 2018
H. J. Suk, I. Van Welie, S. B. Kodandaramaiah, B. D. Allen, C. R. Forest, E. S. Boyden, Closed-loop real-time imaging enables fully automated cell-targeted patch-clamp neural recording in vivo, Neuron, 2017
G. Shull, C. Haffner, W. B Huttner, E. Taverna, S. B. Kodandaramaiah, Robotic platform for microinjection into single cells in intact tissue, EMBO Reports, 2019
G. Shull, C. Haffner, W. B Huttner, E. Taverna, S. B. Kodandaramaiah, Manipulation of single neural stem cells and neurons in brain slices using robotic microinjection, Journal of Visualized experiments 2021.


Return to Top

Hyun Soo Park Leads UMN Participation in Toyota Research Institute Initiative

Hyun Soo Park headshot
Hyun Soo Park, Assistant Professor, Computer Sci. & Eng.

The behavioral state of drivers and passengers in cars (e.g., sleepiness and social interactions) play a major role in safe driving. Further, future artificial intelligence (AI) aided vehicles (such as self-driving buses) are expected to operate while taking into account the behaviors of passengers. Assistant Professor Hyun Soo Park’s lab has been selected by the Toyota Research Institute (TRI) to develop computer vision datasets and a computational model to understand people’s behavioral states. The University of Minnesota is one of 13 additional academic institutions that will participate in the next five-year phase of a collaborative research program. These universities join MIT, Stanford and the University of Michigan that have worked with TRI over the last five years to expand the body of research into AI with the goal of amplifying the human experience.

The overarching goal of Prof. Park’s project is to enable the measurement, analysis and prediction of the behaviors of drivers and passengers in cars. To achieve this goal, he proposes to create a data acquisition system for the collection of large-scale in-cabin video datasets, using a multiview camera system that can allow the capture of realistic in-situ human behavior at multiple resolutions including 1) close up facial expression, head pose and eye gaze, 2) hand and arm movements and gestures, and 3) upper body movements and interaction. With this dataset, we will develop robust and generalizable algorithms for the 4D spatiotemporal representation of behaviors.

TRI will be investing more than $75 million in the academic institutions over the next five years.

“Our first five-year program pushed the boundaries of exploratory research across multiple fields, generating 69 patent applications and nearly 650 papers,” said Eric Krotkov, TRI Chief Science Officer who leads the university research program. “Our next five years are about pushing even further and doing so with a broader, more diverse set of stakeholders. To get to the best ideas, collaboration is critical. Our aim is to build a pipeline of new ideas from different perspectives and underrepresented voices that share our vision of using AI for human amplification and societal good.”

The full list of universities that will participate in the next phase of TRI’s collaborative research program include:

  • Carnegie Mellon University
  • Columbia University
  • Florida A&M University-Florida State University College of Engineering
  • Georgia Institute of Technology (Georgia Tech)
  • Indiana University
  • Massachusetts Institute Technology (MIT)
  • Princeton University
  • Smith College
  • Stanford University
  • Toyota Technological Institute at Chicago (TTIC)
  • University of California, Berkeley
  • University of Illinois
  • University of Michigan
  • University of Minnesota
  • University of Pennsylvania
  • UCLA

Through this program, TRI will lead 35 joint research projects focused on achieving breakthroughs around difficult technological challenges in TRI’s research areas: Automated Driving, Robotics and Machine Assisted Cognition (MAC).

For more information, read the Toyota Research Institute news release.


Return to Top

2021 MnRI Seed Grants Announced

MnRI is pleased to announce seven new seed grants. The objective of the seed grants program is to help lay the foundation for future large projects of transformative impact. Proposals from faculty across the University of Minnesota were received in November and were reviewed by a panel of faculty members who were themselves not competing for funds. The following projects were selected for funding:

  • Autonomous Robotic Detection and Manipulation for Marine Debris Removal, Changhyun Choi (Electrical Engineering) and Junaed Sattar (Computer Science)
  • Muscle-Powered Exoskeleton for Standing and Walking by People with Spinal Cord Injury, Will Durfee (Mechanical Engineering) and Andrew Hanson (Rehabilitation Medicine)
  • Formal Analysis of Robot Swarms Doing Object Gathering Tasks, Maria Gini and Daniel Boley (Computer Science)
  • A Cranial Exoskeleton Assisted Large-Scale Whole-Brain Neural Recording During Complex, Unrestrained Behaviors, Suhasa Kodandaramiah (Mechanical Engineering) and Timothy Ebner (Neuroscience)
  • Using Computer Vision to Measure Social Distancing Behavior in Public Spaces, Julian Wolfson (Biostatistics), Ingrid Schneider (Forest Resources), and Catherine Zhao (Computer Science)
  • OpenMonkeyStudio++: Fine-Grained Movement Tracking and Analysis in Non-human Primates, Jan Zimmerman (Neuroscience), Benjamin Hayden (Neuroscience), and Hyun Soo Park (Computer Science)
  • Integrated Automated Design for E-Textile Circuits and Systems, Lucy Dunne (College of Design) and John Sartori (Electrical Engineering).

Return to Top

Awards and Major Grants

headshots of six MnRI faculty

Two MnRI faculty members have been selected for the prestigious 2021-2023 McKnight Land Grant Professorships. Julianna Abel (top left) from Mechanical Engineering is the recipient of a McKnight Land Grant Professorship for her research titled “From Filaments to Fabrics – Hierarchical Innovations in Multifunctional Textiles for Enhanced Human Health.” Suhasa Kodandnaramiah (top right) from Mechanical Engineering is the recipient of a McKnight Land Grant Professorship for his research titled “Bionic Skulls for Cortex Wide Neural Activity Mapping.”

Tariq Samad (middle left) from the Technological Leadership Institute was invited to present a plenary lecture at the 10th IEEE International Conference on Intelligent Systems, held in Varna, Bulgaria and online, in August 2020. The title of his lecture was “Control Systems — Concepts and Insights for Managerial Decision Making.”

Ju Sun (middle right) from Computer Science was selected as an invited speaker to the 2021 AAAI New Faculty Highlights Program by the Association for the Advancement of Artificial Intelligence.

Ryan Caverly (bottom left) from Aerospace Engineering and Mechanics has been appointed an Associate Editor for the IEEE Robotics and Automation Letters.

Liuqing Yang (bottom right) from Electrical and Computer Engineering is leading a $913,956 grant from the NSF Cyber-Physical Systems program together with Co-PIs from Colorado State University - Fort Collins titled "CPS: Medium: Collaborative Research: Collective Intelligence for Proactive Autonomous Driving (CI-PAD)."


Return to Top