Register now After registration you will be able to apply for this opportunity online.
This opportunity is not published. No applications will be accepted.
Learning Models of Granular Materials for Robotic Manipulation
Despite remarkable progress in robotics over the past few decades, manipulating granular materials such as sand remains an open challenge to robotic manipulation.
The goal of this project is to investigate recent works in differentiable physics models that try to capture the dynamics of granular material.
Despite remarkable progress in robotic manipulation over the past few decades, manipulating granular materials such as sand remains an open challenge to today's robots.
Granular materials consist of millions of microscopic particles that interact with each other, leading to a high-dimensional system with hard-to-model interaction behaviour on a particle level.
The heavy usage of such materials in, e.g. construction and the food industry makes it an exciting area for investigating modelling techniques that could be used in the context of robotic manipulation.
Researchers have proposed different paradigms for modelling granular materials.
The Discrete Element Method (DEM) [1] is based on first principles, which makes it very accurate.
However, the computational complexity [2] makes it unsuitable for robotic planning/control problems.
Visual predictive models are another direction [3], where the material's behaviour is directly learned in the space of sensory input (image, elevation maps,...).
While these models are faster, have lower dimensionality and are more intuitive to work with, they fail to generalize to unseen situations as they have no notion of the underlying laws of physics.
In this work, we want to investigate recent results in differentiable physics that try to model physical phenomena with differentiable architectures, such as neural networks.
Several works have proposed specific network architectures, such as DPI-Nets [4] or Graph Neural Networks [5], to represent the particle nature of granular materials.
This project aims to generate training data from a simulation environment, train a baseline network for pouring [1], [4], [5] and try to extend the approach to contact-rich manipulation actions such as pushing.
If time permits, the approach can be extended to real-world data where ground truth particle positions are unknown.
**Literature**
1. Inferring the material properties of granular media for robotic tasks, C. Matl et al., 2020
2. Chrono:: GPU: An Open-Source Simulation Package for Granular Dynamics Using the Discrete Element Method, Fang et al., 2021
3. The surprising effectiveness of linear models for visual foresight in object pile manipulation, Suh et al., 2020
4. Learning particle dynamics for manipulating rigid bodies, deformable objects, and fluids, Li et al., 2018
5. Manipulation of granular materials by learning particle interactions, Tuomainen et al., 2022
Despite remarkable progress in robotic manipulation over the past few decades, manipulating granular materials such as sand remains an open challenge to today's robots.
Granular materials consist of millions of microscopic particles that interact with each other, leading to a high-dimensional system with hard-to-model interaction behaviour on a particle level. The heavy usage of such materials in, e.g. construction and the food industry makes it an exciting area for investigating modelling techniques that could be used in the context of robotic manipulation.
Researchers have proposed different paradigms for modelling granular materials. The Discrete Element Method (DEM) [1] is based on first principles, which makes it very accurate. However, the computational complexity [2] makes it unsuitable for robotic planning/control problems. Visual predictive models are another direction [3], where the material's behaviour is directly learned in the space of sensory input (image, elevation maps,...). While these models are faster, have lower dimensionality and are more intuitive to work with, they fail to generalize to unseen situations as they have no notion of the underlying laws of physics. In this work, we want to investigate recent results in differentiable physics that try to model physical phenomena with differentiable architectures, such as neural networks. Several works have proposed specific network architectures, such as DPI-Nets [4] or Graph Neural Networks [5], to represent the particle nature of granular materials.
This project aims to generate training data from a simulation environment, train a baseline network for pouring [1], [4], [5] and try to extend the approach to contact-rich manipulation actions such as pushing.
If time permits, the approach can be extended to real-world data where ground truth particle positions are unknown.
**Literature**
1. Inferring the material properties of granular media for robotic tasks, C. Matl et al., 2020
2. Chrono:: GPU: An Open-Source Simulation Package for Granular Dynamics Using the Discrete Element Method, Fang et al., 2021
3. The surprising effectiveness of linear models for visual foresight in object pile manipulation, Suh et al., 2020
4. Learning particle dynamics for manipulating rigid bodies, deformable objects, and fluids, Li et al., 2018
5. Manipulation of granular materials by learning particle interactions, Tuomainen et al., 2022
1. Literature review on modelling, estimating/learning granular material in a robotic context
2. Data generation for pouring in simulation (PBD or WARP-DEM)
3. Train a baseline model as in [1], [2], [3] (or Nvidia WARP) and evaluate the accuracy of the prediction
4. Extension to contact-rich actions such as pushing and/or extension to real-world sensor feedback (point cloud, depth images,...)
1. Literature review on modelling, estimating/learning granular material in a robotic context
2. Data generation for pouring in simulation (PBD or WARP-DEM)
3. Train a baseline model as in [1], [2], [3] (or Nvidia WARP) and evaluate the accuracy of the prediction
4. Extension to contact-rich actions such as pushing and/or extension to real-world sensor feedback (point cloud, depth images,...)
- Highly motivated and independently working student
- Programming experience in C++ and Python is required
- Knowledge of supervised learning and experience in using popular ML libraries (PyTorch, TensorFlow) is required
- Knowledge of system modelling and mechanics would be highly beneficial.
- Knowledge of robot motion planning would be beneficial
- Highly motivated and independently working student - Programming experience in C++ and Python is required - Knowledge of supervised learning and experience in using popular ML libraries (PyTorch, TensorFlow) is required - Knowledge of system modelling and mechanics would be highly beneficial. - Knowledge of robot motion planning would be beneficial
- Nikhilesh Alatur: nikhilesh.alatur@mavt.ethz.ch
- Dr. Olov Andersson: olov.andersson@mavt.ethz.ch
- Nikhilesh Alatur: nikhilesh.alatur@mavt.ethz.ch - Dr. Olov Andersson: olov.andersson@mavt.ethz.ch