Wang, Shengyin ORCID: 0009-0000-8866-3789
(2024)
Model Simplification for Deformable Object Manipulation.
PhD thesis, University of Leeds.
Abstract
This thesis presents motion planning, control, and perception methods for deformable object manipulation, with a focus on physics-based approaches. Here, a physics model or engine is used to predict the effects of robotic motions, such as how a deformable object changes shape when manipulated by a robot gripper. Deformable object manipulation poses three major challenges. First, the configuration space of the deformable object is extremely high-dimensional, which in turn makes the action space also large. Second, the dynamics of the deformable object is highly complex; a small motion may cause the shape to change drastically, making simulation computationally expensive. Third, compared to rigid objects, deformable objects experience not only environmental occlusion but also severe self-occlusion. These challenges make motion planning time-consuming and pose significant difficulties for both control and perception. To alleviate the computational burden of motion planning, I first propose reducing the action space for the planner based on geometric model simplification at the goal. Specifically, two implementations are developed: piece-wise line fitting for 1-D linear objects and mesh simplification for 2-D surface objects. Simulation experiments validate that with the goal-conditioned action space, the planner finds more effective trajectories more efficiently. Next, I propose simplifying the dynamics model to enable faster trajectory rollouts during motion planning. This is achieved through both goal-informed model simplification and uninformed universal simplification. Experiments confirm a significant reduction in planning time, albeit with some sacrifice in optimality. Additionally, I introduce an iterative model simplification and motion planning framework, which progressively improves the simplified dynamics model during motion planning. Experiments verify the effectiveness of the framework, demonstrating improved trajectory quality compared to planning based on a single simplified model, while remaining more efficient than planning based on the original high-fidelity, time-consuming dynamics models. To robustly execute the planned trajectory in the real world, I improve the Constrained Deformable Coherent Point Drift (CDCPD) method for deformable object tracking. This improvement explicitly accounts for occlusion, generates a thicker point cloud by combining the current and previous point clouds, and partially updates the positions of the tracking points. Experiments demonstrate that the improved tracking method can successfully differentiate between different layers and achieves much more precise tracking than the original method. Based on the proposed methods above, a closed-loop experimental system is established in the real world, comprising a perception subsystem, a motion planning subsystem, and a robot control subsystem. A combination of position-based control and impedance control is implemented and demonstrated to be effective for manipulating deformable objects with a robot in real-world cloth folding tasks.
Metadata
Supervisors: | Dogar, Mehmet and Leonetti, Matteo |
---|---|
Keywords: | Deformable Object Manipulation; Motion Planning; Model Simplification |
Awarding institution: | University of Leeds |
Academic Units: | The University of Leeds > Faculty of Engineering (Leeds) > School of Computing (Leeds) |
Depositing User: | Mr Shengyin Wang |
Date Deposited: | 20 May 2025 14:45 |
Last Modified: | 20 May 2025 14:45 |
Open Archives Initiative ID (OAI ID): | oai:etheses.whiterose.ac.uk:36769 |
Download
Final eThesis - complete (pdf)
Filename: Shengyin_s_Thesis.pdf
Licence:
This work is licensed under a Creative Commons Attribution NonCommercial ShareAlike 4.0 International License
Export
Statistics
You do not need to contact us to get a copy of this thesis. Please use the 'Download' link(s) above to get a copy.
You can contact us about this thesis. If you need to make a general enquiry, please see the Contact us page.