He, Feixiang (2024) Advancing High-Fidelity Crowd Simulation: From Behavior to Environment Layout. PhD thesis, University of Leeds.
Abstract
Crowd simulation is a crucial research topic with diverse applications in fields such as computer animation, architecture design, urban planning and crowd management. However, simulating high-fidelity crowds has been a long-standing problem. The primary challenges stem from accurately modeling realistic crowd behaviors as well as creating convincing environment layouts. Therefore, this thesis is dedicated to presenting innovative learnable approaches that span behavior modeling to environment design, aiming to enhance the fidelity, efficiency, and interpretability of crowd simulations.
We first focus on automated simulation guidance with real-world data, i.e., trajectories/tracklets, which works with existing simulators including steering and global planning methods. To achieve this, we propose a comprehensive framework that builds a highly automated pipeline starting from raw data to crowd analysis and eventually simulation guidance. Our framework begins with a novel unsupervised analysis approach that structures raw and noisy data, containing highly mixed multi-dimensional information such as space, time and dynamics, into a series of recurring activity patterns. These patterns are manifested as space flows with temporal and dynamics profiles, and are subsequently served as a foundation for automatically guiding crowd simulations. For each space flow, a linear dynamic system is used to capture intrinsic motion randomness. After the learning process, we sample diversified trajectories while obeying its flow dynamics. Since each flow comes with a temporal and speed profile, we sample the entry time and desired speed for each trajectory. The sampled trajectories can be directly used as guidance to configure crowd simulations, thereby effectively reducing the manual efforts required for simulation configuration and producing more realistic crowds. Furthermore, based on the analysis, our framework offers a new visualization tool for complex crowd data and a set of new metrics for comparing simulated and real data. Extensive experiments and evaluations have been conducted to show the flexibility, versatility and intuitiveness of our framework.
We further explore crowd simulation in extremely high-density scenarios where acquiring accurate trajectories is difficult or even impossible. Intuitively, the above simulation guidance method becomes inapplicable. Previous works to address this problem can be coarsely categorized into empirical models and data-driven methods. The former provides good interpretability but can merely simulate general behaviors and cannot be adapted to specific crowds. In contrast, the latter can learn the dynamics of specific crowds, but lack interpretability and require labeling or accurate estimation of individual motions. In this work, we make both compatible and propose a novel learnable simulator that is high-fidelity, interpretable, and less data-demanding. Our approach treats high-density crowds as a continuum and first develops a new material model called “crowd material” to capture the specific characteristics observed in real crowds. Together with crowd materials, we propose a new material point method called Crowd MPM, a new differentiable physical model that can be used as a layer in a deep neural network. Finally, we combine Crowd MPM with deep neural networks to propose a new approach for high-density crowd simulation. Through extensive experiments and evaluations, we show that our model provides robust performance in learning crowd dynamics and explaining crowd motion, outperforming other solutions.
Lastly, our attention turns to the generation of convincing environment layouts when adapting a
simulator to diverse scenarios. In this context, the environment layouts play a crucial role in achieving realistic simulations. Previous approaches emphasized automatic layout generation to relieve designers from laborious and iterative design processes and enhance productivity, but they overlooked the significance of designer involvement in the design procedure. To address this limitation, we propose a new human-in-the-loop generative model, iPLAN, which is capable of automatically generating layouts, but also interacting with designers throughout the whole procedure, enabling humans and AI to coevolve a sketchy idea gradually into the final design. We evaluate iPLAN on diverse datasets and compare it with existing methods. The results demonstrate that iPLAN has high fidelity in producing similar layouts to those from human designers, great flexibility in accepting designer inputs and providing design suggestions accordingly, and strong generalizability when facing unseen design tasks and limited training data.
Metadata
Supervisors: | Wang, He and Hogg, David |
---|---|
Keywords: | Crowd analysis; Crowd simulation; Simulation evaluation; Layout generation |
Awarding institution: | University of Leeds |
Academic Units: | The University of Leeds > Faculty of Engineering (Leeds) > School of Computing (Leeds) |
Depositing User: | Mr Feixiang He |
Date Deposited: | 26 Feb 2024 14:56 |
Last Modified: | 26 Feb 2024 14:56 |
Open Archives Initiative ID (OAI ID): | oai:etheses.whiterose.ac.uk:34306 |
Download
Final eThesis - complete (pdf)
Embargoed until: 1 March 2029
Please use the button below to request a copy.
Filename: thesis_feixiang.pdf
Export
Statistics
Please use the 'Request a copy' link(s) in the 'Downloads' section above to request this thesis. This will be sent directly to someone who may authorise access.
You can contact us about this thesis. If you need to make a general enquiry, please see the Contact us page.