Zotov, Evgeny ORCID: https://orcid.org/0000-0002-5135-1543 (2022) StyleGAN-based machining digital twin for smart manufacturing. PhD thesis, University of Sheffield.
Abstract
Manufacturing enterprises are challenged to remain competitive due to the increasing demand for greater product variability and quality, intensifying complexity of the production processes, as well as a drive for sustainable manufacturing and the increasing regulatory impact resulting in high labour and energy costs. Consolidated around the discussion of Industry 4.0, the efficient and effective solutions to these challenges lie outside the mainstream production methods. One of the drivers of transition towards the novel manufacturing paradigm is the technological modernisation of the production processes motivated by the increasing availability of computational capacities.
Manufacturing digitalisation is a critical part of the transition towards Industry 4.0. Digital twin plays a significant role as the instrument that enables digital access to precise real-time information about physical objects and supports the optimisation of the related processes through conversion of the big data associated with them into actionable information. A number of frameworks and conceptual models has been proposed in the research literature that addresses the requirements and benefits of digital twins, yet their applications are explored to a lesser extent.
The work presented in this thesis aims to make a proposition that considers the novel challenges introduced for data analysis in the presence of heterogeneous and dynamic cyber-physical systems in Industry 4.0. In this thesis a time-domain machining vibration model based on a generative adversarial network (GAN) is proposed as a digital twin component. The developed conditional StyleGAN architecture enables (1) the extraction of knowledge from existing models and (2) a data-driven simulation applicable for production process optimisation. A novel solution to the challenges in GAN analysis is then developed, where the comparison of maps of generative accuracy and sensitivity reveals patterns of similarity between these metrics.
The proposed simulation model is further extended to reuse the knowledge extracted from a source model and adapt it to a given target environment, enabling the elicitation of information from both physics-based and data-driven solutions. This approach is implemented as a novel domain adaptation algorithm based on the GAN model: CycleStyleGAN. The architecture is validated in an experimental scenario that aims to replicate a real-world manufacturing knowledge transfer problem. The experiment shows that the transferred information enables the reduction of the required target domain data by one order of magnitude.
The thesis thus builds a strong case for a StyleGAN-based digital twin to be developed to support practical implementation of technologies paving the road towards the target state of Industry 4.0.
Metadata
Supervisors: | Kadirkamanathan, Visakan |
---|---|
Related URLs: | |
Keywords: | knowledge transfer, transfer learning, domain adaptation, incremental learning, artificial intelligence, deep learning, machine learning, artificial neural network, generative adversarial network, StyleGAN, industry 4.0, smart manufacturing, digital twin |
Awarding institution: | University of Sheffield |
Academic Units: | The University of Sheffield > Faculty of Engineering (Sheffield) > Automatic Control and Systems Engineering (Sheffield) |
Identification Number/EthosID: | uk.bl.ethos.855745 |
Depositing User: | Evgeny Zotov |
Date Deposited: | 23 May 2022 08:44 |
Last Modified: | 01 Jul 2022 09:54 |
Open Archives Initiative ID (OAI ID): | oai:etheses.whiterose.ac.uk:30778 |
Download
Final eThesis - complete (pdf)
Filename: MyThesis.pdf
Licence:
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 2.5 License
Related datasets
Export
Statistics
You do not need to contact us to get a copy of this thesis. Please use the 'Download' link(s) above to get a copy.
You can contact us about this thesis. If you need to make a general enquiry, please see the Contact us page.