Exploring Efficient Methods for Transformer-based Foundation Language Models

Xue, Huiyin ORCID: https://orcid.org/0000-0002-8705-6431 (2025) Exploring Efficient Methods for Transformer-based Foundation Language Models. PhD thesis, University of Sheffield.

Abstract

Metadata

Supervisors: Aletras, Nikolaos
Related URLs:
Keywords: natural language processing, pretraining, language modeling, efficient methods, model design, micro design, embeddings, attention mechanism
Awarding institution: University of Sheffield
Academic Units: The University of Sheffield > Faculty of Engineering (Sheffield) > Computer Science (Sheffield)
Date Deposited: 27 Jan 2026 11:48
Last Modified: 27 Jan 2026 11:48
Open Archives Initiative ID (OAI ID):

Download

Final eThesis - complete (pdf)

Export

Statistics


You do not need to contact us to get a copy of this thesis. Please use the 'Download' link(s) above to get a copy.
You can contact us about this thesis. If you need to make a general enquiry, please see the Contact us page.