Advanced Machine Learning with Bilevel Optimization. There is an urgent need to develop a new machine learning (ML) paradigm that can overcome data-privacy and model-size constraints in real-world applications. This project aims to develop an advanced paradigm of ML with bilevel optimisation, called bilevel ML. A theoretically-guaranteed fast approximate solver and a new fuzzy bilevel learning framework will be developed to achieve the aim in complex situations; a methodology to transfer knowled ....Advanced Machine Learning with Bilevel Optimization. There is an urgent need to develop a new machine learning (ML) paradigm that can overcome data-privacy and model-size constraints in real-world applications. This project aims to develop an advanced paradigm of ML with bilevel optimisation, called bilevel ML. A theoretically-guaranteed fast approximate solver and a new fuzzy bilevel learning framework will be developed to achieve the aim in complex situations; a methodology to transfer knowledge and an approach to fast-adapt bilevel optimization solutions when required computing resources change. The anticipated outcomes should significantly improve the reliability of ML with benefits for safety learning and computing resource optimisation in ML-based data analytics.Read moreRead less
Small Scalable Natural Language Models using Explicit Memory. Deep neural networks have had spectacular success in natural language processing, seeing wide-spread deployment as part of automatic assistant devices in homes and cars, and across many valuable industries including finance, medicine and law. Fueling this success is the use of ever larger models, with exponentially increasing training resources, accompanying hardware and energy demands. This project aims to develop more compact models ....Small Scalable Natural Language Models using Explicit Memory. Deep neural networks have had spectacular success in natural language processing, seeing wide-spread deployment as part of automatic assistant devices in homes and cars, and across many valuable industries including finance, medicine and law. Fueling this success is the use of ever larger models, with exponentially increasing training resources, accompanying hardware and energy demands. This project aims to develop more compact models, based on the incorporation of an explicit searchable memory, which will dramatically reduce model size, hardware requirements and energy usage. This will make modern natural language processing more accessible, while also providing greater flexibility, allowing for more adaptable and portable technologies.Read moreRead less