Discovery Early Career Researcher Award - Grant ID: DE240100165
Funder
Australian Research Council
Funding Amount
$443,847.00
Summary
Evolving privacy and utility in data storage and publishing. This project aims to develop a distributed evolutionary computation-based framework to optimize data privacy and utility in distributed database systems. It intends to synchronously solve the conflicting challenges of privacy preservation and utility maintenance in multi-objective, dynamic, and multitasking scenarios. Expected outcomes include a new computation framework as a service and freely available distributed computation models, ....Evolving privacy and utility in data storage and publishing. This project aims to develop a distributed evolutionary computation-based framework to optimize data privacy and utility in distributed database systems. It intends to synchronously solve the conflicting challenges of privacy preservation and utility maintenance in multi-objective, dynamic, and multitasking scenarios. Expected outcomes include a new computation framework as a service and freely available distributed computation models, evolutionary algorithms, and knowledge-transfer strategies. Anticipated benefits include theoretical contributions to artificial intelligence, cyber security, distributed computation, and a service to eliminate data owners’ privacy concerns while guaranteeing the value of data in further utilization.Read moreRead less
Small Scalable Natural Language Models using Explicit Memory. Deep neural networks have had spectacular success in natural language processing, seeing wide-spread deployment as part of automatic assistant devices in homes and cars, and across many valuable industries including finance, medicine and law. Fueling this success is the use of ever larger models, with exponentially increasing training resources, accompanying hardware and energy demands. This project aims to develop more compact models ....Small Scalable Natural Language Models using Explicit Memory. Deep neural networks have had spectacular success in natural language processing, seeing wide-spread deployment as part of automatic assistant devices in homes and cars, and across many valuable industries including finance, medicine and law. Fueling this success is the use of ever larger models, with exponentially increasing training resources, accompanying hardware and energy demands. This project aims to develop more compact models, based on the incorporation of an explicit searchable memory, which will dramatically reduce model size, hardware requirements and energy usage. This will make modern natural language processing more accessible, while also providing greater flexibility, allowing for more adaptable and portable technologies.Read moreRead less