Discovery Early Career Researcher Award - Grant ID: DE240101190
Funder
Australian Research Council
Funding Amount
$451,000.00
Summary
Innovating and Validating Scalable Monte Carlo Methods. This project aims to develop innovative scalable Monte Carlo methods for statistical analysis in the presence of big data or complex mathematical models. Existing approaches to scalable Monte Carlo are only approximate, and their inaccuracies are difficult to quantify. This can have a detrimental impact on data-based decision making. The expected outcomes of this project are scalable Monte Carlo methods that are more accurate, fast and capa ....Innovating and Validating Scalable Monte Carlo Methods. This project aims to develop innovative scalable Monte Carlo methods for statistical analysis in the presence of big data or complex mathematical models. Existing approaches to scalable Monte Carlo are only approximate, and their inaccuracies are difficult to quantify. This can have a detrimental impact on data-based decision making. The expected outcomes of this project are scalable Monte Carlo methods that are more accurate, fast and capable of quantifying inaccuracies. Scientists and decision-makers will benefit from the ability to obtain timely, reliable insights for challenging applications.Read moreRead less
Stochastic majorization--minimization algorithms for data science. The changing nature of acquisition and storage data has made the process of drawing inference infeasible with traditional statistical and machine learning methods. Modern data are often acquired in real time, in an incremental nature, and are often available in too large a volume to process on conventional machinery. The project proposes to study the family of stochastic majorisation-minimisation algorithms for computation of inf ....Stochastic majorization--minimization algorithms for data science. The changing nature of acquisition and storage data has made the process of drawing inference infeasible with traditional statistical and machine learning methods. Modern data are often acquired in real time, in an incremental nature, and are often available in too large a volume to process on conventional machinery. The project proposes to study the family of stochastic majorisation-minimisation algorithms for computation of inferential quantities in an incremental manner. The proposed stochastic algorithms encompass and extend upon a wide variety of current algorithmic frameworks for fitting statistical and machine learning models, and can be used to produce feasible and practical algorithms for complex models, both current and future.
Read moreRead less
Technology-Driven and Scalable Regression Methodology, Computing and Theory. Regression is a mainstay of data analysis, statistics, machine learning and data science but is in continual need of enhancement in the face of technological change. Scalability and flexibility for the handling of non-linear signals are fundamental to the practical utility of new regression methodology. Several streams of research aimed at confronting data from specific technologies as well as generic types of data are ....Technology-Driven and Scalable Regression Methodology, Computing and Theory. Regression is a mainstay of data analysis, statistics, machine learning and data science but is in continual need of enhancement in the face of technological change. Scalability and flexibility for the handling of non-linear signals are fundamental to the practical utility of new regression methodology. Several streams of research aimed at confronting data from specific technologies as well as generic types of data are proposed. The project is to be networked with researchers in the United States of America and aims to have Australia-based researchers providing leadership in terms of methodological, theoretical, computational and software development.Read moreRead less
Inference for Hawkes processes with challenging data. The Hawkes processes are statistical models for the analysis of high-impact event sequences, such as bushfires, earthquakes, infectious diseases, and cyber attacks. When the times and/or marks are missing for some events or when the data is otherwise incomplete, it is challenging to fit these models and perform diagnostic checks on the fitted models. This project aims to develop novel statistical methods to fit these models in the presence of ....Inference for Hawkes processes with challenging data. The Hawkes processes are statistical models for the analysis of high-impact event sequences, such as bushfires, earthquakes, infectious diseases, and cyber attacks. When the times and/or marks are missing for some events or when the data is otherwise incomplete, it is challenging to fit these models and perform diagnostic checks on the fitted models. This project aims to develop novel statistical methods to fit these models in the presence of incomplete data and to check the goodness-of-fit of the fitted models. The expected outcomes include publications documenting these methods and software packages implementing them. The primary benefits include the advancement of statistical methodology and the training of junior research personnel. Read moreRead less