Scalable and Robust Bayesian Inference for Implicit Statistical Models. This project aims to develop the next generation of efficient methods for fitting complex simulation-based statistical models to data. Practitioners and scientists are interested in such implicit models to enable discoveries, produce accurate predictions and inform decisions under uncertainty. However, the associated computational cost has restricted researchers to implicit models that must have a small number of parameters ....Scalable and Robust Bayesian Inference for Implicit Statistical Models. This project aims to develop the next generation of efficient methods for fitting complex simulation-based statistical models to data. Practitioners and scientists are interested in such implicit models to enable discoveries, produce accurate predictions and inform decisions under uncertainty. However, the associated computational cost has restricted researchers to implicit models that must have a small number of parameters and be well specified, impeding scientific progress. This project will develop new computational methods and algorithms for implicit models that scale to high dimensions and are robust to misspecification. Benefits will arise from the more routine use of implicit models in epidemiology, biology, ecology and other fields.Read moreRead less
Advances in Sequential Monte Carlo Methods for Complex Bayesian Models. This project aims to develop efficient statistical algorithms for parameter estimation of complex stochastic models that currently cannot be handled. Parameter estimation is an essential component of mathematical modelling for answering scientific questions and revealing new insights. Current parameter estimation methods can be inefficient and require too much user intervention. This project will develop novel Bayesian alg ....Advances in Sequential Monte Carlo Methods for Complex Bayesian Models. This project aims to develop efficient statistical algorithms for parameter estimation of complex stochastic models that currently cannot be handled. Parameter estimation is an essential component of mathematical modelling for answering scientific questions and revealing new insights. Current parameter estimation methods can be inefficient and require too much user intervention. This project will develop novel Bayesian algorithms that are optimally automated and efficient by exploiting ever-improving parallel computing devices. The new methods will allow practitioners to process realistic models, enabling new scientific discoveries in a wide range of disciplines such as biology, ecology, agriculture, hydrology and finance.Read moreRead less
Statistical methods for quantifying variation in spatiotemporal areal data. This project aims to develop new statistical methods for extracting insights into spatial and temporal variation in areal data. These tools will extend the Australian Cancer Atlas which provides small area estimates for 20 cancers across Australia. The project is significant because it will allow government and other organisations to reap dividends from investment in collecting spatial information and it will enable mode ....Statistical methods for quantifying variation in spatiotemporal areal data. This project aims to develop new statistical methods for extracting insights into spatial and temporal variation in areal data. These tools will extend the Australian Cancer Atlas which provides small area estimates for 20 cancers across Australia. The project is significant because it will allow government and other organisations to reap dividends from investment in collecting spatial information and it will enable modelled small-area estimates to be released without compromising confidentiality. The expected outcomes include new statistical knowledge and new insights into cancer. The results will benefit the many disciplines, managers and policy makers that make decisions based on geographic data mapped over space and time. Read moreRead less
Principled statistical methods for high-dimensional correlation networks. This project aims to develop a novel and principled approach for building correlation networks. Correlation networks aim to identify the most significant associations present in modern massive datasets, and have numerous applications, ranging from the biomedical and environmental sciences to the social sciences. Nodes of such networks represent features, and edges represent associations, or the lack thereof. Current method ....Principled statistical methods for high-dimensional correlation networks. This project aims to develop a novel and principled approach for building correlation networks. Correlation networks aim to identify the most significant associations present in modern massive datasets, and have numerous applications, ranging from the biomedical and environmental sciences to the social sciences. Nodes of such networks represent features, and edges represent associations, or the lack thereof. Current methods are not readily scalable to modern ultra-high dimensional settings, and do not account for uncertainty in the estimated associations. This project will develop a principled, highly scalable methodology for building such networks, which incorporates uncertainty quantification. Emphasis is placed on modern ultra-high dimensional settings in which differentiating a true correlation from a spurious one is a notoriously difficult task.Read moreRead less
Discovery Early Career Researcher Award - Grant ID: DE180101252
Funder
Australian Research Council
Funding Amount
$343,450.00
Summary
Statistical theory and algorithms for joint inference of complex networks. This project aims to address the challenges in jointly modelling complex networks by applying an integrated approach encompassing statistical theory, computation, and applications. The project expects to contribute to core statistical methodology development for complex inference and generate new knowledge in the fields of genomics, neuroscience, and social science through in-depth analyses of large-scale multilayered net ....Statistical theory and algorithms for joint inference of complex networks. This project aims to address the challenges in jointly modelling complex networks by applying an integrated approach encompassing statistical theory, computation, and applications. The project expects to contribute to core statistical methodology development for complex inference and generate new knowledge in the fields of genomics, neuroscience, and social science through in-depth analyses of large-scale multilayered network data. Expected outcomes include enhanced theoretical and computational frameworks for probabilistic network models to better utilise the power of multiple observations. This should foster international and interdisciplinary collaborations and add significant value to the rapidly progressing field of networks research.Read moreRead less
Fast flexible feature selection for high dimensional challenging data. The project aims to provide new frameworks for fast flexible feature selection and appropriate modelling of heterogeneous data through structural varying-coefficient regression models. The outcomes will be a series of new statistical methods and concepts enabling more powerful modelling of complex bioscience data. The project will create the science for building reliable statistical models taking model uncertainty into accoun ....Fast flexible feature selection for high dimensional challenging data. The project aims to provide new frameworks for fast flexible feature selection and appropriate modelling of heterogeneous data through structural varying-coefficient regression models. The outcomes will be a series of new statistical methods and concepts enabling more powerful modelling of complex bioscience data. The project will create the science for building reliable statistical models taking model uncertainty into account, impacting how results will be interpreted, and with accompanying software. This will be a significant improvement in the assessment of model confidence in the food and health research priority areas including areas such as meat science, Huntington’s disease, and kidney transplantation.Read moreRead less
Fast approximate inference methods: new algorithms, applications and theory. This project aims to develop new algorithms and theory for fast approximate inference and lay down infrastructure to aid future extensions. Fast approximate inference methods are a principled and extensible means of fitting large and complex statistical models to big data sets. They come into their own in applications where speed is paramount and traditional approaches are not feasible. The project aims to lead to prac ....Fast approximate inference methods: new algorithms, applications and theory. This project aims to develop new algorithms and theory for fast approximate inference and lay down infrastructure to aid future extensions. Fast approximate inference methods are a principled and extensible means of fitting large and complex statistical models to big data sets. They come into their own in applications where speed is paramount and traditional approaches are not feasible. The project aims to lead to practical outcomes from better business decision-making for insurance data warehouses, to improved medical imaging technology.Read moreRead less
New methods for modelling real-world extremes. This project aims to develop new theory and methods for analysing and predicting extreme values observed in real-world processes. Many existing techniques are limited by convenient mathematical assumptions that commonly do not hold in practice: dependence at asymptotic levels, process stationarity, and that the observed data are direct measurements of the process of interest. As a result, using these techniques may produce undesirable results. Expec ....New methods for modelling real-world extremes. This project aims to develop new theory and methods for analysing and predicting extreme values observed in real-world processes. Many existing techniques are limited by convenient mathematical assumptions that commonly do not hold in practice: dependence at asymptotic levels, process stationarity, and that the observed data are direct measurements of the process of interest. As a result, using these techniques may produce undesirable results. Expected outcomes of this project include theoretically justified data analysis techniques that can accurately model extreme values seen in the real world. Project benefits include more realistic analyses of nationally important applications in climate, bushfire insurance risk, and anomaly detection.Read moreRead less
Statistical Modelling in the Era of Data Science: Theory and Practice. This project aims to develop innovative statistical methodology that is interpretable, theoretically justified, and scalable to today's growing complex data. With the influx of data being collected in both the public and private sectors, making sense of this data is a fundamental task. Through a rigorous modelling framework, this project intends to facilitate the discovery of knowledge by developing powerful new tools to extr ....Statistical Modelling in the Era of Data Science: Theory and Practice. This project aims to develop innovative statistical methodology that is interpretable, theoretically justified, and scalable to today's growing complex data. With the influx of data being collected in both the public and private sectors, making sense of this data is a fundamental task. Through a rigorous modelling framework, this project intends to facilitate the discovery of knowledge by developing powerful new tools to extract insight from these complex datasets. The outcomes of this project will benefit society by providing techniques to enable research advances and inform decision-making for a broad base of disciplines, including applications to network security, energy forecasting, environmental monitoring, and public health. Read moreRead less
Generalised Degrees of Freedom and Probabilistic Regularisation. This project intends to develop novel statistical tools for more accurate prediction by taking account of model complexity and uncertainties associated with the fitting procedure. The project also plans to develop a novel shrinkage approach via new penalty functions to avoid over-fitting and asymptotic properties. The key applications may include genetic studies where the number of predictors is large and biological experiments whe ....Generalised Degrees of Freedom and Probabilistic Regularisation. This project intends to develop novel statistical tools for more accurate prediction by taking account of model complexity and uncertainties associated with the fitting procedure. The project also plans to develop a novel shrinkage approach via new penalty functions to avoid over-fitting and asymptotic properties. The key applications may include genetic studies where the number of predictors is large and biological experiments where multivariate and temporal data are often collected – for example economical breeding in animal and fish farming and more effectively detecting the genes of interest in genetic studies on human, animals and plants.Read moreRead less