Large Markov decision processes and combinatorial optimisation. Markov decision processes continue to gain in popularity for modelling a wide range of applications ranging from analysis of supply chains and queueing networks to cognitive science and control of autonomous vehicles. Nonetheless, they tend to become numerically intractable as the size of the model grows fast. Recent works use machine learning techniques to overcome this crucial issue, but with no convergence guarantee. This project ....Large Markov decision processes and combinatorial optimisation. Markov decision processes continue to gain in popularity for modelling a wide range of applications ranging from analysis of supply chains and queueing networks to cognitive science and control of autonomous vehicles. Nonetheless, they tend to become numerically intractable as the size of the model grows fast. Recent works use machine learning techniques to overcome this crucial issue, but with no convergence guarantee. This project aims to provide theoretically sound frameworks for solving large Markov decision processes, and exploit them to solve important combinatorial optimisation problems. This timely project can promote Australia's position in the development of such novel frameworks for many scientific and industrial applications.Read moreRead less
Principled statistical methods for high-dimensional correlation networks. This project aims to develop a novel and principled approach for building correlation networks. Correlation networks aim to identify the most significant associations present in modern massive datasets, and have numerous applications, ranging from the biomedical and environmental sciences to the social sciences. Nodes of such networks represent features, and edges represent associations, or the lack thereof. Current method ....Principled statistical methods for high-dimensional correlation networks. This project aims to develop a novel and principled approach for building correlation networks. Correlation networks aim to identify the most significant associations present in modern massive datasets, and have numerous applications, ranging from the biomedical and environmental sciences to the social sciences. Nodes of such networks represent features, and edges represent associations, or the lack thereof. Current methods are not readily scalable to modern ultra-high dimensional settings, and do not account for uncertainty in the estimated associations. This project will develop a principled, highly scalable methodology for building such networks, which incorporates uncertainty quantification. Emphasis is placed on modern ultra-high dimensional settings in which differentiating a true correlation from a spurious one is a notoriously difficult task.Read moreRead less
Bayesian inversion and computation applied to atmospheric flux fields. This project aims to make use of unprecedented sources of measurements, from remote sensing and in situ data, to estimate the sources and sinks of greenhouse gases. An overabundance of greenhouse gases in Earth's atmosphere is arguably the most serious long-term threat to the planet's ecosystems. This project will combine measurement uncertainties, process uncertainties in the physical transport models, and any parameter unce ....Bayesian inversion and computation applied to atmospheric flux fields. This project aims to make use of unprecedented sources of measurements, from remote sensing and in situ data, to estimate the sources and sinks of greenhouse gases. An overabundance of greenhouse gases in Earth's atmosphere is arguably the most serious long-term threat to the planet's ecosystems. This project will combine measurement uncertainties, process uncertainties in the physical transport models, and any parameter uncertainties, to provide reliable uncertainty quantification for the estimates. This will be achieved with new Bayesian spatio-temporal inversions and big-data computational strategies. The resulting statistical inferences on greenhouse-gas flux fields will enable the development of critical mitigation strategies. These new statistical inferences will be a valuable resource to policy-makers worldwide, who are assessing progress towards global commitments. Further, the final product may assist in developing cost-effective mitigation strategies in the presence of uncertainty.Read moreRead less
Discovery Early Career Researcher Award - Grant ID: DE180101252
Funder
Australian Research Council
Funding Amount
$343,450.00
Summary
Statistical theory and algorithms for joint inference of complex networks. This project aims to address the challenges in jointly modelling complex networks by applying an integrated approach encompassing statistical theory, computation, and applications. The project expects to contribute to core statistical methodology development for complex inference and generate new knowledge in the fields of genomics, neuroscience, and social science through in-depth analyses of large-scale multilayered net ....Statistical theory and algorithms for joint inference of complex networks. This project aims to address the challenges in jointly modelling complex networks by applying an integrated approach encompassing statistical theory, computation, and applications. The project expects to contribute to core statistical methodology development for complex inference and generate new knowledge in the fields of genomics, neuroscience, and social science through in-depth analyses of large-scale multilayered network data. Expected outcomes include enhanced theoretical and computational frameworks for probabilistic network models to better utilise the power of multiple observations. This should foster international and interdisciplinary collaborations and add significant value to the rapidly progressing field of networks research.Read moreRead less
Inference for Hawkes processes with challenging data. The Hawkes processes are statistical models for the analysis of high-impact event sequences, such as bushfires, earthquakes, infectious diseases, and cyber attacks. When the times and/or marks are missing for some events or when the data is otherwise incomplete, it is challenging to fit these models and perform diagnostic checks on the fitted models. This project aims to develop novel statistical methods to fit these models in the presence of ....Inference for Hawkes processes with challenging data. The Hawkes processes are statistical models for the analysis of high-impact event sequences, such as bushfires, earthquakes, infectious diseases, and cyber attacks. When the times and/or marks are missing for some events or when the data is otherwise incomplete, it is challenging to fit these models and perform diagnostic checks on the fitted models. This project aims to develop novel statistical methods to fit these models in the presence of incomplete data and to check the goodness-of-fit of the fitted models. The expected outcomes include publications documenting these methods and software packages implementing them. The primary benefits include the advancement of statistical methodology and the training of junior research personnel. Read moreRead less
Fast flexible feature selection for high dimensional challenging data. The project aims to provide new frameworks for fast flexible feature selection and appropriate modelling of heterogeneous data through structural varying-coefficient regression models. The outcomes will be a series of new statistical methods and concepts enabling more powerful modelling of complex bioscience data. The project will create the science for building reliable statistical models taking model uncertainty into accoun ....Fast flexible feature selection for high dimensional challenging data. The project aims to provide new frameworks for fast flexible feature selection and appropriate modelling of heterogeneous data through structural varying-coefficient regression models. The outcomes will be a series of new statistical methods and concepts enabling more powerful modelling of complex bioscience data. The project will create the science for building reliable statistical models taking model uncertainty into account, impacting how results will be interpreted, and with accompanying software. This will be a significant improvement in the assessment of model confidence in the food and health research priority areas including areas such as meat science, Huntington’s disease, and kidney transplantation.Read moreRead less
Fast approximate inference methods: new algorithms, applications and theory. This project aims to develop new algorithms and theory for fast approximate inference and lay down infrastructure to aid future extensions. Fast approximate inference methods are a principled and extensible means of fitting large and complex statistical models to big data sets. They come into their own in applications where speed is paramount and traditional approaches are not feasible. The project aims to lead to prac ....Fast approximate inference methods: new algorithms, applications and theory. This project aims to develop new algorithms and theory for fast approximate inference and lay down infrastructure to aid future extensions. Fast approximate inference methods are a principled and extensible means of fitting large and complex statistical models to big data sets. They come into their own in applications where speed is paramount and traditional approaches are not feasible. The project aims to lead to practical outcomes from better business decision-making for insurance data warehouses, to improved medical imaging technology.Read moreRead less
Perturbations in Complex Systems and Games. This project aims to: advance the perturbation theory of dynamic and stochastic games; further develop approximations of infinite dimensional linear programs by their finite dimensional counterparts, and by finding asymptotic limits of spaces of occupational measures, by solution of successive layers of fundamental equations; explain and quantify the "exceptionality" of instances of systems that are genuinely difficult to solve; and, capitalise on the ....Perturbations in Complex Systems and Games. This project aims to: advance the perturbation theory of dynamic and stochastic games; further develop approximations of infinite dimensional linear programs by their finite dimensional counterparts, and by finding asymptotic limits of spaces of occupational measures, by solution of successive layers of fundamental equations; explain and quantify the "exceptionality" of instances of systems that are genuinely difficult to solve; and, capitalise on the outstanding performance of our Snakes-and-Ladders Heuristic (SLH) for the solution of the Hamiltonian cycle problem to identify its "fixed complexity orbits" and generalise this notion to other NP-complete problems.Read moreRead less
New methods for modelling real-world extremes. This project aims to develop new theory and methods for analysing and predicting extreme values observed in real-world processes. Many existing techniques are limited by convenient mathematical assumptions that commonly do not hold in practice: dependence at asymptotic levels, process stationarity, and that the observed data are direct measurements of the process of interest. As a result, using these techniques may produce undesirable results. Expec ....New methods for modelling real-world extremes. This project aims to develop new theory and methods for analysing and predicting extreme values observed in real-world processes. Many existing techniques are limited by convenient mathematical assumptions that commonly do not hold in practice: dependence at asymptotic levels, process stationarity, and that the observed data are direct measurements of the process of interest. As a result, using these techniques may produce undesirable results. Expected outcomes of this project include theoretically justified data analysis techniques that can accurately model extreme values seen in the real world. Project benefits include more realistic analyses of nationally important applications in climate, bushfire insurance risk, and anomaly detection.Read moreRead less
Computational methods for population-size-dependent branching processes. Branching processes are the primary mathematical tool used to model populations that evolve randomly in time. Most key results in the theory are derived under the simplifying assumption that individuals reproduce and die independently of each other. However, this assumption fails in most real-life situations, in particular when the environment has limited resources or when the habitat has a restricted capacity. This project ....Computational methods for population-size-dependent branching processes. Branching processes are the primary mathematical tool used to model populations that evolve randomly in time. Most key results in the theory are derived under the simplifying assumption that individuals reproduce and die independently of each other. However, this assumption fails in most real-life situations, in particular when the environment has limited resources or when the habitat has a restricted capacity. This project aims to develop novel and effective algorithmic techniques and statistical methods for a class of branching processes with dependences. We will use these results to study significant problems in the conservation of endangered island bird populations in Oceania, and to help inform their conservation management.Read moreRead less