Discovery Early Career Researcher Award - Grant ID: DE140100993
Funder
Australian Research Council
Funding Amount
$293,520.00
Summary
Mathematics of importance: The optimal importance sampling algorithm for estimating the probability of a black swan event. Rare event simulation and modelling is critical to our understanding of high-cost hard-to-predict events such as nuclear accidents, natural disasters, and financial crises. Quantitative analysis of such high-impact events demands the accurate estimation of the probability of occurrence of such rare events. In realistic models this probability is very difficult to estimate, ....Mathematics of importance: The optimal importance sampling algorithm for estimating the probability of a black swan event. Rare event simulation and modelling is critical to our understanding of high-cost hard-to-predict events such as nuclear accidents, natural disasters, and financial crises. Quantitative analysis of such high-impact events demands the accurate estimation of the probability of occurrence of such rare events. In realistic models this probability is very difficult to estimate, because exact simple analytical formulas are not available and the existing estimation methods fail spectacularly. There is an urgent need for new efficient methodology. This project develops a new Monte Carlo method that will be able to estimate reliably and accurately rare-event probabilities. Read moreRead less
Discovery Early Career Researcher Award - Grant ID: DE160101147
Funder
Australian Research Council
Funding Amount
$294,336.00
Summary
Predicting extremes when events occur in bursts. This project seeks to advance knowledge in extreme value theory. Extreme value theory is essential to quantify risks in complex systems, such as the risk of network failures. Current statistical models for the occurrence of extremes assume that events happen regularly. This assumption, however, is at odds with human actions and many biological and physical events, which occur in bursts. There is a strong need to understand the effect of such ‘burs ....Predicting extremes when events occur in bursts. This project seeks to advance knowledge in extreme value theory. Extreme value theory is essential to quantify risks in complex systems, such as the risk of network failures. Current statistical models for the occurrence of extremes assume that events happen regularly. This assumption, however, is at odds with human actions and many biological and physical events, which occur in bursts. There is a strong need to understand the effect of such ‘bursty dynamics’ on the frequency and magnitude of extreme events. This project aims to develop extreme value theory for bursty events and thus lay the mathematical groundwork for the estimation and prediction of extremes in a variety of scientific contexts.Read moreRead less
Statistical Methods for Flow Cytometric Data. The project will aid users of flow cytometry throughout Australia. It will help foster collaborations between the biological and mathematical scientists. Biological research is an important part of Australia's future and is becoming very quantitative. During the course of the project, two PhD students will be provided strong training in Statistics geared towards biological applications. The project is aligned with the 8th Human Leucocyte Differentiat ....Statistical Methods for Flow Cytometric Data. The project will aid users of flow cytometry throughout Australia. It will help foster collaborations between the biological and mathematical scientists. Biological research is an important part of Australia's future and is becoming very quantitative. During the course of the project, two PhD students will be provided strong training in Statistics geared towards biological applications. The project is aligned with the 8th Human Leucocyte Differentiation Antigen workshop to culminate in Adelaide in December 2004 and will aid the fight against blood cell cancers. The project will also aid research on plankton with potential commercial benefits for Australia's marine scallop industry.
Read moreRead less
Computational methods for population-size-dependent branching processes. Branching processes are the primary mathematical tool used to model populations that evolve randomly in time. Most key results in the theory are derived under the simplifying assumption that individuals reproduce and die independently of each other. However, this assumption fails in most real-life situations, in particular when the environment has limited resources or when the habitat has a restricted capacity. This project ....Computational methods for population-size-dependent branching processes. Branching processes are the primary mathematical tool used to model populations that evolve randomly in time. Most key results in the theory are derived under the simplifying assumption that individuals reproduce and die independently of each other. However, this assumption fails in most real-life situations, in particular when the environment has limited resources or when the habitat has a restricted capacity. This project aims to develop novel and effective algorithmic techniques and statistical methods for a class of branching processes with dependences. We will use these results to study significant problems in the conservation of endangered island bird populations in Oceania, and to help inform their conservation management.Read moreRead less
ARC Centre of Excellence for Mathematical and Statistical Frontiers of Big Data, Big Models, New Insights. In today's world, massive amounts of data in a variety of forms are collected daily from a multitude of sources. Many of the resulting data sets have the potential to make vital contributions to society, business and government, as well as impact on international developments, but are so large or complex that they are difficult to process and analyse using traditional tools. The aim of this ....ARC Centre of Excellence for Mathematical and Statistical Frontiers of Big Data, Big Models, New Insights. In today's world, massive amounts of data in a variety of forms are collected daily from a multitude of sources. Many of the resulting data sets have the potential to make vital contributions to society, business and government, as well as impact on international developments, but are so large or complex that they are difficult to process and analyse using traditional tools. The aim of this Centre is to create innovative mathematical and statistical models that can uncover the knowledge concealed within the size and complexity of these big data sets, with a focus on using the models to deliver insight into problems vital to the Centre's Collaborative Domains: Healthy People, Sustainable Environments and Prosperous Societies.Read moreRead less
Search strategy optimisation by theory, functional analysis and simulation. This project aims to develop a novel computational platform, based on mathematical, statistical and physical theory, as well as advanced simulations, enabling the quantitative prediction of the optimal search strategy to be adopted by populations of agents searching for scarce targets in any given environment. This could lead to significant impacts on breakthrough developments in cancer immunotherapy, search and rescue r ....Search strategy optimisation by theory, functional analysis and simulation. This project aims to develop a novel computational platform, based on mathematical, statistical and physical theory, as well as advanced simulations, enabling the quantitative prediction of the optimal search strategy to be adopted by populations of agents searching for scarce targets in any given environment. This could lead to significant impacts on breakthrough developments in cancer immunotherapy, search and rescue robotics, ecological and environmental management, and developmental biology.Read moreRead less
Innovations in Bayesian likelihood-free inference. Bayesian inference is a statistical method of choice in applied science. This project will develop innovative tools which permit Bayesian inference in problems considered intractable only a few years ago. These methods will expedite advances in multidisciplinary research across a range of applications. With these foundations, this project will accelerate national research efforts into improving frameworks for projecting trends in water availabil ....Innovations in Bayesian likelihood-free inference. Bayesian inference is a statistical method of choice in applied science. This project will develop innovative tools which permit Bayesian inference in problems considered intractable only a few years ago. These methods will expedite advances in multidisciplinary research across a range of applications. With these foundations, this project will accelerate national research efforts into improving frameworks for projecting trends in water availability and management, the impact of climate extremes, telecommunications engineering, HIV and infectious disease modelling and biostatistics. With many sectors unable to recruit appropriately trained statisticians within Australia, this project will train four PhD students in Bayesian statistics.
Read moreRead less
Statistical methods for analysing multi-source microarray data and building gene regulatory networks. I will devise a statistical learning technique that does not force a gene to be assigned to exactly one category. This technique reflects the biological reality that a gene can belong to two or more functional categories. Therefore, the new technique will improve a model's ability to identify regulatory genes in different types of cancer; these regulatory genes can be targeted by new anti-cancer ....Statistical methods for analysing multi-source microarray data and building gene regulatory networks. I will devise a statistical learning technique that does not force a gene to be assigned to exactly one category. This technique reflects the biological reality that a gene can belong to two or more functional categories. Therefore, the new technique will improve a model's ability to identify regulatory genes in different types of cancer; these regulatory genes can be targeted by new anti-cancer drugs resulting in a more effective treatment. I will model gene regulatory networks using microarray data from multiple sources. These networks will be used to identify regulatory cliques - a group of genes that are vital for a cellular function. This will improve our understanding of debilitating conditions such as asthma.Read moreRead less
Asymptotic Expansions and Large Deviations in Probability and Statistics: Theory and Applications. Statistics is the major enabling science in a number of disciplines. This is fundamental research in probability and statistics but it has wide applications in Biology and Social Sciences which will ultimately be of national benefit. The behaviour of self normalized sums is an exciting new area of fundamental research that has implications for the application of statistics in many areas. U-statist ....Asymptotic Expansions and Large Deviations in Probability and Statistics: Theory and Applications. Statistics is the major enabling science in a number of disciplines. This is fundamental research in probability and statistics but it has wide applications in Biology and Social Sciences which will ultimately be of national benefit. The behaviour of self normalized sums is an exciting new area of fundamental research that has implications for the application of statistics in many areas. U-statistics for dependent situations has direct application to understanding financial time series and the analysis of sample survey data. Saddlepoint methods provide extremely accurate approximations in a number of important applications.
Read moreRead less
Empirical saddlepoint approximations and self-normalized limit theorems. Finite population sampling and resampling methods such as the bootstrap and randomization methods are central in a number of areas of application and M-estimates are the major method used to give robust methods under mild conditions; in both these areas statistics are used which are Studentized or self-normalized. We will develop asymptotic approaches for such statistics. Saddlepoint and empirical saddlepoint methods will ....Empirical saddlepoint approximations and self-normalized limit theorems. Finite population sampling and resampling methods such as the bootstrap and randomization methods are central in a number of areas of application and M-estimates are the major method used to give robust methods under mild conditions; in both these areas statistics are used which are Studentized or self-normalized. We will develop asymptotic approaches for such statistics. Saddlepoint and empirical saddlepoint methods will be used to give methods which have second order relative accuracy in large deviation regions and we will obtain limit results and Edgeworth approximations. Emphasis will be on obtaining results under weak conditions necessary for applications.Read moreRead less