ORCID Profile
0000-0002-4551-203X
Current Organisation
Macquarie University
Does something not look right? The information on this page has been harvested from data sources that may not be up to date. We continue to work with information providers to improve coverage and quality. To report an issue, use the Feedback Form.
In Research Link Australia (RLA), "Research Topics" refer to ANZSRC FOR and SEO codes. These topics are either sourced from ANZSRC FOR and SEO codes listed in researchers' related grants or generated by a large language model (LLM) based on their publications.
Biostatistics | Clinical Sciences not elsewhere classified | Statistics |
Application Software Packages (excl. Computer Games) | Expanding Knowledge in the Mathematical Sciences | Expanding Knowledge in the Medical and Health Sciences
Publisher: Informa UK Limited
Date: 29-06-2016
Publisher: World Scientific Pub Co Pte Lt
Date: 04-2011
DOI: 10.1142/S1793536911000817
Abstract: The total variation (TV) minimization models are widely used in image processing, mainly due to their remarkable ability in preserving edges. There are many methods for solving the TV model. These methods, however, seldom consider the positivity constraint one should impose on image-processing problems. In this paper we develop and implement a new approach for TV image restoration. Our method is based on the multiplicative iterative algorithm originally developed for tomographic image reconstruction. The advantages of our algorithm are that it is very easy to derive and implement under different image noise models and it respects the positivity constraint. Our method can be applied to various noise models commonly used in image restoration, such as the Gaussian noise model, the Poisson noise model, and the impulsive noise model. In the numerical tests, we apply our algorithm to deblur images corrupted by Gaussian noise. The results show that our method give better restored images than the forward–backward splitting algorithm.
Publisher: Informa UK Limited
Date: 31-10-2019
Publisher: Informa UK Limited
Date: 04-08-2020
Publisher: SAGE Publications
Date: 17-01-2022
DOI: 10.1177/09622802211070254
Abstract: Competing risks models are attractive tools to analyze time-to-event data where several causes of an event are competing. However, a complexity may arise when, for instance, some subjects experience the event of interest but the causes are not known. Assuming that unknown causes of events are missing at random, we developed a novel constrained maximum penalized likelihood method for fitting semi-parametric cause-specific Cox regression models. Here, penalty functions were used to smooth the baseline hazards. An appealing feature of this approach is that all the relevant estimands in competing risks models are estimated including cause-specific hazard ratios, cause-specific baseline hazards, and cumulative incidence functions. Asymptotic results for these estimators were also developed, allowing for direct inferences. The proposed method was compared with some existing methods through a simulation study. A real data ex le was analyzed using the new method to evaluate the association of age at diagnosis with melanoma-death and non-melanoma-death in patients diagnosed with thin melanoma (tumour thickness [Formula: see text]1.0 mm). An R function for our proposed method is currently available on GitHub and will be included in the R package "survivalMPL" at CRAN.
Publisher: Elsevier BV
Date: 06-2010
Publisher: Informa UK Limited
Date: 08-12-2016
Publisher: Elsevier BV
Date: 06-2014
Publisher: Springer Science and Business Media LLC
Date: 30-07-2010
Publisher: IEEE
Date: 2010
Publisher: Wiley
Date: 30-12-2022
DOI: 10.1002/SIM.9645
Abstract: Time‐varying covariates can be important predictors when model based predictions are considered. A Cox model that includes time‐varying covariates is usually referred to as an extended Cox model. When only right censoring is presented in the observed survival times, the conventional partial likelihood method is still applicable to estimate the regression coefficients of an extended Cox model. However, if there are interval‐censored survival times, then the partial likelihood method is not directly available unless an imputation, such as the middle point imputation, is used to replaced the left‐ and interval‐censored data. However, such imputation methods are well known for causing biases. This paper considers fitting of the extended Cox models using the maximum penalised likelihood method allowing observed survival times to be partly interval censored, where a penalty function is used to regularise the baseline hazard estimate. We present simulation studies to demonstrate the performance of our proposed method, and illustrate our method with applications to two real datasets from medical research.
Publisher: Wiley
Date: 03-02-2012
DOI: 10.1002/SIM.4455
Abstract: Quality of life (QOL) assessment is a key component of many clinical studies and frequently requires the use of single global summary measures that capture the overall balance of findings from a potentially wide-ranging assessment of QOL issues. We propose and evaluate an irregular multilevel latent variable model suitable for use as a global summary tool for health-related QOL assessments. The proposed model is a multiple indicator and multiple cause style of model with a two-level latent variable structure. We approach the modeling from a general multilevel modeling perspective, using a combination of random and nonrandom cluster types to accommodate the mixture of issues commonly evaluated in health-related QOL assessments--overall perceptions of QOL and health, along with specific psychological, physical, social, and functional issues. Using clinical trial data, we evaluate the merits and application of this approach in detail, both for mean global QOL and for change from baseline. We show that the proposed model generally performs well in comparing global patterns of treatment effect and provides more precise and reliable estimates than several common alternatives such as selecting from or averaging observed global item measures. A variety of computational methods could be used for estimation. We derived a closed-form expression for the marginal likelihood that can be used to obtain maximum likelihood parameter estimates when normality assumptions are reasonable. Our approach is useful for QOL evaluations aimed at pharmacoeconomic or in idual clinical decision making and in obtaining summary QOL measures for use in quality-adjusted survival analyses.
Publisher: Walter de Gruyter GmbH
Date: 27-10-2021
Abstract: This paper considers the problem of semi-parametric proportional hazards model fitting where observed survival times contain event times and also interval, left and right censoring times. Although this is not a new topic, many existing methods suffer from poor computational performance. In this paper, we adopt a more versatile penalized likelihood method to estimate the baseline hazard and the regression coefficients simultaneously. The baseline hazard is approximated using basis functions such as M-splines. A penalty is introduced to regularize the baseline hazard estimate and also to ease dependence of the estimates on the knots of the basis functions. We propose a Newton–MI (multiplicative iterative) algorithm to fit this model. We also present novel asymptotic properties of our estimates, allowing for the possibility that some parameters of the approximate baseline hazard may lie on the parameter space boundary. Comparisons of our method against other similar approaches are made through an intensive simulation study. Results demonstrate that our method is very stable and encounters virtually no numerical issues. A real data application involving melanoma recurrence is presented and an R package ‘survivalMPL’ implementing the method is available on R CRAN.
Publisher: MDPI AG
Date: 12-03-2013
DOI: 10.3390/A6010136
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Date: 07-2012
Publisher: Elsevier BV
Date: 2015
Publisher: Wiley
Date: 16-10-2023
DOI: 10.1002/SIM.9926
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Date: 06-2008
Publisher: Informa UK Limited
Date: 1998
Publisher: IEEE
Date: 2007
Publisher: IEEE
Date: 03-2011
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Date: 08-2008
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Date: 02-2010
Publisher: Informa UK Limited
Date: 06-2006
Publisher: Oxford University Press (OUP)
Date: 26-10-2018
DOI: 10.1111/RSSA.12418
Abstract: Credit granting institutions are in the business of lending money to customers, some of whom subsequently fail to repay as promised. For these events, accurate loan balance estimates—termed exposure at default (EAD)—provide quantification of potential losses and form a required input to minimum credit capital calculation under the Basel II Accord. Most available EAD research estimates the credit conversion factor (CCF), which is a transform of EAD, but as we highlight this has substantial deficiencies: an inherent singularity rendering the CCF undefined or numerically unstable and it often provides EAD estimates that fail economic intuition. We build a descriptive model for EAD—without relying on the CCF—using the Global Credit Data database, advancing the literature in three important ways. First we identify, like other studies on revolving facilities, that balance and limits drive EAD and we therefore develop our model to capture these joint dynamics flexibly. Second we find evidence in the data of risk-based line management where lenders tend to decrease limits for riskier obligors. Third we confirm results from other studies of mild EAD countercyclicality, whereby EAD is lower during a subdued economy.
Publisher: Wiley
Date: 03-2015
DOI: 10.1111/ANZS.12110
Publisher: SPIE
Date: 09-02-2006
DOI: 10.1117/12.645354
Publisher: Wiley
Date: 26-04-2022
DOI: 10.1002/SIM.9415
Abstract: Time‐to‐event data in medical studies may involve some patients who are cured and will never experience the event of interest. In practice, those cured patients are right censored. However, when data contain a cured fraction, standard survival methods such as Cox proportional hazards models can produce biased results and therefore misleading interpretations. In addition, for some outcomes, the exact time of an event is not known instead an interval of time in which the event occurred is recorded. This article proposes a new computational approach that can deal with both the cured fraction issues and the interval censoring challenge. To do so, we extend the traditional mixture cure Cox model to accommodate data with partly interval censoring for the observed event times. The traditional method for estimation of the model parameters is based on the expectation‐maximization (EM) algorithm, where the log‐likelihood is maximized through an indirect complete data log‐likelihood function. We propose in this article an alternative algorithm that directly optimizes the log‐likelihood function. Extensive Monte Carlo simulations are conducted to demonstrate the performance of the new method over the EM algorithm. The main advantage of the new algorithm is the generation of asymptotic variance matrices for all the estimated parameters. The new method is applied to a thin melanoma dataset to predict melanoma recurrence. Various inferences, including survival and hazard function plots with point‐wise confidence intervals, are presented. An R package is now available at Github and will be uploaded to R CRAN.
Publisher: Elsevier BV
Date: 03-2011
Publisher: Springer Science and Business Media LLC
Date: 06-02-2007
DOI: 10.1007/S10709-007-9139-4
Abstract: Since it was first recognised that eukaryotic genes are fragmented into coding segments (exons) separated by non-coding segments (introns), the reason for this phenomenon has been debated. There are two dominant theories: that the piecewise arrangement of genes allows functional protein domains, represented by exons, to recombine by shuffling to form novel proteins with combinations of functions or that introns represent parasitic DNA that can infest the eukaryotic genome because it does not interfere grossly with the fitness of its host. Differing distributions of exon lengths are predicted by these two theories. In this paper we examine distributions of exon lengths for six different organisms and find that they offer empirical evidence that both theories may in part be correct.
Publisher: Elsevier BV
Date: 09-2019
Publisher: Foundation for Open Access Statistic
Date: 2020
Publisher: Wiley
Date: 26-03-2018
DOI: 10.1002/SIM.7651
Abstract: This paper considers Cox proportional hazard models estimation under informative right censored data using maximum penalized likelihood, where dependence between censoring and event times are modelled by a copula function and a roughness penalty function is used to restrain the baseline hazard as a smooth function. Since the baseline hazard is nonnegative, we propose a special algorithm where each iteration involves updating regression coefficients by the Newton algorithm and baseline hazard by the multiplicative iterative algorithm. The asymptotic properties for both regression coefficients and baseline hazard estimates are developed. The simulation study investigates the performance of our method and also compares it with an existing maximum likelihood method. We apply the proposed method to a dementia patients dataset.
Publisher: IEEE
Date: 05-2008
Publisher: Elsevier BV
Date: 09-2016
Publisher: SAGE Publications
Date: 03-1994
DOI: 10.1177/096228029400300104
Abstract: Many algorithms for medical image reconstruction adopt versions of the expectation-maximization (EM) algorithm. In this approach, parameter estimates are obtained which maximize a complete data likelihood or penalized likelihood, in each iteration. Implicitly (and sometimes explicitly) penalized algorithms require smoothing of the current reconstruction in the image domain as part of their iteration scheme. In this paper, we discuss alternatives to EM which adapt Fisher's method of scoring (FS) and other methods for direct maximization of the incomplete data likelihood. Jacobi and Gauss-Seidel methods for non-linear optimization provide efficient algorithms applying FS in tomography. One approach uses smoothed projection data in its iterations. We investigate the convergence of Jacobi and Gauss-Seidel algorithms with clinical tomographic projection data.
Start Date: 06-2022
End Date: 05-2025
Amount: $405,000.00
Funder: Australian Research Council
View Funded Activity