ORCID Profile
0000-0003-1653-5803
Current Organisation
University of Queensland
Does something not look right? The information on this page has been harvested from data sources that may not be up to date. We continue to work with information providers to improve coverage and quality. To report an issue, use the Feedback Form.
Publisher: SAGE Publications
Date: 07-2018
Abstract: Acinetobacter is a rare but important cause of peritonitis in peritoneal dialysis (PD) patients. As the complication has not been comprehensively evaluated previously, the present study examined the outcomes of Acinetobacter peritonitis in a large, national cohort of PD patients. The study included all episodes of peritonitis in Australia from January 2004 to December 2014 using Australia and New Zealand Dialysis and Transplant Registry (ANZDATA) data. The primary outcome was peritonitis cure and secondary outcomes were catheter removal, hemodialysis transfer, recurrent/relapsing peritonitis, peritonitis-related hospitalization, and death. Outcomes were compared using multivariable logistic regression. Overall, 5,367 patients experienced 11,122 episodes of peritonitis across 51 centers in Australia. Of these, 228 (4.2%) patients experienced 253 (2.3%) episodes of Acinetobacter peritonitis (176 episodes were due to Acinetobacter alone and 77 involved co-infection with other organisms). Of the 176 solitary Acinetobacter episodes, 131(74%) achieved cure with antibiotics alone. Compared with Acinetobacter, significantly lower odds of peritonitis cure were observed for Pseudomonas (adjusted odds ratio [AOR] 0.24, 95% confidence interval [CI]: 0.16 – 0.36), other gram-negative organisms (AOR 0.54, 95% CI 0.37 – 0.77), fungi (AOR 0.02, 95% CI 0.01 – 0.03), and polymicrobial organisms (AOR 0.36, 95% CI 0.25 – 0.51), whilst similar odds of cure were observed for Staphylococcus (AOR 0.73, 95% CI 0.50 – 1.06), other gram-positive organisms (AOR 1.32,95% CI 0.93 – 1.89), culture-negative (AOR 1.19, 95% CI 0.82 –1.71), and other organisms (AOR 0.72, 95% CI 0.49 – 1.07). The odds of catheter removal and hemodialysis transfer were higher with Pseudomonas, other gram-negative, fungal, and polymicrobial peritonitis than with Acinetobacter peritonitis. The odds of death were also higher with Pseudomonas and fungal peritonitis than with Acinetobacter peritonitis. Treatment of Acinetobacter peritonitis with gentamicin, ciprofloxacin, or ceftazidime achieved comparable outcomes. Outcomes of Acinetobacter peritonitis were favorable compared with most other forms of organism-specific peritonitis. Commonly used antibiotics covering gram-negative bacteria achieved comparable outcomes in Acinetobacter peritonitis.
Publisher: Elsevier BV
Date: 03-2018
DOI: 10.1053/J.AJKD.2017.09.018
Abstract: Many randomized controlled trials have been performed with the goal of improving outcomes related to hemodialysis vascular access. If the reported outcomes are relevant and measured consistently to allow comparison of interventions across trials, such trials can inform decision making. This study aimed to assess the scope and consistency of vascular access outcomes reported in contemporary hemodialysis trials. Systematic review. Adults requiring maintenance hemodialysis. All randomized controlled trials and trial protocols reporting vascular access outcomes identified from ClinicalTrials.gov, Embase, MEDLINE, and the Cochrane Kidney and Transplant Specialized Register from January 2011 to June 2016. Any hemodialysis-related intervention. The frequency and characteristics of vascular access outcome measures were analyzed and classified. From 168 relevant trials, 1,426 access-related outcome measures were extracted and classified into 23 different outcomes. The 3 most common outcomes were function (136 [81%] trials), infection (63 [38%]), and maturation (31 [18%]). Function was measured in 489 different ways, but most frequently reported as "mean access blood flow (mL/min)" (37 [27%] trials) and "number of thromboses" (30 [22%]). Infection was assessed in 136 different ways, with "number of access-related infections" being the most common measure. Maturation was assessed in 44 different ways at 15 different time points and most commonly characterized by vein diameter and blood flow. Patient-reported outcomes, including pain (19 [11%]) and quality of life (5 [3%]), were reported infrequently. Only a minority of trials used previously standardized outcome definitions. Restricted s ling frame for feasibility and focus on contemporary trials. The reporting of access outcomes in hemodialysis trials is very heterogeneous, with limited patient-reported outcomes and infrequent use of standardized outcome measures. Efforts to standardize outcome reporting for vascular access are critical to optimizing the comparability, reliability, and value of trial evidence to improve outcomes for patients requiring hemodialysis.
Publisher: Wiley
Date: 12-2008
DOI: 10.1111/J.1440-1754.2008.01413.X
Abstract: A 1993 study of blood lead levels (BLLs) in pre-schoolers living in Fremantle showed 25% had BLLs >or= 10 microg/dL. This study compares the 1993 BLLs with a s le of contemporary Fremantle pre-schoolers. Pre-schoolers (0-5 years) living in the Fremantle area were recruited from hospital and community settings during 2005. As in the 1993 study, guardians completed a questionnaire concerning demographic, environmental and behavioural variables. BLLs were determined by the same method used in 1993. Statistical analysis compared the 1993 and 2005 s les according to demographic variables and dichotomised BLL. Multivariate linear regression was used to control for confounding variables, and linear regression was used to identify risk factors in the 2005 s le. Community (40) and hospital (60) participants provided blood and completed questionnaires none had BLLs >or=10 microg/dL. Compared with the 1993 s le, 2005 participants were younger, fewer were aboriginal, more had occupied their homes for over 6 months and more had a habit of putting soil in their mouths. After controlling for these variables, the geometric mean BLL in 2005 remained significantly lower than the 1993 value (1.83 and 6.82 microg/dL respectively). As in 1993, aboriginality, presence of participants during home renovation, occupancy of home less than 6 months and living <200 m from a main road were associated with higher mean BLLs. The reassuring decline in the mean BLL between the1993 and 2005 s les is likely associated with the phasing out of leaded petrol. Future research should concentrate on monitoring groups at higher risk.
Publisher: SAGE Publications
Date: 09-2017
Abstract: The HONEYPOT trial failed to establish the superiority of exit-site application of Medihoney compared with nasal mupirocin prophylaxis for the prevention of peritonitis in peritoneal dialysis (PD) patients. This study aimed to assess the representativeness of the patients in the HONEYPOT trial to the Australian and New Zealand PD population. This study compared baseline characteristics of the 371 PD patients in the HONEYPOT trial with those of 6,085 PD patients recorded on the Australia and New Zealand Dialysis and Transplant (ANZDATA) Registry. Compared with the PD population, the HONEYPOT s le was older (standardized difference [ d] = 0.19, p = 0.003), more likely to be treated with automated PD ( d = 0.58, p 0.001), had higher residual renal function ( d = 0.26, p 0.001) and a higher proportion of participants with end-stage kidney disease due to polycystic kidney disease ( d = 0.17) and lower proportion due to diabetes ( d = -0.17) and glomerulonephritis ( d = -0.18) ( p 0.001), and lower proportions of indigenous people ( d = -0.17, p 0.001), current smokers ( d = -0.10, p 0.001), and people with prior histories of hemodialysis ( d = -0.16, p 0.001), diabetes mellitus ( d = -0.18, p 0.001), and coronary artery disease ( d = -0.15, p 0.001). HONEYPOT trial participants tended to be healthier than the Australian and New Zealand PD patient population. Although the differences between the groups were generally modest, it is possible that their cumulative effect may have had some impact on external generalizability, which is not an uncommon occurrence in clinical trials.
Publisher: Springer Science and Business Media LLC
Date: 30-05-2023
DOI: 10.1186/S13063-023-07363-4
Abstract: An increasing number of older people are living with chronic kidney disease (CKD). Many have complex healthcare needs and are at risk of deteriorating health and functional status, which can adversely affect their quality of life. Comprehensive geriatric assessment (CGA) is an effective intervention to improve survival and independence of older people, but its clinical utility and cost-effectiveness in frail older people living with CKD is unknown. The GOAL Trial is a pragmatic, multi-centre, open-label, superiority, cluster randomised controlled trial developed by consumers, clinicians, and researchers. It has a two-arm design, CGA compared with standard care, with 1:1 allocation of a total of 16 clusters. Within each cluster, study participants ≥ 65 years of age (or ≥ 55 years if Aboriginal or Torres Strait Islander (First Nations Australians)) with CKD stage 3–5/5D who are frail, measured by a Frailty Index (FI) of 0.25, are recruited. Participants in intervention clusters receive a CGA by a geriatrician to identify medical, social, and functional needs, optimise medication prescribing, and arrange multidisciplinary referral if required. Those in standard care clusters receive usual care. The primary outcome is attainment of self-identified goals assessed by standardised Goal Attainment Scaling (GAS) at 3 months. Secondary outcomes include GAS at 6 and 12 months, quality of life (EQ-5D-5L), frailty (Frailty Index – Short Form), transfer to residential aged care facilities, cost-effectiveness, and safety (cause-specific hospitalisations, mortality). A process evaluation will be conducted in parallel with the trial including whether the intervention was delivered as intended, any issue or local barriers to intervention delivery, and perceptions of the intervention by participants. The trial has 90% power to detect a clinically meaningful mean difference in GAS of 10 units. This trial addresses patient-prioritised outcomes. It will be conducted, disseminated and implemented by clinicians and researchers in partnership with consumers. If CGA is found to have clinical and cost-effectiveness for frail older people with CKD, the intervention framework could be embedded into routine clinical practice. The implementation of the trial’s findings will be supported by presentations at conferences and forums with clinicians and consumers at specifically convened workshops, to enable rapid adoption into practice and policy for both nephrology and geriatric disciplines. It has potential to materially advance patient-centred care and improve clinical and patient-reported outcomes (including quality of life) for frail older people living with CKD. ClinicalTrials.gov NCT04538157. Registered on 3 September 2020.
Publisher: Oxford University Press (OUP)
Date: 05-2017
Publisher: Oxford University Press (OUP)
Date: 09-02-2014
DOI: 10.1093/NDT/GFU004
Abstract: Left ventricular (LV) systolic dysfunction is an important predictor of cardiovascular death. Global longitudinal strain (GLS) is a widely available echocardiographic technique proven to be more sensitive than conventional ejection fraction (EF) in detecting subtle changes in LV function. However, the prognostic value of GLS in patients with chronic kidney disease (CKD) is unknown. We studied 447 patients from a single center who were stratified according to estimated glomerular filtration rate (eGFR). GLS was calculated using two-dimensional speckle tracking and EF was measured using Simpson's biplane. Cox proportional hazard model was used to identify independent predictors of survival and measures of discrimination and reclassification were used to assess the predictive value of GLS. Multivariable regression models were used to evaluate clinical and laboratory factors associated with GLS. The mean EF was 58 ± 11% and GLS was -16.6 ± 4.2%. eGFR correlated negatively with GLS (r = -0.14, P = 0.004). Factors that were independently associated with GLS include gender, previous myocardial infarction, eGFR and phosphate (R(2) = 0.16, P < 0.001). Sixty-four patients died in a follow-up of 5.2 ± 1.4 years. GLS remained a significant predictor of all-cause mortality [hazard ratio (HR) 1.08, 95% confidence interval (CI) 1.01-1.15] following adjustment for age, diabetes mellitus, hypertension, eGFR and left ventricular mass index (LVMI). The strength of association between demographic data, eGFR, LVMI and mortality increased following addition of GLS [c-statistic 0.68 (95% CI 0.61-0.74) to 0.71 (95% CI 0.64-0.77), P = 0.04]. Addition of GLS also demonstrated a 21% net reclassification improvement in risk prediction for all-cause mortality over clinical factors. GLS is an important predictor of all-cause mortality in CKD patients. Traditional and non-traditional risk factors such as phosphate are important determinants of GLS. Strain assessment in CKD patients may provide greater cardiovascular risk stratification.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 04-2021
Abstract: This study examined patient and center factors associated with arteriovenous fistula/graft access use at hemodialysis commencement. Arteriovenous access use at hemodialysis commencement varied four-fold from 15% to 62% (median 39%) across centers. There is substantial variability in arteriovenous access use across centers. Commencing hemodialysis (HD) with an arteriovenous access is associated with superior patient outcomes compared with a catheter, but the majority of patients in Australia and New Zealand initiate HD with a central venous catheter. This study examined patient and center factors associated with arteriovenous fistula/graft access use at HD commencement. We included all adult patients starting chronic HD in Australia and New Zealand between 2004 and 2015. Access type at HD initiation was analyzed using logistic regression. Patient-level factors included sex, age, race, body mass index (BMI), smoking status, primary kidney disease, late nephrologist referral, comorbidities, and prior RRT. Center-level factors included size transplant capability home HD proportion incident peritoneal dialysis (average number of patients commencing RRT with peritoneal dialysis per year) mean weekly HD hours average blood flow and achievement of phosphate, hemoglobin, and weekly Kt/V targets. The study included 27,123 patients from 61 centers. Arteriovenous access use at HD commencement varied four-fold from 15% to 62% (median 39%) across centers. Incident arteriovenous access use was more likely in patients aged 51–72 years, males, and patients with a BMI of kg/m 2 and polycystic kidney disease but use was less likely in patients with a BMI of .5 kg/m 2 , late nephrologist referral, diabetes mellitus, cardiovascular disease, chronic lung disease, and prior RRT. Starting HD with an arteriovenous access was less likely in centers with the highest proportion of home HD, and no center factor was associated with higher arteriovenous access use. Adjustment for center-level characteristics resulted in a 25% reduction in observed intercenter variability of arteriovenous access use at HD initiation compared with the model adjusted for only patient-level characteristics. This study identified several patient and center factors associated with incident HD access use, yet these factors did not fully explain the substantial variability in arteriovenous access use across centers.
Publisher: Wiley
Date: 03-2010
DOI: 10.1111/J.1440-1754.2009.01646.X
Abstract: Clinical assessment of dehydration in children is often inaccurate. We aimed to determine if a scoring system based on standardised clinical signs would reduce the variability between doctors' assessment of dehydration. A clinical scoring system was developed using seven physiological variables based on previously published research. Estimated percentage dehydration and severity scores were recorded for 100 children presenting to a Paediatric Emergency Department with symptoms of gastroenteritis and dehydration by three doctors of different seniority (resident medical officer, registrar and consultant). Agreement was measured using intra-class correlation coefficient (ICC) for percentage ratings and total clinical scores and kappa for in idual characteristics. Estimated percentage dehydration ranged from 0-9%, mean 2.96%, across the three groups. Total clinical scores from 0-10, mean 2.20. There was moderate agreement amongst clinicians for the percentage dehydration (ICC 0.40). The level of agreement on the clinical scoring system was identical (ICC 0.40). Consultants gave statistically lower scores than the other two groups (Consultant (Con) vs. Resident P = 0.001, Con vs. Registrar P = 0.013). There was a marked difference in agreement across characteristics comprising the scoring system, from kappa 0.02 for capillary refill time to 0.42 for neurological status. The clinical scoring system used did not reduce the variability of assessment of dehydration compared to doctors' conventional methods. In order to reduce variability improving education may be more important than production of a scoring system as experience appears to be a key determinant in the assessment of a potentially dehydrated child.
Publisher: SAGE Publications
Date: 05-2017
Abstract: Preservation of residual renal function (RRF) is associated with improved survival. The aim of the present study was to identify independent predictors of RRF and urine volume (UV) in incident peritoneal dialysis (PD) patients. The study included incident PD patients who were balANZ trial participants. The primary and secondary outcomes were RRF and UV, respectively. Both outcomes were analyzed using mixed effects linear regression with demographic data in the first model and PD-related parameters included in a second model. The study included 161 patients (mean age 57.9 ± 14.1 years, 44% female, 33% diabetic, mean follow-up 19.5 ± 6.6 months). Residual renal function declined from 7.5 ± 2.9 mL/min/1.73 m 2 at baseline to 3.3 ± 2.8 mL/min/1.73 m 2 at 24 months. Better preservation of RRF was independently predicted by male gender, higher baseline RRF, higher time-varying systolic blood pressure (SBP), biocompatible (neutral pH, low glucose degradation product) PD solution, lower peritoneal ultrafiltration (UF) and lower dialysate glucose exposure. In particular, biocompatible solution resulted in 27% better RRF preservation. Each 1 L/day increase in UF was associated with 8% worse RRF preservation ( p = 0.007) and each 10 g/day increase in dialysate glucose exposure was associated with 4% worse RRF preservation ( p 0.001). Residual renal function was not independently predicted by body mass index, diabetes mellitus, renin angiotensin system inhibitors, peritoneal solute transport rate, or PD modality. Similar results were observed for UV. Common modifiable risk factors which were consistently associated with preserved RRF and residual UV were use of biocompatible PD solutions and achievement of higher SBP, lower peritoneal UF, and lower dialysate glucose exposure over time.
Publisher: Springer Science and Business Media LLC
Date: 10-01-2014
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 11-2020
Abstract: An autologous arteriovenous fistula (AVF) is the preferred hemodialysis vascular access, but successful creation is h ered by high rates of AVF failure. This study aimed to evaluate patient and surgical factors associated with AVF failure to improve vascular access selection and outcomes. This is a post hoc analysis of all participants of FAVOURED, a multicenter, double-blind, multinational, randomized, placebo-controlled trial evaluating the effect of fish oil and/or aspirin in preventing AVF failure in patients receiving hemodialysis. The primary outcome of AVF failure was a composite of fistula thrombosis and/or abandonment and/or cannulation failure at 12 months post-AVF creation, and secondary outcomes included in idual outcome components. Patient data (demographics, comorbidities, medications, and laboratory data) and surgical factors (surgical expertise, anesthetic, intraoperative heparin use) were examined using multivariable logistic regression analyses to evaluate associations with AVF failure. Of 536 participants, 253 patients (47%) experienced AVF failure during the study period. The mean age was 55±14.4 years, 64% were male, 45% were diabetic, and 4% had peripheral vascular disease. Factors associated with AVF failure included female sex (odds ratio [OR], 1.79 95% confidence interval [CI], 1.20 to 2.68), lower diastolic BP (OR for higher DBP, 0.85 95% CI, 0.74 to 0.99), presence of central venous catheter (OR, 1.49 95% CI, 1.02 to 2.20 P =0.04), and aspirin requirement (OR, 1.60 95% CI, 1.00 to 2.56). Female sex, requirement for aspirin therapy, requiring hemodialysis via a central venous catheter, and lower diastolic BP were factors associated with higher odds of AVF failure. These associations have potential implications for vascular access planning and warrant further studies.
Publisher: Public Library of Science (PLoS)
Date: 23-08-2019
Publisher: Elsevier BV
Date: 2018
DOI: 10.1053/J.AJKD.2017.08.018
Abstract: Advances in kidney transplantation have led to considerable improvements in short-term transplant and patient outcomes, but there are few data regarding long-term transplant outcomes in patients with vascular comorbid conditions. This study examined the association of vascular disease before transplantation with transplant and patient survival after transplantation and evaluated whether this association was modified by diabetes. All deceased donor kidney transplant recipients recorded in the Australia and New Zealand Dialysis and Transplant Registry (ANZDATA) for 1990 to 2012. Vascular disease burden. All-cause mortality and overall transplant loss. Potential interactions between diabetes and vascular disease for mortality and transplant loss were assessed using 2-way interaction terms. Of 7,128 recipients with 58,120 patient-years of follow-up, 854 (12.0%) and 263 (3.7%) had vascular diseases at 1 and 2 or more sites, respectively. Overall survival for recipients without vascular disease 15 years after transplantation was 65% compared with 35% and 22% among recipients with vascular disease at 1 and 2 or more sites, respectively (P<0.001). Compared with recipients without vascular disease, adjusted HRs for mortality and transplant loss were 1.75 (95% CI, 1.39-2.20 P<0.001) and 1.61 (95% CI, 1.30-1.99 P<0.001), respectively, for recipients with 2 or more vascular diseases. Among recipients without diabetes but with 2 or more vascular diseases, adjusted HRs for mortality and transplant loss were 2.10 (95% CI, 1.56-2.82 P<0.001) and 1.84 (95% CI, 1.39-2.42 P<0.001), respectively, compared with those without vascular disease. Similar associations were not observed for recipients with diabetes mellitus (P for interaction < 0.001). Selection bias and unmeasured residual confounders, such as the severity/extent of comorbid conditions likely to be present. The impact of vascular disease on long-term outcomes was modified by the presence of diabetes, whereby excess risks for death and transplant loss are more apparent in recipients without diabetes.
Publisher: BMJ
Date: 03-2006
Publisher: Oxford University Press (OUP)
Date: 05-04-2012
Abstract: It is uncertain whether particular clones causing invasive community-onset methicillin-resistant and methicillin-sensitive Staphylococcus aureus (cMRSA/cMSSA) infection differ in virulence. Invasive cMRSA and cMSSA cases were prospectively identified. Principal component analysis was used to derive an illness severity score (ISS) from clinical data, including 30-day mortality, requirement for intensive hospital support, the presence of bloodstream infection, and hospital length of stay. The mean ISS for each S. aureus clone (based on MLST) was compared with its DNA microarray-based genotype. Fifty-seven cMRSA and 50 cMSSA infections were analyzed. Ten clones caused 82 (77%) of these infections and had an ISS calculated. The enterotoxin gene cluster (egc) and the collagen adhesin (cna) gene were found in 4 of the 5 highest-ranked clones (ST47-MSSA, ST30-MRSA-IV[2B], ST45-MSSA, and ST22-MRSA-IV[2B]) compared with none and 1 of the lowest 5 ranked clones, respectively. cMSSA clones caused more severe infection than cMRSA clones. The lukF/lukS Panton-Valentine leukocidin (PVL) genes did not directly correlate with the ISS, being present in the second, fourth, and 10th most virulent clones. The clinical severity of invasive cMRSA and cMSSA infection is likely to be attributable to the isolates' entire genotype rather than a single putative virulence determinant such as PVL.
Publisher: Wiley
Date: 16-02-2006
DOI: 10.1111/J.1460-9592.2005.01827.X
Abstract: Optimal analgesia for children undergoing adenotonsillectomy for obstructive sleep apnea (OSA) is controversial. Tramadol may represent a superior choice over morphine in this group, with a potential to cause less postoperative sedation and respiratory depression. Optimal perioperative analgesia may allow expensive and time-consuming preoperative work-up and postoperative monitoring to be rationalized. Sixty-six children were randomized to receive either perioperative tramadol or morphine in this double blinded, prospective, controlled trial. Postoperative sedation, pain, respiratory events, and vomiting were then compared between groups. There was no significant difference between the two groups in sedation scores 1 h after arrival in recovery (P = 0.24) or at any other time up to 6 h postoperation. There was also no evidence of a difference between the groups in pain scores up to 6 h postoperation. There were fewer episodes of postoperative desaturation (<94%) in the tramadol group up to 3 h postoperation, with 26% fewer episodes in the tramadol group during the second hour postoperation (P = 0.02). Overall, there was a trend toward fewer desaturation episodes in the tramadol group. Tramadol may be a suitable drug for children undergoing adenotonsillectomy for OSA. Further work is required to investigate this.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 21-07-2016
DOI: 10.2215/CJN.00190116
Abstract: Emerging evidence from recently published observational studies and an in idual patient data meta–analysis shows that mammalian target of rapamycin inhibitor use in kidney transplantation is associated with increased mortality. Therefore, all-cause mortality and allograft loss were compared between use and nonuse of mammalian target of rapamycin inhibitors in patients from Australia and New Zealand, where mammalian target of rapamycin inhibitor use has been greater because of heightened skin cancer risk. Our longitudinal cohort study included 9353 adult patients who underwent 9558 kidney transplants between January 1, 1996 and December 31, 2012 and had allograft survival ≥1 year. Risk factors for all-cause death and all–cause and death–censored allograft loss were analyzed by multivariable Cox regression using mammalian target of rapamycin inhibitor as a time-varying covariate. Additional analyses evaluated mammalian target of rapamycin inhibitor use at fixed time points of baseline and 1 year. Patients using mammalian target of rapamycin inhibitors were more likely to be white and have a history of pretransplant cancer. Over a median follow-up of 7 years, 1416 (15%) patients died, and 2268 (24%) allografts were lost. There was a higher risk of all-cause mortality with time–varying mammalian target of rapamycin inhibitor use (hazard ratio, 1.47 95% confidence interval, 1.23 to 1.76) as well as in the fixed time model analyses comparing mammalian target of rapamycin inhibitor use at baseline (hazard ratio, 1.54 95% confidence interval, 1.22 to 1.93) and 1 year (hazard ratio, 1.63 95% confidence interval, 1.32 to 2.01). Time–varying mammalian target of rapamycin inhibitor use was associated with higher risk of death because of malignancy (hazard ratio, 1.37 95% confidence interval, 1.09 to 1.71). There were no statistically significant differences in the risk of all–cause (hazard ratio, 0.98 95% confidence interval, 0.85 to 1.12) and death–censored (hazard ratio, 0.85 95% confidence interval, 0.69 to 1.03) allograft loss between the mammalian target of rapamycin inhibitor use and nonuse groups in the time-varying model as well as the fixed time models. Mammalian target of rapamycin inhibitor use was associated with a higher risk of all-cause mortality but not allograft loss.
Publisher: Oxford University Press (OUP)
Date: 05-2015
Publisher: Springer Science and Business Media LLC
Date: 27-06-2015
Publisher: Elsevier BV
Date: 09-2014
DOI: 10.1016/J.NUMECD.2014.04.006
Abstract: There is a growing body of evidence supporting the nephrovascular toxicity of indoxyl sulphate (IS) and p-cresyl sulphate (PCS). Nonetheless, a comprehensive description of how these toxins accumulate over the course of chronic kidney disease (CKD) is lacking. This cross-sectional observational study included a convenience s le of 327 participants with kidney function categorised as normal, non-dialysis CKD and end-stage kidney disease (ESKD). Participants underwent measurements of serum total and free IS and PCS and assessment of cardiovascular history and structure (carotid intima-media thickness [cIMT, a measure of arterial stiffness]), and endothelial function (brachial artery reactivity [flow-mediated dilation (BAR-FMD) glyceryl trinitrate (BAR-GTN)]). Across the CKD spectrum there was a significant increase in both total and free IS and PCS and their free fractions, with the highest levels observed in the ESKD population. Within each CKD stage, concentrations of PCS, total and free, were significantly greater than IS (all p < 0.01). Both IS and PCS, free and total, were correlated with BAR-GTN (ranging from r = -0.33 to -0.44) and cIMT (r = 0.19 to 0.21), even after adjusting for traditional risk factors (all p < 0.01). Further, all toxins were independently associated with the presence of cardiovascular disease (all p < 0.02). More advanced stages of CKD are associated with progressive increases in total and free serum IS and PCS, as well as increases in their free fractions. Total and free serum IS and PCS were independently associated with structural and functional markers of cardiovascular disease. Studies of therapeutic interventions targeting these uraemic toxins are warranted.
Publisher: Wiley
Date: 22-05-2018
DOI: 10.1002/JSO.25037
Abstract: New-onset chronic kidney disease (CKD) following surgical management of kidney tumors is common. This study evaluated risk factors for new-onset CKD after nephrectomy for T1a renal cell carcinoma (RCC) in an Australian population-based cohort. There were 551 RCC patients from the Australian states of Queensland and Victoria included in this study. The primary outcome was new-onset CKD (eGFR 20 mm, radical nephrectomy, lower hospital caseloads (<20 cases/year), and rural place of residence. The associations between rural place of residence and low center volume were a consequence of higher radical nephrectomy rates. Risk factors for CKD after nephrectomy generally relate to worse baseline health, or likelihood of undergoing radical nephrectomy. Surgeons in rural centres and hospitals with low caseloads may benefit from formalized integration with specialist centers for continued professional development and case-conferencing, to assist in management decisions.
Publisher: SAGE Publications
Date: 11-2015
Abstract: The HONEYPOT study recently reported that daily exit-site application of antibacterial honey was not superior to nasal mupirocin prophylaxis for preventing overall peritoneal dialysis (PD)-related infection. This paper reports a secondary outcome analysis of the HONEYPOT study with respect to exit-site infection (ESI) and peritonitis microbiology, infectious hospitalization and technique failure. A total of 371 PD patients were randomized to daily exit-site application of antibacterial honey plus usual exit-site care ( N = 186) or intranasal mupirocin prophylaxis (in nasal Staphylococcus aureus carriers only) plus usual exit-site care (control, N = 185). Groups were compared on rates of organism-specific ESI and peritonitis, peritonitis-and infection-associated hospitalization, and technique failure (PD withdrawal). The mean peritonitis rates in the honey and control groups were 0.41 (95% confidence interval [CI] 0.32 – 0.50) and 0.41 (95% CI 0.33 – 0.49) episodes per patient-year, respectively (incidence rate ratio [IRR] 1.01, 95% CI 0.75 – 1.35). When specific causative organisms were examined, no differences were observed between the groups for gram-positive (IRR 0.99, 95% CI 0.66 – 1.49), gram-negative (IRR 0.71, 95% CI 0.39 – 1.29), culture-negative (IRR 2.01, 95% CI 0.91 – 4.42), or polymicrobial peritonitis (IRR 1.08, 95% CI 0.36 – 3.20). Exit-site infection rates were 0.37 (95% CI 0.28 – 0.45) and 0.33 (95% CI 0.26 – 0.40) episodes per patient-year for the honey and control groups, respectively (IRR 1.12, 95% CI 0.81 – 1.53). No significant differences were observed between the groups for gram-positive (IRR 1.10, 95% CI 0.70 – 1.72), gram-negative (IRR: 0.85, 95% CI 0.46 – 1.58), culture-negative (IRR 1.88, 95% CI 0.67 – 5.29), or polymicrobial ESI (IRR 1.00, 95% CI 0.40 – 2.54). Times to first peritonitis-associated and first infection-associated hospitalization were similar in the honey and control groups. The rates of technique failure (PD withdrawal) due to PD-related infection were not significantly different between the groups. Compared with standard nasal mupirocin prophylaxis, daily topical exit-site application of antibacterial honey resulted in comparable rates of organism-specific peritonitis and ESI, infection-associated hospitalization, and infection-associated technique failure in PD patients.
Publisher: Elsevier BV
Date: 06-2016
DOI: 10.1016/J.KINT.2016.02.014
Abstract: Patient outcomes in end-stage kidney disease (ESKD) secondary to lupus nephritis have not been well described. To help define this we compared dialysis and transplant outcomes of patients with ESKD due to lupus nephritis to all other causes. All patients diagnosed with ESKD who commenced renal replacement therapy in Australia and New Zealand (1963-2012) were included. Clinical outcomes were evaluated in both a contemporary cohort (1998-2012) and the entire 50-year cohort. Of 64,160 included patients, 744 had lupus nephritis as the primary renal disease. For the contemporary cohort of 425 patients with lupus nephritis, the 5-year dialysis patient survival rate was 69%. Of 176 contemporary patients with lupus nephritis who received their first renal allograft, the 5-year patient, overall renal allograft, and death-censored renal allograft survival rates were 95%, 88%, and 93%, respectively. Patients with lupus nephritis had worse dialysis patient survival (adjusted hazard ratio 1.33, 95% confidence interval 1.12-1.58) and renal transplant patient survival (adjusted hazard ratio 1.87, 95% confidence interval 1.18-2.98), but comparable overall renal allograft survival (adjusted hazard ratio 1.19, 95% confidence interval 0.84-1.68) and death-censored renal allograft survival (adjusted hazard ratio 1.05, 95% confidence interval 0.68-1.62) compared with ESKD controls. Similar results were found in the entire cohort and when using competing-risks analysis. Thus, the ESKD of lupus nephritis was associated with worse dialysis and transplant patient survival but comparable renal allograft survival compared with other causes of ESKD.
Publisher: Wiley
Date: 15-06-2017
DOI: 10.1111/NEP.12815
Abstract: Pentoxifylline has been shown to increase haemoglobin levels in patients with chronic kidney disease (CKD) and erythropoietin-stimulating agent (ESA)-hyporesponsive anaemia in the Handling Erythropoietin Resistance with Oxpentifylline multicentre double-blind, randomized controlled trial. The present sub-study evaluated the effects of pentoxifylline on the iron-regulatory hormone hepcidin in patients with ESA-hyporesponsive CKD. This sub-study included 13 patients in the pentoxifylline arm (400 mg daily) and 13 in the matched placebo arm. Hepcidin-25 was measured by ultra performance liquid chromatography/quadrupole time-of-flight mass spectrometry following isolation from patient serum. Serum hepcidin-25, serum iron biomarkers, haemoglobin and ESA dosage were compared within and between the two groups. Hepcidin-25 concentration at 4 months adjusted for baseline did not differ significantly in pentoxifylline versus placebo treated patients (adjusted mean difference (MD) -7.9 nmol, P = 0.114), although the difference between the groups mean translated into a >25% reduction of circulating hepcidin-25 due to pentoxifylline compared with the placebo baseline. In paired analysis, serum hepcidin-25 levels were significantly decreased at 4 months compared with baseline in the pentoxifylline group (-5.47 ± 2.27 nmol/l, P < 0.05) but not in the placebo group (2.82 ± 4.29 nmol/l, P = 0.24). Pentoxifylline did not significantly alter serum ferritin (MD 55.4 mcg/l), transferrin saturation (MD 4.04%), the dosage of ESA (MD -9.93 U/kg per week) or haemoglobin concentration (MD 5.75 g/l). The reduction of circulating hepcidin-25 due to pentoxifylline did not reach statistical significance however, the magnitude of the difference suggests that pentoxifylline may be a clinically and biologically meaningful modulator of hepcidin-25 in dialysis of patients with ESA-hyporesponsive anaemia.
Publisher: SAGE Publications
Date: 2015
DOI: 10.1186/S40697-015-0066-5
Abstract: Erythropoiesis stimulating agent (ESA)-resistant anemia is common in chronic kidney disease (CKD). To evaluate the determinants of severity of ESA resistance in patients with CKD and primary ESA-resistance. Secondary analysis of a randomized controlled trial (the Handling Erythropoietin Resistance with Oxpentifylline, HERO) 53 adult patients with CKD stage 4 or 5 and primary ESA-resistant anemia (hemoglobin ≤120 g/L, ESA resistance index [ERI] ≥1.0 IU/kg/week/gHb for erythropoietin or ≥0.005 μg/kg/week/gHb for darbepoeitin, no cause for ESA-resistance identified). Iron studies, parathyroid hormone, albumin, liver enzymes, phosphate or markers of oxidative stress and inflammation. Participants were ided into tertiles of ERI. Multinomial logistic regression was used to analyse the determinants of ERI tertiles. All patients, except one, were receiving dialysis for end-stage kidney disease. The mean ± SD ERI values in the low ( n = 18), medium ( n = 18) and high ( n = 17) ERI tertiles were 1.4 ± 0.3, 2.3 ± 0.2 and 3.5 ± 0.8 IU/kg/week/gHb, respectively ( P 0.001). There were no significant differences observed in age, gender, ethnicity, cause of kidney disease, diabetes, iron studies, parathyroid hormone, albumin, liver enzymes, phosphate or markers of oxidative stress and inflammation between the ERI tertiles. The median [inter-quartile range] serum alkaline phosphatase concentrations in the low, medium and high ERI tertiles were 89 [64,121], 99 [76,134 and 148 [87,175] U/L, respectively ( P = 0.054). There was a weak but statistically significant association between ERI and serum alkaline phosphatase (R 2 = 0.06, P = 0.03). Using multinomial logistic regression, the risk of being in the high ERI tertile relative to the low ERI tertile increased with increasing serum alkaline phosphatase levels ( P = 0.02). No other variables were significantly associated with ERI. Small s le size bone-specific alkaline phosphatase, other markers of bone turnover and bone biopsies not evaluated. Serum alkaline phosphatase was associated with severity of ESA resistance in ESA-resistant patients with CKD. Large prospective studies are required to confirm this association. (Trial registration: Australian New Zealand Clinical Trials Registry 12608000199314)
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 09-2009
DOI: 10.1097/ALN.0B013E3181B27C18
Abstract: Monitoring changes in electrical skin conductance has been described as a potentially useful tool for the detection of acute pain in adults. The aim of this study was to test the method in pediatric patients. A total of 180 postoperative pediatric patients aged 1-16 yr were included in this prospective, blinded observational study. After arrival in the recovery unit, pain was assessed by standard clinical pain assessment tools (1-3 yr: Face Legs Activity Cry Consolability Scale, 4-7 yr: Revised Faces Scale, 8-16 yr: Visual Analogue Scale) at various time points during their stay in the recovery room. The number of fluctuations in skin conductance per second (NFSC) was recorded simultaneously. Data from 165 children were used for statistical analysis, and 15 patients were excluded. The area under the Receiver Operating Characteristic curve for predicting moderate to severe pain from NFSC was 0.82 (95% confidence interval 0.79-0.85). Over all age groups, an NFSC cutoff value of 0.13 was found to distinguish between no or mild versus moderate or severe pain with a sensitivity of 90% and a specificity of 64% (positive predictive value 35%, negative predictive value 97%). NFSC accurately predicted the absence of moderate to severe pain in postoperative pediatric patients. The measurement of NFSC may therefore provide an additional tool for pain assessment in this group of patients. However, more research is needed to prospectively investigate the observations made in this study and to determine the clinical applicability of the method.
Publisher: AME Publishing Company
Date: 10-2017
Publisher: Wiley
Date: 28-01-2016
DOI: 10.1111/NEP.12573
Abstract: The Fish oils and Aspirin in Vascular access OUtcomes in REnal Disease (FAVOURED) trial investigated whether 3 months of omega-3 polyunsaturated fatty acids, either alone or in combination with aspirin, will effectively reduce primary access failure of de novo arteriovenous fistulae. This report presents the baseline characteristics of all study participants, examines whether study protocol amendments successfully increased recruitment of a broader and more representative haemodialysis cohort, including patients already receiving aspirin, and contrasts Malaysian participants with those from Australia, New Zealand and the United Kingdom (UK). This international, randomized, double-blind, placebo-controlled trial included patients older than 19 years with stage 4 or 5 chronic kidney disease currently receiving, or planned within 12 months to receive haemodialysis. Participants (n = 568) were overweight (28.6 ± 7.3 kg/m(2) ), relatively young (54.8 ± 14.3 years), and predominantly male (63%) with a high prevalence of diabetes mellitus (46%) but low rate of ischaemic heart disease (8%). Sixty one percent were planned for lower arm arteriovenous fistula creation. Malaysian participants (n = 156) were younger (51.8 ± 13.6 years vs 57.1 ± 14.2 years, P < 0.001) with a higher prevalence of diabetes mellitus (65% vs 43%, P < 0.001), but less ischaemic heart disease (5% vs 14%, P < 0.01) compared with the combined Australian, New Zealand and UK cohort (n = 228). Protocol modifications allowing for inclusion of patients receiving aspirin increased the prevalence of co-morbidities compared with the original cohort. The FAVOURED study participants, while mostly similar to patients in contemporary national registry reports and comparable recent clinical trials, were on average younger and had less ischaemic heart disease. These differences were reduced as a consequence of including patients already receiving aspirin.
Publisher: Frontiers Media SA
Date: 27-09-2017
Publisher: SAGE Publications
Date: 05-2018
Abstract: Obesity is increasingly prevalent worldwide, and a greater number of patients initiate renal replacement therapy with a high body mass index (BMI). This study aimed to evaluate the association between BMI and organism-specific peritonitis. All adult patients who initiated peritoneal dialysis (PD) in Australia between January 2004 and December 2013 were included. Data were accessed through the Australia and New Zealand Dialysis and Transplant (ANZDATA) Registry. The co-primary outcomes of this study were time to first organism-specific peritonitis episode, specifically gram-positive, gram-negative, culture-negative, and fungal. Secondary outcomes were in idual rates of organism-specific peritonitis for the same 4 microbiological categories. There were 7,381 peritonitis episodes among the 8,343 incident PD patients evaluated. After multivariable adjustment, obese patients (BMI 30 – 34.9 kg/m 2 ) had an increased risk of fungal peritonitis (adjusted hazard ratio [HR] 1.69, 95% confidence interval [CI] 1.18 – 2.42), very obese patients (BMI ≥ 35 kg/m 2 ) had a significantly higher risk of gram-positive peritonitis (HR 1.15, 95% CI 1.02 – 1.30), while both obese and very obese patients experienced significantly higher risks of gram-negative peritonitis (HR 1.29, 95% CI 1.11 – 1.50 and HR 1.30, 95% CI 1.08 – 1.57, respectively) compared with patients with normal BMI (20 – 24.9 kg/m 2 ). Obesity and severe obesity were independently associated with increased incidence rate ratios of all forms of organism-specific peritonitis with a non-significant trend for severe obesity and gram-negative peritonitis association. Among Australian patients, obesity and severe obesity are associated with significantly increased rates of gram-positive, gram-negative, fungal, and culture-negative peritonitis.
Publisher: Wiley
Date: 26-11-2018
DOI: 10.1111/SDI.12658
Abstract: In patients receiving hemodialysis, the provision of safe and effective vascular access using an arteriovenous fistula or graft is regarded as a critical priority by patients and health professionals. Vascular access failure is associated with morbidity and mortality, such that strategies to prevent these outcomes are essential. Inadequate vascular remodeling and neointimal hyperplasia resulting in stenosis and frequently thrombosis are critical to the pathogenesis of access failure. Systemic medical therapies with pleiotropic effects including antiplatelet agents, omega-3 polyunsaturated fatty acids (fish oils), statins, and inhibitors of the renin-angiotensin-aldosterone system (RAAS) may reduce vascular access failure by promoting vascular access maturation and reducing stenosis and thrombosis through antiproliferative, antiaggregatory, anti-inflammatory and vasodilatory effects. Despite such promise, the results of retrospective analyses and randomized controlled trials of these agents on arteriovenous fistula and graft outcomes have been mixed. This review describes the current understanding of the pathogenesis of arteriovenous fistula and graft failure, the biological effects of antiplatelet agents, fish oil supplementation, RAAS blockers and statins that may be beneficial in improving vascular access survival, results from clinical trials that have investigated the effect of these agents on arteriovenous fistula and graft outcomes, and it explores future therapeutic approaches combining these agents with novel treatment strategies.
Publisher: Wiley
Date: 2009
DOI: 10.1111/J.1440-1754.2008.01427.X
Abstract: Aim: Accurate assessment of nutritional status is a vital aspect of caring for in iduals with anorexia nervosa (AN) and body mass index (BMI) is considered an appropriate and easy to use tool. Because of the intense fear of weight gain, some in iduals may attempt to mislead the physician. Mid‐upper arm circumference (MUAC) is a simple, objective method of assessing nutritional status. The setting is an eating disorders clinic in a tertiary paediatric hospital in Western Australia. The aim of this study is to evaluate how well MUAC correlates with BMI in adolescents with AN. Methods: Prospective observational study to evaluate nutritional status in adolescents with AN. Results: Fifty‐five adolescents aged 12–17 years with AN were assessed between January 1, 2004 and January 1, 2006. MUAC was highly correlated with BMI ( r = 0.79, P 0.001) and in iduals with MUAC ≥20 cm rarely required hospitalisation (negative predictive value 93%). Conclusions: MUAC reflects nutritional status as defined by BMI in adolescents with AN. Lack of consistency between longitudinal measurements of BMI and MUAC should be viewed suspiciously and prompt a more detailed nutritional assessment.
Publisher: Wiley
Date: 12-10-2012
DOI: 10.1111/J.1460-9592.2011.03717.X
Abstract: Children treated with stimulant medications for the behavioral management of attention deficit hyperactivity disorder (ADHD) may present for elective surgery. Stimulant medication is often continued until the morning of surgery to optimize perioperative behavior. It is unknown whether such stimulant drug ingestion can affect cerebral arousal and alter depth of anesthesia. A clinically relevant alteration in measured depth of anesthesia could form the basis for an evidence-based recommendation that children taking stimulant medications require a change in the amount of anesthetic delivered or that they require routine monitoring of depth of anesthesia. Thirty-four ASA 1 and 2 children aged between 5 and 16, presenting for elective day case surgery, were recruited. Seventeen had a diagnosis of ADHD and had taken stimulant medication on the day of surgery, and 17 were controls. A standard inhalational induction of anesthesia using air, oxygen, and sevoflurane by facemask was performed and maintained for 10 min at 1 MAC endtidal sevoflurane. During this time, no other stimulus was applied to the patient. Bispectral index (BIS) and other markers of depth of anesthesia were recorded after 10 min. Children in both groups were of similar ages and weights. There were a higher percentage of boys in the stimulants group. Baseline physiological parameters were similar in both groups. After induction and equilibration for 10 min of anesthesia at 1 MAC endtidal sevoflurane, there was no significant difference in BIS or clinical markers of depth of anesthesia. Children taking stimulant medication for ADHD, and who ingest medication on the day of surgery, do not appear to have altered BIS or depth of anesthesia at 1 MAC of sevoflurane. These results do not support a recommendation for a change in anesthetic practice for children having ingested stimulants up to the day of surgery, either in terms of increasing the amount of anesthetic given or monitoring of depth.
Publisher: SAGE Publications
Date: 07-2017
Abstract: Few studies have examined the relationship between socio-economic position (SEP) and peritoneal dialysis (PD) outcomes, particularly at a country level. The aim of this study was to investigate the relationships between SEP, technique failure, and mortality in PD patients undertaking treatment in Australia. The study included all Australian non-indigenous incident PD patients between January 1, 1997, and December 31, 2014, using Australia and New Zealand Dialysis and Transplant (ANZDATA) Registry data. The SEP was assessed by quartiles of postcode-based Australian Socio-Economic Indexes for Areas (SEIFA), including Index of Relative Socio-economic Advantage and Disadvantage (IRSAD – primary index), Index of Relative Socio-economic Disadvantage (IRSD), Index of Economic Resources (IER), and Index of Education and Occupation (IEO). Technique and patient survival were evaluated by multivariable Cox proportional hazards survival analyses. The study included 9,766 patients (mean age 60.6 ± 15 years, 57% male, 38% diabetic). Using multivariable Cox regression, no significant association was observed between quartiles of IRSAD and technique failure (30-day definition p = 0.65, 180-day definition p = 0.68). Similar results were obtained using competing risks regression. However, higher SEP, defined by quartiles of IRSAD, was associated with better patient survival (Quartile 1 reference Quartile 2 adjusted hazards ratio [HR] 0.96, 95% confidence interval [CI] 0.86 – 1.06 Quartile 3 HR 0.87, 95% CI 0.77 – 0.99 Quartile 4 HR 0.86, 95% CI 0.76 – 0.97). Similar results were found when IRSD was analyzed, but results were no longer statistically significant for IER and IEO. In Australia, where there is universal free healthcare, SEP was not associated with PD technique failure in non-indigenous PD patients. Higher SEP was generally associated with improved patient survival.
Publisher: Wiley
Date: 17-10-2012
Publisher: Elsevier BV
Date: 11-2005
DOI: 10.1016/J.BURNS.2005.05.001
Abstract: The ideal analgesic agent for burns wound dressings in paediatric patients would be one that is easy to administer, well tolerated, and produces rapid onset of analgesia with a short duration of action and minimal side-effects to allow rapid resumption of activities and oral intake. We compared our current treatment of oral morphine to intranasal fentanyl in an attempt to find an agent closer to the ideal. A randomised double blind two-treatment crossover study comparing intranasal administration of fentanyl (INF) to orally administered morphine (OM). Children with burn injury aged up to 15 years and weighing 10-75 kg were included. Primary end-point was pain scores. Secondary end-points were time to resumption of age-appropriate activities, time to resumption of fluid intake, sedation and cooperation. Routine observations and vital signs were also recorded. Twenty-four patients were studied with a median age of 4.5 years (interquartile range 1.8-9.0 years) and a median weight of 18.4 kg (interquartile range 12.9-33.2kg). Mean pain difference scores (OM-INF) ranged from -0.500 (95% CI=-1.653 to 0.653) at baseline to -0.625 (05% CI=-1.863 to 0.613) for a retrospective rating of worst pain experienced during the dressing procedure. All measurements were within a pre-defined range of equivalent efficacy. The median time to resumption of fluid intake was 108 min (range 44-175 min) with OM and 140 min (range 60-210 min) with INF. These differences were not statistically significant. Fewer patients experienced mild side-effects with INF compared to OM (n=5 versus n=10). No patients experienced depressed respirations or oxygen saturations. Intranasal fentanyl was shown to be equivalent to oral morphine in the provision of analgesia for burn wound dressing changes in this cohort of paediatric patients. It was concluded that intranasal fentanyl is a suitable analgesic agent for use in paediatric burns dressing changes either by itself or in combination with oral morphine as a top up titratable agent.
Publisher: University of Buckingham Press
Date: 18-11-2013
Abstract: Aim: This paper is a report of the comparison of perceptions of family-centred care by hospital staff (nurses, doctors and allied health staff) and parents of hospitalised children in two Australian tertiary paediatric hospitals.Background: Family-centred care is an accepted approach to caring for children and their families in hospital. Previous publications have been inconsistent, ranging from promoting its benefits and integration into practice, reporting operational difficulties and proposing that family-centred care may not be working at all. An evaluation of the model of care is long overdue. Method: A quantitative comparative cross-sectional survey was used to collect data in 2010 from a convenience s le of 309 parents of hospitalised children and 519 staff. Participants rated 20 items grouped into three subscales of respect, collaboration and support.Findings: Both parents and staff responses were positive and parents had significantly higher subscale scores for respect, collaboration and support (all p .0001). Parents’ responses for 19 of the 20 items were significantly higher than for staff. The item on which parents and staff did not differ was concerned with being able to question recommendations about the child’s treatment. Conclusion: Both parents and staff had positive perceptions of their family-centred care experiences. Parents’ perception of their experience was more positive than staff perceptions of their delivery of family-centred care in hospital. Whilst the positive experience by both consumers and healthcare providers is an important finding, reasons for differences, in particular in supporting parents, require further examination.
Publisher: Oxford University Press (OUP)
Date: 05-2015
Publisher: BMJ
Date: 07-2022
DOI: 10.1136/BMJOPEN-2022-063061
Abstract: (1) Identify the healthcare settings in which goal attainment scaling (GAS) has been used as an outcome measure in randomised controlled trials. (2) Describe how GAS has been implemented by researchers in those trials. Scoping review using the Preferred Reporting Items for Systematic Reviews and Meta-Analyses extension for scoping reviews approach. PubMed, CENTRAL, EMBASE and PsycINFO were searched through 28 February 2022. English-language publications reporting on research where adults in healthcare settings were recruited to a randomised controlled trial where GAS was an outcome measure. Two independent reviewers completed data extraction. Data collected underwent descriptive statistics. Of 1,838 articles screened, 38 studies were included. These studies were most frequently conducted in rehabilitation (58%) and geriatric medicine (24%) disciplines opulations. S le sizes ranged from 8 to 468, with a median of 51 participants (IQR: 30–96). A number of studies did not report on implementation aspects such as the personnel involved (26%), the training provided (79%) and the calibration and review mechanisms (87%). Not all trials used the same scale, with 24% varying from the traditional five-point scale. Outcome attainment was scored in various manners (self-report: 21% observed: 26% both self-report and observed: 8% and not reported: 45%), and the calculation of GAS scores differed between trials (raw score: 21% T score: 47% other: 21% and not reported: 66%). GAS has been used as an outcome measure across a wide range of disciplines and trial settings. However, there are inadequacies and inconsistencies in how it has been applied and implemented. Developing a cross-disciplinary practical guide to support a degree of standardisation in its implementation may be beneficial in increasing the reliability and comparability of trial results. CRD42021237541.
Publisher: Public Library of Science (PLoS)
Date: 12-06-2012
Publisher: Oxford University Press (OUP)
Date: 05-2015
Publisher: Elsevier BV
Date: 2011
Publisher: Springer Science and Business Media LLC
Date: 20-10-2022
Publisher: Wiley
Date: 02-12-2012
DOI: 10.1111/JPC.12011
Abstract: The aim of this study is to directly compare published prediction tools with triage nurse (TN) predictions within a defined paediatric population. A prospective observational study carried out over a week in May 2010 in the Emergency Department (ED) at Princess Margaret Hospital for Children in Perth, Western Australia. TN predicted which patients would be admitted to hospital at the time of ED presentation. Data required for the other prediction tools (paediatric early warning score (PEWS) triage category and the Pediatric Risk of Admission Score (PRISA) and PRISA II were obtained from the notes following the patient's ED attendance. A total of 1223 patients presented during the study week, 91 patients were excluded and a total of 946 patients (83.6%) had TN predictions and were included in the analysis. TN predictions were compared against a PEWS ≥ 4, triage category 1, 2 and 3, PRISA ≥ 9 and PRISA II ≥ 2. TNs had the highest prediction accuracy (87.7%), followed by an elevated PEWS (82.9%), triage category of 1, 2, or 3 (82.9%). The PRISA and PRISA II score had an accuracy of 80.1% and 79.7%, respectively. When compared with validated prediction tools, the TN is the most accurate predictor of need to admit. This study provides valuable information in planning efficient flow of patients through the ED.
Publisher: Elsevier BV
Date: 09-2012
DOI: 10.1053/J.AJKD.2012.04.026
Abstract: The association between blood pressure and cardiovascular outcomes in patients undergoing hemodialysis remains controversial. This may relate in part to the technique and device used and the timing of the blood pressure measurement in relation to the hemodialysis procedure. Emerging evidence indicates that standardized hemodialysis unit blood pressure measurements or measurements obtained at home, either by the patient or using an ambulatory blood pressure monitor, may offer advantages over routine hemodialysis unit blood pressure measurements for determining cardiovascular risk and treatment. This review discusses the available evidence and implications for clinicians and clinical trials.
Publisher: Springer Science and Business Media LLC
Date: 24-11-2011
Start Date: 2016
End Date: 2020
Funder: National Health and Medical Research Council
View Funded Activity