ORCID Profile
0000-0002-9851-002X
Current Organisations
Monash University
,
Monash Health
Does something not look right? The information on this page has been harvested from data sources that may not be up to date. We continue to work with information providers to improve coverage and quality. To report an issue, use the Feedback Form.
Publisher: AMPCo
Date: 08-2012
DOI: 10.5694/MJA11.11329
Abstract: The publication of the Australasian Creatinine Consensus Working Group's position statements in 2005 and 2007 resulted in automatic reporting of estimated glomerular filtration rate (eGFR) with requests for serum creatinine concentration in adults, facilitated the unification of units of measurement for creatinine and eGFR, and promoted the standardisation of assays. New advancements and continuing debate led the Australasian Creatinine Consensus Working Group to reconvene in 2010. The working group recommends that the method of calculating eGFR should be changed to the Chronic Kidney Disease Epidemiology Collaboration (CKD-EPI) formula, and that all laboratories should report eGFR values as a precise figure to at least 90 mL/min/1.73 m(2). Age-related decision points for eGFR in adults are not recommended, as although an eGFR < 60 mL/min/1.73 m(2) is very common in older people, it is nevertheless predictive of significantly increased risks of adverse clinical outcomes, and should not be considered a normal part of ageing.If using eGFR for drug dosing, body size should be considered, in addition to referring to the approved product information. For drugs with a narrow therapeutic index, therapeutic drug monitoring or a valid marker of drug effect should be used to in idualise dosing. The CKD-EPI formula has been validated as a tool to estimate GFR in some populations of non-European ancestry living in Western countries. Pending publication of validation studies, the working group also recommends that Australasian laboratories continue to automatically report eGFR in Aboriginal and Torres Strait Islander peoples. The working group concluded that routine calculation of eGFR is not recommended in children and youth, or in pregnant women. Serum creatinine concentration (preferably using an enzymatic assay for paediatric patients) should remain as the standard test for kidney function in these populations.
Publisher: Public Library of Science (PLoS)
Date: 26-03-2019
Publisher: Elsevier BV
Date: 03-2019
DOI: 10.1053/J.AJKD.2018.10.005
Abstract: The number of people with diabetes and end-stage kidney disease (ESKD) is increasing worldwide, but it is unknown whether this indicates an increasing risk for ESKD in people with diabetes. We examined temporal trends in the incidence of ESKD within the Australian population with diabetes from 2002 to 2013. Follow-up study using a national health care services registry. Registrants with type 1 or type 2 diabetes in Australia's National Diabetes Services Scheme (NDSS). Age, sex, indigenous status, diabetes type, and calendar year. Incidence of ESKD (dialysis or kidney transplantation) or death ascertained using the Australian and New Zealand Dialysis and Transplant Registry and the Australian national death index. NDSS registrants were followed up from 2002 or date of registration until onset of ESKD, death, or December 31, 2013. The incidence of ESKD in type 1 diabetes was calculated only in those younger than 55 years. Among 1,375,877 registrants between 2002 and 2013, a total of 9,977 experienced incident ESKD, representing an overall incidence of ESKD in people with diabetes of 10.0 (95% CI, 9.8-10.2) per 10,000 person-years. Among those with type 1 diabetes, the age-standardized annual incidence was stable during the study period. Among those with type 2 diabetes, the incidence increased in nonindigenous people (annual percentage change, 2.2% 95% CI, 0.4%-4.1%) with the greatest increases in those younger than 50 and those older than 80 years. No significant change over time was observed in indigenous people, although the adjusted incident rate ratio for indigenous versus nonindigenous was 4.03 (95% CI, 3.68-4.41). Lack of covariates such as comorbid conditions, medication use, measures of quality of care, and baseline kidney function. The age-standardized annual incidence of ESKD increased in Australia from 2002 to 2013 for nonindigenous people with type 2 diabetes but was stable for people with type 1 diabetes. Efforts to prevent the development of ESKD, especially among indigenous Australians and those with early-onset type 2 diabetes, are warranted.
Publisher: Wiley
Date: 24-11-2010
DOI: 10.1111/J.1542-4758.2010.00505.X
Abstract: Patients on extended hours (>15 h/week) hemodialysis may be at a higher risk of deficiency of water-soluble vitamins than conventional (≤15 h/week) hemodialysis patients due to their increased weekly hours of dialysis. We compared serum levels of the water-soluble vitamins in a group of extended and conventional hours hemodialysis patients. Predialysis serum levels of vitamin C, vitamin B12, thiamine, pyridoxine, and folate were measured in 52 patients: 26 extended group and 26 conventional group. Information on patient's intake of vitamin supplements and dialysis regimen was obtained. Data were log transformed due to the skewed distribution of the results. Median vitamin C levels were significantly lower in the extended group (0.30 vs. 1.14 mg/dL, P<0.001), with 7 patients having a level <0.18 mg/dL. Thiamine levels were also lower in the extended group (median 211 vs. 438.5 nmol/L, P=0.0005). However, extended patients had higher levels of pyridoxine (23.2 vs. 11.1 ng/mL, P=0.03). Vitamin B12 and folate levels were not significantly different between the groups. There was significant variability in vitamin supplement prescription in both groups, and dietary data were not obtained. This study showed a high incidence of vitamin C deficiency in extended hours hemodialysis patients, suggesting that supplementation is warranted. It also supports an ongoing role for multivitamin supplementation in conventional hemodialysis patients.
Publisher: Elsevier BV
Date: 05-2016
DOI: 10.1016/J.KINT.2019.01.017
Abstract: Globally, the number of patients undergoing maintenance dialysis is increasing, yet throughout the world there is significant variability in the practice of initiating dialysis. Factors such as availability of resources, reasons for starting dialysis, timing of dialysis initiation, patient education and preparedness, dialysis modality and access, as well as varied "country-specific" factors significantly affect patient experiences and outcomes. As the burden of end-stage kidney disease (ESKD) has increased globally, there has also been a growing recognition of the importance of patient involvement in determining the goals of care and decisions regarding treatment. In January 2018, KDIGO (Kidney Disease: Improving Global Outcomes) convened a Controversies Conference focused on dialysis initiation, including modality choice, access, and prescription. Here we present a summary of the conference discussions, including identified knowledge gaps, areas of controversy, and priorities for research. A major novel theme represented during the conference was the need to move away from a "one-size-fits-all" approach to dialysis and provide more in idualized care that incorporates patient goals and preferences while still maintaining best practices for quality and safety. Identifying and including patient-centered goals that can be validated as quality indicators in the context of erse health care systems to achieve equity of outcomes will require alignment of goals and incentives between patients, providers, regulators, and payers that will vary across health care jurisdictions.
Publisher: Springer Science and Business Media LLC
Date: 03-07-2012
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 08-2015
DOI: 10.2215/CJN.00840115
Publisher: Elsevier BV
Date: 04-2004
DOI: 10.1053/J.AJKD.2003.11.023
Abstract: Native arteriovenous fistula (AVF) prevalence varies significantly among different populations and countries. Physician practice patterns may have a strong influence on access type. We assessed differences in vascular access practice patterns across all treating centers in New Zealand. Adult (age > or = 18 years) patients on hemodialysis therapy in the year ending September 30, 2001, were studied from the Australian and New Zealand Dialysis and Transplant Association Registry. Multinomial logistic regression was used to assess factors associated with arteriovenous graft (AVG) and catheter use. Of 772 patients available for analysis, 461 patients (60%) underwent dialysis using an AVF 122 patients (16%), an AVG and 189 patients (24%), a catheter. On multivariable analysis, female sex (odds ratio, 5.92 P < 0.001), coronary artery disease (odds ratio, 1.89 P < 0.05), body mass index greater than 30 (odds ratio, 2.55 P < 0.05), and age (odds ratio, 1.03 per year increase P < 0.001) were associated with an increased likelihood of AVG use. Maori and Pacific Island patients were less likely to use an AVG compared with Caucasians (odds ratio, 0.47 P < 0.05). Predictors of greater likelihood of catheter use were female sex (odds ratio, 3.9 P < 0.001), late referral (odds ratio, 1.60 P < 0.05), and age (odds ratio, 1.02 per year increase P < 0.001). Proportions of access types varied significantly across the 7 treating centers (AVFs, 32% to 86% AVGs, 2% to 32% catheters, 9% to 33% P < 0.001). After adjusting for confounding factors, significant differences persisted among access types in some centers and the national average. Certain patient characteristics, such as age and female sex, are associated strongly with increased AVG and catheter use. However, the significant variation in risk across centers suggests more attention needs to be given to physician practice patterns to increase AVF use rates.
Publisher: Wiley
Date: 28-01-2016
DOI: 10.1111/NEP.12573
Abstract: The Fish oils and Aspirin in Vascular access OUtcomes in REnal Disease (FAVOURED) trial investigated whether 3 months of omega-3 polyunsaturated fatty acids, either alone or in combination with aspirin, will effectively reduce primary access failure of de novo arteriovenous fistulae. This report presents the baseline characteristics of all study participants, examines whether study protocol amendments successfully increased recruitment of a broader and more representative haemodialysis cohort, including patients already receiving aspirin, and contrasts Malaysian participants with those from Australia, New Zealand and the United Kingdom (UK). This international, randomized, double-blind, placebo-controlled trial included patients older than 19 years with stage 4 or 5 chronic kidney disease currently receiving, or planned within 12 months to receive haemodialysis. Participants (n = 568) were overweight (28.6 ± 7.3 kg/m(2) ), relatively young (54.8 ± 14.3 years), and predominantly male (63%) with a high prevalence of diabetes mellitus (46%) but low rate of ischaemic heart disease (8%). Sixty one percent were planned for lower arm arteriovenous fistula creation. Malaysian participants (n = 156) were younger (51.8 ± 13.6 years vs 57.1 ± 14.2 years, P < 0.001) with a higher prevalence of diabetes mellitus (65% vs 43%, P < 0.001), but less ischaemic heart disease (5% vs 14%, P < 0.01) compared with the combined Australian, New Zealand and UK cohort (n = 228). Protocol modifications allowing for inclusion of patients receiving aspirin increased the prevalence of co-morbidities compared with the original cohort. The FAVOURED study participants, while mostly similar to patients in contemporary national registry reports and comparable recent clinical trials, were on average younger and had less ischaemic heart disease. These differences were reduced as a consequence of including patients already receiving aspirin.
Publisher: Wiley
Date: 22-04-2019
DOI: 10.1111/TID.13076
Abstract: The aim of this study was to determine whether a composite score of simple immune biomarkers and clinical characteristics could predict severe infections in kidney transplant recipients. We conducted a prospective study of 168 stable kidney transplant recipients who underwent measurement of lymphocyte subsets, immunoglobulins, and renal function at baseline and were followed up for 2 years for the development of any severe infections, defined as infection requiring hospitalization. A point score was developed to predict severe infection based on logistic regression analysis of factors in baseline testing. Fifty-nine (35%) patients developed severe infection, 36 (21%) had two or more severe infections, and 3 (2%) died of infection. A group of 19 (11%) patients had the highest predicted infectious risk (>60%), as predicted by the score. Predictive variables were mycophenolate use, graft function, CD4+, and natural killer cell number. The level of immunosuppression score had an area under the receiver operating curve of 0.75 (95% CI: 0.67-0.83). Our level of immunosuppression score for predicting the development of severe infection over 2 years has sufficient prognostic accuracy for identification of high-risk patients. This data can inform research that examines strategies to reduce the risks of infection.
Publisher: Wiley
Date: 22-10-2021
DOI: 10.1111/NEP.13782
Publisher: Elsevier BV
Date: 07-2008
DOI: 10.1053/J.AJKD.2008.02.296
Abstract: Uremic toxicity is a major concern in the dialysis population. There is keen interest in techniques that increase the removal of larger uremic molecules. We examined the short-term impact of a new, more porous, super-flux Helixone membrane (FX-E) versus the conventional high-flux Helixone membrane (FX-60) on beta(2)-microglobulin (beta2M) reduction and nutritional and inflammatory parameters. Randomized, double blind, crossover, pilot trial. A single freestanding dialysis center. 30 stable hemodialysis patients. Patients were treated with FX-60 and FX-E membranes for a treatment period of 6 weeks each, with a 2-week washout period in between. Primary outcome was change in beta2M concentrations from baseline to end of treatment. Serum s les were obtained predialysis and postdialysis at 0, 2, and 6 weeks, and dialysate albumin s les were collected continuously throughout dialysis sessions. Mean postdialysis beta2M concentrations at the end of 6 weeks of treatment were 6.73 mg/L for FX-E versus 8.22 mg/L for FX-60, which was significantly lower overall by 0.69 mg/L (95% confidence interval [CI], -1.09 to -0.29 P = 0.001). beta2M reduction ratios were greater overall with FX-E by 4.83% (95% CI, 2.78 to 6.89 P < 0.001), with mean values of 57% for FX-60 versus 66% for FX-E at the end of treatment. Median dialysate albumin loss with FX-E was 1.23 g (range, 0.22 to 4.83 g) compared with 0.17 g (range, 0.0017 to 2.69 g) with FX-60, which was greater by 1.52 g (95% CI, 1.11 to 1.93 P < 0.001). Serum albumin concentrations were slightly lower with FX-E by 0.1 g/dL (0.55 g/L 95% CI, -1.04 to -0.07 P = 0.03), but prealbumin concentrations were not significantly different at 8.53 mg/L (95% CI, -23.76 to 6.71 P = 0.3). There were no differences in inflammatory cytokine concentrations or small-solute removal. Short-term pilot study. In this stable dialysis population, removal of beta2M was more efficient with the Helixone super-flux FX-E membrane, with only a small decrease in albumin concentrations despite increased albumin loss. Large trials with longer treatment periods are required to evaluate the impact of the FX-E membrane on clinical outcomes.
Publisher: Wiley
Date: 08-2004
Publisher: Wiley
Date: 31-10-2003
DOI: 10.1046/J.1445-5994.2003.00420.X
Abstract: Plasma homocysteine is elevated in patients with end-stage renal disease (ESRD) and is a risk factor for cardiovascular disease. Folic acid has been shown to partially reduce homocysteine levels in dialysis patients. It is not known whether vitamin B12 reduces homocysteine independent of folic acid in patients who are not vitamin B12 deficient. To determine whether 1 mg vitamin B12 lowers homocysteine in stable, chronic, haemodialysis patients independent of folic acid. Twenty-eight haemodialysis patients were randomized to receive three doses of 1 mg vitamin B12 or 1 mL saline placebo in a double-blind fashion at 1-month intervals. Fasting plasma total homocysteine, folic acid, red-cell folate, vitamin B12 and haemoglobin levels were determined prior to each dose and 4 weeks after the final injection. The study was powered to detect a 30% reduction in homocysteine over the 3 months. Both the two groups were well matched with respect to total homocysteine levels, folic acid, red-cell folate and vitamin B12 levels. Serum vitamin B12 levels were significantly higher in the treatment group compared to placebo (217.7 pmol/L 95% confidence interval (CI) 103.0-332.5 P < 0.001) at the end of the trial but homocysteine levels were not significantly different (3.08 micromol/L 95% CI -4.44-10.61 P= 0.406). The administration of intramuscular vitamin B12 over a 3-month period does not result in any reduction of plasma homocysteine levels in haemo-dialysis patients independent of folate status, however reductions of <30% cannot be excluded by the present study. High-dose folic acid remains the treatment of choice in reducing homocysteine, but whether this results in a reduction in cardiovascular events remains to be determined.
Publisher: Elsevier BV
Date: 07-2011
DOI: 10.1053/J.AJKD.2011.01.024
Abstract: Cardiovascular mortality rates in the general population have decreased over time. We hypothesized that cardiovascular mortality rates in dialysis patients, which are higher than in the general population, have not decreased as much as those in the general population. Comparison of registry data with population data. Data for prevalent Australian patients for whom dialysis was the first renal replacement therapy were obtained from the Australia and New Zealand Dialysis and Transplant (ANZDATA) Registry for 1992-2005. Data for a comparable Australian general population were obtained from the Australian Bureau of Statistics. Cardiovascular mortality rates per 100 person-years were calculated from ANZDATA Registry data, and age-specific relative risks were calculated relative to cardiovascular mortality rates in the general population. Included in this cohort were 34,741 dialysis patients with 93,112 person-years of follow-up and 7,267 cardiovascular deaths. Cardiovascular mortality rates decreased over time in the general population and in dialysis patients older than 55 years. In patients aged 55-64 years, cardiovascular mortality rates were 9.0 (95% CI, 7.8-10.3) per 100 person-years in 1992-1994 and 6.4 (95% CI, 5.5-7.3) in 2004-2005 corresponding relative risks were 32.4 (95% CI, 28.2-37.2) and 52.0 (95% CI, 45.2-59.9), respectively. The corresponding cardiovascular mortality rates for dialysis patients aged 65-74 years were 11.6 (95% CI, 10.4-13.0) and 8.3 (95% CI, 7.4-9.3) relative risks were 12.9 (95% CI, 11.6-14.5) and 20.8 (95% CI, 18.7-23.2). Using negative binomial regression, the relative risk associated with dialysis compared with the general population increased over time (P for interaction = 0.001). Causes of death used to define cardiovascular mortality were not coded using identical systems in the ANZDATA Registry and the Australian population. Despite decreasing cardiovascular mortality rates in some dialysis patients, the excess cardiovascular risk compared with the general population is increasing.
Publisher: Elsevier BV
Date: 05-2020
Publisher: Elsevier BV
Date: 02-2014
DOI: 10.1053/J.AJKD.2013.08.025
Abstract: A predictive histologic classification recently was proposed to determine the prognostic value of kidney biopsy in patients with antineutrophil cytoplasmic antibody-associated renal vasculitis (AAV). A dual-purpose retrospective observational cohort study to assess the reproducibility of the new classification and clinical variables that predict outcomes. 169 consecutive patients with AAV were identified 145 were included in the reproducibility study, and 120, in the outcomes study. Kidney biopsy specimens were classified according to the predominant glomerular lesion: focal, mixed, crescentic, and sclerotic. An assessment of tubular atrophy also was performed. The primary outcome was time to end-stage kidney disease or all-cause mortality, modeled using Cox regression analysis. Estimated glomerular filtration rate, requirement for renal replacement therapy. For the reproducibility study, the overall inter-rater reliability of the classification demonstrated variability among 3 histopathologists (intraclass correlation coefficient, 0.48 95% CI, 0.38-0.57 κ statistic=0.46). Although agreement was high in the sclerotic group (κ=0.70), it was less consistent in other groups (κ=0.51, κ=0.47, and κ=0.23 for crescentic, focal, and mixed, respectively). For the clinical outcomes study, patients with sclerotic patterns of glomerular injury displayed the worst outcomes. Patients with focal (HR, 0.26 95% CI, 0.12-0.58 P=0.001), crescentic (HR, 0.33 95% CI, 0.16-0.69 P=0.003), and mixed (HR, 0.39 95% CI, 0.18-0.81 P=0.01) patterns of injury had lower risk of the primary outcome. Tubular atrophy correlated with outcome, and advanced injury was associated with worse outcomes (HR, 5.9 95% CI, 2.25-15.47 P<0.001). Level of kidney function at presentation strongly predicted outcome (HR per 10-mL/min/1.73m(2) increase in estimated glomerular filtration rate, 0.63 95% CI, 0.46-0.81 P<0.001). Data availability, given the retrospective nature of the study. Reproducibility of the classification was seen only in patients with sclerotic patterns of glomerular injury. Sclerotic pattern of glomerular injury, advanced chronic interstitial injury, and decreased kidney function all predicted poor outcomes.
Publisher: Elsevier BV
Date: 02-2021
Publisher: Wiley
Date: 07-12-2014
DOI: 10.1111/SDI.12322
Abstract: Since their inception in the 1960s, home-based dialysis therapies have been viable alternatives to conventional thrice weekly in center hemodialysis. In spite of this, uptake of these therapies has been steadily declining over past decades with utilization varying globally dependent on training support, funding models, and prevailing Nephrologist beliefs. In the Australian context, home dialysis (predominantly peritoneal dialysis and extended hours nocturnal hemodialysis) is now again increasing in popularity--with enthusiasm driven not only by evidence of an array of physiological and psychological patient benefit but also significant economic advantage: critical in the current climate where dialysis therapies in Australia take approximately $1 billion dollars per year from the healthcare budget. When assessing the significant advantages of home-based therapies, it is important to consider not only the increasing body of evidence around improved survival but also that for dramatically better health-related quality of life, decreased economic burden and the overall benefits of undertaking treatment in the home. With patient-centered care an increasingly important aspect of our decision making paradigm, home-based dialysis should be considered as the default option in all patients transitioning to renal replacement therapy.
Publisher: American Public Health Association
Date: 07-2008
Abstract: Objectives. We sought to determine whether an elevated burden of chronic kidney disease is found among disadvantaged groups living in the United States, Australia, and Thailand. Methods. We used data on participants 35 years or older for whom a valid serum creatinine measurement was available from studies in the United States, Thailand, and Australia. We used logistic regression to analyze the association of income, education, and employment with the prevalence of chronic kidney disease (estimated glomerular filtration rate mL/min/1.73 m 2 ). Results. Age- and gender-adjusted odds of having chronic kidney disease were increased 86% for US Whites in the lowest income quartile versus the highest quartile (odds ratio [OR] = 1.86 95% confidence interval [CI] = 1.27, 2.72). Odds were increased 2 times and 6 times, respectively, among unemployed (not retired) versus employed non-Hispanic Black and Mexican American participants (OR=2.89 95% CI=1.53, 5.46 OR=6.62 95% CI=1.94, 22.64. respectively). Similar associations were not evident for the Australian or Thai populations. Conclusions. Higher kidney disease prevalence among financially disadvantaged groups in the United States should be considered when chronic kidney disease prevention and management strategies are created. This approach is less likely to be of benefit to the Australian and Thai populations.
Publisher: Oxford University Press (OUP)
Date: 26-08-2015
DOI: 10.1093/AJE/KWV090
Abstract: In the application of marginal structural models to compare time-varying treatments, it is rare that the hierarchical structure of a data set is accounted for or that the impact of unmeasured confounding on estimates is assessed. These issues often arise when analyzing data sets drawn from clinical registries, where patients may be clustered within health-care providers, and the amount of data collected from each patient may be limited by design (e.g., to reduce costs or encourage provider participation). We compared the survival of patients undergoing treatment with various dialysis types, where some patients switched dialysis modality during the course of their treatment, by estimating a marginal structural model using data from the Australia and New Zealand Dialysis and Transplant Registry, 2003-2011. The number of variables recorded by the registry is limited, and patients are clustered within the dialysis centers responsible for their treatment, so we assessed the impact of accounting for unmeasured confounding or clustering on estimated treatment effects. Accounting for clustering had limited impact, and only unreasonable levels of unmeasured confounding would have changed conclusions about treatment comparisons. Our analysis serves as a case study in assessing the impact of unmeasured confounding and clustering in the application of marginal structural models.
Publisher: Wiley
Date: 12-09-2016
DOI: 10.1111/NEP.12688
Abstract: There remains debate on which dialysis modality offers better survival outcomes for patients. We compare the survival of patients undergoing home haemodialysis (HD) with a permanent vascular access, facility HD with a permanent vascular access, facility HD with a central venous catheter or peritoneal dialysis. We considered adult patients from the Australia and New Zealand Dialysis and Transplant Registry who commenced dialysis between 1 October 2003 and 31 December 2011. Patients were followed until death, transplant, loss to follow-up or 31 December 2011. Marginal structural models for mortality were used to account for time-varying treatment, comorbidities and baseline covariates. Unmeasured differences between treatment groups may remain even after adjustment for measured differences, so the potential effects of unmeasured confounding were explicitly modelled. There were 20,191 patients who underwent ≥90 days of dialysis (median 2.25 years, interquartile range 1-3.75 years). There were significant differences in age, gender, comorbidities and other variables between treatment groups at baseline. Thirty per cent of patients had at least one treatment change. Relative to facility HD with permanent access, the risk of death for home HD patients with a permanent access was lower in the first year (at 9 months: hazard ratio 0.41, 95% CI 0.25-0.67, adjusted for all baseline covariates). Findings were robust to unmeasured confounding within plausible ranges. Relative to facility HD with permanent vascular access, home HD conferred better survival prospects, while peritoneal dialysis was associated with a higher risk and facility HD with a catheter the highest risk, especially within the first year of dialysis.
Publisher: AMPCo
Date: 02-2008
DOI: 10.5694/J.1326-5377.2008.TB01585.X
Abstract: To explore awareness of the causes of kidney disease and recollection of kidney function testing in a cohort of Australian adults. An interviewer-administered cross-sectional survey, conducted from October to December 2004 as a nested study within the 5-year follow-up phase of the Australian Diabetes, Obesity and Lifestyle Study (AusDiab) 852 subjects who attended a testing site in New South Wales were interviewed. Responses to the questions "What sort of things do you think may lead to a person developing kidney disease?" and "Has a doctor or health care worker ever tested your kidney function, outside of the AusDiab study?" Respondents most commonly believed that kidney disease was caused by alcohol misuse or poor diet, with few identifying diabetes or high blood pressure. Awareness of risk factors was no greater in respondents identified as having chronic kidney disease (CKD). A third of respondents with CKD recalled having undergone a test of kidney function within the previous 2 years, while another third replied they had never had their kidney function tested. Of participants with previously diagnosed diabetes or treated hypertension, 54.1% and 32.0%, respectively, reported having their kidney function tested within the previous 2 years. Knowledge of risk factors for kidney disease and recall of kidney function testing were both limited, even among subgroups of the cohort who were at greatest risk of CKD. Prevention efforts may benefit from public and patient education to improve recognition of risk factors for CKD.
Publisher: Oxford University Press (OUP)
Date: 30-10-2008
DOI: 10.1093/NDT/GFM660
Abstract: Vascular calcification (VC) and arterial stiffness are major contributors to cardiovascular (CV) disease in chronic kidney disease (CKD). Both are independent predictors of CV mortality and are inversely correlated with bone mineral density (BMD). Few studies have addressed the extent of VC in the pre-dialysis CKD population, with associated measurements of BMD and arterial compliance. We report cross-sectional data on 48 patients with CKD (GFR 17-55 ml/min) assessing the prevalence of VC and its associations. All patients had computed tomography (CT) scans through abdominal aorta and superficial femoral arteries (SFAs) to determine VC, pulse wave velocity (PWV) using SphygmoCor device (AtCor PWV Inc., Westmead, Australia) measuring arterial stiffness, and dual-energy X-ray absorptiometry (DEXA) scans to determine BMD, as well as serum markers of renal function and mineral metabolism. Patients, 71% male, 54% diabetic, had a median age 64.5 years. Mean estimated GFR was 35.1 +/- 10 ml/min. Mean PWV was 10.0 +/- 4.5 m/s and mean aortic VC score was 421.5 +/- 244 Hounsfield units, with 90% of subjects having some aortic VC present. In univariate linear regression analysis, aortic VC correlated positively with age (r 0.50, P < 0.001), triglycerides (r 0.47, P = 0.002) and PWV (r 0.33, P = 0.03). There was also greater VC with declining renal function (r -0.28, P = 0.05). There was no significant association between VC and serum markers of mineral metabolism, however phosphate and Ca x P correlated positively with PWV (r 0.35, P = 0.02, r 0.36, P = 0.02, respectively). There was also a positive association between PWV and triglycerides (P = 0.008), and a trend towards greater PWV with increasing age (P = 0.09). In multivariate regression analysis only increasing age and triglyceride levels were significantly associated with aortic VC and PWV. Mean spine and femoral T-scores on DEXA were 0.48 and -1.31 respectively, with 13% of subjects having femoral T-score <-2.5 (osteoporotic range). SFA VC inversely correlated with femoral T-scores (r -0.43, P = 0.004) however, there was a positive (likely false) association between spine T-scores and aortic VC (r 0.37, P = 0.01), related to the limitation of vertebral DEXA in CKD. There is a high prevalence of VC in pre-dialysis CKD patients, worse with increasing age, triglycerides and reducing renal function. Correlation exists between VC and PWV and determination of one or both may be useful for CKD patient CV risk assessment. Femoral BMD is inversely associated with SFA VC, but measurement of vertebral BMD by DEXA is unreliable in CKD patients with aortic VC.
Publisher: Wiley
Date: 17-03-2013
DOI: 10.1111/CTR.12105
Abstract: To determine factors associated with early pancreatic allograft thrombosis (EPAT). Thrombosis is the leading non-immunological cause of early pancreatic allograft failure. Multiple risk factors have been postulated. We hypothesized that recipient perioperative hypotension was a major risk factor and evaluated the correlation of this and other parameters with EPAT. We retrospectively reviewed the records of the 118 patients who received a pancreatic allograft at our center between October 1992 and January 2010. Multiple donor and recipient parameters were analyzed as associates of EPAT by univariate and multivariate analysis. There were 12 episodes of EPAT, resulting in an incidence of 10.2%. On univariate analysis, EPAT was associated with perioperative hypotension, vasopressor use, and neuropathy in the recipient (p ≤ 0.04 for all). On multivariate analysis corrected for age, sex, and peripheral vascular disease, only vasopressor use retained a significant association with EPAT with a hazard ratio of 8.74 (CI 1.11-68.9, p = 0.04). Factors associated with vasopressor use included recipient ischemic heart disease, peripheral vascular disease, retinopathy or neuropathy, and any surgical complication. Significant hypotension, measured by the need for perioperative vasopressor use was associated with EPAT, suggesting that maintenance of higher perfusion pressures may avoid this complication.
Publisher: Elsevier BV
Date: 12-2017
DOI: 10.1016/J.KINT.2017.05.011
Abstract: The Fragility Index is a tool for testing robustness of randomized controlled trial results for dichotomous outcomes. It describes the minimum number of in iduals in whom changing an event status would render a statistically significant result nonsignificant. Here we identified all randomized controlled trials in five nephrology and five general journals from 2005-2014. A total of 127 randomized controlled trials reporting at least one dichotomous statistically significant outcome (p less than 0.05) were included and the Fragility Index was calculated. Twenty randomized controlled trials had a Fragility Index of zero and were excluded from further analysis. Linear regression was performed to assess factors associated with Fragility Indexes stratified by primary or secondary outcomes. The median s le size was 134 (range 2211506) with 36 (range 5-2743) total number of events. The median Fragility Index was three (range 1-166), indicating that in half the trials the addition of three events to the treatment with the lowest number of events rendered the result nonsignificant. For primary outcome studies a doubling in total event number and s le size significantly increased the geometric mean Fragility Index by 52% and 42%, respectively. Compared to a reported p value of 0.05 to 0.01, those reporting 0.01 to 0.001 or less than 0.001 had a significant 57% and 472% increase in the median Fragility Index, respectively. Forty-one percent had a Fragility Index less than the total loss to follow-up, indicating a potential to change a trial result had all in iduals been accounted for. Thus, our study highlights the need for larger randomized controlled trials with accurate accounting for loss to follow-up to adequately guide evidence-based practice.
Publisher: Elsevier BV
Date: 12-2018
DOI: 10.1016/J.TRANSPROCEED.2018.07.017
Abstract: The aim of this study was to determine if measurement of B cell protective immunity was associated with susceptibility to sinopulmonary infection in kidney transplant recipients. A prospective cohort of 168 patients with stable graft function (median 4.1 years) underwent assessment of B-lymphocyte antigen CD19 (CD19 After 2 years follow-up, 31 patients (18%) developed sinopulmonary infection. CD19 Monitoring B-cell numbers represents a simple, inexpensive means of stratifying transplant recipients' risk of sinopulmonary infection.
Publisher: Oxford University Press (OUP)
Date: 17-09-2010
DOI: 10.1093/NDT/GFP490
Abstract: Nutritional status predicts outcome in dialysis populations. Increased dialysis time and/or frequency reportedly improves nutritional status. We examined the impact of more intensive dialysis on body composition. A cross-sectional, matched study comparing home haemodialysis (HHD) patients (>15 h/week, n = 28) and conventional haemodialysis (CHD) patients (<15 h/ week, n = 28), matched for age, sex, length of time on dialysis and diabetes, was performed. We measured total body protein (TBP) by in vivo neutron activation, total body fat (TBF) and skeletal muscle mass (SKMM) by dual-energy x-ray absorptiometry (DEXA) and biochemical and inflammatory parameters. Visceral (VFA) and subcutaneous fat areas (SFA) were determined from computed tomography. There was no significant difference in TBP (10.2 +/- 1.9 kg CHD versus 10.8 +/- 1.8 kg HHD, P = 0.18) or SKMM (25.6 +/- 5.6 kg CHD versus 26.2 +/- 4.2 kg HHD). TBF was not different (27.7 +/- 10.7 kg CHD versus 27.8 +/- 16.0 kg HHD), although the HHD group had greater VFA (182.0 +/- 105.6 cm(2) versus 173.8 +/- 90.1 cm(2)) and lower SFA (306.7 +/- 176.4 cm(2) versus 309.7 +/- 138.1 cm(2)), the difference was not statistically significant. Albumin concentrations were significantly increased in the HHD group (37.5 +/- 3.56 g/L versus 35.18 +/- 4.11 g/L, P = 0.03), whilst phosphate concentrations (1.57 +/- 0.41 mmol/LHHD versus 1.92 +/- 0.62 mmol/ LCHD, P = 0.02) and inflammatory parameters were lower. There was a positive relationship between hours of dialysis and TBP (beta = 0.08 P = 0.03). Surrogate nutritional markers and inflammatory parameters improved with more intensive dialysis, but this was not reflected by improved body composition. Further prospective studies are required to confirm whether more intensive dialysis affects body composition, and whether this impacts on metabolic risk and clinical outcome.
Publisher: Wiley
Date: 24-07-2003
DOI: 10.1046/J.1440-1797.2003.00157.X
Abstract: Cardiovascular disease (CVD) is the major cause of mortality in dialysis patients. Aspirin, beta-blockers, statins, and angiotensin-converting enzyme (ACE) inhibitors reduce CVD mortality in the general population, as may angiotensin II receptor antagonists. The prevalence of cardiovascular risk factors and usage rates of cardioprotective agents in end-stage renal failure are unknown. A retrospective, cross-sectional study of dialysis patients was performed to compare: (i) prevalence of cardiovascular risk factors (age, hypertension, hyperlipidaemia, diabetes mellitus, and smoking) (ii) use of cardioprotective agents and (iii) prevalence of cardiovascular disease between the time-points: 1996 (n = 262) versus 2001 (n = 369). We found an increase in the risk factors of age (53.6 +/- 14.9 years in 1996 vs 58.4 +/- 14.3 in 2001 P < 0.001) and hyperlipidaemia (45 vs 51.8% P < 0.001) between the two time-points, with a reduction in the prevalence of smoking (14.5 vs 8.1% P = 0.016). There was no difference in the prevalence of cardiovascular disease (37.4 vs 40.7% P = 0.44). Cardioprotective agents were underutilized, with improvement in prescribing practice between 1996 and in 2001, especially in the usage of statins (21.4 vs 38.7% in 2001 P = 0.019). In conclusion, CVD is the primary cause of mortality in our dialysis patients. Although traditional cardiovascular risk factors affect the majority of the dialysis population, underutilization of cardioprotective agents is common. Proof of efficacy of these agents in this population of enormous risk is urgently required.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 2014
DOI: 10.2215/CJN.03930413
Abstract: The relative merits of buttonhole (or blunt needle) versus rope ladder (or sharp needle) cannulation for hemodialysis vascular access are unclear. Clinical outcomes by cannulation method were reviewed in 90 consecutive home hemodialysis patients. Initially, patients were trained in rope ladder cannulation. From 2004 on, all incident patients were started on buttonhole cannulation, and prevalent patients were converted to this cannulation method. Coprimary outcomes were arteriovenous fistula-attributable systemic infections and a composite of arteriovenous fistula loss or requirement for surgical intervention. Secondary outcomes were total arteriovenous fistula-related infections and staff time requirements. Additionally, a systematic review evaluating infections by cannulation method was performed. Seventeen systemic arteriovenous fistula-attributable infections were documented in 90 patients who were followed for 3765 arteriovenous fistula-months. Compared with rope ladder, buttonhole was not associated with a significantly higher rate of systemic arteriovenous fistula-attributable infections (incidence rate ratio, 2.71 95% confidence interval, 0.66 to 11.09 P =0.17). However, use of buttonhole was associated with a significantly higher rate of total arteriovenous fistula infections (incidence rate ratio, 3.85 95% confidence interval, 1.66 to 12.77 P =0.03). Initial and ongoing staff time requirements were significantly higher with buttonhole cannulation. Arteriovenous fistula loss or requirement for surgical intervention was not different between cannulation methods. A systematic review found increased arteriovenous fistula-related infections with buttonhole compared with rope ladder in four randomized trials (relative risk, 3.34 95% confidence interval, 0.91 to 12.20), seven observational studies comparing before with after changes (relative risk, 3.15 95% confidence interval, 1.90 to 5.21), and three observational studies comparing units with different cannulation methods (relative risk, 3.27 95% confidence interval, 1.44 to 7.43). Buttonhole cannulation was associated with higher rates of infectious events, increased staff support requirements, and no reduction in surgical arteriovenous fistula interventions compared with rope ladder in home hemodialysis patients. A systematic review of the published literature found that buttonhole is associated with higher risk of arteriovenous fistula-related infections.
Publisher: Elsevier BV
Date: 02-2021
Publisher: Elsevier BV
Date: 2019
DOI: 10.1111/AJT.14900
Abstract: The aim of this study was to determine if natural killer cell number (CD3
Publisher: Wiley
Date: 25-04-2011
DOI: 10.1111/J.1440-1797.2010.01420.X
Abstract: Vascular calcification is prevalent in patients with chronic kidney disease. Abdominal aortic calcification (AAC) can be detected by X-ray, although AAC is less well documented in anatomical distribution and severity compared with coronary calcification. Using simple radiological imaging we aimed to assess AAC and determine associations in prevalent Australian haemodialysis (HD) patients. Lateral lumbar X-ray of the abdominal aorta was used to determine AAC, which is related to the severity of calcific deposits at lumbar vertebral segments L1 to L4. Two radiologists determined AAC scores, by semi-quantitative measurement using a validated 24-point scale, on HD patients from seven satellite dialysis centres. Regression analysis was used to determine associations between AAC and patient characteristics. Lateral lumbar X-ray was obtained in 132 patients. Median age of patients was 69 years (range 29-90), 60% were male, 36% diabetic, median duration of HD 38 months (range 6-230). Calcification (AAC score ≥ 1) was present in 94.4% with mean AAC score 11.0 ± 6.4 (median 12). Independent predictors for the presence and severity of calcification were age (P = 0.03), duration of dialysis (P = 0.04) and a history of cardiovascular disease (P = 0.009). There was no significant association between AAC and the presence of diabetes or time-averaged serum markers of mineral metabolism, lipid status and C-reactive protein. AAC detected by lateral lumbar X-ray is highly prevalent in our cohort of Australian HD patients and is associated with cardiovascular disease, increasing age and duration of HD. This semi-quantitative method of determining vascular calcification is widely available and inexpensive and may assist cardiovascular risk stratification.
Publisher: Elsevier BV
Date: 10-2017
DOI: 10.1053/J.AJKD.2016.11.029
Abstract: Survival and quality of life for patients on hemodialysis therapy remain poor despite substantial research efforts. Existing trials often report surrogate outcomes that may not be relevant to patients and clinicians. The aim of this project was to generate a consensus-based prioritized list of core outcomes for trials in hemodialysis. In a Delphi survey, participants rated the importance of outcomes using a 9-point Likert scale in round 1 and then re-rated outcomes in rounds 2 and 3 after reviewing other respondents' scores. For each outcome, the median, mean, and proportion rating as 7 to 9 (critically important) were calculated. 1,181 participants (202 [17%] patients/caregivers, 979 health professionals) from 73 countries completed round 1, with 838 (71%) completing round 3. Outcomes included in the potential core outcome set met the following criteria for both patients/caregivers and health professionals: median score ≥ 8, mean score ≥ 7.5, proportion rating the outcome as critically important ≥ 75%, and median score in the forced ranking question < 10. Patients/caregivers rated 4 outcomes higher than health professionals: ability to travel, dialysis-free time, dialysis adequacy, and washed out after dialysis (mean differences of 0.9, 0.5, 0.3, and 0.2, respectively). Health professionals gave a higher rating for mortality, hospitalization, decrease in blood pressure, vascular access complications, depression, cardiovascular disease, target weight, infection, and potassium (mean differences of 1.0, 1.0, 1.0, 0.9, 0.9, 0.8, 0.7, 0.4, and 0.4, respectively). The Delphi survey was conducted online in English and excludes participants without access to a computer and internet connection. Patients/caregivers gave higher priority to lifestyle-related outcomes than health professionals. The prioritized outcomes for both groups were vascular access problems, dialysis adequacy, fatigue, cardiovascular disease, and mortality. This process will inform a core outcome set that in turn will improve the relevance, efficiency, and comparability of trial evidence to facilitate treatment decisions.
Publisher: American Medical Association (AMA)
Date: 09-05-2012
Publisher: Wiley
Date: 30-06-2017
DOI: 10.1111/HDI.12452
Abstract: Higher calcium dialysate is recommended for quotidian nocturnal hemodialysis (NHD) (≥6 nights/week) to maintain bone health. It is unclear what the optimal calcium dialysate concentration should be for alternate night NHD. We aimed to determine the effect of low calcium (LC) versus high calcium (HC) dialysate on cardiovascular and bone parameters in this population. A randomized controlled trial where participants were randomized to LC (1.3 mmol/L, n = 24) or HC dialysate (1.6 or 1.75 mmol/L, n = 26). Primary outcome was change in mineral metabolism markers. Secondary outcomes included change in vascular calcification (VC) scores [CT abdominal aorta (AA) and superficial femoral arteries (SFA)), pulse wave velocity (PWV), bone mineral density (BMD) and left ventricular mass index (LVMI) over 12 months. In the LC group, pre-dialysis ionised calcium decreased -0.12 mmol/L (-0.18-0.06, P = 0.0001) and PTH increased 16 pmol/L (3.5-28.5, p = 0.01) from baseline to 12 months with no significant change in the HC group. In both groups, there was no progression of VC in AA or SFA and no change in PWV, LVMI or BMD. At 12 months, calcimimetics were prescribed in a higher percentage in the LC vs. HC groups (45.5% vs. 10.5%) with a lower proportion of the HC group being prescribed calcitriol (31.5% vs. 72%). Although dialysate calcium prescription influenced biochemical parameters it was not associated with difference in progression of VC between HC and LC groups. An important finding was the potential impact of alternate night NHD in attenuating progression of VC and inducing stabilisation of LVMI and PWV.
Publisher: Wiley
Date: 09-2006
DOI: 10.5694/J.1326-5377.2006.TB00584.X
Abstract: To evaluate the outcomes of and barriers to implementing standard guidelines (Caring for Australasians with renal impairment [CARI]), using iron management in patients having dialysis as an ex le. On-site review of iron management processes at six Australian dialysis units varying in size and locality. Patients' iron indices and haemoglobin levels were obtained from the Australian and New Zealand Dialysis and Transplant Registry. Patients with chronic kidney disease who were dependent on dialysis. Processes for assessing indices of iron stores and iron supplementation comparison with target indices in the CARI guidelines. There was considerable variability among the units in achievement of haemoglobin and iron targets, with 25%-32% of patients achieving haemoglobin targets of 110-120 g/L, 30%-68% achieving ferritin targets of 300-800 microg/L, and 65%-73% achieving transferrin saturation targets of 20%-50%. Implementation barriers included lack of knowledge, lack of awareness of or trust in the CARI guideline, inability to implement the guideline, and inability to agree on a uniform unit protocol. Factors associated with achieving the CARI guideline targets included nurse-driven iron management protocols, use of an iron management decision aid, fewer nephrologists per dialysis unit, and a "proactive" (actively keeping iron levels within target range) rather than "reactive" (only reacting if iron levels are out of the range) protocol. Variability in achievement of iron targets, despite the availability of a clinical practice guideline, may be explained by variability in processes of care for achieving and maintaining adequate iron parameters.
Publisher: Wiley
Date: 29-04-2019
DOI: 10.1111/NEP.13507
Abstract: Dialysis catheter-associated infections (CAI) are a serious and costly burden on patients and the health-care system. Many approaches to minimizing catheter use and infection prophylaxis are available and the practice patterns in Australia and New Zealand are not known. We aimed to describe dialysis catheter management practices in dialysis units in Australia and New Zealand. Online survey comprising 52 questions, completed by representatives from dialysis units from both countries. Of 64 contacted units, 48 (75%) responded (Australia 43, New Zealand 5), representing 79% of the dialysis population in both countries. Nephrologists (including trainees) inserted non-tunnelled catheters at 60% and tunnelled catheters at 31% of units. Prophylactic antibiotics were given with catheter insertion at 21% of units. Heparin was the most common locking solution for both non-tunnelled (77%) and tunnelled catheters (69%), with antimicrobial locks being predominant only in New Zealand (80%). Eight different combinations of exit site dressing were in use, with an antibiotic patch being most common (35%). All units in New Zealand and 84% of those in Australia undertook CAI surveillance. However, only 51% of those units were able to provide a figure for their most recent rate of catheter-associated bacteraemia per 1000 catheter days. There is wide variation in current dialysis catheter management practice and CAI surveillance is suboptimal. Increased attention to the scope and quality of CAI surveillance is warranted and further evidence to guide infection prevention is required.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 10-02-2023
DOI: 10.34067/KID.0000000000000076
Abstract: Health professionals resisted practice change in environments of low infection where the perception of a need to change is small. Standardizing care of central venous catheters for hemodialysis requires breaking down silos of practice to benefit all patients. Knowledge of and adherence to guidelines, formal change management, and ongoing facilitation are required to implement standardized care. Implementation of a care bundle standardizing insertion, management, and removal practices to reduce infection related to central venous catheters (CVCs) used for hemodialysis was evaluated in a stepped wedge, cluster randomized controlled trial conducted at 37 Australian hospitals providing kidney services, with no reduction in catheter-related blood stream infection detected. This process evaluation explored the barriers, enablers, and unintended consequences of the implementation to explain the trial outcomes. Qualitative process evaluation using pre-post semistructured interviews with 38 (19 nursing and 19 medical) and 44 (25 nursing and 19 medical) Australian health professionals involved in hemodialysis CVC management. Analysis was guided by the process implementation domain of the Consolidated Framework for Implementation Research. Key influences on bundle uptake were that clinicians were open to change that was evidence-based and driven by guidelines and had a desire to improve practice and patient outcomes. However, resistance to change in environments of low infection, working in silos of practice, and a need for in idualized delivery of patient education created barriers to uptake. Unintended effects of increased costs and lack of interoperability of systems for data collection were reported. Because the trial was in progress at the time of qualitative data collection, perceptions of the bundle may have been influenced by the fact that practices of participants were being observed as a part of the trial. This national process evaluation revealed that health professionals who reported experiencing a benefit viewed the bundle positively. Those who already provided most of the recommended care or perceived that their patient population was not included in the research evidence that underpinned the interventions, resisted the implementation of the bundle. Potentially, formal change management processes using facilitation may improve implementation of evidence-based practice. Australian New Zealand Clinical Trials Registry, ACTRN12616000830493.
Publisher: American Medical Association (AMA)
Date: 02-2017
DOI: 10.1001/JAMAINTERNMED.2016.8029
Abstract: Vascular access dysfunction is a leading cause of morbidity and mortality in patients requiring hemodialysis. Arteriovenous fistulae are preferred over synthetic grafts and central venous catheters due to superior long-term outcomes and lower health care costs, but increasing their use is limited by early thrombosis and maturation failure. ω-3 Polyunsaturated fatty acids (fish oils) have pleiotropic effects on vascular biology and inflammation and aspirin impairs platelet aggregation, which may reduce access failure. To determine whether fish oil supplementation (primary objective) or aspirin use (secondary objective) is effective in reducing arteriovenous fistula failure. The Omega-3 Fatty Acids (Fish Oils) and Aspirin in Vascular Access Outcomes in Renal Disease (FAVOURED) study was a randomized, double-blind, controlled clinical trial that recruited participants with stage 4 or 5 chronic kidney disease from 2008 to 2014 at 35 dialysis centers in Australia, Malaysia, New Zealand, and the United Kingdom. Participants were observed for 12 months after arteriovenous fistula creation. Participants were randomly allocated to receive fish oil (4 g/d) or matching placebo. A subset (n = 406) was also randomized to receive aspirin (100 mg/d) or matching placebo. Treatment started 1 day prior to surgery and continued for 12 weeks. The primary outcome was fistula failure, a composite of fistula thrombosis and/or abandonment and/or cannulation failure, at 12 months. Secondary outcomes included the in idual components of the primary outcome. Of 1415 eligible participants, 567 were randomized (359 [63%] male, 298 [53%] white, 264 [47%] with diabetes mean [SD] age, 54.8 [14.3] y). The same proportion of fistula failures occurred in the fish oil and placebo arms (128 of 270 [47%] vs 125 of 266 [47%] relative risk [RR] adjusted for aspirin use, 1.03 95% CI, 0.86-1.23 P = .78). Fish oil did not reduce fistula thrombosis (60 [22%] vs 61 [23%] RR, 0.98 95% CI, 0.72-1.34 P = .90), abandonment (51 [19%] vs 58 [22%] RR, 0.87 95% CI, 0.62-1.22 P = .43), or cannulation failure (108 [40%] vs 104 [39%] RR, 1.03 95% CI, 0.83-1.26 P = .81). The risk of fistula failure was similar between the aspirin and placebo arms (87 of 194 [45%] vs 83 of 194 [43%] RR, 1.05 95% CI, 0.84-1.31 P = .68). Neither fish oil supplementation nor aspirin use reduced failure of new arteriovenous fistulae within 12 months of surgery. anzctr.org.au Identifier: CTRN12607000569404.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 11-2006
Publisher: International Scientific Information, Inc.
Date: 03-05-2021
DOI: 10.12659/AOT.930787
Publisher: Wiley
Date: 18-03-2011
DOI: 10.1111/J.1445-5994.2010.02226.X
Abstract: End-stage kidney disease registry data have reported increased mortality in patients with diabetes as compared with those without. Here we examine whether diabetes is independently associated with an increased risk of major cardiovascular events and death in patients with advanced chronic kidney disease (CKD). Data from 315 participants with CKD in the Atherosclerosis and Folic Acid Supplementation Trial (ASFAST) were assessed. Primary end-points were fatal or non-fatal cardiovascular events, including myocardial infarction, stroke, unstable angina, coronary revascularisation and peripheral vascular events assessed both jointly and separately using Cox-proportional hazard models. Twenty-three per cent reported diabetes. Median follow up was 3.6 years. In those with diabetes, an increased risk for major cardiovascular events was observed, crude hazard ratio (HR) 2.87 (95% confidence interval (CI) 2.11-3.90). After adjustment for age, gender, smoking, systolic blood pressure, body mass index, past ischaemic heart disease and use of preventive therapies, diabetes was associated with an HR of 1.83 (1.28-2.61) for major cardiovascular events. The risk for peripheral vascular events was also increased, adjusted HR 6.31 (2.61-15.25). For all-cause death, major coronary and stroke events, the risk in those with diabetes was not significantly increased (all-cause death, adjusted HR 1.31 (95% CI 0.80-2.14) major coronary events, adjusted HR 1.26 (95% CI 0.64-2.49) and major stroke events, adjusted HR 1.28 (95% CI 0.55-2.99)). Diabetes significantly increases the risk of major cardiovascular events, especially peripheral vascular events in patients with advanced CKD. Trials of multifactorial management of cardiovascular risk factors are required to determine if outcomes for this population may be improved.
Publisher: Elsevier BV
Date: 12-2013
DOI: 10.1016/J.JHIN.2013.09.005
Abstract: Vancomycin-resistant enterococci (VRE) colonization is a frequent occurrence in patients with renal failure. Understanding the impact of VRE colonization on this group of patients has considerable clinical applicability. To understand whether VRE colonization in renal patients has an impact on number of admissions to hospital, length of stay, and mortality. A retrospective case-control study of renal dialysis patients was performed between 2000 and 2010. Cases were 134 VRE-colonized patients requiring renal replacement therapy and matched controls were 137 non-colonized patients with the same baseline characteristics. Matched cases and controls were analysed for differences in number of admissions, length of stay, and mortality. There was no difference in mortality between colonized and non-colonized patients (hazard ratio: 1.14 95% confidence interval: 0.78-1.69 P = 0.49). Length of stay for colonized patients was 7.29 days compared with 4.14 days (P < 0.001). The number of admissions for VRE-colonized patients was not significantly different compared with controls (9.34 vs 8.33, P = 0.78). VRE colonization did not increase mortality in renal patients but did contribute to increased length of stay.
Publisher: Elsevier BV
Date: 05-2018
DOI: 10.1053/J.AJKD.2017.12.003
Abstract: Vascular access outcomes in hemodialysis are critically important for patients and clinicians, but frequently are neither patient relevant nor measured consistently in randomized trials. A Standardized Outcomes in Nephrology-Hemodialysis (SONG-HD) consensus workshop was convened to discuss the development of a core outcome measure for vascular access. 13 patients/caregivers and 46 professionals (clinicians, policy makers, industry representatives, and researchers) attended. Participants advocated for vascular access function to be a core outcome based on the broad applicability of function regardless of access type, involvement of a multidisciplinary team in achieving a functioning access, and the impact of access function on quality of life, survival, and other access-related outcomes. A core outcome measure for vascular access required demonstrable feasibility for implementation across different clinical and trial settings. Participants advocated for a practical and flexible outcome measure with a simple actionable definition. Integrating patients' values and preferences was warranted to enhance the relevance of the measure. Proposed outcome measures for function included "uninterrupted use of the access without the need for interventions" and "ability to receive prescribed dialysis," but not "access blood flow," which was deemed too expensive and unreliable. These recommendations will inform the definition and implementation of a core outcome measure for vascular access function in hemodialysis trials.
Publisher: Springer Science and Business Media LLC
Date: 14-11-2018
Publisher: Oxford University Press (OUP)
Date: 08-2019
DOI: 10.1093/NDT/GFZ148
Abstract: Vascular access outcomes reported across haemodialysis (HD) trials are numerous, heterogeneous and not always relevant to patients and clinicians. This study aimed to identify critically important vascular access outcomes. Outcomes derived from a systematic review, multi-disciplinary expert panel and patient input were included in a multilanguage online survey. Participants rated the absolute importance of outcomes using a 9-point Likert scale (7–9 being critically important). The relative importance was determined by a best–worst scale using multinomial logistic regression. Open text responses were analysed thematically. The survey was completed by 873 participants [224 (26%) patients/caregivers and 649 (74%) health professionals] from 58 countries. Vascular access function was considered the most important outcome (mean score 7.8 for patients and caregivers/8.5 for health professionals, with 85%/95% rating it critically important, and top ranked on best–worst scale), followed by infection (mean 7.4/8.2, 79%/92% rating it critically important, second rank on best–worst scale). Health professionals rated all outcomes of equal or higher importance than patients/caregivers, except for aneurysms. We identified six themes: necessity for HD, applicability across vascular access types, frequency and severity of debilitation, minimizing the risk of hospitalization and death, optimizing technical competence and adherence to best practice and direct impact on appearance and lifestyle. Vascular access function was the most critically important outcome among patients/caregivers and health professionals. Consistent reporting of this outcome across trials in HD will strengthen their value in supporting vascular access practice and shared decision making in patients requiring HD.
Publisher: Elsevier BV
Date: 06-2019
Publisher: No publisher found
Date: 2003
Publisher: Wiley
Date: 29-09-2006
DOI: 10.1111/J.1440-1797.2006.00670.X
Abstract: The biochemical, haemodynamic, clinical and nutritional benefits of nocturnal home haemodialysis (NHHD) compared with 4 h, three times per week conventional haemodialysis are well known and accrue by increasing dialysis time and frequency either for 8 h alternate night per week (NHHD3.5) or for 8 h six nights per week (NHHD6). However, there are little data comparing NHHD3.5 with NHHD6. Thirteen patients on NHHD6 were compared with 21 patients on NHHD3.5, all with similar demographic profiles. Pre- and post-dialysis phosphate (PO4) control was ideal between the groups. However, all NHHD6 needed PO4 supplementation compared with 4/21 (19%) NHHD3.5. In the present study, 8/21 (38%) NHHD3.5 needed PO4 binders whereas none was required with NHHD6. The pre-haemoglobin (Hb) 122.8 g/L (NHHD6) versus 124.9 g/L (NHHD3.5) and the pre-albumin 38.31 g/L (NHHD6) versus 37.71 g/L (NHHD3.5) were not significantly different. NHHD6 had significantly lower pre-blood urea and creatinine (10.16 vs 19.54 mmol/L and 437.0 vs 812.3 micromol/L, respectively). Less interdialytic urea and creatinine fluctuation were also noted in NHHD6. Of major significance was the significantly lower ultra filtration rate and intradialytic weight gains (mean +/- SEM) of NHHD6 (249 +/- 76 mL/h and 2.0 +/- 0.65 kg) versus NHHD3.5 (425 +/- 168 mL/h and 2.9 +/- 1.2 kg). The authors conclude that NHHD6 offers the optimum biochemical, volume and clinical outcome, but NHHD3.5 has additional appeal to providers seeking home-based therapy cost advantages and consumable expenditure control. A flexible dialysis programme should offer all the time and frequency options of NHHD but in particular, should support NHHD at a frequency sympathetic to the clinical rehabilitation and lifestyle aspirations of in idual patients.
Publisher: No publisher found
Date: 2007
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 2009
Publisher: Wiley
Date: 31-03-2007
DOI: 10.1111/J.1542-4758.2007.00172.X
Abstract: Benefits of dialysate with greater calcium (Ca) concentration are reported in nocturnal hemodialysis (NHD) to prevent Ca depletion and subsequent hyperparathyroidism. Studies with patients dialyzing against 1.25 mmol/L Ca baths demonstrate increases in alkaline phosphatase (ALP) and parathyroid hormone (PTH) and increasing dialysate Ca subsequently corrects this problem. However, whether 1.5 or 1.75 mmol/L dialysate Ca is most appropriate for NHD is yet to be determined, and differences in the effect on mineral metabolism of daily vs. alternate daily NHD have also not been well defined. We retrospectively analyzed mineral metabolism in 48 patients, from 2 institutions (30 at Monash and 18 at Geelong), undergoing home NHD (8 hr/night, 3.5-6 nights/week) for a minimum of 6 months. Thirty-seven patients were dialyzed against 1.5 mmol/L Ca bath and 11 patients against 1.75 mmol/L. We ided patients into 4 groups, based on dialysate Ca and also on the hours per week of dialysis, or =40 (n=4 and 7). We compared predialysis and postdialysis serum markers, time-averaged over a 6-month period, and the administration of calcitriol and Ca-based phosphate binders between 1.5 and 1.75 mmol/L Ca dialysate groups. Baseline characteristics between all groups were similar, with a slightly longer, but nonsignificant, duration of NHD in both 1.75 mmol/L dialysate groups compared with 1.5 mmol/L. The mean predialysis Ca, phosphate, and Ca x P were similar between the 1.5 and 1.75 mmol/L groups, regardless of NHD hr/week. Postdialysis Ca was significantly greater, with 1.75 vs. 1.5 mmol/L in those dialyzing <40 hr/week (2.64+/-0.19 vs. 2.50+/-0.12 mmol/L, p=0.046), but postdialysis Ca x P were similar (2.25+/-0.44 vs. 2.16+/-0.29 mmol(2)/L(2), p=0.60). Parathyroid hormone was also lower with 1.75 vs. 1.5 mmol/L baths in the or =40 hr/week. Hemoglobin, ALP, and albumin were all similar between groups. There was also no difference in vitamin D requirement when using 1.75 mmol/L compared with the 1.5 mmol/L dialysate. Multivariate analysis to determine independent predictors of postdialysis serum Ca showed a statistically significant positive association with predialysis Ca, dialysate Ca, and total NHD hr/week. An elevated dialysate Ca concentration is required in NHD to prevent osteopenia but differences in serum markers of mineral metabolism between 1.5 and 1.75 mmol/L Ca dialysate in NHD in our study were few. This was similar for patients undertaking NHD or =40 hr/week, although differences in the frequency of NHD may also be as important as dialysate Ca with regard to serum Ca levels. With concerns that prolonged higher Ca levels contribute to increased cardiovascular mortality, the optimal Ca dialysate bath is still unknown and further studies addressing bone metabolism with larger NHD numbers are required.
Publisher: Oxford University Press (OUP)
Date: 22-03-2009
DOI: 10.1093/NDT/GFP114
Abstract: Excessive alcohol consumption is a risk factor for hypertension and stroke however, evidence for an association with chronic kidney disease is conflicting. A total of 6259 adults >or=25 years of age, without a history of alcohol dependence, participating in baseline (1999-2000) and follow-up (2004-2005) phases of an Australian population-representative study (AusDiab) were the subject of this analysis. Alcohol consumption status and volume/frequency were collected by standardized interviewer administered questionnaires and self-administered food frequency questionnaires. The outcomes were as follows: (i) 5-year decline in estimated glomerular filtration rate (eGFR) >or=10%, with baseline eGFR >or= 60 and final eGFR or= 2.5 (males)/>or= 3.5 (females) mg/mmol, in the absence of albuminuria at baseline. Self-identification as a moderate or heavy, versus light, drinker was associated with elevated risk of albuminuria in males and females <65 years of age (OR, 95% CI: males 1.87, 0.99-3.52 females 2.38, 1.37-4.14). Odds of de novo eGFR or=30 g/day was associated with an increased risk of albuminuria after adjustment for age, sex and baseline kidney function (OR = 1.59, 95% CI 1.07-2.36), but a reduced risk of eGFR <60 mL/min/1.73 m(2) (OR = 0.59, 95% CI 0.37-0.95), compared with consumption of <10 g/day. Moderate-heavy alcohol consumption may be an important modifiable risk factor for albuminuria in the general population. The natural history of alcohol-induced kidney damage and how this relates to markers of kidney function in the general population warrant further research.
Publisher: Springer Science and Business Media LLC
Date: 19-05-2011
Publisher: Elsevier BV
Date: 07-2010
DOI: 10.1053/J.AJKD.2009.12.039
Abstract: Vascular calcification contributes to cardiovascular disease in patients with chronic kidney disease (CKD). Few studies have addressed interventions to decrease vascular calcification however, experimental studies report benefits of bisphosphonates. Recent studies of hemodialysis patients also suggest benefits of bisphosphonates on vascular calcification however, no study exists in nondialysis patients with CKD. We conducted a randomized controlled trial to determine the effect of bisphosphonates on vascular calcification in patients with CKD. 51 patients with CKD stages 3-4 were recruited from a hospital outpatient setting 50 were treated with study medication. Patients were randomly assigned to either alendronate, 70 mg (n = 25), or matching placebo (n = 25), administered weekly. The primary outcome was change in aortic vascular calcification after 18 months. Secondary outcomes included superficial femoral artery vascular calcification, arterial compliance, bone mineral density (BMD), renal function, and serum markers of mineral metabolism. At baseline and 12 and 18 months, computed tomography, pulse wave velocity using SphygmoCor (AtCor Medical, PWV Inc, www.atcormedical.com), and dual-energy x-ray absorptiometry were performed to measure vascular calcification, arterial compliance, and BMD, respectively. Analysis was by intention to treat, with a random-effect linear regression model to assess differences. 46 patients completed the study (24 alendronate, 22 placebo) baseline mean age was 63.1 +/- 1.8 years, estimated glomerular filtration rate was 34.5 +/- 1.4 mL/min/1.73 m(2), 59% had diabetes, and 65% were men. 91% had aortic vascular calcification at the start and 78% showed progression. At 18 months, there was no difference in vascular calcification progression with alendronate compared with placebo (adjusted difference, -24.2 Hounsfield units [95% CI, -77.0 to 28.6] P = 0.4). There was an increase in lumbar spine BMD (T score difference, +0.3 [95% CI, 0.03-0.6] P = 0.04) and a trend toward better pulse wave velocity (-1 m/s [95% CI, -2.1 to 0.1] P = 0.07) with alendronate. Femoral BMD was similar between groups. There was a nonsignificant decrease in kidney function in patients on alendronate therapy compared with placebo (-1.2 mL/min/1.73 m(2) [95% CI, -4.0 to 1.7]). Small s le size and baseline differences, especially with aortic vascular calcification, may have diminished any potential difference between groups. Unlike previous studies of hemodialysis patients, alendronate did not decrease the progression of vascular calcification compared with placebo in patients with CKD during 18 months.
Publisher: Elsevier BV
Date: 10-2017
DOI: 10.1053/J.AJKD.2017.05.014
Abstract: Bleeding from dialysis vascular access (arteriovenous fistulas, arteriovenous grafts, and vascular catheters) is uncommon. Death from these bleeds is rare and likely to be under-reported, with incident rates of fewer than 1 episode for every 1,000 patient-years on dialysis, meaning that dialysis units may experience this catastrophic event only once a decade. There is an opportunity to learn from (and therefore prevent) these bleeding deaths. We reviewed all reported episodes of death due to vascular access bleeding in Australia and New Zealand over a 14-year period together with in idual dialysis units' root cause analyses on each event. In this perspective, we provide a clinically useful summary of the evidence and knowledge gained from these rare events. Our conclusion is that death due to dialysis vascular access hemorrhage is an uncommon, catastrophic, but potentially preventable event if the right policies and procedures are put in place.
Publisher: Elsevier BV
Date: 2004
DOI: 10.1053/J.AJKD.2003.09.016
Abstract: Control of serum phosphate remains a difficult clinical issue in most hemodialysis (HD) patients. This study examines 2 nonpharmacological approaches to improving phosphate control in HD patients. First, 9 stable HD patients underwent dialysis in random fashion for either 4 hours 3 times weekly or 5 hours 3 times weekly. Adjustments were made to blood flow rates such that Kt/V was the same for all sessions, thus allowing independent assessment of the influence of time. The primary end point was weekly dialysate phosphate removal. In the second study, 12 different patients underwent an exercise program in which they pedaled a bicycle ergometer either immediately before or during dialysis. Again, weekly dialysate phosphate removal was measured. In the time study, urea reduction ratio (69% +/- 0.02% versus 68% +/- 0.07, 4 versus 5 hours) and weekly urea removal were no different between the 2 groups. However, weekly phosphate removal (3,007 +/- 641 versus 3,400 +/- 647 mg P < 0.02) significantly improved with longer dialysis duration. Serum phosphate levels improved, but did not reach statistical significance in this short-term study. In the exercise study, weekly phosphate removal improved with exercise, but did not reach significance (2,741 +/- 715 [no exercise] versus 2,917 +/- 833 [exercise predialysis] versus 2,992 +/- 852 mg [exercise during dialysis] P = 0.055), although comparing only exercise during dialysis with no exercise reached significance (P = 0.02). There was no significant difference in serum phosphate levels. Both increased dialysis time and exercise result in increased dialytic removal of phosphate and could be expected in the long term to improve phosphate control.
Publisher: American College of Physicians
Date: 15-09-2020
DOI: 10.7326/M20-0529
Publisher: Wiley
Date: 14-02-2016
DOI: 10.1111/SDI.12476
Publisher: Elsevier BV
Date: 04-2021
Publisher: Wiley
Date: 16-02-2017
DOI: 10.1111/NEP.12759
Abstract: Uncertainties about the role of cystatin C-based estimated glomerular filtration rate (eGFR) in the prediction of cardiovascular disease (CVD) beyond traditional CVD risk factors remain. We assessed contributions of eGFR to CVD and mortality in the general population. Using 14 year follow-up data on 9353 adults without a reported history of CVD from the Australian Diabetes, Obesity and Lifestyle study, we assessed the contributions of eGFR (assessed by cystatin C (eGFR After adjusting for age, sex, CVD risk factors and uACR, compared with an eGFR In our community-based cohort, reduced eGFR
Publisher: Frontiers Media SA
Date: 14-07-2012
DOI: 10.1111/J.1432-2277.2012.01528.X
Abstract: The association between pretransplant dialysis modality and transplant outcomes remains inconsistent. The aim of this study is to address the association between alteration in dialysis modality and post-transplant outcomes. Using Australia and New Zealand Dialysis and Transplant Registry, primary live- and deceased-donor renal transplant recipients (RTR) between 1997 and 2009 were examined. Pre-emptive and multiple-organ transplants were excluded. The association between initial and pretransplant dialysis modality and transplant outcomes were examined. Of the 6701 RTR, 18.6% were initiated-maintained on peritoneal dialysis pretransplant (PD-PD), 9.2% were initiated on PD, but maintained on haemodialysis (HD) pretransplant (PD-HD), 63.3% were HD-HD and 8.9% were HD-PD. PD-HD [odds ratio(OR)1.44, 95% CI 1.21,1.72] and HD-HD (OR1.25, 95% CI 1.12,1.41) were associated with a significantly greater risk of slow graft function compared with the overall mean of the groups, whereas a change in initial dialysis modality from HD to pretransplant PD was associated with higher risk of overall graft failure [hazard ratio(HR)1.19, 95% CI 1.04,1.36) and recipient death (HR1.34, 95% CI 1.13,1.59). Our registry analysis suggest that dialysis modality pretransplant may affect transplant outcomes and future studies evaluating patient selection, choice of modality and/or potential interventions in the pre and post-transplant period may have a beneficial effect on post-transplant outcomes.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 13-04-2017
Abstract: The burden of premature death and health loss from ESRD is well described. Less is known regarding the burden of cardiovascular disease attributable to reduced GFR. We estimated the prevalence of reduced GFR categories 3, 4, and 5 (not on RRT) for 188 countries at six time points from 1990 to 2013. Relative risks of cardiovascular outcomes by three categories of reduced GFR were calculated by pooled random effects meta-analysis. Results are presented as deaths for outcomes of cardiovascular disease and ESRD and as disability-adjusted life years for outcomes of cardiovascular disease, GFR categories 3, 4, and 5, and ESRD. In 2013, reduced GFR was associated with 4% of deaths worldwide, or 2.2 million deaths (95% uncertainty interval [95% UI], 2.0 to 2.4 million). More than half of these attributable deaths were cardiovascular deaths (1.2 million 95% UI, 1.1 to 1.4 million), whereas 0.96 million (95% UI, 0.81 to 1.0 million) were ESRD-related deaths. Compared with metabolic risk factors, reduced GFR ranked below high systolic BP, high body mass index, and high fasting plasma glucose, and similarly with high total cholesterol as a risk factor for disability-adjusted life years in both developed and developing world regions. In conclusion, by 2013, cardiovascular deaths attributed to reduced GFR outnumbered ESRD deaths throughout the world. Studies are needed to evaluate the benefit of early detection of CKD and treatment to decrease these deaths.
Publisher: Elsevier BV
Date: 10-2007
DOI: 10.1053/J.AJKD.2007.07.016
Abstract: Australia historically has been recognized for its high fistula use. Observational study using registry data. Adult patients registered in the Australia and New Zealand Dialysis and Transplant Association Registry on hemodialysis in Australia. Cohort year. Hemodialysis access trends were examined from 2000 to 2005 for incident patients (within 60 days of hemodialysis therapy start), patients on hemodialysis therapy for 6 to 8 months, and prevalent hemodialysis patients. Multivariate analyses were performed to examine the relationship between access type and cohort year for each group, with adjustment for age, sex, race, body mass index, late referral, smoking status, cause of end-stage renal disease, comorbidities, and dialysis vintage. During 2000 to 2005, catheter use increased from 39% to 53% in incident patients, 10% to 22% in the 6- to 8-month groups, and 6% to 13% in prevalent patients. Fistula use decreased from 56% to 43% in incident patients and 78% to 67% in the 6- to 8-month group and remained at 73% to 75% in prevalent patients. Graft use decreased in all groups. Adjustment for factors associated with access type did not significantly change these results. The registry collects only the access in use at the end of the survey period thus, it was not possible to determine whether another access had failed or was present, but not in use. The small number of incident numbers prevented separate analysis of arteriovenous fistulas and arteriovenous grafts. Incident use of fistulas and grafts decreased, with an unexpected increase in both incident and prevalent catheters between 2000 and 2005. Adjustment for factors associated with access type did not significantly alter the trends. Changes in unidentified practice patterns, attitudes, or preferences are contributing to these trends. Ongoing evaluation of data and investigation into processes of care are required to increase functioning fistulas, together with reevaluation of the role of grafts in patients without a fistula.
Publisher: Wiley
Date: 27-10-2011
DOI: 10.1111/J.1440-1797.2011.01508.X
Abstract: Spot urine measurement of albumin is now the most commonly accepted approach to screening for proteinuria. Exertion prior to the collection may potentially influence the result of spot urine albumin estimation. We aim to evaluate the effect of exercise on albuminuria in subjects at various stages of diabetic nephropathy in comparison with healthy control volunteers. Thirty-five people with diabetes (19 with normoalbuminuria (NA), nine with microalbuminuria (MA) and seven with overt proteinuria (OP)) and nine control subjects were assessed. A 1 km treadmill walk was performed. Four spot urine specimens were collected: first morning void, immediately prior to exercise, and 1 h and 2 h after exercise. A random effects linear regression mixed model was used to assess the effect of exercise on albumin/creatinine ratio (uACR). Results are presented separately for male and female subjects with diabetes due to a significant exercise/gender interaction (P < 0.05). No significant effect of exercise on uACR was seen in control subjects. In NA males with diabetes no effect of exercise was seen, while in females uACR 1 h after exercise was significantly higher than the early morning s le (3.55 mg/mmol (96% confidence interval 0.27-6.83). Both female and male diabetes subjects with MA have increase in uACR 1 h after exercise (87.8, -24.3-199.4 and 6.7, 2.1-11.3). For both males and females with OP, uACR was significantly increased 1 h post exercise (67.5, 22-113 and 21.6, 8.4-34.8, respectively). In all groups uACR at 2 h after exercise was not significantly different to the early morning s le. Exercise increased uACR estimation in normoalbuminuric subjects with diabetes with a larger effect in females. Whether exercise unmasks early diabetic nephropathy in NA subjects requires further study.
Publisher: Elsevier BV
Date: 10-2020
Publisher: Elsevier BV
Date: 2019
DOI: 10.1016/J.KINT.2018.08.036
Abstract: Reliable estimates of the long-term outcomes of acute kidney injury (AKI) are needed to inform clinical practice and guide allocation of health care resources. This systematic review and meta-analysis aimed to quantify the association between AKI and chronic kidney disease (CKD), end-stage kidney disease (ESKD), and death. Systematic searches were performed through EMBASE, MEDLINE, and grey literature sources to identify cohort studies in hospitalized adults that used standardized definitions for AKI, included a non-exposed comparator, and followed patients for at least 1 year. Risk of bias was assessed by the Newcastle-Ottawa Scale. Random effects meta-analyses were performed to pool risk estimates subgroup, sensitivity, and meta-regression analyses were used to investigate heterogeneity. Of 4973 citations, 82 studies (comprising 2,017,437 participants) were eligible for inclusion. Common sources of bias included incomplete reporting of outcome data, missing biochemical values, and inadequate adjustment for confounders. In iduals with AKI were at increased risk of new or progressive CKD (HR 2.67, 95% CI 1.99-3.58 17.76 versus 7.59 cases per 100 person-years), ESKD (HR 4.81, 95% CI 3.04-7.62 0.47 versus 0.08 cases per 100 person-years), and death (HR 1.80, 95% CI 1.61-2.02 13.19 versus 7.26 deaths per 100 person-years). A gradient of risk across increasing AKI stages was demonstrated for all outcomes. For mortality, the magnitude of risk was also modified by clinical setting, baseline kidney function, diabetes, and coronary heart disease. These findings establish the poor long-term outcomes of AKI while highlighting the importance of injury severity and clinical setting in the estimation of risk.
Publisher: SAGE Publications
Date: 2016
DOI: 10.1186/S40697-016-0125-6
Abstract: Current guidelines favor fistulas over catheters as vascular access. Yet, the observational literature comparing fistulas to catheters has important limitations and biases that may be difficult to overcome in the absence of randomization. However, it is not clear if physicians would be willing to participate in a clinical trial comparing fistulas to catheters. We also sought to elicit participants' opinions on willingness to participate in a future trial regarding catheters and fistulas. We created a three-part survey consisting of 19 questions. We collected demographic information, respondents' knowledge of the vascular access literature, appropriateness of current guideline recommendations, and their willingness to participate in a future trial. Participants were recruited from Canada, Europe, Australia, and New Zealand. Participants include physicians and trainees who are involved in the care of end-stage renal disease patients requiring vascular access. Descriptive statistics were used to describe baseline characteristics of respondents according to geographic location. We used logistic regression to model willingness to participate in a future trial. We surveyed nephrologists from Canada, Europe, Australia, and New Zealand to assess their willingness to participate in a randomized trial comparing fistulas to catheters in incident hemodialysis patients. Our results show that in Canada, 86 % of respondents were willing to participate in a trial (32 % in all patients 54 % only in patients at high risk of primary failure). In Europe and Australia/New Zealand, the willingness to participate in a trial that included all patients was lower (28 % in Europe 25 % in Australia/New Zealand), as was a trial that included patients at high risk of primary failure (38 % in Europe 39 % in Australia/New Zealand). Nephrologists who have been in practice for a few years, saw a larger volume of patients, or self-identified as experts in vascular access literature were more likely to participate in a trial. (Continued on next page) Survey distribution was limited to vascular access experts in participating European countries and ultimately led to a discrepancy in numbers of European to non-European respondents overall. Canadian views are likely over-represented in the overall outcomes. Our survey results suggest that nephrologists believe there is equipoise surrounding the optimal vascular access strategy and that a randomized controlled study should be undertaken, but restricted to those in iduals with a high risk of primary fistula failure.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 08-2020
Abstract: Patients with hemodialysis central venous catheters (HD CVCs) are susceptible to health care-associated infections, particularly hemodialysis catheter-related bloodstream infection (HD-CRBSI), which is associated with high mortality and health care costs. There have been few systematic attempts to reduce this burden and clinical practice remains highly variable. This manuscript will summarize the challenges in preventing HD-CRBSI and describe the methodology of the REDUcing the burden of dialysis Catheter ComplicaTIOns: a National approach (REDUCCTION) trial. The REDUCCTION trial is a stepped-wedge cluster randomized trial of a suite of clinical interventions aimed at reducing HD-CRBSI across Australia. It clusters the intervention at the renal-service level with implementation randomly timed across three tranches. The primary outcome is the effect of this intervention upon the rate of HD-CRBSI. Patients who receive an HD CVC at a participating renal service are eligible for inclusion. A customized data collection tool allows near-to-real-time reporting of the number of active catheters, total exposure to catheters over time, and rates of HD-CRBSI in each service. The interventions are centered around the insertion, maintenance, and removal of HD CVC, informed by the most current evidence at the time of design (mid-2018). A total of 37 renal services are participating in the trial. Data collection is ongoing with results expected in the last quarter of 2020. The baseline phase of the study has collected provisional data on 5385 catheters in 3615 participants, representing 603,506 days of HD CVC exposure. The REDUCCTION trial systematically measures the use of HD CVCs at a national level in Australia, accurately determines the rate of HD-CRBSI, and tests the effect of a multifaceted, evidence-based intervention upon the rate of HD-CRBSI. These results will have global relevance in nephrology and other specialties commonly using CVCs.
Publisher: Elsevier BV
Date: 06-2010
Publisher: Elsevier BV
Date: 07-2013
DOI: 10.1053/J.AJKD.2013.03.010
Abstract: Low serum 25-hydroxyvitamin D (25[OH]D) levels have been associated with chronic kidney disease in cross-sectional studies. However, this association has not been studied prospectively in a large general population-based cohort. Prospective cohort study. 6,180 adults 25 years or older participating in the baseline and 5-year follow-up phases of the Australian Diabetes, Obesity and Lifestyle (AusDiab) Study. Serum 25(OH)D levels <15 ng/mL were considered deficient. Incident chronic kidney disease was defined as being negative at baseline but positive after 5 years for (1) reduced estimated glomerular filtration rate (eGFR <60 mL/min/1.72 m²) or (2) albuminuria (spot urine albumin-creatinine ratio ≥2.5 mg/mmol [≥22.1 mg/g] for men and ≥3.5 mg/mmol [≥30.9 mg/g] for women). 623 (10.9%) participants were vitamin D deficient, 161 developed incident reduced eGFR, and 222 developed incident albuminuria. In participants with and without vitamin D deficiency, annual age-standardized incidences were 0.92% (95% CI, 0.56%-1.30%) and 0.59% (95% CI, 0.51%-0.68%), respectively, for eGFR <60 mL/min/1.72 m² and 1.50% (95% CI, 1.06%-1.95%) and 0.66% (95% CI, 0.56%-0.76%), respectively, for albuminuria. In multivariate regression models, vitamin D deficiency was associated significantly with the 5-year incidence of albuminuria (OR, 1.71 95% CI, 1.12-2.61 P = 0.01), but not reduced eGFR (OR, 0.93 95% CI, 0.53-1.66 P = 0.8). The observational nature of the study does not account for unmeasured confounders. Only baseline 25(OH)D level was measured and therefore may not accurately reflect lifetime levels. Differences in baseline characteristics of participants who were included compared with those excluded due to missing data or follow-up may limit the applicability of results to the original AusDiab cohort. Our prospective cohort study shows that vitamin D deficiency is associated with a higher annual incidence of albuminuria and reduced eGFR and independently predicts the 5-year incidence of albuminuria. These associations warrant further exploration in long-term prospective clinical trials.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 2009
DOI: 10.2215/CJN.03410708
Publisher: Elsevier BV
Date: 10-2020
Publisher: Wiley
Date: 22-07-2008
Publisher: Elsevier BV
Date: 09-2022
DOI: 10.1053/J.JRN.2021.10.009
Abstract: High dietary phosphate intake may lead to adverse outcomes including cardiovascular disease (CVD). Urinary phosphate excretion, a marker of intestinal phosphate absorption, may be a more reliable marker of phosphate homeostasis in steady state than serum phosphate. Studies report good agreement between urine phosphate-to-creatinine ratio (uPiCr) and 24-hour urinary phosphate however, whether uPiCr is associated with increased risk of CVD or mortality remains uncertain. This study aimed to assess the relationship between uPiCr and all-cause and CVD mortality. This is an observational longitudinal cohort study using data from the population-based national Australian Diabetes, Obesity and Lifestyle study (n = 10,014 participants). Non-linear association between uPiCr and all-cause and CVD mortality was assessed using fractional polynomial transformations. Cox proportional hazards regression models were used to estimate adjusted hazard ratios for all-cause and CVD mortality. Median age [interquartile range] was 50 [41-62] years, and 46% were male. Median uPiCr was 1.38 [1.02-1.79] mmol/mmol. Median follow-up time was 16.9 years with 1,735 deaths. uPiCr was associated with all-cause and CVD mortality in univariate models and when adjusted for age and gender. However, associations were not significant in multivariate models. Sensitivity analyses excluding participants with chronic kidney disease (CKD) revealed a significant J-shaped association between uPiCr and all-cause mortality. Urine phosphate alone showed an association with increased all-cause mortality in a similar J-shape relationship. Although no association between uPiCr and all-cause and CVD mortality was observed in multivariate analyses in the whole cohort, a significant relationship between uPiCr and mortality in those without CKD suggests that uPiCr may have predictive validity for future adverse outcomes in people with no CKD.
Publisher: Nepal Journals Online (JOL)
Date: 25-02-2017
Abstract: Background BK virus associated nephropathy (BKVN) is an important cause of early graft dysfunction in renal transplant recipients. The present study was carried out to determine the burden of BKVN in a single renal transplant centre in Australia.Method A retrospective analysis of de novo renal transplant recipients from 2010 to 2013 was performed to identify biopsy proven BKVN. Estimated glomerular filtration rate (eGFR) was compared at baseline, at BKVN diagnosis and 3 and 12 months post-diagnosis.Result Of the 317 de novo renal transplants recipients in the study period, 20 (6.3%) developed BKVN. The mean age was 54.8 ± 13.1 years and 13 (65%) were male. The mean time from transplant to BKVN was 8.7 ± 6.7 months with 17 (85%) diagnosed within 12 months. Four recipients each were diagnosed BKVN on 3 and 12 month surveillance biopsy. Six (30%) had normal eGFR at diagnosis. Mean eGFR at diagnosis was 38.8 ± 19.2 ml/min/1.73 m2, which was significantly lower (p 0.01) than that at baseline (50.3 ± 16.4 ml/min/1.73 m2). eGFR improved numerically at 3 and 12 months post-diagnosis, however the difference was not significant. One patient had graft failure, 19 months after diagnosis.Conclusion BKVN generally occurs in first post-transplant year and is an important cause of early graft dysfunction. Surveillance biopsy helps in detecting subclinical BKVN.
Publisher: Elsevier BV
Date: 04-2010
DOI: 10.1053/J.AJKD.2009.12.011
Abstract: The Chronic Kidney Disease Epidemiology Collaboration (CKD-EPI) is more accurate than the Modification of Diet in Renal Disease (MDRD) Study equation. We applied both equations in a cohort representative of the Australian adult population. Population-based cohort study. 11,247 randomly selected noninstitutionalized Australians aged >or= 25 years who attended a physical examination during the baseline AusDiab (Australian Diabetes, Obesity and Lifestyle) Study survey. Glomerular filtration rate (GFR) was estimated using the MDRD Study and CKD-EPI equations. Kidney damage was defined as urine albumin-creatinine ratio >or= 2.5 mg/mmol in men and >or= 3.5 mg/mmol in women or urine protein-creatinine ratio >or= 0.20 mg/mg. Chronic kidney disease (CKD) was defined as estimated GFR (eGFR) >or= 60 mL/min/1.73 m(2) or kidney damage. Participants were classified into 3 mutually exclusive subgroups: CKD according to both equations CKD according to the MDRD Study equation, but no CKD according to the CKD-EPI equation and no CKD according to both equations. All-cause mortality was examined in subgroups with and without CKD. Serum creatinine and urinary albumin, protein, and creatinine measured on a random spot morning urine s le. 266 participants identified as having CKD according to the MDRD Study equation were reclassified to no CKD according to the CKD-EPI equation (estimated prevalence, 1.9% 95% CI, 1.4-2.6). All had an eGFR >or= 45 mL/min/1.73 m(2) using the MDRD Study equation. Reclassified in iduals were predominantly women with a favorable cardiovascular risk profile. The proportion of reclassified in iduals with a Framingham-predicted 10-year cardiovascular risk >or= 30% was 7.2% compared with 7.9% of the group with no CKD according to both equations and 45.3% of in iduals retained in stage 3a using both equations. There was no evidence of increased all-cause mortality in the reclassified group (age- and sex-adjusted hazard ratio vs no CKD, 1.01 95% CI, 0.62-1.97). Using the MDRD Study equation, the prevalence of CKD in the Australian population aged >or= 25 years was 13.4% (95% CI, 11.1-16.1). Using the CKD-EPI equation, the prevalence was 11.5% (95% CI, 9.42-14.1). Single measurements of serum creatinine and urinary markers. The lower estimated prevalence of CKD using the CKD-EPI equation is caused by reclassification of low-risk in iduals.
Publisher: Wiley
Date: 17-04-2019
DOI: 10.1111/NEP.13397
Abstract: Extended-hours haemodialysis has long been regarded as the optimal form of dialysis for solute clearance. With emerging benefits of haemodiafiltration, we wanted to compare these two head-to-head. In this randomized cross-over trial, we recruited existing nocturnal haemodialysis patients, who had not been hospitalized in the prior 3 months. After a baseline 8 h haemodialysis session, subjects were randomized to either 2 weeks of 8 h haemodialysis or 4 h haemodiafiltration with cross-over to the alternative treatment after a 2-week washout period. Subjects were additionally randomized to the Fresenius FX80 or Nipro Elisio in a parallel design. Blood and dialysate s les were collected at baseline and at the end of both study periods. Twelve patients completed the study. Mean (SD) age and body mass index were 55.1 ± 11.5 years and 36.4 ± 10.8, respectively. Urea and creatinine reduction ratios were higher with extended-hours haemodialysis compared to haemodiafiltration (difference 14.0%, 95% CI = 10.6, 17.3 P < 0.001 and 9.1%, 95% CI = 11.0, 7.2 P < 0.001). Fibroblast growth factor 23 (FGF23) clearance was superior with haemodiafiltration (difference 20.1%, 95% CI = 8.7, 31.6 P = 0.001). No difference was seen in reduction ratios for phosphate, retinol binding protein, alpha-1-microglobulin, beta-2-microglobulin and fetuin with both modalities. Compared to Nipro Elisio, Fresenius FX80 dialyser achieved higher beta-2-microglobulin clearance (Period 1: difference 7.8%, 95% CI = 1.3, 14.4 P = 0.02, Period 2:7.5%, 95% CI = 1.0, 14.1 P = 0.02). Small solute clearance was superior with extended-hours haemodialysis while haemodiafiltration enhanced FGF23 clearance. Beta-2-microglobulin clearance was improved with Fresenius FX80 dialyser, but this difference is unlikely to be clinically significant.
Publisher: Elsevier BV
Date: 03-2018
DOI: 10.1053/J.AJKD.2017.09.018
Abstract: Many randomized controlled trials have been performed with the goal of improving outcomes related to hemodialysis vascular access. If the reported outcomes are relevant and measured consistently to allow comparison of interventions across trials, such trials can inform decision making. This study aimed to assess the scope and consistency of vascular access outcomes reported in contemporary hemodialysis trials. Systematic review. Adults requiring maintenance hemodialysis. All randomized controlled trials and trial protocols reporting vascular access outcomes identified from ClinicalTrials.gov, Embase, MEDLINE, and the Cochrane Kidney and Transplant Specialized Register from January 2011 to June 2016. Any hemodialysis-related intervention. The frequency and characteristics of vascular access outcome measures were analyzed and classified. From 168 relevant trials, 1,426 access-related outcome measures were extracted and classified into 23 different outcomes. The 3 most common outcomes were function (136 [81%] trials), infection (63 [38%]), and maturation (31 [18%]). Function was measured in 489 different ways, but most frequently reported as "mean access blood flow (mL/min)" (37 [27%] trials) and "number of thromboses" (30 [22%]). Infection was assessed in 136 different ways, with "number of access-related infections" being the most common measure. Maturation was assessed in 44 different ways at 15 different time points and most commonly characterized by vein diameter and blood flow. Patient-reported outcomes, including pain (19 [11%]) and quality of life (5 [3%]), were reported infrequently. Only a minority of trials used previously standardized outcome definitions. Restricted s ling frame for feasibility and focus on contemporary trials. The reporting of access outcomes in hemodialysis trials is very heterogeneous, with limited patient-reported outcomes and infrequent use of standardized outcome measures. Efforts to standardize outcome reporting for vascular access are critical to optimizing the comparability, reliability, and value of trial evidence to improve outcomes for patients requiring hemodialysis.
Publisher: Wiley
Date: 08-2012
DOI: 10.5694/MJA11.11468
Abstract: Optimal detection and subsequent risk stratification of people with chronic kidney disease (CKD) requires simultaneous consideration of both kidney function (glomerular filtration rate [GFR]) and kidney damage (as indicated by albuminuria or proteinuria). Measurement of urinary albuminuria and proteinuria is hindered by a lack of standardisation regarding requesting, s le collection, reporting and interpretation of tests. A multidisciplinary working group was convened with the goal of developing and promoting recommendations that achieve consensus on these issues. The working group recommended that the preferred method for assessment of albuminuria in both diabetic and non-diabetic patients is urinary albumin-to-creatinine ratio (UACR) measurement in a first-void spot urine specimen. Where a first-void specimen is not possible or practical, a random spot urine specimen for UACR is acceptable. The working group recommended that adults with one or more risk factors for CKD should be assessed using UACR and estimated GFR every 1-2 years, depending on their risk-factor profile. Recommended testing algorithms and sex-specific cut-points for microalbuminuria and macroalbuminuria are provided. The working group recommended that all pathology laboratories in Australia should implement the relevant recommendations as a vital component of an integrated national approach to detection of CKD.
Publisher: Elsevier BV
Date: 07-2012
DOI: 10.1038/KI.2012.106
Abstract: In late 2009 transplant organizations recommended that kidney recipients be vaccinated for pandemic H1N1 influenza (pH1N1) however, the vaccine efficacy was unknown. We had offered a monovalent non-adjuvanted pH1N1 vaccine to transplant recipients. Here we compared the pre- and post-vaccination seroresponses of 151 transplant recipients to that of 71 hemodialysis patients and 30 healthy controls. Baseline seroprotection was similar between groups but was significantly different at 1 month (44, 56, and 87%, respectively). Seroconversion was significantly less common for transplant recipients (32%) than dialysis patients (45%) and healthy controls (77%). After adjusting for age and gender, dialysis patients were significantly more likely (2.7-fold) to achieve new seroprotection than transplant recipients. The likelihood of seroprotection in transplant recipients was significantly reduced by mycophenolate use (adjusted odds ratio 0.24), in a dose-dependent manner, and by reduced eGFR (adjusted odds ratio 0.16 for worst to best). Seroprotection and geometric mean antibody titers increased substantially in 49 transplant recipients who subsequently received the 2010 seasonal influenza vaccine. Thus, patients requiring renal replacement therapy had reduced seroresponses to vaccination with the monovalent vaccine compared with healthy controls. Transplant recipient responses were further reduced if they were receiving mycophenolate or had significantly lower graft function.
Publisher: Mary Ann Liebert Inc
Date: 02-2013
Publisher: Wiley
Date: 25-09-2012
DOI: 10.1111/SDI.12009
Abstract: Hypertension is highly prevalent yet poorly controlled in the majority of dialysis patients and represents a significant burden of disease, with rates of morbidity and mortality greater than those in the general population. In dialysis, blood volume plays a critical role in the pathogenesis of hypertension, with expansion of extracellular volume increasingly recognized as an independent risk factor for morbidity and mortality. Within the current paradigm of dialysis prescription the majority of patients remain chronically volume expanded. However, management of blood pressure and volume state is difficult for clinicians with a paucity of randomized evidence adding to the complexity of nonlinear morbidity and mortality associations. With dialysis itself as a significant cardiac stressor, control of volume state is critical to minimize intradialytic hemodynamic instability, aid in preservation of cardiac anatomy and prevent progression to cardiovascular morbidity and mortality. This review explores the relationship of blood volume to blood pressure and potential targets for management in this at risk population.
Publisher: Springer Science and Business Media LLC
Date: 09-11-2020
DOI: 10.1038/S41598-020-76379-6
Abstract: The evidence supporting an initial mycophenolate mofetil (MMF) dose of 2 g daily in tacrolimus-treated renal transplant recipients is limited. In a non-contemporaneous single-centre cohort study we compared the incidence of leukopaenia, rejection and graft dysfunction in patients initiated on MMF 1.5 g and 2 g daily. Baseline characteristics and tacrolimus trough levels were similar by MMF group. MMF doses became equivalent between groups by 12-months post-transplant, driven by dose reductions in the 2 g group. Leukopaenia occurred in 42.4% of patients by 12-months post-transplant. MMF 2 g was associated with a 1.80-fold increased risk of leukopaenia compared to 1.5 g. Rejection occurred in 44.8% of patients by 12-months post-transplantation. MMF 2 g was associated with half the risk of rejection relative to MMF 1.5 g. Over the first 7-years post-transplantation there was no difference in renal function between groups. Additionally, the development of leukopaenia or rejection did not result in reduced renal function at 7-years post-transplant. Leukopaenia was not associated with an increased incidence of serious infections or rejection. This study demonstrates the initial MMF dose has implications for the incidence of leukopaenia and rejection. Since neither dose produced superior long-term graft function, clinical equipoise remains regarding the optimal initial mycophenolate dose in tacrolimus-treated renal transplant recipients.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 04-2021
Abstract: This study examined patient and center factors associated with arteriovenous fistula/graft access use at hemodialysis commencement. Arteriovenous access use at hemodialysis commencement varied four-fold from 15% to 62% (median 39%) across centers. There is substantial variability in arteriovenous access use across centers. Commencing hemodialysis (HD) with an arteriovenous access is associated with superior patient outcomes compared with a catheter, but the majority of patients in Australia and New Zealand initiate HD with a central venous catheter. This study examined patient and center factors associated with arteriovenous fistula/graft access use at HD commencement. We included all adult patients starting chronic HD in Australia and New Zealand between 2004 and 2015. Access type at HD initiation was analyzed using logistic regression. Patient-level factors included sex, age, race, body mass index (BMI), smoking status, primary kidney disease, late nephrologist referral, comorbidities, and prior RRT. Center-level factors included size transplant capability home HD proportion incident peritoneal dialysis (average number of patients commencing RRT with peritoneal dialysis per year) mean weekly HD hours average blood flow and achievement of phosphate, hemoglobin, and weekly Kt/V targets. The study included 27,123 patients from 61 centers. Arteriovenous access use at HD commencement varied four-fold from 15% to 62% (median 39%) across centers. Incident arteriovenous access use was more likely in patients aged 51–72 years, males, and patients with a BMI of kg/m 2 and polycystic kidney disease but use was less likely in patients with a BMI of .5 kg/m 2 , late nephrologist referral, diabetes mellitus, cardiovascular disease, chronic lung disease, and prior RRT. Starting HD with an arteriovenous access was less likely in centers with the highest proportion of home HD, and no center factor was associated with higher arteriovenous access use. Adjustment for center-level characteristics resulted in a 25% reduction in observed intercenter variability of arteriovenous access use at HD initiation compared with the model adjusted for only patient-level characteristics. This study identified several patient and center factors associated with incident HD access use, yet these factors did not fully explain the substantial variability in arteriovenous access use across centers.
Publisher: No publisher found
Date: 2007
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 11-2020
Abstract: An autologous arteriovenous fistula (AVF) is the preferred hemodialysis vascular access, but successful creation is h ered by high rates of AVF failure. This study aimed to evaluate patient and surgical factors associated with AVF failure to improve vascular access selection and outcomes. This is a post hoc analysis of all participants of FAVOURED, a multicenter, double-blind, multinational, randomized, placebo-controlled trial evaluating the effect of fish oil and/or aspirin in preventing AVF failure in patients receiving hemodialysis. The primary outcome of AVF failure was a composite of fistula thrombosis and/or abandonment and/or cannulation failure at 12 months post-AVF creation, and secondary outcomes included in idual outcome components. Patient data (demographics, comorbidities, medications, and laboratory data) and surgical factors (surgical expertise, anesthetic, intraoperative heparin use) were examined using multivariable logistic regression analyses to evaluate associations with AVF failure. Of 536 participants, 253 patients (47%) experienced AVF failure during the study period. The mean age was 55±14.4 years, 64% were male, 45% were diabetic, and 4% had peripheral vascular disease. Factors associated with AVF failure included female sex (odds ratio [OR], 1.79 95% confidence interval [CI], 1.20 to 2.68), lower diastolic BP (OR for higher DBP, 0.85 95% CI, 0.74 to 0.99), presence of central venous catheter (OR, 1.49 95% CI, 1.02 to 2.20 P =0.04), and aspirin requirement (OR, 1.60 95% CI, 1.00 to 2.56). Female sex, requirement for aspirin therapy, requiring hemodialysis via a central venous catheter, and lower diastolic BP were factors associated with higher odds of AVF failure. These associations have potential implications for vascular access planning and warrant further studies.
Publisher: Elsevier BV
Date: 10-2008
DOI: 10.1053/J.AJKD.2008.03.007
Abstract: Urine albumin assays by high-performance liquid chromatography (HPLC) yield greater values than immunoassays at lower albumin levels. We compared predictive values of albumin-creatinine ratios (ACRs) by these 2 techniques for mortality in Aboriginal people. This was a longitudinal study of 741 adults in a remote Aboriginal community who participated in a baseline health survey between 1992 and 1998 at ages ranging from 18 to 84 years (mean, 34 years). All natural deaths were documented on follow-up until 2006. Urine albumin concentrations were measured simultaneously by using both nephelometric and HPLC techniques on baseline urine s les retrieved from -70 degrees C storage, as well as creatinine concentrations, and ACRs were derived. Age- and sex-specific tertiles of ACR were compiled. Cox regression analyses were used to evaluate the predictive value of ACR for natural deaths by ACR tertiles and again by z score changes in ACRs as continuous variables. Participants were followed up for a median of 11 years, during which a total of 119 natural deaths were documented. ACRs on baseline urine s les were greater by HPLC than immunoassay at lower ACR ranges, but were fairly concordant at levels greater than 100 mg/mmol. Levels of ACR by both techniques were strong predictors of death, but correlations of death with ACR tertiles and with ACR levels on a continuum were similar for the 2 techniques. The age- and sex-specific tertiles used might introduce some risk of bias in the assessment of predictive value. In addition, assays were performed on urine after more than 10 years of cold storage. Despite different absolute values, this study did not show that ACR level by either technique was superior in predicting deaths.
Publisher: Springer Science and Business Media LLC
Date: 09-09-2009
DOI: 10.1007/S00125-009-1525-2
Abstract: Patients with end-stage kidney disease (ESKD) and patients with diabetes mellitus experience higher mortality rates than the general population. Whether ESKD imparts the same excess in mortality risk for those with diabetes as it does for those without diabetes is unknown. Included in the study were all white patients aged > or =25 years with incident ESKD and type 2 diabetes (n = 4,141) or with incident ESKD but without diabetes (n = 13,289) in Australia from 1991 to 2005, and all the in iduals aged > or =25 years without ESKD and with type 2 diabetes (n = 909) or without ESKD without diabetes (n = 10,302) enrolled in the AusDiab Study--a nationwide Australian representative cohort--from 1999 to 2005. Excess mortality was analysed in patients with ESKD by diabetes status, using age-, sex- and diabetes-status-specific standardised mortality ratios (SMRs) in the first 8 years after first renal replacement therapy among ANZDATA patients relative to AusDiab participants. The SMRs in patients with ESKD were, in non-diabetic patients and in those with type 2 diabetes, respectively: 14.2 (95% CI 13.9-14.6) and 10.8 (95% CI 10.4-11.2) (p < 0.01) in people aged <60 years, 28.7 (95% CI 27.2-30.4) and 18.6 (95% CI 17.1-20.4) (p or =60 years, 12.5 (95% CI 12.1-12.9) vs 9.7 (95% CI 9.3-10.1) (p < 0.01) in men, 11.0 (95% CI 10.7-11.4) vs 8.9 (95% CI 8.4-9.3) (p < 0.01) and in women, 23.4 (95% CI 22.5-24.3) vs 16.2 (95% CI 15.2-17.3) (p < 0.01). ESKD was associated with a greater relative increase in mortality in the non-diabetic study populations than in the type 2 diabetes population. Excess mortality was greater among younger people and women.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 10-02-2020
Publisher: Elsevier BV
Date: 10-2020
Publisher: Public Library of Science (PLoS)
Date: 05-01-2016
Publisher: Wiley
Date: 06-02-2007
DOI: 10.1111/J.1440-1797.2006.00756.X
Abstract: Peritonitis is a serious complication of peritoneal dialysis (PD) and a major cause of hospitalization, catheter loss, transfer to haemodialysis and death. Thus, it is important to identify risk factors for PD-related peritonitis in order to reduce the incidence and improve patient selection. This study is a prospective cohort review (1992-2003) with data consisting of 12,844 patient months, 506 PD patients and 623 episodes of peritonitis. Comorbidities and patient demographics were provided by the Australian and New Zealand Dialysis and Transplant Registry and these were merged with the hospital combined clinical and microbiology laboratory peritonitis database. Variables identified to be associated with an increased likelihood of peritonitis were: age (every 10 years OR, 1.26 95% CI, 1.07-1.48), gender (female OR, 1.91 95% CI, 1.2-3.01), current smoker at entry to dialysis (OR, 1.71 95% CI, 1.04-2.82) and the pre twin bag connection system (OR, 2.07 95% CI, 1.22-3.52). Increasing age, female gender and smoking increased the risk of peritonitis. Identifying these risk factors will assist in the selection, training and monitoring of our PD population.
Publisher: Elsevier BV
Date: 03-2009
Publisher: Springer Science and Business Media LLC
Date: 23-02-2021
DOI: 10.1186/S12876-021-01655-2
Abstract: Risk indices such as the pancreas donor risk index (PDRI) and pre-procurement pancreas allocation suitability score (P-PASS) are utilised in solid pancreas transplantation however no review has compared all derived and validated indices in this field. We systematically reviewed all risk indices in solid pancreas transplantation to compare their predictive ability for transplant outcomes. Medline Plus, Embase and the Cochrane Library were searched for studies deriving and externally validating risk indices in solid pancreas transplantation for the outcomes of pancreas and patient survival and donor pancreas acceptance for transplantation. Results were analysed descriptively due to limited reporting of discrimination and calibration metrics required to assess model performance. From 25 included studies, discrimination and calibration metrics were only reported in 88% and 38% of derivation studies (n = 8) and in 25% and 25% of external validation studies (n = 12) respectively. 21 risk indices were derived with mild to moderate ability to predict risk (C-statistics 0.52–0.78). Donor age, donor body mass index (BMI) and donor gender were the commonest covariates within derived risk indices. Only PDRI and P-PASS were subsequently externally validated, with variable association with post-transplant outcomes. P-PASS was not associated with pancreas graft survival. Most of the risk indices derived for use in solid pancreas transplantation were not externally validated (90%). PDRI and P-PASS are the only risk indices externally validated for solid pancreas transplantation, and when validated without reclassification measures, are associated with 1-year pancreas graft survival and donor pancreas acceptance respectively. Future risk indices incorporating recipient and other covariates alongside donor risk factors may have improved predictive ability for solid pancreas transplant outcomes.
Publisher: Elsevier BV
Date: 04-2014
DOI: 10.1016/J.DIABRES.2014.01.020
Abstract: To examine the relationship between average glucose (AG) and HbA1c in patients with and without chronic kidney disease (CKD) and type 2 diabetes. 43 patients with diabetes and CKD (stages 3-5) with stable glycaemic control, and glucose-lowering and erythropoiesis stimulating agent (ESA) doses, were prospectively studied for 3 months and compared to 104 age-matched controls with diabetes, without CKD from the ADAG study. Over 3 months, AG was calculated from 7 to 8 point self-monitored blood glucose measurements (SMBG) and from continuous glucose monitoring (CGMS), and mean HbA1c was calculated from 4 measurements. AG and HbA1c relationships were determined using multivariable linear regression analyses. The CKD and non-CKD groups were well matched for age and gender. Mean AG tended to be higher (p=0.08) but HbA1c levels were similar (p=0.68) in the CKD compared with non-CKD groups. A linear relationship between AG and HbA1c was observed irrespective of the presence and stage of CKD. The relationship was weaker in patients with stage 4-5 CKD (non-CKD R2=0.75, stage 3 CKD R2=0.79 and stage 4-5 CKD R2=0.34, all p<0.01). The inclusion of ESA use in the model rendered the effect of CKD stage insignificant (R2=0.67, p<0.01). In patients with type 2 diabetes and CKD there is a linear relationship between HbA1c and AG that is attenuated by ESA use, suggesting that ESA results in a systematic underestimation of AG derived from HbA1c.
Publisher: Springer Science and Business Media LLC
Date: 18-05-2016
Publisher: Wiley
Date: 04-2008
DOI: 10.1111/J.1542-4758.2008.00262.X
Abstract: Cardiovascular (CV) disease is the most common cause of mortality in end-stage kidney disease (ESKD), and arterial stiffness, measured by pulse wave velocity (PWV), is an independent predictor of all-cause and CV mortality. B-type natriuretic peptide (BNP) levels are high in patients with CV disease and ESKD, and increases in BNP may also be a marker of CV risk. Regular exercise has many benefits on quality of life and physical strength and may also improve CV risk, but few studies have addressed the impact of exercise on CV risk in ESKD. We performed a prospective cross-over trial in 19 hemodialysis (HD) patients to assess the impact of regular exercise on surrogate markers of CV risk-arterial compliance and BNP levels. Exercise involved the use of a bicycle ergometer for minimum 30 min at each HD session for 3 months, with a 1-month washout period. Group A (n=9) exercised for the first 3 months only, while group B (n=10) performed no intradialytic exercise initially and exercised for 3 months at cross-over (month 4). Pulse wave velocity was performed using a SphygmoCor device, with concurrent measurements of BNP and other serum markers, at the commencement of the study, at 3 months, and on completion. The mean PWV (A: 10.4+/-3.1 m/s, B: 9.8+/-3.8 at baseline) showed a trend toward improvement with exercise (A: 8.7+/-2.7, p=0.07), and no significant change without (B: 10.5+/-3.6, p=0.31). After cross-over, there was an increase in PWV in group A with cessation of exercise (9.75+/-2.4, p=0.01 vs. 3 months) and an improvement in group B with exercise (9.33+/-2.3, p=0.11 vs. 3 months). When comparing PWV after 3 months of exercise vs. 3 months of no exercise (paired t test), there was a significant difference in favor of exercise (9.04+/-0.59 vs. 10.16+/-0.74, p=0.008). The mean BNP levels following 3 months of exercise were also lower than those after 3 months of no exercise (504.4+/-101.2 vs. 809.4+/-196.1[N<100], p=0.047). There was an overall improvement in PWV, and to a lesser extent BNP levels, with 3 months of intradialytic exercise compared with no exercise, suggesting that regular exercise in ESKD may be associated with improvements in arterial compliance and a reduction in CV risk.
Publisher: Elsevier BV
Date: 02-2011
DOI: 10.1016/J.NUMECD.2009.08.010
Abstract: Physical inactivity is associated with cardiovascular risk however its relationship to chronic kidney disease is largely unknown. We examined the association between leisure-time physical activity and risk of chronic kidney disease in a prospective, population-based cohort of Australians aged ≥ 25 years (AusDiab). The baseline s le included 10,966 adults (4951 males and 6015 females). From this s le, 6318 participants with complete baseline and 5-year follow-up urinalysis and serum creatinine measurements formed the study population for longitudinal analysis. Self-reported leisure-time physical activity was measured using a validated, interviewer-administered questionnaire. Compared with sufficiently active in iduals (≥ 150 min physical activity per week), those who were inactive (0 min/week) were more likely to have albuminuria at baseline (multivariate-adjusted OR=1.34, 95% CI 1.10-1.63). Inactivity (versus sufficient physical activity) was associated with increased age- and sex-adjusted odds of an estimated glomerular filtration rate <3rd percentile (OR=1.30, 95% CI 1.02-1.65), although this was not significant after multivariate adjustment (OR=1.17, 95% CI 0.91-1.50). Obese, inactive in iduals were significantly more likely to have albuminuria at baseline (multivariate-adjusted OR=1.74, 95% CI 1.35-2.25), compared with sufficiently active, non-obese in iduals. Baseline physical activity status was not significantly associated with longitudinal outcomes. Physical inactivity is cross-sectionally associated with albuminuria prevalence, particularly when combined with obesity. Future studies are needed to determine whether this association is causal and the importance of physical activity in CKD prevention.
Publisher: Elsevier BV
Date: 07-2018
DOI: 10.1053/J.AJKD.2017.11.017
Abstract: Arteriovenous access failure frequently occurs in people on hemodialysis and is associated with morbidity, mortality and large healthcare expenditures. Omega-3 polyunsaturated fatty acids (omega-3 PUFA) may improve access outcomes via pleiotropic effects on access maturation and function, but may cause bleeding complications. Systematic review with meta-analysis. Adults requiring hemodialysis via arteriovenous fistula or graft. Trials evaluating omega-3 PUFA for arteriovenous access outcomes identified by searches in CENTRAL, MEDLINE, and Embase to 24 January 2017. Omega-3 PUFA. Primary patency loss, dialysis suitability failure, access abandonment, interventions to maintain patency or assist maturation, bleeding, gastrointestinal side-effects, all-cause and cardiovascular mortality, hospitalization, and treatment adherence. Treatment effects were summarized as relative risks (RR) and 95% confidence intervals (CI). Evidence was assessed using GRADE. Five eligible trials (833 participants) with a median follow-up of 12 months compared peri-operative omega-3 PUFA supplementation with placebo. One trial (n=567) evaluated treatment for fistulae and four (n=266) for grafts. Omega-3 PUFA supplementation prevented primary patency loss with moderate certainty (761 participants, RR 0.81, CI 0.68-0.98). Low quality evidence suggested, that omega-3 PUFA may have had little or no effect on dialysis suitability failure (536 participants, RR 0.95, CI 0.73-1.23), access abandonment (732 participants, RR 0.78, CI 0.59-1.03), need for interventions (732 participants, RR 0.82, CI 0.64-1.04), or all-cause mortality (799 participants, RR 0.99, CI 0.51-1.92). Bleeding risk (793 participants, RR 1.40, CI 0.78-2.49) or gastrointestinal side-effects (816 participants, RR 1.22, CI 0.64-2.34) from treatment were uncertain. There was no evidence of different treatment effects for grafts and fistulae. Small number and methodological limitations of included trials. Omega-3 PUFA supplementation probably protects against primary loss of arteriovenous access patency, but may have little or no effect on dialysis suitability failure, access interventions or access abandonment. Potential treatment harms are uncertain.
Publisher: Elsevier BV
Date: 02-2019
Publisher: Wiley
Date: 22-07-2008
Publisher: Elsevier BV
Date: 12-2018
Publisher: Wiley
Date: 22-04-2016
DOI: 10.1111/NEP.12629
Abstract: Clinical outcomes of patients with end-stage kidney disease (ESKD) receiving renal replacement therapy (RRT) secondary to IgA nephropathy (IgAN) have not been well described. To investigate the characteristics, treatments and outcomes of ESKD because of kidney-limited IgAN and Henoch-Schönlein purpura nephritis (HSPN) in the Australian and New Zealand RRT populations. All ESKD patients who commenced RRT in Australia and New Zealand between 1971 and 2012 were included. Dialysis and transplant outcomes were evaluated in both a contemporary cohort (1998-2012) and the entire cohort (1971-2012). Of 63 297 ESKD patients, 3721 had kidney-limited IgAN, and 131 had HSPN. For the contemporary cohort of IgAN patients on dialysis (n = 2194), 10-year patient survival was 65%. Of 1368 contemporary IgAN patients who received their first renal allograft, 10-year patient, overall renal allograft and death-censored renal allograft survival were 93%, 82% and 88%, respectively. Using multivariable Cox regression analysis, patients with IgAN had favourable dialysis patient survival (adjusted hazard ratio (HR) 0.63, 95% confidence interval (CI) 0.57-0.69), overall renal allograft survival (HR 0.67, 95% CI 0.57-0.79) and renal transplant patient survival (HR 0.58, 95% CI 0.45-0.74) compared with ESKD controls. Similar results were found in the entire cohort and when using competing-risks models. Compared with kidney-limited IgAN patients, those with HSPN had worse dialysis patient survival (HR 1.94, 95% CI 1.02-3.69), overall renal allograft survival (HR 3.40, 95% CI 1.00-11.55) and renal transplant patient survival (HR 3.50, 95% CI 1.03-11.92). IgAN ESKD was associated with better dialysis and renal transplant outcomes compared with other forms of ESKD. The survival outcomes of ESKD patients with HSPN were worse than kidney-limited IgAN.
Publisher: Oxford University Press (OUP)
Date: 19-07-2005
DOI: 10.1093/NDT/GFI007
Abstract: The degree to which transplant recipients are immunosuppressed influences their risks of rejection, infection and cancer. Current measures of immune suppression are crude (clinical events) or indirect (drug exposure). We assessed a direct measure of immune status, leukocyte phenotype and function (LPF, a composite measure of five aspects of peripheral blood leukocyte phenotype and function), as a predictor of infection. A double-blind, prospective, cohort study was conducted, to determine the burden of infection in stable renal transplant recipients with moderate-severe (Group I, n = 34) or minimal (Group II, n = 36) impairment of LPF, a composite score of: (i) CD4 count (ii) lymphocyte proliferation in response to phytohaemagglutinin A (PHA) (iii) serum Ig concentrations (iv) neutrophil phagocytic function and (v) reactive oxygen species generation. Subjects completed a 6 month diary and each recorded infection was scored 1-4: 1, minor undefined infection (e.g. URTI) 2, minor, microbiologically defined infection (e.g. UTI) 3, major defined infection (requiring hospitalization) 4, opportunistic infection (e.g. Herpes zoster). Final infection score was the sum of all infective episodes. Subjects were then followed-up for 5 years for outcome measures. Groups were well matched for age, sex, diabetes, serum creatinine, rejection and trough cyclosporin concentrations. Group I (moderate to severe impairment of LPF) recorded a higher infection score, 2.4+/-2.8 vs 1.2+/-1.2 for Group II, P = 0.02, due to a higher incidence of moderate to severe infection. This relationship was confirmed by multivariate analysis (OR 1.83, CI 1.08, 3.11, P = 0.03 per unit increase in infection score). During the 5 year follow-up period they had significantly more episodes of admission to hospital, and twice as many admissions due to infections, but no difference in malignancy, graft or patient outcome. LPF testing prospectively identified a cohort who incurred a higher burden of infection. Further studies are required to determine the predictive value of LPF for acute rejection, infection and cancer, and to determine whether adjustments to therapy on the basis of LPF can lead to improved outcomes.
Publisher: Elsevier BV
Date: 11-2011
DOI: 10.1053/J.AJKD.2011.04.027
Abstract: There is a resurgence of interest in home hemodialysis (HD), especially frequent or extended forms involving unconventionally frequent (>3 times/wk) and/or long (>6 hours) treatments. This resurgence is driven by cost containment and experience suggesting lower mortality risk compared with facility HD and peritoneal dialysis (PD). We performed an observational cohort study using the Australia and New Zealand Dialysis and Transplant Registry, using marginal structural modeling to adjust for time-varying medical comorbidity as both a source of selection bias and an intermediary variable on the causal pathway to death. All adult patients starting renal replacement therapy in Australia and New Zealand since March 31, 1996, followed up to December 31, 2007. The main predictor was dialysis modality (conventional facility HD, conventional home HD, frequent/extended facility HD, frequent/extended home HD, and PD). We adjusted for the confounding effects of patient demographics and comorbid conditions. Patient mortality. We analyzed 26,016 patients with 856,007 patient-months of follow-up. Relative to conventional facility HD, adjusted mortality HRs were 0.51 (95% CI, 0.44-0.59) for conventional home HD, 1.16 (95% CI, 0.94-1.44) for frequent/extended facility HD, 0.53 (95% CI, 0.41-0.68) for frequent/extended home HD, and 1.10 (95% CI, 1.06-1.16) for PD. The apparent benefit of home HD on mortality risk was less for patients who were nonwhite, non-Asian, and older. Potential for residual confounding from the limited collection of comorbid conditions (no collection of cognitive or motor impairment, depression, left ventricular volume or structure, or blood pressure/fluid volume status) and lack of socioeconomic, medication, and biochemical data in analyses. Our study supports a survival advantage of home HD without a difference between conventional and frequent/extended modalities. Suitably designed clinical trials of frequent/extended HD are needed to determine the presence and extent of mortality benefit with this modality.
Publisher: Elsevier BV
Date: 07-2018
DOI: 10.1053/J.AJKD.2017.11.010
Abstract: Clinical trials are most informative for evidence-based decision making when they consistently measure and report outcomes of relevance to stakeholders. We aimed to assess the scope and consistency of outcomes reported in trials for hemodialysis. Systematic review. Adults requiring maintenance hemodialysis enrolled in clinical trials. All Cochrane systematic reviews of interventions published by August 29, 2016, and the trials published and registered in ClinicalTrials.gov since January 2011. Any hemodialysis-related interventions. Frequency and characteristics of the reported outcome domains and measures. From the 362 trials, we extracted and classified 10,713 outcome measures (a median of 21 [IQR, 10-39] per trial) into 81 different outcome domains, of which 42 (52%) were surrogate 25 (31%), clinical and 14 (17%), patient reported. The number of outcome measures reported significantly changed over time. The 5 most commonly reported domains were all surrogates: phosphate (125 [35%] trials), dialysis adequacy (120 [33%]), anemia (115 [32%]), inflammatory markers (114 [31%]), and calcium (109 [30%]). Mortality, cardiovascular diseases, and quality of life were reported very infrequently (73 [20%], 44 [12%], and 32 [9%], respectively). For feasibility, we included a s ling frame that included only trials identified in Cochrane systematic reviews or ClinicalTrials.gov. Outcomes reported in clinical trials involving adults receiving hemodialysis are focused on surrogate outcomes, rather than clinical and patient-centered outcomes. There is also extreme multiplicity and heterogeneity at every level: domain, measure, metric, and time point. Estimates of the comparative effectiveness of available interventions are unreliable and improvements over time have been inconsistent.
Publisher: Wiley
Date: 2011
DOI: 10.1111/J.1365-2559.2011.03742.X
Abstract: New onset of the clinical symptoms of immunoglobulin A (IgA) nephropathy (IgAN) manifests with proliferative glomerular lesions in children, whereas adults exhibit mesangial matrix expansion and interstitial fibrosis. Alternatively, activated (M2) macrophages have been implicated in promoting tissue fibrosis in some settings. Therefore, the aim of this study was to investigate whether M2 macrophages are present in new-onset IgAN and if they are related to pathological differences between paediatric and adult disease. Biopsy specimens from paediatric ( 12 years, n=15) and adult (n=27) IgAN showed a significant infiltrate of CD68(+) macrophages. M2 macrophages, identified by CD163 or CD204 expression, were detected in glomeruli and the interstitium, being more prominent in adults versus young children. CD163(+) and CD204(+) macrophages were present in areas of fibrosis containing myofibroblasts, and double staining showed that CD163(+) cells produced the profibrotic molecule, connective tissue growth factor. In young children, total CD68(+) macrophages, but not M2 macrophages, correlated with glomerular hypercellularity. In contrast, in adults and older children, mesangial matrix expansion correlated with M2 macrophages but not with the total CD68(+) macrophage infiltrate. Alternatively activated M2 macrophages are present in new-onset paediatric and adult IgAN, and this population may promote the development of fibrotic lesions.
Publisher: SAGE Publications
Date: 30-05-2019
Abstract: The creation and maintenance of dialysis vascular access is associated with significant morbidity. Structured management pathways can reduce this morbidity, yet practice patterns in Australia and New Zealand are not known. We aimed to describe the arteriovenous access practices in dialysis units in Australia and New Zealand. An online survey comprising 51 questions was completed by representatives from dialysis units from both countries. In addition to descriptive analysis, responses were compared between units inside and outside of major cities. Of 64 contacted units, 48 (75%) responded (Australia 43, New Zealand 5), representing 38% of dialysis units in Australia and New Zealand. While 94% of units provided pre-dialysis education, only 60% reported a structured pre-dialysis pathway and 69% had a dedicated vascular access nurse. Most units routinely monitored fistula/graft function using flow rate measurement (73%) or recirculation studies (63%). A minority used routine ultrasound (35%). Thrombectomy, fistuloplasty and peritoneal dialysis catheter insertion were rarely performed by nephrologists (4%, 4% and 17% of units, respectively). Units outside of a major city were less likely to have access to a local vascular access surgeon (6/13 (46%) vs 35/35 (100%), P 0.001). There were no other significant differences between units on the basis of location. Much variation exists in unit management of arteriovenous access. Structured pre-dialysis pathways and dedicated vascular access nurses may be underutilised in Australia and New Zealand. The use of regular access blood flow measurement and ultrasound is common in both countries despite a lack of data supporting its effectiveness. There is room for both practice improvement and a need for further evidence to ensure optimal arteriovenous access care.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 2019
Publisher: Springer Science and Business Media LLC
Date: 30-05-2023
DOI: 10.1186/S13063-023-07363-4
Abstract: An increasing number of older people are living with chronic kidney disease (CKD). Many have complex healthcare needs and are at risk of deteriorating health and functional status, which can adversely affect their quality of life. Comprehensive geriatric assessment (CGA) is an effective intervention to improve survival and independence of older people, but its clinical utility and cost-effectiveness in frail older people living with CKD is unknown. The GOAL Trial is a pragmatic, multi-centre, open-label, superiority, cluster randomised controlled trial developed by consumers, clinicians, and researchers. It has a two-arm design, CGA compared with standard care, with 1:1 allocation of a total of 16 clusters. Within each cluster, study participants ≥ 65 years of age (or ≥ 55 years if Aboriginal or Torres Strait Islander (First Nations Australians)) with CKD stage 3–5/5D who are frail, measured by a Frailty Index (FI) of 0.25, are recruited. Participants in intervention clusters receive a CGA by a geriatrician to identify medical, social, and functional needs, optimise medication prescribing, and arrange multidisciplinary referral if required. Those in standard care clusters receive usual care. The primary outcome is attainment of self-identified goals assessed by standardised Goal Attainment Scaling (GAS) at 3 months. Secondary outcomes include GAS at 6 and 12 months, quality of life (EQ-5D-5L), frailty (Frailty Index – Short Form), transfer to residential aged care facilities, cost-effectiveness, and safety (cause-specific hospitalisations, mortality). A process evaluation will be conducted in parallel with the trial including whether the intervention was delivered as intended, any issue or local barriers to intervention delivery, and perceptions of the intervention by participants. The trial has 90% power to detect a clinically meaningful mean difference in GAS of 10 units. This trial addresses patient-prioritised outcomes. It will be conducted, disseminated and implemented by clinicians and researchers in partnership with consumers. If CGA is found to have clinical and cost-effectiveness for frail older people with CKD, the intervention framework could be embedded into routine clinical practice. The implementation of the trial’s findings will be supported by presentations at conferences and forums with clinicians and consumers at specifically convened workshops, to enable rapid adoption into practice and policy for both nephrology and geriatric disciplines. It has potential to materially advance patient-centred care and improve clinical and patient-reported outcomes (including quality of life) for frail older people living with CKD. ClinicalTrials.gov NCT04538157. Registered on 3 September 2020.
Publisher: Wiley
Date: 21-05-2021
Abstract: Although antineutrophil cytoplasmic antibody‐associated vasculitis (AAV) most commonly affects older in iduals, many patients develop the disease during their most productive working years. The aim of this study was to examine the effects of AAV on employment and work disability in a cohort of Australian patients of working age. Patients attending a vasculitis clinic located in Melbourne, Australia, completed an employment questionnaire in addition to the Work Productivity and Activity Impairment Questionnaire: Specific Health Problem. The average age of the 47 respondents was 47.8 ± 11.9 years (range 22‐63 years), with a median disease duration of 60 months (range 10.2‐318.5 months). There were 68.1% who were currently employed, but 20.6% of respondents employed at the time of diagnosis were no longer working and 10.6% had experienced a significant reduction in work hours since their diagnosis. There were 12.7% who were dependent on the disability support pension. The rate of work disability was 23.4%. Many participants considered themselves work impaired (41.9%), with 10.1% having missed work in the previous week. Furthermore, 44.7% of respondents reported that their financial stability had been negatively impacted by their vasculitis diagnosis. Fatigue was commonly reported. Work disabled patients were significantly more likely to be obese and less likely to have completed a tertiary education. Work disabled patients tended to be older, myeloperoxidase‐antineutrophil cytoplasmic antibody positive, and have renal involvement and lung involvement. A proportion of people living with AAV in Australia experience a decline in employment and an increase in work disability when living with this condition.
Publisher: Wiley
Date: 02-2009
DOI: 10.1111/J.1440-1797.2008.01056.X
Abstract: Cardiovascular disease in dialysis patients is associated with increased vascular calcification (VC) and arterial stiffness, both inversely correlated with bone mineral density (BMD). Few studies have correlated VC in the dialysis population with measurements of BMD and arterial compliance. We report cross-sectional data on 45 haemodialysis (HD) patients assessing the prevalence of VC and its associations. Patients had computed tomography scans through abdominal aorta and superficial femoral arteries (SFA) to determine VC, pulse wave velocity (PWV) using SphygmoCor device measuring arterial stiffness, and dual-energy X-ray absorptiometry (DXA) to determine BMD. Patients, 64% male, 38% diabetic, had median age 58 years. Mean PWV was 8.7 +/- 3.5 m/s and median aortic VC score 488.1 +/- 298 Hounsfield units, with 91% having aortic VC present. In univariate linear regression analysis, aortic VC correlated positively with length of HD (P = 0.03) and diabetes (P = 0.06). Increasing PWV was positively associated with age (P = 0.001), diabetes (P = 0.05) and VC (aortic P = 0.08, SFA P = 0.01). In multivariate regression analysis, length of HD and diabetes were significantly associated with aortic VC, whereas age and diabetes were associated with SFA VC and PWV. Mean lumbar spine and femoral neck T-scores on DXA were 0.14 and -1.66 respectively. Increased VC and reduced arterial compliance, both closely related, are common in Australian HD patients. Both are associated with diabetes and increasing age, and greater aortic VC is seen with longer duration of dialysis.
Publisher: Elsevier BV
Date: 07-2015
Publisher: Elsevier BV
Date: 12-2011
Publisher: Wiley
Date: 11-2012
DOI: 10.1111/SDI.12037
Publisher: Oxford University Press (OUP)
Date: 02-09-2007
DOI: 10.1093/NDT/GFL621
Abstract: Vascular calcification (VC), precipitated by calcium and phosphate imbalance, is a major contributor to cardiovascular disease (CVD) in chronic kidney disease (CKD). Electron-beam computed tomography (EBCT) quantitatively assesses coronary artery calcification (CAC), with VC scores predictive of atherosclerosis and cardiac events in the general and CKD population. EBCT is not readily available but spiral CT can also provide quantitative assessment of the extent of VC. CT fistulograms can be used as initial investigation for arterio-venous fistula (AVF) problems in haemodialysis (HD). The images obtained include thoracic aorta, brachio-cephalic, subclavian and common carotid arteries which allow assessment of the extent of VC in these vessels. No study to date has combined the CT fistulogram with concurrent determination of VC. We hypothesize that a single investigation for AVF management may also provide information on VC. We retrospectively analysed CT fistulograms on 28 HD patients determining VC scores (in Hounsfield units) in AVF, subclavian and carotid arteries and aorta. We correlated these scores with patient demographics, serum markers of mineral metabolism (time averaged for the period 6 months prior to CT) and calcium-based phosphate binders. Patients (60.7% male) had a median age of 59 years and 46.4% were diabetic. The mean duration of dialysis was 17.5 months. CT fistulograms showed predominantly aortic (75% of patients) and subclavian (75%) calcifications, with only 21.4% having carotid VC and minimal VC at the level of AVF. Median VC scores were 619.8 (0-1481.4) for aorta and 521.7 (0-1139.6) for subclavian (scores of >400 indicate severe atherosclerotic disease), but there was no significant correlation with serum markers or duration of HD. Increasing age correlated significantly with greater VC in aortic (R = 0.53, P = 0.003) and subclavian (R = 0.40, P = 0.03) vessels, as well as with the number of VC sites involved. CAC was present in most patients (89.3%) but CAC scores were not able to be determined because of cardiac movement. Concurrent determination of the degree of calcification in certain vessels may be possible from CT studies assessing AVF structure. VC scores provided by CT fistulograms could contribute to HD patient CVD risk assessment but studies with larger patient numbers are required to determine their relevance.
Publisher: Springer Science and Business Media LLC
Date: 12-2018
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 08-2020
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 31-03-2022
Abstract: Mild albuminuria was associated with worse baseline cognitive function, cognitive decline, and increased risk for incident dementia. Screening cognitive tests for older persons with a urine albumin-creatinine ratio ≥3 mg/mmol could identify those at elevated risk of cognitive decline and dementia. CKD is a risk factor for cognitive impairment (CI), but reports of in idual associations of eGFR and albuminuria with CI and incident dementia in healthier, older, longitudinal populations are lacking. Our goal was to estimate these associations in a large cohort of older healthy persons. In a longitudinal cohort study of older persons without prior cardiovascular disease, we estimated the associations between baseline eGFR (in ml/min per 1.73 m 2 ) and albuminuria, measured as urine albumin-creatinine ratio (UACR in mg/mmol) and cognitive test scores, declines in cognitive test scores, and incident dementia using adjusted linear and linear mixed models. Cox proportional hazards regression models assessed the association between baseline kidney function and incident CI no dementia (CIND) or dementia at a median of 4.7 years. At baseline, among 18,131 participants, median age was 74 years, eGFR was 74 (IQR, 63–84) ml/min per 1.73 m 2 , UACR was 0.8 (IQR, 0.5–1.5) mg/mmol (7.1 [4.4–13.3] mg/g), and 56% were female. Baseline eGFR was not associated with performance on any cognitive tests in cross-sectional analysis, nor was incident CIND or dementia over a median follow-up of 4.7 years. However, baseline UACR ≥3 mg/mmol (≥26.6 mg/g) was significantly associated with lower baseline scores and larger declines on the Modified Mini-Mental State Exam, verbal memory and processing speed tests, and with incident CIND (hazard ratio [HR], 1.19 95% CI, 1.07 to 1.33) and dementia (HR, 1.32 95% CI, 1.06 to 1.66). Mild albuminuria was associated with worse baseline cognitive function, cognitive decline, and increased risk for incident CIND and dementia. Screening global cognitive tests for older persons with UACR ≥ 3 mg/mmol could identify those at elevated risk of cognitive decline and dementia.
Publisher: Springer Science and Business Media LLC
Date: 23-10-2012
Abstract: Dialysis water can be contaminated by chemical and microbiological factors, all of which are potentially hazardous to patients on haemodialysis. The quality of dialysis water has seen incremental improvements over the years, with advances in water preparation, monitoring and disinfection methods, and high standards are now readily achievable in clinical practice. Advances in dialysis membrane technology have refocused attention on water quality and its potential role in the bioincompatibility of haemodialysis circuits and adverse patient outcomes. The role of ultrapure dialysate is increasingly being advocated, given its proposed clinical benefits and relative ease of production as a result of the widespread use of reverse osmosis and ultrafiltration. Many of the issues pertaining to water quality in hospital-based dialysis units are also pertinent to haemodialysis in the home. Furthermore, an increased awareness of the environmental and financial consequences of home haemodialysis has resulted in the development of automated and more efficient dialysis machines. These new machines have an increased emphasis on water conservation and recycling along with a decreased need for a complex infrastructure for water purification and maintenance.
Publisher: Elsevier BV
Date: 06-2019
Publisher: Wiley
Date: 07-2008
DOI: 10.1111/J.1542-4758.2008.00288.X
Abstract: Home hemodialysis (HD) in Australia represents 11% of the dialysis population. This percentage has declined over the last 20 years but the absolute number of home HD patients has increased since 2001. The major reason for this resurgence has been the institution of nocturnal HD at home. Predominantly, this has been as a strictly alternate day exercise, although 5-6 times per week dialysis is also practised. Short-daily HD is uncommon in Australia. Nocturnal HD now comprises 30% or more of all home HD. Most home HD in Australia is practiced without remote monitoring, using simple machines with separate reverse osmosis units. Patients tend to self-needle and not all have a "partner." The enthusiasm for nocturnal HD in particular has been fuelled by ANZDATA Registry data demonstrating a survival advantage for patients dialyzing alternate days compared with 3 times per week and for patients dialyzing for >18 hours per week compared with 12 or 15 hours per week.
Publisher: Public Library of Science (PLoS)
Date: 07-05-2014
Publisher: Elsevier BV
Date: 10-2011
Publisher: Oxford University Press (OUP)
Date: 14-03-2020
DOI: 10.1093/NDT/GFAA031
Abstract: Quality-of-life is an essential outcome for clinical care. Both chronic kidney disease (CKD) and diabetes have been associated with poorer quality-of-life. The combined impact of having both diseases is less well understood. As diabetes is the most common cause of CKD, it is imperative that we deepen our understanding of their joint impact. This was a prospective, longitudinal cohort study of community-based Australians aged ≥25 years who participated in the Australian Diabetes, Obesity and Lifestyle study. Quality-of-life was measured by physical component summary (PCS) and mental component summary sub-scores of the Short Form (36) Health Survey. Univariate and multivariate linear mixed effect regressions were performed. Of the 11 081 participants with quality-of-life measurements at baseline, 1112 had CKD, 1001 had diabetes and of these 271 had both. Of the 1112 with CKD 421 had Stage 1, 314 had Stage 2, 346 had Stage 3 and 31 had Stages 4/5. Adjusted linear mixed effect models showed baseline PCS was lower for those with both CKD and diabetes compared with either disease alone (P & 0.001). Longitudinal analysis demonstrated a more rapid decline in PCS in those with both diseases. The combination of CKD and diabetes has a powerful adverse impact on quality-of-life, and participants with both diseases had significantly poorer quality-of-life than those with one condition.
Publisher: Elsevier BV
Date: 02-2007
DOI: 10.1053/J.AJKD.2006.11.034
Abstract: Anemia is prevalent among kidney transplant recipients and likely contributes to mortality and morbidity. Prevalence of anemia is associated strongly with degree of kidney graft dysfunction however, it remains unclear whether additional transplant-associated factors also contribute. The aim of this study is to compare the prevalence of anemia between 2 cohorts, 1 of kidney transplant recipients (n = 851) and another from the general population (n = 732), sourced from subjects of the AusDiab study and selected by means of propensity score to provide a cohort matched for kidney function (Cockcroft-Gault creatinine clearance). Average hemoglobin level in kidney transplant recipients was (13.1 g/dL [131 g/L] range, 9.0 to 18.0 g/dL), significantly less than in the general population (14.3 g/dL [143 g/L] range, 9.7 to 20.0 g/dL). The prevalence of anemia (hemoglobin < 12.0 g/dL [<120 g/L] for females <12.5 g/dL [<125 g/L] for males) was almost 10-fold greater in kidney transplant recipients (30.8%) versus the general population (3.4%). Average hemoglobin level was lower in the kidney-transplant-recipient cohort at all levels of creatinine clearance. Considering both cohorts pooled, multivariate analysis showed that transplant status had the strongest association with anemia, followed by sex, creatinine clearance, and age. Posttransplantation anemia cannot be attributed solely to impaired kidney function.
Publisher: Springer Science and Business Media LLC
Date: 19-08-2015
Publisher: Wiley
Date: 29-07-2018
DOI: 10.1111/NEP.13232
Abstract: Diabetes and chronic kidney disease (CKD) are two of the most prevalent co-morbid chronic diseases in Australia. The increasing complexity of multi-morbidity, and current gaps in health-care delivery for people with co-morbid diabetes and CKD, emphasize the need for better models of care for this population. Previously, proposed published models of care for co-morbid diabetes and CKD have not been co-designed with stake-holders or formally evaluated. Particular components of health-care shown to be effective in this population are interventions that: are structured, intensive and multifaceted (treating diabetes and multiple cardiovascular risk factors) involve multiple medical disciplines improve self-management by the patient and upskill primary health-care. Here we present an integrated patient-centred model of health-care delivery incorporating these components and co-designed with key stake-holders including specialist health professionals, general practitioners and Diabetes and Kidney Health Australia. The development of the model of care was informed by focus groups of patients and health-professionals and semi-structured interviews of care-givers and health professionals. Other distinctives of this model of care are routine screening for psychological morbidity patient-support through a phone advice line and focused primary health-care support in the management of diabetes and CKD. Additionally, the model of care integrates with the patient-centred health-care home currently being rolled out by the Australian Department of Health. This model of care will be evaluated after implementation across two tertiary health services and their primary care catchment areas.
Publisher: FapUNIFESP (SciELO)
Date: 03-2020
Publisher: Elsevier BV
Date: 07-2011
DOI: 10.1053/J.AJKD.2010.12.026
Abstract: Urine dipsticks, an inexpensive accessible test for proteinuria, are widely advocated for mass screening however, their diagnostic accuracy in the general community is largely unknown. Evaluation of diagnostic test accuracy in a cross-sectional cohort. AusDiab, a representative survey of Australian adults 25 years and older (conducted in 1999/2000). Stratified cluster random s ling from 11,247 in iduals participating in the biomedical examination complete urinalysis data available for 10,944. Urine dipsticks (Bayer Multistix), with a positive result defined as ≥1+ or trace or higher protein. Albumin-creatinine ratio (ACR), measured on a random spot urine s le. Reference test positivity was defined as ACR ≥30 mg/g or ACR ≥300 mg/g. Numbers of participants with ACR <30, 30-300, and ≥300 mg/g were 10,219 (93.4%), 634 (5.8%), and 91 (0.8%), respectively. The area under the receiver operating characteristic curve (AUC) for dipstick detection of ACR ≥30 mg/g was 0.8451 ± 0.0129 (SE) in men and 0.7775 ± 0.0131 in women (P < 0.001). The AUROC for dipstick detection of ACR ≥300 mg/g was 0.9904 ± 0.0030 in men and 0.9950 ± 0.0016 in women (P = 0.02). Dipstick result ≥1+ identified ACR ≥30 mg/g with 57.8% sensitivity (95% CI, 54.1%-61.4%) and 95.4% specificity (95% CI, 95.0%-95.8%) and identified ACR ≥300 mg/g with 98.9% sensitivity (99% CI, 92.1%-100%) and 92.6% specificity (99% CI, 92.0%-93.3%). A dipstick result of trace or higher identified ACR ≥30 mg/g with 69.4% sensitivity (95% CI, 65.9%-72.7%) and 86.8% specificity (95% CI, 86.1%-87.4%) and identified ACR ≥300 mg/g with 100% sensitivity (99% CI, 94.3%-100%) and 83.7% specificity (99% CI, 82.8%-84.6%). A negative dipstick result (less than trace) had a negative predictive value of 97.6% (95% CI, 97.2%-97.9%) for ACR ≥30 mg/g and a negative predictive value of 100% (99% CI, 99.9%-100%) for ACR ≥300 mg/g. The probability of an ACR ≥30 mg/g confirmed on laboratory investigation was 47.2% (95% CI, 43.9%-50.5%) based on a dipstick result ≥1+ and 27.1% (95% CI, 25.1%-29.2%) based on a trace or higher result. Isolated urine s les precluded assessment of test reproducibility. Urine specific gravity and pH were not recorded therefore, the effect of urine concentration on test performance was not assessed. A dipstick test result <1+ or less than trace has a high negative predictive value in the general community setting, with minimal risk of a missed diagnosis of macroalbuminuria. High false-positive rates emphasize the need for laboratory confirmation of positive results.
Publisher: Massachusetts Medical Society
Date: 05-09-2013
Publisher: Wiley
Date: 13-07-2017
DOI: 10.1111/CTR.13037
Abstract: Chronic antibody-mediated rejection (cAMR) is the major cause of premature renal allograft loss and is resistant to therapy with 12-month graft failure of up to 50% reported. We examined the duration of graft survival and associates of graft failure in patients with donor-specific antibody-positive cAMR and treatment-resistant peritubular capillaritis between June 2007 and October 2010. Those with advanced interstitial fibrosis (n=5) were excluded. Included patients (n=24) received treatment with high-dose intravenous immunoglobulin and fixed-dose rituximab (500 mg). Compared with previous reports, the study group experienced prolonged graft survival (median 82.1 months). Graft loss was predicted by eGFR and degree of proteinuria at diagnosis but not by donor-specific HLA antibody class or intensity, nor in idual or summed Banff scores. Allograft biopsies were further examined for infiltrating leukocyte subtypes and location with high numbers of glomerular leukocytes, particularly macrophages, independently associated with an increased risk of graft failure. This study suggests that patients with cAMR and persistent microcirculatory inflammation, excluding those with advanced histological damage, can expect prolonged graft survival when treated with IVIg and rituximab. Trial level evidence is required to validate this observation. Further examination of the role of macrophages in cAMR is warranted.
Publisher: Elsevier BV
Date: 10-2020
Publisher: Wiley
Date: 05-12-2009
Publisher: Elsevier BV
Date: 06-2011
DOI: 10.1053/J.AJKD.2010.12.020
Abstract: Current clinical practice guidelines recommend a native arteriovenous fistula (AVF) as the vascular access of first choice. Despite this, most patients in western countries start hemodialysis therapy using a catheter. Little is known regarding specific physician and system characteristics that may be responsible for delays in permanent access creation. Multicenter cohort study using mixed methods qualitative and quantitative analysis. 9 nephrology centers in Australia and New Zealand, including 319 adult incident hemodialysis patients. Identification of barriers and enablers to AVF placement. Type of vascular access used at the start of hemodialysis therapy. Prospective data collection included data concerning predialysis education, interviews of center staff, referral times, and estimated glomerular filtration rate (eGFR) at AVF creation and dialysis therapy start. 319 patients started hemodialysis therapy during the 6-month period, 39% with an AVF and 59% with a catheter. Perceived barriers to access creation included lack of formal policies for patient referral, long wait times for surgical review and access placement, and lack of a patient database for management purposes. eGFR thresholds at referral for and creation of vascular accesses were considerably lower than appreciated (in both cases, median eGFR of 7 mL/min/1.73 m(2)), with median wait times for access creation of only 3.7 weeks. First assessment by a nephrologist less than 12 months before dialysis therapy start was an independent predictor of catheter use (OR, 8.71 P < 0.001). Characteristics of the best performing centers included the presence of a formalized predialysis pathway with a centralized patient database and low nephrologist and surgeon to patient ratios. A limited number of patient-based barriers was assessed. Cross-sectional data only. A formalized predialysis pathway including patient education and eGFR thresholds for access placement is associated with improved permanent vascular access placement.
Publisher: Elsevier BV
Date: 2019
DOI: 10.1016/J.JDIACOMP.2018.09.020
Abstract: In patients with comorbid diabetes and chronic kidney disease, the extent to which patient-reported barriers to health-care and patient reported outcomes influence the quality of health care is not well established. This study explored the association between patient-reported barriers to health-care, patient activation, quality of life and diabetes self-care, with attainment of glycaemic and blood pressure (BP) targets. This cross-sectional study recruited adults with diabetes and CKD (eGFR 20 to <60 ml/min/1.73m 199 patients, mean age 68.7 (SD 9.6), 70.4% male and 90.0% with type 2 diabetes were studied. Poor glycaemic control was associated with increased odds of patient reported "poor family support" (OR 4.90 95% CI 1.80 to 13.32, p < 0.002). Poor BP control was associated with increased odds of patient reported, "not having a good primary care physician" (OR 6.01 2.42 to 14.95, p 0.05). Specific patient-reported barriers, lack of patient perceived family and primary care physician support, are associated with increased odds of poor glycaemic and blood pressure control respectively. Interventions addressing these barriers may improve treatment target attainment.
Publisher: Wiley
Date: 12-2008
DOI: 10.1111/J.1440-1797.2008.00982.X
Abstract: Cardiovascular diseases (CVD) are the major cause of morbidity and mortality in end-stage renal failure (ESRF). Establishing whether traditional risk factors are valid predictors of CVD in ESRF is important in order to devise preventive and interventional strategies for the ESRF populations. In this retrospective cohort study, a cohort of patients on dialysis were examined between September 2000 and February 2001. Only those without previous CVD events at baseline were included. For each in idual, 5 year CVD risk was calculated using the New Zealand 5 year CVD risk prediction charts based on the Framingham Heart Study prognostic algorithm. The subsequent 5 year CVD outcome for each patient was determined and the observed rate of first CVD events was compared to the predicted risk. Relation of in idual risk factors with the CVD outcome was also assessed. Of the patients, 274 were without previous CVD events at baseline and 27% experienced CVD events during the subsequent 5 years. Observed CVD risk was more than twofold that of predicted risk although there was a linear correlation between the two. Among in idual risk factors, increasing age, diabetes and smoking were significantly related to the incidence of the CVD events but, unlike in the general population, systolic blood pressure, total cholesterol/high-density lipoprotein ratio and body mass index were not significantly related to CVD events. The very high incidence of CVD in ESRF patients suggest that non-traditional risk factors present in the uraemic state are independent risk factors for CVD in ESRF patients. Nevertheless, the application of traditional cardiovascular risk profiles does allow risk stratification of the ESRF population.
Publisher: Elsevier BV
Date: 02-2019
Publisher: BMJ
Date: 10-2017
DOI: 10.1136/BMJOPEN-2017-017695
Abstract: To evaluate the extent of patient activation and factors associated with activation in adults with comorbid diabetes and chronic kidney disease (CKD). A cross-sectional study. Renal/diabetes clinics of four tertiary hospitals across the two largest states of Australia. Adult patients (over 18 years) with comorbid diabetes and CKD (estimated glomerular filtration rate mL/min/1.73 m 2 ). Patients completed the Patient Activation Measure, the Kidney Disease Quality of Life and demographic and clinical data survey from January to December 2014. Factors associated with patient activation were examined using χ 2 or t-tests and linear regression. Three hundred and five patients with median age of 68 (IQR 14.8) years were studied. They were evenly distributed across socioeconomic groups, stage of kidney disease and duration of diabetes but not gender. Approximately 46% reported low activation. In patients with low activation, the symptom roblem list, burden of kidney disease subscale and mental composite subscale scores were all significantly lower (all p .05). On multivariable analysis, factors associated with lower activation for all patients were older age, worse self-reported health in the burden of kidney disease subscale and lower self-care scores. Additionally, in men, worse self-reported health in the mental composite subscale was associated with lower activation and in women, worse self-reported health scores in the symptom problem list and greater renal impairment were associated with lower activation. Findings from this study suggest that levels of activation are low in patients with diabetes and CKD. Older age and worse self-reported health were associated with lower activation. This data may serve as the basis for the development of interventions needed to enhance activation and outcomes for patients with diabetes and CKD.
Publisher: Elsevier BV
Date: 08-2015
DOI: 10.1053/J.AJKD.2015.02.341
Abstract: Research aims to improve health outcomes for patients. However, the setting of research priorities is usually performed by clinicians, academics, and funders, with little involvement of patients or caregivers and using processes that lack transparency. A national workshop was convened in Australia to generate and prioritize research questions in chronic kidney disease (CKD) among erse stakeholder groups. Patients with CKD (n=23), nephrologists/surgeons (n=16), nurses (n=8), caregivers (n=7), and allied health professionals and researchers (n=4) generated and voted on intervention questions across 4 treatment categories: CKD stages 1 to 5 (non-dialysis dependent), peritoneal dialysis, hemodialysis, and kidney transplantation. The 5 highest ranking questions (in descending order) were as follows: How effective are lifestyle programs for preventing deteriorating kidney function in early CKD? What strategies will improve family consent for deceased donor kidney donation, taking different cultural groups into account? What interventions can improve long-term post-transplant outcomes? What are effective interventions for post hemodialysis fatigue? How can we improve and in idualize drug therapy to control post-transplant side effects? Priority questions were focused on prevention, lifestyle, quality of life, and long-term impact. These prioritized research questions can inform funding agencies, patient/consumer organizations, policy makers, and researchers in developing a CKD research agenda that is relevant to key stakeholders.
Publisher: Dustri-Verlgag Dr. Karl Feistle
Date: 08-2015
DOI: 10.5414/CN108519
Publisher: Elsevier BV
Date: 04-2007
DOI: 10.1111/J.1753-6405.2007.00038.X
Abstract: To estimate the magnitude of excess risk for proteinuria, high blood pressure and diabetes in Australian Aboriginal adults in three remote communities by comparing them with nationwide Australian data. Adult volunteers from three remote communities in the Northern Territory were screened for proteinuria, high blood pressure, and diabetes between 2000 and mid 2003. Rates for people age 25 to 74 years were compared with those from the AusDiab study conducted in 1999 and 2000. Compared with AusDiab, rates of these conditions were elevated in all Aboriginal communities, but differed among them. With adjustment for age and sex, rates of proteinuria were elevated 2.5- to 5.3-fold, rates of high blood pressure were elevated 3.1- to 8.1-fold and rates of diabetes were elevated 5.4- to 10-fold (p < 0.001 for all). The risk of having any condition ranged from 3.0- to 8.7-fold and the risk of having two or more conditions ranged from 5.8- to 14.2-fold. The data are compatible with the excess morbidity and mortality from cardiovascular disease, diabetes and renal disease in these Aboriginal groups. They reflect the multitude of risk factors operating in these environments. They dictate urgent and systematic intervention to modify outcomes of established disease and to prevent their development. However, the resources required for effective secondary intervention will differ among communities according to the disease burden.
Publisher: Wiley
Date: 11-2018
DOI: 10.1111/IMJ.14106
Abstract: C-reactive protein (CRP) levels increase in response to bacterial infection and have been used to guide the use of antibiotics. We assessed CRP levels in a cohort of patients with cystic fibrosis (CF) admitted to hospital with an exacerbation of their lung disease, requiring treatment with broad-spectrum antibiotics. In this group, most subjects had CRP levels of less than 20 mg/L, including patients who had pneumonia. The clinical utility of the CRP in guiding antibiotic use in exacerbations of CF is limited.
Publisher: Elsevier BV
Date: 08-2018
DOI: 10.1053/J.AJKD.2017.10.019
Abstract: Concern regarding technique failure is a major barrier to increased uptake of peritoneal dialysis (PD), and the first year of therapy is a particularly vulnerable time. A cohort study using competing-risk regression analyses to identify the key risk factors and risk periods for early transfer to hemodialysis therapy or death in incident PD patients. All adult patients who initiated PD therapy in Australia and New Zealand in 2000 through 2014. Patient demographics and comorbid conditions, duration of prior renal replacement therapy, timing of referral, PD modality, dialysis era, and center size. Technique failure within the first year, defined as transfer to hemodialysis therapy for more than 30 days or death. Of 16,748 patients included in the study, 4,389 developed early technique failure. Factors associated with increased risk included age older than 70 years, diabetes or vascular disease, prior renal replacement therapy, late referral to a nephrology service, or management in a smaller center. Asian or other race and use of continuous ambulatory PD were associated with reduced risk, as was initiation of PD therapy in 2010 through 2014. Although the risk for technique failure due to death or infection was constant during the first year, mechanical and other causes accounted for a greater number of cases within the initial 9 months of treatment. Potential for residual confounding due to limited data for residual kidney function, dialysis prescription, and socioeconomic factors. Several modifiable and nonmodifiable factors are associated with early technique failure in PD. Targeted interventions should be considered in high-risk patients to avoid the consequences of an unplanned transfer to hemodialysis therapy or death.
Publisher: Springer Science and Business Media LLC
Date: 04-2019
DOI: 10.1007/S11136-019-02173-1
Abstract: Quality-of-life is poor in end-stage kidney disease however, the relationships between earlier stages of chronic kidney disease (CKD) and are poorly understood. This study explored longitudinal quality-of-life changes in a community-based CKD cohort and assessed associations between CKD and quality-of-life over time, and between baseline quality-of-life and CKD outcomes. We used the Australian diabetes, obesity and lifestyle study-a nationally representative, prospective cohort with data collected at baseline, year 5 and year 12-to examine the relationships between CKD stage, quality-of-life and outcomes. Linear mixed regression, cox proportional hazards, Kaplan-Meier and competing risks analyses were used. Of 1112 participants with CKD and baseline quality-of-life data, the physical component summary (PCS) score was significantly lower than for the general population (p = 0.01 age and sex adjusted), while the mental component summary (MCS) score was no different (p = 0.9 age and sex adjusted). In our unadjusted mixed effects model, more advanced kidney disease was associated with lower PCS and higher MCS at baseline (p < 0.001 and p < 0.01, respectively) however, this effect was no longer significant after adjustment for demographic and clinical variables. The rate of decline in PCS over the period of follow-up was greatest for those with more advanced kidney disease (p < 0.001 in unadjusted model, p = 0.007 in adjusted model). There was no association between change in MCS over the period of follow-up and severity of kidney disease in either the unadjusted or adjusted model (p = 0.7 and p = 0.1, respectively). Lower PCS, but not MCS, was associated with increased cardiovascular and increased all-cause mortality even after adjustment for key demographic and clinical variables (p < 0.001). Physical, but not mental, quality-of-life is significantly impaired in CKD, and continues to decline with disease progression.
Publisher: Elsevier BV
Date: 12-2021
DOI: 10.1053/J.AJKD.2021.03.018
Abstract: Mortality is an important outcome for all dialysis stakeholders. We examined associations between dialysis modality and mortality in the modern era. Observational study comparing dialysis inception cohorts 1998-2002, 2003-2007, 2008-2012, and 2013-2017. Australia and New Zealand (ANZ) dialysis population. The primary exposure was dialysis modality: facility hemodialysis (HD), continuous ambulatory peritoneal dialysis (CAPD), automated PD (APD), or home HD. The main outcome was death. Cause-specific proportional hazards models with shared frailty and subdistribution proportional hazards (Fine and Gray) models, adjusting for available confounding covariates. In 52,097 patients, the overall death rate improved from ~15 deaths per 100 patient-years in 1998-2002 to ~11 in 2013-2017, with the largest cause-specific contribution from decreased infectious death. Relative to facility HD, mortality with CAPD and APD has improved over the years, with adjusted hazard ratios in 2013-2017 of 0.88 (95% CI, 0.78-0.99) and 0.91 (95% CI, 0.82-1.00), respectively. Increasingly, patients with lower clinical risk have been adopting APD, and to a lesser extent CAPD. Relative to facility HD, mortality with home HD was lower throughout the entire period of observation, despite increasing adoption by older patients and those with more comorbidities. All effects were generally insensitive to the modeling approach (initial vs time-varying modality, cause-specific versus subdistribution regression), different follow-up time intervals (5 year vs 7 year vs 10 year). There was no effect modification by diabetes, comorbidity, or sex. Potential for residual confounding, limited generalizability. The survival of patients on PD in 2013-2017 appears greater than the survival for patients on facility HD in ANZ. Additional research is needed to assess whether changing clinical risk profiles over time, varied dialysis prescription, and morbidity from dialysis access contribute to these findings.
Publisher: SAGE Publications
Date: 05-2015
Abstract: Peritoneal dialysis (PD) patients are commonly required to transfer to hemodialysis (HD), however the literature describing the outcomes of such transfers is limited. The aim of our study was to describe the predictors of these transfers and their outcomes according to vascular access at the time of transfer. A retrospective cohort study using registry data of all adult patients commencing PD as their initial renal replacement therapy in Australia or New Zealand between 2004 – 2010 was performed. Follow-up was until 31 December 2010. Logistic regression models were constructed to determine possible predictors of transfer within both 6 and 12 months of PD commencement. Cox analysis and competing risks regression were used to determine the predictors of survival and transplantation post-transfer. The analysis included 4,781 incident PD patients, of whom 1,699 transferred to HD during the study period. Logistic models did not identify any clinically useful predictors of transfer within 6 or 12 months (c-statistics 0.54 and 0.55 respectively). 67% of patients commenced HD with a central venous catheter (CVC). CVC use at transfer was associated with increased mortality (hazard ratio 1.37, 95% confidence interval (CI) 1.11 – 1.68, p = 0.003) and a borderline significant reduction in the incidence of transplantation (subhazard ratio 0.76, 95% CI 0.58 – 1.00, p = 0.05). It is difficult to predict the transfer to HD for incident PD patients. PD patients who commence HD with a CVC have a higher risk of mortality and a lower likelihood of undergoing renal transplantation.
Publisher: Springer Science and Business Media LLC
Date: 06-1983
DOI: 10.1007/BF00257342
Abstract: RNA can anneal to its DNA template to generate an RNA-DNA hybrid (RDH) duplex and a displaced DNA strand, termed R-loop. RDH duplex occupies up to 5% of the mammalian genome and plays important roles in many biological processes. The functions of RDH duplex are affected by its mechanical properties, including the elasticity and the conformation transitions. The mechanical properties of RDH duplex, however, are still unclear. In this work, we studied the mechanical properties of RDH duplex using magnetic tweezers in comparison with those of DNA and RNA duplexes with the same sequences. We report that the contour length of RDH duplex is ∼0.30 nm/bp, and the stretching modulus of RDH duplex is ∼660 pN, neither of which is sensitive to NaCl concentration. The persistence length of RDH duplex depends on NaCl concentration, decreasing from ∼63 nm at 1 mM NaCl to ∼49 nm at 500 mM NaCl. Under high tension of ∼60 pN, the end-opened RDH duplex undergoes two distinct overstretching transitions at high salt in which the basepairs are stable, it undergoes the nonhysteretic transition, leading to a basepaired elongated structure, whereas at low salt, it undergoes a hysteretic peeling transition, leading to the single-stranded DNA strand under force and the single-stranded RNA strand coils. The peeled RDH is difficult to reanneal back to the duplex conformation, which may be due to the secondary structures formed in the coiled single-stranded RNA strand. These results help us understand the full picture of the structures and mechanical properties of nucleic acid duplexes in solution and provide a baseline for studying the interaction of RDH with proteins at the single-molecule level.
Publisher: Wiley
Date: 04-07-2017
DOI: 10.1111/HEX.12577
Publisher: Wiley
Date: 17-09-2017
DOI: 10.1111/NEP.12823
Publisher: SAGE Publications
Date: 09-2016
Abstract: The aim of the present study was to evaluate the predictors of transfer to home hemodialysis (HHD) after peritoneal dialysis (PD) completion. All Australian and New Zealand patients treated with PD on day 90 after initiation of renal replacement therapy between 2000 and 2012 were included. Completion of PD was defined by death, transplantation, or hemodialysis (HD) for 180 days or more. Patients were categorized as “transferred to HHD” if they initiated HHD fewer than 180 days after PD had ended. Multivariable logistic regression was used to evaluate predictors of transfer to HHD in a restricted cohort experiencing PD technique failure a competing-risks analysis was used in the unrestricted cohort. Of 10 710 incident PD patients, 3752 died, 1549 underwent transplantation, and 2915 transferred to HD, among whom 156 (5.4%) started HHD. The positive predictors of transfer to HHD in the restricted cohort were male sex [odds ratio (OR): 2.81], obesity (OR: 2.20), and PD therapy duration (OR: 1.10 per year). Negative predictors included age (OR: 0.95 per year), infectious cause of technique failure (OR: 0.48), underweight (OR: 0.50), kidney disease resulting from hypertension (OR: 0.38) or diabetes (OR: 0.32), race being Maori (OR: 0.65) or Aboriginal and Torres Strait Islander (OR: 0.30). Comparable results were obtained with a competing-risks model. Transfer to HHD after completion of PD is rare and predicted by patient characteristics at baseline and at the time of PD end. Transition to HHD should be considered more often in patients using PD, especially when they fulfill the identified characteristics.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 07-2011
DOI: 10.2215/CJN.06790810
Publisher: Public Library of Science (PLoS)
Date: 15-07-2019
Publisher: Wiley
Date: 02-05-2020
DOI: 10.1111/NEP.13574
Abstract: The use of haemodiafiltration (HDF) for the management of patients with end-stage kidney failure is increasing worldwide. Factors associated with HDF use have not been studied and may vary in different countries and jurisdictions. The aim of this study was to document the pattern of increase and variability in uptake of HDF in Australia and New Zealand, and to describe patient- and centre-related factors associated with its use. Using the Australian and New Zealand Dialysis and Transplant Registry, all incident patients commencing haemodialysis (HD) between 2000 and 2014 were included. The primary outcome was HDF commencement over time, which was evaluated using multivariable logistic regression stratified by country. Of 27 433 patients starting HD, 3339 (14.4%) of 23 194 patients in Australia and 810 (19.1%) of 4239 in New Zealand received HDF. HDF uptake increased over time in both countries but was more rapid in New Zealand than Australia. In Australia, HDF use was more likely in males (odds ratio (OR) 1.13, 95% confidence interval (CI) = 1.03-1.24, P = 0.009) and less likely with older age (reference 70 years OR = 0.48 95% CI = 0.41-0.56) higher body mass index (body mass index (BMI) < 18.5 kg/m Haemodiafiltration uptake is increasing, variable and associated with both patient and centre characteristics. Centre characteristics not explicitly captured elsewhere explained 36% of variability in HDF uptake in Australia and 48% in New Zealand.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 12-2007
Publisher: Oxford University Press (OUP)
Date: 08-10-2011
DOI: 10.1093/NDT/GFQ602
Abstract: Although clinical guidelines exist for optimal levels of serum markers of chronic kidney disease mineral and bone disorder (CKD-MBD), target parameters are not achieved in many haemodialysis (HD) patients. The reason for this evidence-practice gap is unclear and more information from patients and healthcare professionals is required to improve knowledge transfer. We aimed to determine potential barriers by surveying HD patients and staff about awareness and management of CKD-MBD. A total of 136 prevalent HD patients, 25 nephrologists and 58 dialysis nurses/technicians were surveyed. Three separate questionnaires included issues of knowledge and awareness of CKD-MBD and factors limiting management (including compliance, medications and general understanding). Of patients surveyed, 84% had heard of phosphate, but 42% were unsure of high phosphate foods and 46% unaware of consequences of elevated phosphate. Twenty-seven percent and thirty-five percent of patients, respectively, had difficulty taking or forgetting to take phosphate binders. Seventy-four percent of patients wanted to know more about CKD-MBD (40% via written material). Of nephrologists surveyed, 76% thought non-compliance with phosphate binders was the main reason for poor control of phosphate (predominantly related to poor patient understanding) 84% thought patients wanted to know more but only 28% provided written material on CKD-MBD. Of dialysis staff surveyed, 63% thought non-compliance with binders explained poor control, the main reason being lack of patient understanding 88% thought patients wanted to know more but only 17% provided written education. Implementation of an intensive educational programme, with a multi-faceted approach, for HD patients may promote better control of CKD-MBD and improve achievement of target levels.
Publisher: Oxford University Press (OUP)
Date: 13-08-2019
DOI: 10.1093/NDT/GFY209
Abstract: It is unclear if haemodiafiltration improves patient survival compared with standard haemodialysis. Observational studies have tended to show benefit with haemodiafiltration, while meta-analyses have not provided definitive proof of superiority. Using data from the Australia and New Zealand Dialysis and Transplant Registry, this binational inception cohort study compared all adult patients who commenced haemodialysis in Australia and New Zealand between 2000 and 2014. The primary outcome was all-cause mortality. Cardiovascular mortality was the secondary outcome. Outcomes were measured from the first haemodialysis treatment and were examined using multivariable Cox regression analyses. Patients were censored at permanent discontinuation of haemodialysis or at 31 December 2014. Analyses were stratified by country. The study included 26 961 patients (4110 haemodiafiltration, 22 851 standard haemodialysis 22 774 Australia, 4187 New Zealand) with a median follow-up of 5.31 (interquartile range 2.87-8.36) years. Median age was 62 years, 61% were male, 71% were Caucasian. Compared with standard haemodialysis, haemodiafiltration was associated with a significantly lower risk of all-cause mortality [adjusted hazard ratio (HR) for Australia 0.79, 95% confidence interval (95% CI) 0.72-0.87 adjusted HR for New Zealand 0.88, 95% CI 0.78-1.00]. In Australian patients, there was also an association between haemodiafiltration and reduced cardiovascular mortality (adjusted HR 0.78, 95% CI 0.64-0.95). Haemodiafiltration was associated with superior survival across patient subgroups of age, sex and comorbidity.
Publisher: Wiley
Date: 26-06-2006
DOI: 10.1111/J.1445-2197.2006.03775.X
Abstract: Vaccination, education and use of long-term antibiotics are recommended in expert guidelines for the prevention of infectious complications after splenectomy. However, studies outside Australia have shown poor adherence to the guidelines. The aim of this study was to determine overall adherence to the guidelines and to ascertain any independent risk factors for poor compliance with the guidelines. A retrospective review of hospital records between 1999 and 2004 was carried out. Indications for splenectomy of the 111 patients in this review included post-trauma (32), haematological (32), cancer surgery (24), iatrogenic (12) and others (11). On multivariable analysis, age was associated with a 28% less likelihood to receive education (odds ratio (OR) 0.72 95% confidence interval (CI) 0.56-0.92 P = 0.009) and 36% less likelihood to receive long-term antibiotics (OR 0.64 95% CI 0.52-0.80 P < or = 0.001). Women were four times more likely to receive education (OR 4.03 95% CI 1.16-14.0 P = 0.028) and patients who had undergone splenectomy in 2004 were 22 times more likely to have received education compared with those in 1999 (OR 22.53 95% CI 3.12-162.34 P = 0.002). Education for prevention of sepsis after splenectomy is poorly documented and may be incomplete. Older age and male sex are risk factors in non-adherence to guidelines for prevention of postsplenectomy sepsis. Strategies such as alert cards and information brochures may improve adherence to guidelines particularly in older patients.
Publisher: Elsevier BV
Date: 04-2006
DOI: 10.1053/J.AJKD.2005.12.034
Abstract: Microalbuminuria is an independent risk factor for cardiovascular morbidity and mortality in the general population. Standard immunochemical urinary albumin assays detect immunoreactive albumin, whereas high-performance liquid chromatography (HPLC) detects both immunoreactive and immunounreactive albumin. Using data from the Australian Diabetes, Obesity, and Lifestyle cohort study of randomly selected community-based Australian adults, spot urine s les were tested for albuminuria (spot urine albumin-creatinine ratio [ACR]: normal, 300 mg/g) by using both immunonephelometry (IN) and HPLC (n = 10,010). Bland-Altman analysis showed significant bias, with a greater ACR by means of HPLC, particularly at lower levels of ACR. Mean ACR was 15.8 mg/g (95% confidence interval [CI], 12.3 to 19.2) by means of IN compared with 30.0 mg/g (95% CI, 27.0 to 35.0) by means of HPLC. The prevalence of microalbuminuria was 4 times greater by means of HPLC compared with IN (20% versus 5.5%). In all demographic and comorbid subgroups associated with microalbuminuria, the prevalence of microalbuminuria increased by 2 to 4 times. A total of 1,743 subjects (17.4%) classified as normoalbuminuric by means of IN were reclassified as microalbuminuric by means of HPLC. Using multivariate logistic regression, women, patients with untreated and treated hypertension, and those with impaired glucose tolerance or diabetes were associated significantly with a change in category from normoalbuminuric to microalbuminuria by means of HPLC. HPLC measures significantly more urinary albumin within the normoalbuminuria and microalbuminuria range, resulting in a significant increase in prevalence of microalbuminuria. Longitudinal studies are needed to determine whether the extra in iduals identified by means of HPLC are at increased risk for developing hard clinical outcomes (renal and cardiovascular).
Publisher: Wiley
Date: 23-07-2010
DOI: 10.1111/J.1440-1797.2010.01288.X
Abstract: Vascular calcification (VC) is a major contributor to increased cardiovascular (CV) disease in chronic kidney disease (CKD) and an independent predictor of mortality. VC is inversely correlated with bone mineral density (BMD). Screening for VC may be useful to determine those at greater CV risk and dual-energy X-ray absorptiometry (DXA) may have a dual role in providing VC measurement as well as BMD. We report cross-sectional data on 44 patients with CKD stages 3-4 and aim to determine and validate measurement of VC using DXA. Patients had computed tomography (CT) of abdominal aorta and DXA of lateral lumbar spine, to determine both aortic VC and BMD. Semi-quantitative measurement of VC from DXA was determined (blinded) using previously validated 8- and 24-point scales, and compared with VC from CT. BMD determination from L2 to L4 vertebrae on CT was compared with DXA-reported BMD. Patients 66% male, 57% diabetic, had mean age 63.4 years and mean estimated glomerular filtration rate 31.4 +/- 12 mL/min. Aortic VC was present in 95% on CT, mean 564.9 +/- 304 Hounsfield units (HU). Aortic VC was seen in 68% on lateral DXA, mean scores 5.1 +/- 5.9 and 1.9 +/- 1.9 using 24- and 8-point scales, respectively. Strong correlation of VC measurement was present between CT and DXA (r 0.52, P < 0.001). For DXA VC 24-point score, intraclass correlations for intra-rater and inter-rater agreement were 0.91 and 0.64, respectively (8-point scale, intraclass correlations 0.90 and 0.69). Vertebral BMD measured by CT (mean 469.3 HU L2-4) also significantly correlated with lateral DXA-reported BMD (mean spine T-score -0.67 +/- 1.6) (r 0.56, P < 0.001). Despite limitations in CKD, DXA may be useful as lateral DXA images provide concurrent assessment of aortic calcification as well as lumbar spine BMD, both correlating significantly with CT measurements. Lateral DXA may provide VC screening to determine patients at greater CV risk although more studies are needed to evaluate their potential role.
Publisher: Oxford University Press (OUP)
Date: 10-01-2017
DOI: 10.1093/NDT/GFW383
Abstract: There is evidence that end-stage kidney disease patients who are older or with more comorbidity may have a poor trade-off between benefits of dialysis and potential harms. We aimed to develop a tool for predicting patient mortality in the early stages of receiving dialysis. In 23 658 patients aged 15+ years commencing dialysis between 2000 and 2009 in Australia and New Zealand a point score tool was developed to predict 6-month mortality based on a logistic regression analysis of factors available at dialysis initiation. Temporal validation used 2009-11 data from Australia and New Zealand. External validation used the UK Renal Registry. Within 6 months of commencing dialysis 6.1% of patients had died. A small group (4.7%) of patients had a high predicted mortality risk (>20%), as predicted by the point score tool. Predictive variables were: older age, underweight, chronic lung disease, coronary artery disease, peripheral vascular disease, cerebrovascular disease (particularly for patients <60 years of age), late referral to nephrologist care and underlying cause of renal disease. The new point score tool outperformed existing models, and had an area under the receiver operating characteristic curve of 0.755 on temporal validation with acceptable calibration and 0.713 on external validation with poor calibration. Our point score tool for predicting 6-month mortality in patients at dialysis commencement has sufficient prognostic accuracy to use in Australia and New Zealand for prognosis and identification of high risk patients who may be given appropriate supportive care. Use in other countries requires further study.
Publisher: Elsevier BV
Date: 11-2003
DOI: 10.1046/J.1523-1755.2003.00277.X
Abstract: A number of demographic and comorbid factors have been demonstrated to be associated with the placement of arteriovenous grafts (AVG) and central venous catheters (CVC) as opposed to native arteriovenous fistulas (AVF). However, no data are available regarding these factors in a hemodialysis population where AVF utilization is high. All adult patients on hemodialysis on September 30, 2001 in Australia were included in the study. Vascular access was recorded as AVF, AVG, or CVC. Patients were separated into incident ( or =150 days). Multinomial logistic regression was used to assess factors associated with AVG and CVC use. Of the 4968 patients who were studied, 877(17%) were classed as incident and the remainder prevalent. AVF were present in 61% versus 77%, AVG were present in 11% versus 19%, and CVC were present in 28% versus 4% in the incident and prevalent cohorts, respectively (all P or =30 kg/m2 and peripheral vascular and cerebrovascular disease were significant in the prevalent group. For CVC, female gender, type I and II diabetes mellitus and late referral were associated with increased frequency in the incident cohort, while females, cigarette smoking, and peripheral vascular disease were predictive in the prevalent group. Significant variations in access type were also seen depending on geographic location. Certain patient characteristics such as age and female gender, but not type II diabetes mellitus, remain significantly associated with AVG and catheter use despite the high prevalence of AVF use in Australia. However, the significant variation in risk by geographic location suggests more attention needs to be paid to physician practice patterns to increase AVF utilization rates.
Publisher: Wiley
Date: 24-07-2003
DOI: 10.1046/J.1440-1797.2003.00158.X
Abstract: Simultaneous pancreas-kidney (SPK) transplant recipients are at high immunological risk of rejection. Antibody induction is beneficial but lymphocyte-depleting therapy is associated with a high incidence of side-effects. We performed a historical controlled trial to compare OKT3 versus anti-CD25 antibody (basiliximab) induction therapy with regard to patient, kidney and pancreas survival, as well as to examine for any differences in acute rejection, graft function, and infective complications. Twenty-eight consecutive SPK transplants were performed at the Monash Medical Centre between December 1997 and November 2001. Anti CD3 monoclonal antibody (OKT3) was used prior to March 2000 (n = 12) and basiliximab was used after (n = 16), both in combination with cyclosporin, mycophenolate, and prednisolone. A retrospective comparison of outcomes was performed. At 6 months, patient (100 vs 100%), kidney (91.7 vs 91.7%) and pancreas (75 vs 83.3%) survival were similar in the OKT3 and basiliximab groups, respectively. A minority of subjects in each group remained free from rejection (42% basiliximab vs 25% on OKT3, P = NS). Renal function was superior in the basiliximab group (mean calculated creatinine clearance 79.4 +/- 11.9 vs 54.5 +/- 15.9 mL/min for basiliximab vs OKT3, P < 0.001). The incidence of major opportunistic infection was lower in basiliximab-treated patients (9 vs 50% in the OKT3 group, P = 0.033). Basiliximab was associated with similar 6-month patient, kidney and pancreas survival, superior renal function and less opportunistic infection as compared with OKT3 induction therapy in SPK transplants. Basiliximab is at least as effective and is safer than OKT3 for induction therapy in SPK transplantation.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 14-09-2016
DOI: 10.2215/CJN.08930816
Publisher: BMJ
Date: 2020
DOI: 10.1136/BMJDRC-2019-000842
Abstract: To evaluate the impact of an integrated diabetes and kidney disease model of care on health-related quality of life (HRQOL) of patients with comorbid diabetes and chronic kidney disease (CKD). A longitudinal study of adult patients (over 18 years) with comorbid diabetes and CKD (stage 3a or worse) who attended a new diabetes kidney disease service was conducted at a tertiary hospital. A questionnaire consisting of demographics, clinical data, and the Kidney Disease Quality of Life (KDQOL-36) was administered at baseline and after 12 months. Paired t-tests were used to compare baseline and 12-month scores. A subgroup analysis examined the effects by patient gender. Multiple regression analysis examined the factors associated with changes in scores. 179 patients, 36% of whom were female, with baseline mean±SD age of 65.9±11.3 years, were studied. Across all subscales, HRQOL did not significantly change over time (p value for all mean differences .05). However, on subgroup analysis, symptom problem list and physical composite summary scores increased among women (MD=9.0, 95% CI 1.25 to 16.67 p=0.02 and MD=4.5, 95% CI 0.57 to 8.42 p=0.03 respectively) and physical composite scores decreased among men (MD=−3.35, 95% CI −6.26 to −0.44 p=0.03). The HRQOL of patients with comorbid diabetes and CKD attending a new codesigned, integrated diabetes and kidney disease model of care was maintained over 12 months. Given that HRQOL is known to deteriorate over time in this high-risk population, the impact of these findings on clinical outcomes warrants further investigation.
Publisher: Elsevier BV
Date: 06-2016
DOI: 10.1016/J.KINT.2016.02.014
Abstract: Patient outcomes in end-stage kidney disease (ESKD) secondary to lupus nephritis have not been well described. To help define this we compared dialysis and transplant outcomes of patients with ESKD due to lupus nephritis to all other causes. All patients diagnosed with ESKD who commenced renal replacement therapy in Australia and New Zealand (1963-2012) were included. Clinical outcomes were evaluated in both a contemporary cohort (1998-2012) and the entire 50-year cohort. Of 64,160 included patients, 744 had lupus nephritis as the primary renal disease. For the contemporary cohort of 425 patients with lupus nephritis, the 5-year dialysis patient survival rate was 69%. Of 176 contemporary patients with lupus nephritis who received their first renal allograft, the 5-year patient, overall renal allograft, and death-censored renal allograft survival rates were 95%, 88%, and 93%, respectively. Patients with lupus nephritis had worse dialysis patient survival (adjusted hazard ratio 1.33, 95% confidence interval 1.12-1.58) and renal transplant patient survival (adjusted hazard ratio 1.87, 95% confidence interval 1.18-2.98), but comparable overall renal allograft survival (adjusted hazard ratio 1.19, 95% confidence interval 0.84-1.68) and death-censored renal allograft survival (adjusted hazard ratio 1.05, 95% confidence interval 0.68-1.62) compared with ESKD controls. Similar results were found in the entire cohort and when using competing-risks analysis. Thus, the ESKD of lupus nephritis was associated with worse dialysis and transplant patient survival but comparable renal allograft survival compared with other causes of ESKD.
Publisher: Wiley
Date: 06-2003
DOI: 10.1046/J.1492-7535.2003.00039.X
Abstract: Vascular access placement is a key management issue for hemodialysis patients. Despite being well regarded as the access of first choice, the native arteriovenous fistula (AVF) remains underutilized in the United States. The first part of this review examines recent epidemiology studies addressing patient factors associated with the use of the synthetic arteriovenous graft as opposed to the native fistula. Female gender and older age are consistently associated with a higher frequency of graft use. Diabetes, peripheral vascular disease, and body mass index were associated with graft use in some but not all of the studies. Recent evidence also suggests an independent survival advantage for patients dialyzing via native fistulae especially for infection-related mortality. The second part reviews evidence surrounding the recommendations for blood flow surveillance of the native fistula. The hemodynamic features of the native fistula are examined and differences from synthetic grafts are highlighted. Clinical studies assessing the use of blood flow surveillance to prevent the sudden thrombosis of native fistulae are reviewed. Blood flow thresholds for further investigation are yet to be determined definitely for AVF and randomized studies should be performed to assesses the impact on AVF thrombosis rates.
Publisher: Wiley
Date: 23-10-2013
DOI: 10.1111/NEP.12132
Publisher: Elsevier BV
Date: 04-2016
DOI: 10.1053/J.AJKD.2015.09.025
Abstract: Intensive hemodialysis (HD) is characterized by increased frequency and/or session length compared to conventional HD. Previous analyses from Australia and New Zealand did not suggest benefit with intensive HD, although recent research suggests that relationships have changed. We present updated analyses. Observational cohort study using marginal structural modeling to adjust for changes in renal replacement modality and time-varying medical comorbid conditions. Adults initiating renal replacement therapy since March 31, 1996, followed up through December 31, 2012 this analysis included 40,842 patients over 2,187,689 patient-months. Time-varying renal replacement modality: conventional facility HD (≤3 times per week, ≤6 hours per session), quasi-intensive facility HD (between conventional and intensive), intensive facility HD (≥5 times per week, any hours per session), conventional home HD, quasi-intensive home HD, intensive home HD, peritoneal dialysis, deceased donor kidney transplantation, and living donor kidney transplantation. Patient mortality, with a 3-month lag in primary analyses and 6- and 12-month lags in sensitivity analyses. Conventional facility HD was the reference group. Conventional home HD had a similar mortality risk. For quasi-intensive home HD, mortality risk was lower (HR, 0.56 95% CI, 0.44-0.73). For intensive home HD, mortality risk was nonsignificantly lower in primary analyses and significantly lower using a 6-month lag (HR, 0.41 95% CI, 0.20-0.85), but not using a 12-month lag. For quasi-intensive facility HD, mortality risk was nonsignificantly lower in primary analyses, although significantly lower using 6- (HR, 0.41 95% CI, 0.20-0.85) and 12-month lags (HR, 0.59 95% CI, 0.44-0.80). Mortality risk was similar between intensive and conventional facility HD. For peritoneal dialysis, mortality risk was greater than for conventional facility HD (HR, 1.07 95% CI, 1.03-1.12). Kidney transplantation had the lowest mortality risk. Potential residual confounding from limited collection of comorbid condition, socioeconomic, and medication data. There is an emerging HD dose-effect in Australia and New Zealand, with lower mortality risks associated with some of the more intensive HD regimens in these countries.
Publisher: BMJ
Date: 02-2019
DOI: 10.1136/BMJOPEN-2018-024382
Abstract: Patients with chronic kidney disease (CKD) are at heightened cardiovascular risk, which has been associated with abnormalities of bone and mineral metabolism. A deeper understanding of these abnormalities should facilitate improved treatment strategies and patient-level outcomes, but at present there are few large, randomised controlled clinical trials to guide management. Positive associations between serum phosphate and fibroblast growth factor 23 (FGF-23) and cardiovascular morbidity and mortality in both the general and CKD populations have resulted in clinical guidelines suggesting that serum phosphate be targeted towards the normal range, although few randomised and placebo-controlled studies have addressed clinical outcomes using interventions to improve phosphate control. Early preventive measures to reduce the development and progression of vascular calcification, left ventricular hypertrophy and arterial stiffness are crucial in patients with CKD. We outline the rationale and protocol for an international, multicentre, randomised parallel-group trial assessing the impact of the non-calcium-based phosphate binder, lanthanum carbonate, compared with placebo on surrogate markers of cardiovascular disease in a predialysis CKD population—the IM pact of P hosphate R eduction O n V ascular E nd-points (IMPROVE)-CKD study. The primary objective of the IMPROVE-CKD study is to determine if the use of lanthanum carbonate reduces the burden of cardiovascular disease in patients with CKD stages 3b and 4 when compared with placebo. The primary end-point of the study is change in arterial compliance measured by pulse wave velocity over a 96-week period. Secondary outcomes include change in aortic calcification and biochemical parameters of serum phosphate, parathyroid hormone and FGF-23 levels. Ethical approval for the IMPROVE-CKD trial was obtained by each local Institutional Ethics Committee for all 17 participating sites in Australia, New Zealand and Malaysia prior to study commencement. Results of this clinical trial will be published in peer-reviewed journals and presented at conferences. ACTRN12610000650099.
Publisher: Elsevier BV
Date: 09-2015
DOI: 10.1053/J.AJKD.2015.03.014
Abstract: In most studies, home dialysis associates with greater survival than facility hemodialysis (HD). However, the relationship between mortality risk and modality can vary by era. We describe and compare changes in survival with facility HD, peritoneal dialysis, and home HD over a 15-year period using data from The Australia and New Zealand Dialysis and Transplant Registry (ANZDATA). An observational inception cohort study, using Cox proportional hazards and competing-risks regression. All adult patients initiating renal replacement therapy in Australia and New Zealand since March 31, 1998, followed up to December 31, 2012. Era at dialysis inception (1998-2002, 2003-2007, and 2008-2012). We adjusted for time-varying dialysis modality and comorbid conditions, demographics, initial state/country of treatment, late referral for nephrology care, primary kidney disease, and kidney function at dialysis inception. Patient mortality. Survival on dialysis therapy has improved despite increasing patient comorbid conditions. Compared to 1998 to 2002, there has been a 21% reduction in mortality for those on facility HD therapy, a 27% reduction for those on peritoneal dialysis therapy, and a 49% reduction for those on home HD therapy. Potential for residual confounding from limited collection of comorbid conditions analyses lack data for blood pressure, fluid volume status, socioeconomics, medication, and biochemical parameters. Our study indicates that outcomes on dialysis therapy are improving with time and that this improvement is most marked with home dialysis modalities, especially home HD. This might be the result of better dialysis care (eg, improving predialysis care and more appropriate selection of patients for home dialysis). Other contributing factors are possible, such as improvements in general care of patient comorbid conditions and improvements in dialysis technology, although further research is needed to clarify these issues.
Publisher: Elsevier BV
Date: 2017
Publisher: Wiley
Date: 22-02-2011
DOI: 10.1111/J.1440-1797.2010.01412.X
Abstract: Vascular calcification (VC) contributes to cardiovascular disease in haemodialysis (HD) patients. Few controlled studies have addressed interventions to reduce VC but non-calcium-based phosphate binders may be beneficial. No published randomized study to date has assessed the effect of lanthanum carbonate (LC) on VC progression. We conducted a pilot randomized controlled trial to determine the effect of LC on VC. Forty-five HD patients were randomized to either LC or calcium carbonate (CC). Primary outcome was change in aortic VC after 18 months. Secondary outcomes included superficial femoral artery (SFA) VC, bone mineral density (BMD) of lumbar spine and serum markers of mineral metabolism. At baseline, 6 and 18 month computed tomography was performed to measure VC and BMD. A random effect linear regression model was performed to assess differences. Thirty patients completed the study (17 LC, 13 CC) baseline median age 58 years, 38% diabetic, 64% male. Ninety-three per cent had aortic VC at commencement and 87% showed progression. At 18 months, there was significantly less aortic VC progression with LC than CC (adjusted difference -98.1 (-149.4, -46.8) Hounsfield units (HU), P < 0.001). There was also a non-significant reduction with LC in left SFA VC (-25.8 (-67.7, 16.1) HU, P = 0.2) and right SFA VC (-35.9 (-77.8, 5.9) HU, P = 0.09). There was no difference in lumbar spine BMD and serum phosphate, calcium and parathyroid hormone levels between groups. Limitations to the study include small s le size and loss to follow up. Lanthanum carbonate was associated with reduced progression of aortic calcification compared with CC in HD patients over 18 months.
Publisher: Elsevier BV
Date: 2018
Publisher: Elsevier BV
Date: 04-1996
DOI: 10.1016/S0260-6917(96)80063-3
Abstract: The provision of further education opportunities is generally known to be limited for enrolled nurses. With the phasing out of the enrolled nurse qualification, it appears that many of them have to compete for limited places on conversion courses. Such a situation which has been placed on them appears to create problems in their lives. This study was undertaken with a randomized s le of 30 enrolled nurses to find out what the psychosocial problems may be for them who have not yet been able to convert to registered nurse status. The findings reveal a sense of betrayal, frustration, anger and helplessness at being coerced into getting onto conversion courses. That is compounded by the fact that places on such courses are extremely limited. Superimposed on those, fears were also expressed for their jobs. That was particularly so in the case of nurses from the area of learning disabilities. Although there are suggestions that they need to get onto conversion courses, there is a sense of disillusionment as places on the courses are extremely limited. Whilst the intention to convert may be present, many feel prevented from doing so because of their family commitments. In many instances, those commitments were non-existent when they first embarked on their nurse education/training. At the very least, the findings suggest a moral responsibility on the part of the relevant authorities to undertake a coordinated effort to help this group of nurses. After all it may be suggested that their predicament is not of their own making.
Publisher: BMJ
Date: 03-2017
Publisher: Wiley
Date: 04-2018
DOI: 10.1111/TID.12866
Abstract: Conjugated pneumococcal vaccine is recommended for kidney transplant recipients, however, their immunogenicity and potential to trigger allograft rejection though generation of de novo anti-human leukocyte antigen antibodies has not been well studied. Clinically stable kidney transplant recipients participated in a prospective cohort study and received a single dose of 13-valent conjugate pneumococcal vaccine. Anti-pneumococcal IgG was measured for the 13 vaccine serotypes pre and post vaccination and functional anti-pneumococcal IgG for 4 serotypes post vaccination. Anti-human leukocyte antigen antibodies antibodies were measured before and after vaccination. Kidney transplant recipients were followed clinically for 12 months for episodes of allograft rejection or invasive pneumococcal disease. Forty-five kidney transplant recipients participated. Median days between pre and post vaccination serology was 27 (range 21-59). Post vaccination, there was a median 1.1 to 1.7-fold increase in anti-pneumococcal IgG antibody concentrations for all 13 serotypes. Kidney transplant recipients displayed a functional antibody titer ≥1:8 for a median of 3 of the 4 serotypes. Post vaccination, there were no de novo anti-human leukocyte antigen antibodies, no episodes of biopsy proven rejection or invasive pneumococcal disease. A single dose of 13-valent conjugate pneumococcal vaccine elicits increased titers and breadth of functional anti-pneumococcal antibodies in kidney transplant recipients without stimulating rejection or donor-specific antibodies.
Publisher: Elsevier BV
Date: 11-2012
Publisher: Oxford University Press (OUP)
Date: 11-12-2021
DOI: 10.1093/NDT/GFAA358
Abstract: Home haemodialysis (HHD) is utilized significantly less often than facility HD globally with few exceptions, despite being associated with improved survival and better quality of life. Previously HHD was exclusively offered to younger patients with a few comorbidities. However, with the increasing burden of end-stage kidney disease (ESKD) alongside an ageing population, increasing numbers of older patients are being treated with HHD. This study aims to re-evaluate survival and related outcomes in the context of this epidemiological shift. A matched cohort design was used to compare all-cause mortality, transplantation, average biochemical values and graft survival 6 months post-transplant between HHD and facility HD patients. A total of 181 HHD patients from a major hospital network were included with 413 facility HD patients from the Australia and New Zealand Dialysis and Transplant Registry matched by age, gender and cause of ESKD. Survival analysis and competing risks analysis (for transplantation) were performed. After adjusting for body mass index, smoking status, racial group and comorbidities, HHD was associated with a significantly reduced risk of death compared with facility HD patients [hazard ratio 0.47 (95% confidence interval 0.30–0.74)]. Transplantation rates were comparable, with high rates of graft survival at 6 months in both groups. Haemoglobin, calcium and parathyroid hormone levels did not vary significantly. However, HHD patients had significantly lower phosphate levels. In this study, improved survival outcomes were observed in patients on home compared with facility dialysis, with comparable rates of transplantation, graft survival and biochemical control.
Publisher: Elsevier BV
Date: 09-2016
DOI: 10.1053/J.AJKD.2016.02.037
Abstract: In the context of clinical research, investigators have historically selected the outcomes that they consider to be important, but these are often discordant with patients' priorities. Efforts to define and report patient-centered outcomes are gaining momentum, though little work has been done in nephrology. We aimed to identify patient and caregiver priorities for outcomes in hemodialysis. Nominal group technique. Patients on hemodialysis therapy and their caregivers were purposively s led from 4 dialysis units in Australia (Sydney and Melbourne) and 7 dialysis units in Canada (Calgary). Identification and ranking of outcomes. Mean rank score (of 10) for top 10 outcomes and thematic analysis. 82 participants (58 patients, 24 caregivers) aged 24 to 87 (mean, 58.4) years in 12 nominal groups identified 68 outcomes. The 10 top-ranked outcomes were fatigue/energy (mean rank score, 4.5), survival (defined by patients as resilience and coping 3.7), ability to travel (3.6), dialysis-free time (3.3), impact on family (3.2), ability to work (2.5), sleep (2.3), anxiety/stress (2.1), decrease in blood pressure (2.0), and lack of appetite/taste (1.9). Mortality ranked only 14th and was not regarded as the complement of survival. Caregivers ranked mortality, anxiety, and depression higher than patients, whereas patients ranked ability to work higher. Four themes underpinned their rankings: living well, ability to control outcomes, tangible and experiential relevance, and severity and intrusiveness. Only English-speaking participants were eligible. Although trials in hemodialysis have typically focused on outcomes such as death, adverse events, and biological markers, patients tend to prioritize outcomes that are more relevant to their daily living and well-being. Researchers need to consider interventions that are likely to improve these outcomes and measure and report patient-relevant outcomes in trials, and clinicians may become more patient-orientated by using these outcomes in their clinical encounters.
Publisher: Oxford University Press (OUP)
Date: 05-04-2010
DOI: 10.1093/NDT/GFQ147
Abstract: Ischaemia/reperfusion (I/R) is an important factor in delayed graft function in renal transplantation and is a determinant of long-term graft outcome. This study examined the role of c-Jun N-terminal kinase (JNK) signalling in human and experimental renal I/R injury. Biopsies obtained 15-20 min after reperfusion of human renal allografts were examined for JNK signalling by immunostaining for phospho-c-Jun. To examine the pathologic role of JNK signalling, a selective JNK inhibitor (CC-401) was administered to rats before or after the induction of a 30-min period of bilateral renal ischaemia followed by reperfusion. Renal function and tubular damage were analysed. Substantial JNK activation was evident in tubular epithelial cells in kidneys from deceased donors (n = 30) which was less prominent in kidneys from live donors (n = 7) (44.6 +/- 24.8% vs 29.1 +/- 20% p-c-Jun+, respectively P < 0.05), whereas biopsies of thin basement membrane disease exhibited little, or no, p-c-Jun staining. The degree of p-c-Jun staining correlated with ischaemic time in deceased donor allografts, but not with graft function. Administration of CC-401 to rats prior to bilateral renal I/R prevented acute renal failure and largely prevented tubular damage, leucocyte infiltration and upregulation of pro-inflammatory molecules. However, delaying CC-401 treatment until 1 h after reperfusion (after the peak of JNK activation) had no protective effect. We have identified acute activation of the JNK signalling pathway following I/R in human kidney allografts. Experimental studies indicate that blockade of JNK signalling, commenced prior to this activation, can prevent acute tubular necrosis and renal dysfunction secondary to I/R injury.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 02-2004
DOI: 10.1097/01.ASN.0000109668.05157.05
Abstract: The native arteriovenous fistula (AVF) is the preferred vascular access because of its longevity and its lower rates of infection and intervention. Recent studies suggest that the AVF may offer a survival advantage. Because these data were derived from observational studies, they are prone to potential bias. The use of propensity scores offers an additional method to reduce bias resulting from nonrandomized treatment assignment. Adult (age 18 yr or more) patients who commenced hemodialysis in Australia and New Zealand on April 1, 1999, until March 31, 2002, were studied by using the Australian and New Zealand Dialysis and Transplant Association (ANZDATA) Registry. Cox regression was used to determine the effect of access type on total mortality. Propensity scores were calculated and used both as a controlling variable in the multivariable model and to construct matched cohorts. The catheter analysis was stratified by dialysis duration at entry to ANZDATA to satisfy the proportional-hazard assumption. There were 612 deaths in 3749 patients (median follow-up, 1.07 yr). After adjustment for confounding factors and propensity scores, catheter use was predictive of mortality. Patients with arteriovenous grafts (AVG) also had a significantly increased risk of death. Effect estimates were also consistent in the smaller propensity score-matched cohorts. Both AVG and catheter use in incident hemodialysis patients are associated with significant excess of total mortality. Reducing catheter use and increasing the proportion of patients commencing hemodialysis with a mature AVF remain important clinical objectives.
Publisher: Wiley
Date: 28-01-2016
DOI: 10.1111/NEP.12585
Abstract: Nocardia infections are an uncommon but important cause of morbidity and mortality in renal transplant recipients. The present study was carried out to determine the spectrum of Nocardia infections in a renal transplant centre in Australia. A retrospective chart analysis of all renal transplants performed from 2008 to 2014 was conducted to identify cases of culture proven Nocardia infection. The clinical course for each patient with nocardiosis was examined. Four of the 543 renal transplants patients developed Nocardia infection within 2 to 13 months post-transplant. All patients were judged at high immunological risk of rejection pre-transplant and had received multiple sessions of plasmaphoeresis and intravenous immunoglobulin before the onset of the infection. Two patients presented with pulmonary nocardiosis and two with cerebral abscesses. One case of pulmonary nocardiosis was complicated by pulmonary aspergillosis and the other by cytomegalovirus pneumonia. All four patients improved with combination antibiotic therapy guided by drug susceptibility testing. At the time of Nocardia infection all four patients were receiving primary prophylaxis with trimethoprim/sulphamethoxazole (TMP/SMX) 160/800 mg, twice weekly. Plasmaphoeresis may be risk factor for Nocardia infection and need further study. Nocardia infection may coexist with other opportunistic infections. Identification of the Nocardia species and drug susceptibility testing is essential in guiding the effective management of patients with Nocardia. Intermittent TMP-SMX (one double strength tablet, twice a week) appears insufficient to prevent Nocardia infection in renal transplant recipients.
Publisher: Elsevier BV
Date: 2009
DOI: 10.1053/J.AJKD.2008.06.026
Abstract: Starting hemodialysis therapy with an arteriovenous fistula (AVF) is associated with improved patient survival. Clinical audit showed that less than 50% of our patients started hemodialysis therapy with an AVF. Quality improvement report, prospective before and after study. Tertiary referral hospital with 184 patients starting hemodialysis therapy in 2005 and 2006. Situational analysis showed poor overall coordination of surgical waiting lists. Multifaceted intervention included vascular access nurse coordinator and an algorithm to prioritize surgery. Vascular access used at first hemodialysis treatment in patients with pre-end-stage renal disease in the 12 months before and after the intervention. Proportions of patients starting hemodialysis therapy with an AVF. Overall, 65% of patients started hemodialysis therapy with an AVF 2%, with an arteriovenous graft and 33%, with a catheter. The proportion of patients starting hemodialysis therapy with an AVF increased from 56% preimplementation to 75% postimplementation (P = 0.007). After adjustment for age, sex, late referral, cause of renal failure, and presentation type, patients starting dialysis therapy in the implementation phase were twice as likely to start treatment with an AVF (odds ratio, 2.85 P = 0.008). The total number of catheter-days in the implementation phase was half that of the preimplementation phase (2,833 v 4,685 days). Nonrandomized study. Implementation of a multifaceted intervention including a vascular access nurse and an algorithm to prioritize surgery significantly increased the proportion of patients starting dialysis therapy with an AVF by improving the overall coordination of the surgical waiting list.
Publisher: Elsevier BV
Date: 2013
DOI: 10.1053/J.AJKD.2012.07.008
Abstract: There has been little study to date of daily variation in cardiac death in dialysis patients and whether such variation differs according to dialysis modality and session frequency. Observational cohort study using ANZDATA (Australia and New Zealand Dialysis and Transplant) Registry data. All adult patients with end-stage kidney failure treated by dialysis in Australia and New Zealand who died between 1999 and 2008. Timing of death (day of week), dialysis modality, hemodialysis (HD) session frequency, and demographic, clinical, and facility variables. Cardiac and noncardiac mortality. 14,636 adult dialysis patients died during the study period (HD, n = 10,338 peritoneal dialysis [PD], n = 4,298). Cardiac death accounted for 40% of deaths and was significantly more likely to occur on Mondays in in-center HD patients receiving 3 or fewer dialysis sessions per week (n = 9,503 adjusted OR, 1.26 95% CI, 1.14-1.40 P < 0.001 compared with the mean odds of cardiac death for all days of the week). This daily variation in cardiac death was not seen in PD patients, in-center HD patients receiving more than 3 sessions per week (n = 251), or home HD patients (n = 573). Subgroup analyses showed that deaths related to hyperkalemia and myocardial infarction also were associated with daily variation in risk in HD patients. This pattern was not seen for vascular, infective, malignant, dialysis therapy withdrawal, or other deaths. Limited covariate adjustment. Residual confounding and coding bias could not be excluded. Possible type 2 statistical error due to limited s le size of home HD and enhanced-frequency HD cohorts. Daily variation in the pattern of cardiac deaths was observed in HD patients receiving 3 or fewer dialysis sessions per week, but not in PD, home HD, and HD patients receiving more than 3 sessions per week.
Publisher: Elsevier BV
Date: 10-2020
Publisher: Public Library of Science (PLoS)
Date: 16-12-2014
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 03-09-2020
Publisher: Wiley
Date: 2008
DOI: 10.1002/AJHB.20729
Abstract: To compare body size measurements in Australian Aboriginals living in three remote communities in the Northern Territory of Australia with those of the general Australian population. Height, weight, waist and hip circumferences and derivative values of body mass index (BMI), waist-hip ratio (WHR), waist-height ratio (WHT), and waist-weight ratios (WWT) of adult Aboriginal volunteers (n = 814), aged 25 to 74 years were compared with participants in the nationally representative 'AusDiab' survey (n = 10,434). The Aboriginal body habitus profiles differed considerably from the Australian profile. When compared to Australian females, Aboriginal females were taller and had lower hip circumference but had higher WC, WHR, WHT, and WWT (P < 0.01 for all). When compared with their Australian counterparts, Aboriginal males were shorter, had lower body weight, WC, hip circumference, BMI, and WHT but had higher WHR and WWT (P < 0.001 for all). Significantly more Aboriginal females were classified as overweight and or obese using cutoffs defined by WC and by WHR than by BMI. Aboriginal males were less often overweight and/or obese by BMI than their counterparts, but were significantly more often overweight or obese by WHR. There were significant variations in body size profiles between Aboriginal communities. However, the theme of excess waist measurements relative to their weight was uniform. Aboriginal people had preferential central fat deposition in relation to their overall weight. BMI significantly underestimated overweight and obesity as assessed by waist measurements among Aboriginals. This relationship of preferential central fat deposition to the current epidemic of chronic diseases needs to be explored further.
Publisher: Wiley
Date: 05-2013
DOI: 10.1111/SDI.12096
Publisher: Wiley
Date: 23-12-2010
DOI: 10.1111/J.1440-1797.2010.01362.X
Abstract: Intra-dialytic hypotension (IDH) is a common problem affecting haemodialysis patients. Its aetiology is complex and influenced by multiple patient and dialysis factors. IDH occurs when the normal cardiovascular response cannot compensate for volume loss associated with ultrafiltration, and is exacerbated by a myriad of factors including intra-dialytic fluid gains, cardiovascular disease, antihypertensive medications and the physiological demands placed on patients by conventional haemodialysis. The use of blood volume monitoring and blood temperature monitoring technologies is advocated as a tool to predict and therefore prevent episodes of IDH. We review the clinical utility of these technologies and summarize the current evidence of their effect on reducing the incidence of IDH in haemodialysis population.
Publisher: Springer Science and Business Media LLC
Date: 27-06-2015
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 19-11-2020
DOI: 10.2215/CJN.16351020
Publisher: Springer Science and Business Media LLC
Date: 16-03-2017
Publisher: Springer Science and Business Media LLC
Date: 19-11-2018
Publisher: Wiley
Date: 04-2012
DOI: 10.1111/J.1754-9485.2011.02323.X
Abstract: Contrast-induced nephropathy (CIN), a common iatrogenic cause of acute renal failure, is preventable. Identification of impaired renal function prior to intravenous contrast is important. Questionnaire screening has been useful to negate the need for cumbersome and costly renal function testing on all patients prior to contrast-enhanced CT (CECT). The Royal Australian and New Zealand College of Radiologists guidelines include age older than 60 as a risk marker requiring renal function testing. The aim of this retrospective study is to assess the efficacy of the pre-CT questionnaire in identifying patients with pre-existing renal impairment even in this older than 60 age group. All outpatients were given questionnaires containing 11 CIN risk markers prior to CECT. Radiographers documented age, gender, serum creatinine and/or estimated glomerulofiltration rate (eGFR mL/min/1.72 m(2) ) within 3 months of CT. Questionnaires of all patients older than 60 years were collated. The data was tabulated and analyzed. Incomplete questionnaires were excluded. 134/171 (78.4%) patients had eGFR ≥ 60 and 37/171 (21.6%) had eGFR < 60, with 31/171 (18.1%) having eGFR between 30 and 60 and 3/171 (1.8%) having eGFR < 30. 47/171 (27.5%) circled 'no' to all risk markers. Percentage for sensitivity is 81.1% (95% confidence interval (CI) 64.8-92%), for specificity 29.9% (95% CI 22.3-38.4%), for positive predictive value 24.2% (95% CI 17-32.7%) and for negative predictive value 85.1% (95%CI 71.7-93.8%). Kidney disease, anaemia, myeloma and vasculitis seem to be statistically significant risk factors (P < 0.05). All three true-positive patients with eGFR < 30 indicated known kidney disease. Seven false-negative patients had eGFR 30-60, with 4/7 (57.1%) having CIN risk markers in their medical records. Questionnaire screening for CIN risk has a high negative predictive value (85.1%) even in patients older than 60 years.
Publisher: Elsevier BV
Date: 10-2014
DOI: 10.1038/KI.2013.553
Publisher: S. Karger AG
Date: 2021
DOI: 10.1159/000515231
Abstract: b i Introduction: /i /b Acute kidney diseases and disorders (AKD) encompass acute kidney injury (AKI) and subacute or persistent alterations in kidney function that occur after an initiating event. Unlike AKI, accurate estimates of the incidence and prognosis of AKD are not available and its clinical significance is uncertain. b i Methods: /i /b We studied the epidemiology and long-term outcome of AKD (as defined by the KDIGO criteria), with or without AKI, in a retrospective cohort of adults hospitalized at a single centre for & #x3e h between 2012 and 2016 who had a baseline eGFR ≥60 mL/min/1.73 m sup /sup and were alive at 30 days. In patients for whom follow-up data were available, the risks of major adverse kidney events (MAKEs), CKD, kidney failure, and death were examined by Cox and competing risk regression analyses. b i Results: /i /b Among 62,977 patients, 906 (1%) had AKD with AKI and 485 (1%) had AKD without AKI. Follow-up data were available for 36,118 patients. In this cohort, compared to no kidney disease, AKD with AKI was associated with a higher risk of MAKEs (40.25 per 100 person-years hazard ratio [HR] 2.51, 95% confidence interval [CI] 2.16–2.91), CKD (27.84 per 100 person-years) subhazard ratio [SHR] 3.18, 95% CI 2.60–3.89), kidney failure (0.56 per 100 person-years SHR 24.84, 95% CI 5.93–104.03), and death (14.86 per 100 person-years HR 1.52, 95% CI 1.20–1.92). Patients who had AKD without AKI also had a higher risk of MAKEs (36.21 per 100 person-years HR 2.26, 95% CI 1.89–2.70), CKD (22.94 per 100 person-years SHR 2.69, 95% CI 2.11–3.43), kidney failure (0.28 per 100 person-years SHR 12.63, 95% CI 1.48–107.64), and death (14.86 per 100 person-years HR 1.57, 95% CI 1.19–2.07). MAKEs after AKD were driven by CKD, especially in the first 3 months. b i Conclusions: /i /b These findings establish the burden and poor prognosis of AKD and support prioritisation of clinical initiatives and research strategies to mitigate such risk.
Publisher: Elsevier BV
Date: 03-2014
Publisher: Oxford University Press (OUP)
Date: 20-07-2006
DOI: 10.1093/NDT/GFL242
Abstract: Clinical practice guidelines recommend that the preferred method of surveillance for arteriovenous fistula (AVF) is the measurement of AVF blood flow (Qa). As these recommendations are based on observational studies, we conducted a randomized, prospective, double-blind, controlled trial to assess whether Qa surveillance results in an increased detection of AVF stenosis. A total of 137 patients were randomly assigned to receive either continuing AVF surveillance using current clinical criteria (control, usual treatment) or usual treatment plus AVF blood-flow surveillance by ultrasound dilution (Qa surveillance group). The primary outcome measure was the detection of a significant (>50%) AVF stenosis. There were 67 and 68 patients assigned to the control and Qa surveillance groups, respectively. Patients in the Qa surveillance group were twice as likely to have a stenosis detected compared with the control hazard ratio (HR) confidence interval (CI) group (2.27, 95% 0.85-5.98, P = 0.09), with a trend for a significant stenosis to be detected earlier in the Qa surveillance group (P = 0.09, log rank test). However, using the Qa results alone prior to angiography, the area under the receiver operating characteristic curve demonstrated, at best, a moderate prediction of (>50%) AVF stenosis (0.78, 95% CI 0.63-0.94, P = 0.006). This study demonstrates that the addition of AVF Qa monitoring to clinical screening for AVF stenosis resulted in a non-significant doubling in the detection of angiographically significant AVF stenosis. Further, large multi-centre randomized trials are feasible and will be necessary to confirm whether Qa surveillance and the correction of detected AVF stenosis will lead to a reduction in AVF thrombosis and increased AVF survival.
Publisher: Elsevier BV
Date: 02-2016
Publisher: Wiley
Date: 07-05-2019
DOI: 10.1111/NEP.13565
Publisher: Wiley
Date: 26-11-2018
DOI: 10.1111/SDI.12658
Abstract: In patients receiving hemodialysis, the provision of safe and effective vascular access using an arteriovenous fistula or graft is regarded as a critical priority by patients and health professionals. Vascular access failure is associated with morbidity and mortality, such that strategies to prevent these outcomes are essential. Inadequate vascular remodeling and neointimal hyperplasia resulting in stenosis and frequently thrombosis are critical to the pathogenesis of access failure. Systemic medical therapies with pleiotropic effects including antiplatelet agents, omega-3 polyunsaturated fatty acids (fish oils), statins, and inhibitors of the renin-angiotensin-aldosterone system (RAAS) may reduce vascular access failure by promoting vascular access maturation and reducing stenosis and thrombosis through antiproliferative, antiaggregatory, anti-inflammatory and vasodilatory effects. Despite such promise, the results of retrospective analyses and randomized controlled trials of these agents on arteriovenous fistula and graft outcomes have been mixed. This review describes the current understanding of the pathogenesis of arteriovenous fistula and graft failure, the biological effects of antiplatelet agents, fish oil supplementation, RAAS blockers and statins that may be beneficial in improving vascular access survival, results from clinical trials that have investigated the effect of these agents on arteriovenous fistula and graft outcomes, and it explores future therapeutic approaches combining these agents with novel treatment strategies.
Publisher: Wiley
Date: 22-02-2011
DOI: 10.1111/J.1440-1797.2011.01443.X
Abstract: Fibroblast growth factor 23 (FGF-23) is a recently discovered regulator of phosphate and mineral metabolism. Its main physiological function is the enhancement of renal phosphate excretion. FGF-23 levels are inversely related to renal function and in patients with chronic kidney disease (CKD) elevation in FGF-23 precedes the rise of serum phosphate. Studies have demonstrated an important role for FGF-23 in the development of secondary hyperparathyroidism through an effect on parathyroid hormone and calcitriol. In cross-sectional studies FGF-23 has been associated with surrogate markers of cardiovascular disease such as endothelial dysfunction and arterial stiffness. FGF-23 has also been associated with both progression of CKD and mortality in dialysis patients. The discovery of FGF-23 has provided a profound new insight into bone and mineral metabolism, and it may become an important biomarker and therapeutic target in CKD.
Publisher: Elsevier BV
Date: 10-2011
Publisher: Wiley
Date: 10-04-2018
DOI: 10.1111/TID.12888
Abstract: Microsporidia are intracellular organisms most commonly known to cause opportunistic infection in patients with human immunodeficiency virus (HIV). There have been several case reports of infection in solid organ and bone marrow transplant recipients. Here, we report a case of a non-HIV-infected renal transplant patient with microsporidiosis of the renal tract associated with acute graft dysfunction. We also review the literature of 12 previously reported cases of microsporidiosis in patients with renal transplants who had described graft involvement. We review the pattern of illness as well as the common renal biopsy features when microsporidial infection is associated with renal graft infection.
Publisher: Springer Science and Business Media LLC
Date: 03-12-2013
Abstract: The population of elderly patients with end-stage renal disease (ESRD) is growing rapidly worldwide. The high prevalence of comorbidities, limited life expectancy and complex quality of life issues associated with this population pose substantial challenges for clinicians in terms of clinical decision-making and providing optimal care. The first dilemma encountered in the management of an elderly patient with ESRD is deciding whether to initiate renal replacement therapy and, if so, selecting the most-suitable dialysis modality. Planning vascular access for haemodialysis is associated with additional challenges. Several clinical practice guidelines recommend an arteriovenous fistula, rather than a central venous catheter or arteriovenous graft, as the preferred access for maintenance haemodialysis therapy. However, whether this recommendation is applicable to elderly patients with ESRD and a limited life expectancy is unclear. Selection and planning of the most appropriate vascular access for an elderly patient with ESRD requires careful consideration of several factors and ultimately should lead to an improvement in the patient's quality of life.
Publisher: Springer Science and Business Media LLC
Date: 21-11-2023
DOI: 10.1007/S00125-022-05832-0
Abstract: Whether sodium–glucose co-transporter 2 inhibitors (SGLT2is) or glucagon-like peptide-1 receptor agonists (GLP-1 RAs) are cost-effective based solely on their cardiovascular and kidney benefits is unknown. We projected the health and economic outcomes due to myocardial infarction (MI), stroke, heart failure (HF) and end-stage kidney disease (ESKD) among people with type 2 diabetes, with and without CVD, under scenarios of widespread use of these drugs. We designed a microsimulation model using real-world data that captured CVD and ESKD morbidity and mortality from 2020 to 2040. The populations and transition probabilities were derived by linking the Australian Diabetes Registry (1.1 million people with type 2 diabetes) to hospital admissions databases, the National Death Index and the ESKD Registry using data from 2010 to 2019. We modelled four interventions: increase in use of SGLT2is or GLP-1 RAs to 75% of the total population with type 2 diabetes, and increase in use of SGLT2is or GLP-1 RAs to 75% of the secondary prevention population (i.e. people with type 2 diabetes and prior CVD). All interventions were compared with current use of SGLT2is (20% of the total population) and GLP-1 RAs (5% of the total population). Outcomes of interest included quality-adjusted life years (QALYs), total costs (from the Australian public healthcare perspective) and the incremental cost-effectiveness ratio (ICER). We applied 5% annual discounting for health economic outcomes. The willingness-to-pay threshold was set at AU$28,000 per QALY gained. The numbers of QALYs gained from 2020 to 2040 with increased SGLT2i and GLP-1 RA use in the total population ( n =1.1 million in 2020 n =1.5 million in 2040) were 176,446 and 200,932, respectively, compared with current use. Net cost differences were AU$4.2 billion for SGLT2is and AU$20.2 billion for GLP-1 RAs, and the ICERs were AU$23,717 and AU$100,705 per QALY gained, respectively. In the secondary prevention population, the ICERs were AU$8878 for SGLT2is and AU$79,742 for GLP-1 RAs. At current prices, use of SGLT2is, but not GLP-1 RAs, would be cost-effective when considering only their cardiovascular and kidney disease benefits for people with type 2 diabetes.
No related grants have been discovered for Kevan Polkinghorne.