ORCID Profile
0000-0003-0499-4527
Current Organisations
Te Whatu Ora Health New Zealand Hauora a Toi Bay of Plenty
,
The University of Auckland
Does something not look right? The information on this page has been harvested from data sources that may not be up to date. We continue to work with information providers to improve coverage and quality. To report an issue, use the Feedback Form.
Publisher: Oxford University Press (OUP)
Date: 18-07-2012
DOI: 10.1093/NDT/GFS305
Abstract: The success of peritoneal dialysis (PD) is dependent on timely and adequate PD catheter access. In many centres, including our own, PD catheter insertion technique has evolved by laparoscopic surgery. An alternative method of catheter insertion is performed by radiologists using a percutaneous modified Seldinger technique under fluoroscopic guidance. However, there are no clinical trials comparing these two methods of catheter insertion. From 1 April 1999 to 30 August 2004, we randomly assigned 113 pre-dialysis patients to receive PD catheter insertion using fluoroscopic guidance under local anaesthesia by radiologists or insertion using laparoscopy under general anaesthesia by a surgeon. The primary endpoint was the occurrence of dialysis catheter complications (complication-free catheter survival) by Day 365, a composite endpoint that included complications secondary to mechanical and infectious causes. Secondary endpoints were the occurrence of catheter removal (overall catheter survival) and death from any cause (patient survival) by Day 365, procedure pain, procedure time, procedure room utilization time, length of inpatient admission and direct hospital costs. Results were analysed by univariate and multivariate methods and by Kaplan-Meier survival curves. Complication-free catheter survival was significantly higher at 42.5% [95% confidence interval (CI) 29.3-55] in the radiological group compared with 18.1% (95% CI 8.9-29.8) in the laparoscopic group (P-value = 0.03). Excess complications in the laparoscopic group included peritonitis, peritoneal dialysate leaks and umbilical herniae. One-year overall catheter survival and 1-year subject survival were not different between the groups. Hospital costs were significantly higher in the laparoscopic group by almost a factor of two. Radiological insertion of first PD catheters using fluoroscopy is a clinically non-inferior and cost-effective alternative to surgical laparoscopic insertion.
Publisher: Wiley
Date: 19-11-2014
DOI: 10.1002/JORC.12043
Abstract: Peritoneal dialysis (PD) has been shown to offer a high quality of life and independence to patients. New Zealand (NZ) is a world leader in home dialysis, yet over the last decade, rates of PD have been steadily decreasing for unknown reasons. This paper reports on the findings of a national survey which explored the clinicians' perspectives on key factors that influence the rate of PD. Ten multi-answer questions were asked of several groups of dialysis health professionals to assess factors that are barriers and enablers to PD, including patient choice of dialysis modality, information about PD and pre-dialysis education delivery. All NZ nephrologists, pre-dialysis and PD nurses were invited to complete an anonymous online survey. Responses were analysed to identify perceived barriers and enablers influencing the rate of PD uptake amongst incident dialysis patients. Completed surveys were received from 52% of nephrologists, 100% of pre-dialysis nurses and 50% of PD nurses in NZ. In NZ, patients are offered a choice of dialysis modality with pre-dialysis nurses delivering the majority of education. The most frequently identified barriers to uptake of PD were lack of information about PD, established misconceptions about PD and late referrals to dialysis. Important enablers were early and frequent pre-dialysis education. The only two factors which were reported as very important contraindications to PD were dexterity and decreased cognitive function. Early and frequent pre-dialysis education encourages patients to choose PD and enables early identification and resolution of barriers to the uptake of PD.
Publisher: Elsevier BV
Date: 10-2006
Publisher: Oxford University Press (OUP)
Date: 02-04-2008
DOI: 10.1093/NDT/GFN252
Publisher: Elsevier BV
Date: 06-1998
DOI: 10.1053/AJKD.1998.V31.PM9631847
Abstract: Agreement and reproducibility of Daugirdas blood-based and Biostat 1000 dialysate-based Kt/V estimation were explored. Fifty-two dialysis treatments in 19 patients were studied. All patients were dialyzed by arteriovenous (AV) access. Good agreement was found in the comparison between laboratory predialysis blood urea nitrogen (BUN) and Biostat 1000 BUN. Each treatment was assessed for Kt/V simultaneously by Biostat 1000 and by Daugirdas methods based on predialysis and postdialysis BUN. Four Daugirdas blood-based Kt/V estimations per session were obtained two were single pool Kt/V, the first using an "arterial" postdialysis BUN and the second a "mixed venous" postdialysis BUN, whereas the other two were double pool (or equilibrated) eKt/V obtained by factoring the respective single pool "arterial" and "mixed venous" Kt/V for the relative rate of solute removal. The four blood-based and Biostat 1000 Kt/V were examined for pooled-within-patient variability in 15 of the patients in whom three dialysis sessions on the same dialysis prescription were available, and these were not significantly different between the blood-based and Biostat 1000 Kt/V. The four blood-based Kt/V were then compared with the Biostat 1000 Kt/V using the concordance correlation coefficient (CC, 1 indicating pairs of observations fall on a line of identity, 0 indicating no relationship), and bias and range of agreement as defined by the Bland and Altman technique of analysis. The "mixed venous" eKt/V had the closest agreement with the Biostat 1000 Kt/V (CC = 0.77), but the range of agreement as defined by Bland and Altman was 0.62, implying that for a single session, there was a 95% chance that the "mixed venous" eKt/V would lie within +/- 0.31 of the Biostat 1000 Kt/V. It is concluded that Biostat 1000 Kt/V results are comparable in large groups to certain Daugirdas blood-based Kt/V, although for a given dialysis session, clinically important differences in resulting Kt/V parameters may be seen between these two methods of estimating Kt/V.
Publisher: Elsevier BV
Date: 12-2017
DOI: 10.1053/J.AJKD.2017.06.023
Abstract: Involving patients in dialysis decision making is crucial, yet little is known about patient-reported experiences and patient-reported outcomes of dialysis. A prospective longitudinal cohort study of older patients receiving long-term dialysis. Predictors of worse health status were assessed using modified Poisson regression analysis. 150 New Zealanders 65 years or older with end-stage kidney disease dialyzing at 1 of 3 nephrology centers. Patient-reported social and health characteristics based on the 36-Item Short Form Health Survey, EQ-5D, and Kidney Symptom Score questionnaires and clinical information from health records. Health status after 12 months of follow-up. 35% of study participants had reported worse health or had died at 12 months. Baseline variables independently associated with reduced risk for worse health status were Pacific ethnicity (relative risk [RR], 0.63 95% CI, 0.53-0.72), greater bother on the Kidney Symptom Score (RR, 0.78 95% CI, 0.62-0.97), and dialyzing at home with either home hemodialysis (RR, 0.55 95% CI, 0.36-0.83) or peritoneal dialysis (RR, 0.86 95% CI, 0.79-0.93). Baseline variables independently associated with increased risk were greater social dissatisfaction (RR, 1.66 95% CI, 1.27-2.17), lower sense of community (RR, 1.70 95% CI, 1.09-2.64), comorbid conditions (RR, 1.70 95% CI, 1.09-2.64), EQ-5D anxiety/depression (RR, 1.61 95% CI, 1.07-2.42) poor/fair overall general health (RR, 1.60 95% CI, 1.37-1.85), and longer time on dialysis therapy (RR, 1.03 95% CI, 1.00-1.05). Small s le size restricted study power. Most older dialyzing patients studied reported same/better health 12 months later. Home-based dialysis, regardless of whether hemodialysis or peritoneal dialysis, was associated with reduced risk for worse health, and older Pacific People reported better outcomes on dialysis therapy. Social and/or clinical interventions aimed at improving social satisfaction, sense of community, and reducing anxiety/depression may favorably affect the experiences of older patients receiving long-term dialysis.
Publisher: Elsevier BV
Date: 06-2020
Publisher: Elsevier BV
Date: 06-2006
Abstract: Hemodiafiltration (HDF) is used sporadically for renal replacement therapy in Europe but not in the US. Characteristics and outcomes were compared for patients receiving HDF versus hemodialysis (HD) in five European countries in the Dialysis Outcomes and Practice Patterns Study. The study followed 2165 patients from 1998 to 2001, stratified into four groups: low- and high-flux HD, and low- and high-efficiency HDF. Patient characteristics including age, sex, 14 comorbid conditions, and time on dialysis were compared between each group using multivariate logistic regression. Cox proportional hazards regression assessed adjusted differences in mortality risk. Prevalence of HDF ranged from 1.8% in Spain to 20.1% in Italy. Compared to low-flux HD, patients receiving low-efficiency HDF had significantly longer average duration of end-stage renal disease (7.0 versus 4.7 years), more history of cancer (15.4 versus 8.7%), and lower phosphorus (5.3 versus 5.6 mg/dl) patients receiving high-efficiency HDF had significantly more lung disease (15.5 versus 10.2%) and received a higher single-pool Kt/V (1.44 versus 1.35). High-efficiency HDF patients had lower crude mortality rates than low-flux HD patients. After adjustment, high-efficiency HDF patients had a significant 35% lower mortality risk than those receiving low-flux HD (relative risk=0.65, P=0.01). These observational results suggest that HDF may improve patient survival independently of its higher dialysis dose. Owing to possible selection bias, the potential benefits of HDF must be tested by controlled clinical trials before recommendations can be made for clinical practice.
Publisher: Wiley
Date: 04-2015
DOI: 10.1111/HDI.12314
Abstract: Encouraging clinical outcomes from observational and randomized controlled data in the forms of frequent hemodialysis (HD) has renewed interest in home HD. However, given its benefits, home HD is relatively underutilized throughout the world. The Global Forum for Home Hemodialysis, an independent panel comprised of internationally recognized nephrologists, home HD nurses, administrators, patient advocates, and a long-time home HD patient, has convened with the intention of creating an open-source, comprehensive, practical manual that provides useful information to clinicians who are interested in implementing home HD.
Publisher: SAGE Publications
Date: 12-2018
Abstract: The aim of this study was to determine if there were centers in China with unusually high levels of risk-adjusted mortality in continuous ambulatory peritoneal dialysis (CAPD) patients. We analyzed an inception cohort commencing CAPD between 1 January 2005 and 13 August 2015, followed until death, dropout defined as discontinuation of Baxter products, loss to follow-up, or 13 November 2015, whichever occurred first. We calculated standardized mortality ratios (SMRs) from Cox proportional hazards models, adjusting for age, gender, employment status, insurance status, primary renal disease, size of peritoneal dialysis (PD) program, and year of dialysis inception. We calculated 2 SMRs, 1 from models including a fixed effect for center of treatment, and 1 from stratified models. In this study, there was a 9.9% annual mortality rate in China, with decreasing mortality risk over time. There was significant variation of outcomes between Chinese centers, with up to 20% of facilities having SMRs indicating a higher risk-adjusted mortality rate than average. In particular, larger centers had better than expected mortality than smaller ones. There was significant misclassification of SMRs calculated using stratification versus fixed-effects models, although both showed directionally similar results. Despite overall satisfactory and improving outcomes, our study showed a significant proportion of PD centers with higher than expected mortality. This is a signal for further assessment of these centers in China, after which there might be a range of actions taken depending on the results of the assessment and context, bearing in mind that the variation seen may be driven by factors unrelated to quality of care or beyond the control of hospital.
Publisher: Oxford University Press (OUP)
Date: 26-12-2020
DOI: 10.1093/NDT/GFAA287
Abstract: Dialysate sodium (DNa) prescription policy differs between haemodialysis (HD) units, and the optimal DNa remains uncertain. We sought to summarize the evidence on the agreement between prescribed and delivered DNa, and whether the relationship varied according to prescribed DNa. We searched MEDLINE and PubMed from inception to 26 February 2020 for studies reporting measured and prescribed DNa. We analysed results reported in aggregate with random-effects meta-analysis. We analysed results reported by in idual s le, using mixed-effects Bland–Altman analysis and linear regression. Pre-specified subgroup analyses included method of sodium measurement, dialysis machine manufacturer and proportioning method. Seven studies, representing 908 dialysate s les from 10 HD facilities (range 16–133 s les), were identified. All but one were single-centre studies. Studies were of low to moderate quality. Overall, there was no statistically significant difference between measured and prescribed DNa {mean difference = 0.73 mmol/L [95% confidence interval (CI) −1.12 to 2.58 P = 0.44]} but variability across studies was substantial (I2 = 99.3%). Among in idually reported s les (n = 295), measured DNa was higher than prescribed DNa by 1.96 mmol/L (95% CI 0.23–3.69) and the 95% limits of agreement ranged from −3.97 to 7.88 mmol/L. Regression analysis confirmed a strong relationship between prescribed and measured DNa, with a slope close to 1:1 (β = 1.16, 95% CI 1.06–1.27 P & 0.0001). A limited number of studies suggest that, on average, prescribed and measured DNa are similar. However, between- and within-study differences were large. Further consideration of the precision of delivered DNa is required to inform rational prescribing.
Publisher: Elsevier BV
Date: 04-2004
DOI: 10.1053/J.AJKD.2003.11.023
Abstract: Native arteriovenous fistula (AVF) prevalence varies significantly among different populations and countries. Physician practice patterns may have a strong influence on access type. We assessed differences in vascular access practice patterns across all treating centers in New Zealand. Adult (age > or = 18 years) patients on hemodialysis therapy in the year ending September 30, 2001, were studied from the Australian and New Zealand Dialysis and Transplant Association Registry. Multinomial logistic regression was used to assess factors associated with arteriovenous graft (AVG) and catheter use. Of 772 patients available for analysis, 461 patients (60%) underwent dialysis using an AVF 122 patients (16%), an AVG and 189 patients (24%), a catheter. On multivariable analysis, female sex (odds ratio, 5.92 P < 0.001), coronary artery disease (odds ratio, 1.89 P < 0.05), body mass index greater than 30 (odds ratio, 2.55 P < 0.05), and age (odds ratio, 1.03 per year increase P < 0.001) were associated with an increased likelihood of AVG use. Maori and Pacific Island patients were less likely to use an AVG compared with Caucasians (odds ratio, 0.47 P < 0.05). Predictors of greater likelihood of catheter use were female sex (odds ratio, 3.9 P < 0.001), late referral (odds ratio, 1.60 P < 0.05), and age (odds ratio, 1.02 per year increase P < 0.001). Proportions of access types varied significantly across the 7 treating centers (AVFs, 32% to 86% AVGs, 2% to 32% catheters, 9% to 33% P < 0.001). After adjusting for confounding factors, significant differences persisted among access types in some centers and the national average. Certain patient characteristics, such as age and female sex, are associated strongly with increased AVG and catheter use. However, the significant variation in risk across centers suggests more attention needs to be given to physician practice patterns to increase AVF use rates.
Publisher: Wiley
Date: 04-2015
DOI: 10.1111/HDI.12273
Abstract: An effective home hemodialysis program critically depends on adequate hub facilities and support functions and on transparent and accountable organizational processes. The likelihood of optimal service delivery and patient care will be enhanced by fit-for-purpose facilities and implementation of a well-considered governance structure. In this article, we describe the required accommodation and infrastructure for a home hemodialysis program and a generic organizational structure that will support both patient-facing clinical activities and business processes.
Publisher: SAGE Publications
Date: 12-2018
Abstract: Peritonitis is the most common complication in patients undergoing peritoneal dialysis (PD). Peritoneal dialysis-related peritonitis caused by Brucella species has been reported in only 7 patients before. Here, we report a further case of Brucella peritonitis. This patient was successfully treated with both intraperitoneal and prolonged oral antibiotics, without removal of the PD catheter. We review relevant literature and make recommendations for the diagnosis and treatment of Brucella PD-related peritonitis from the cumulative published clinical experience.
Publisher: Wiley
Date: 03-2011
Publisher: Elsevier BV
Date: 08-2014
Publisher: Springer Science and Business Media LLC
Date: 12-2018
Publisher: Elsevier BV
Date: 03-2006
DOI: 10.1016/J.ARTMED.2005.07.007
Abstract: In many medical areas, there exist different regression formulas to predict/evaluate a medical outcome on the same problem, each of them being efficient only in a particular sub-space of the problem space. The paper aims at the development of a generic, incremental learning model that includes all available regression formulas for a particular prediction problem to define local areas of the problem space with their best performing formula along with useful explanation rules. Another objective of the paper is to develop a specific model for renal function evaluation using nine existing formulas. We have used a connectionist neuro-fuzzy approach and have developed a knowledge-based neural network model (KBNN) which incorporates and adapts incrementally several existing regression formulas and kernel functions. The model incorporates different non-linear regression functions as neurons in its hidden layer and adapts these functions through incremental learning from data in particular local areas of the space. More specifically, each hidden neural node has a pair of functions associated with it--one regression formula, that represents existing knowledge and one Gaussian kernel function, that defines the sub-space of the whole problem space, in which the formula is locally adapted to new data. All these functions are aggregated and changed through incremental learning. The proposed KBNN model is illustrated using a medical dataset of observed patient glomerular filtration rate (GFR) measurements for renal function evaluation. In this case study, the regression function for each cluster is selected by the model from nine formulas commonly used by medical practitioners to predict GFR. 441 GFR data vectors from 141 patients taken from 12 sites in Australia and New Zealand have been used as a case study experimental data set. The proposed GFR prediction model, based on the proposed generic KBNN model, outperforms at least by 10% accuracy any of the in idual regression formulas or a standard neural network model. Furthermore, we have derived locally adapted regression formulas to perform best on local clusters of data along with useful explanatory rules. The proposed KBNN model manifests better accuracy then existing regression formulas or neural network models for renal function evaluation and extracts modified formulas that perform well in local areas of the problem space.
Publisher: Springer Science and Business Media LLC
Date: 19-02-2019
Publisher: SAGE Publications
Date: 12-2018
Abstract: We report outcomes on ≥ 4 compared with 4 exchanges/day in a Chinese cohort on continuous ambulatory peritoneal dialysis (CAPD). Data were sourced from the Baxter (China) Investment Co. Ltd Patient Support Program database, comprising an inception cohort commencing CAPD between 1 January 2005 and 13 August 2015. We used cause-specific Cox proportional hazards and Fine-Gray competing risks (kidney transplantation, change to hemodialysis) models to estimate mortality risk on ≥ 4 compared with 4 exchanges/day. We matched or adjusted for age, gender, employment, insurance, primary renal disease, size of CAPD program, year of dialysis inception, and treatment center. We modeled 100,022 subjects from 1,177 centers over 239,876 patient-years. Of these subjects, 43,185 received 4 exchanges/day and 56,837 ≥ 4 exchanges/day. The proportion of patients on 4 exchanges/day varied widely between centers. Those on 4 exchanges/day were significantly older, more often female, of unknown employment, and from rural China. In the various models, ≥ 4 exchanges/day was associated with a significantly lower risk of death by 30% – 35% compared with 4 exchanges/day. This beneficial effect was greatest in younger and rural patients. In this Chinese CAPD cohort, ≥ 4 exchanges/day was associated with significantly lower mortality risk than 4 exchanges/day. Analyses are limited by residual confounding from unavailability of important prognostic covariates (e.g., comorbidity, socioeconomic factors) and data on residual renal function, peritoneal clearance, and transport status with which to judge the clinical appropriateness of CAPD prescription. Nonetheless, our study indicates this area as a high priority for further detailed study.
Publisher: Oxford University Press (OUP)
Date: 27-09-2013
DOI: 10.1093/NDT/GFT062
Abstract: To examine patterns of intravenous (IV) iron use across 12 countries from 1999 to 2011. Trends in iron use are described among 32 192 hemodialysis (HD) patients in the Dialysis Outcomes and Practice Patterns Study. Adjusted associations of IV iron dose with serum ferritin and transferrin saturation (TSAT) values were also studied. IV iron was administered to 50% of patients over 4 months in 1999, increasing to 71% during 2009-11, with increasing use in most countries. Among patients receiving IV iron, the mean monthly dose increased from 232 ± 167 to 281 ± 211 mg. Most countries used 3 to 4 doses/month, but Canada used about 2 doses/month, Italy increased from 3 to almost 6 doses/month and Germany used 5 to 6 doses/month. The USA and most European countries predominantly used iron sucrose and sodium ferric gluconate. A significant use of iron dextran was limited to Canada and France iron polymaltose was used in Australia and New Zealand and Japan used ferric oxide saccharate, chondroitin polysulfate iron complex and cideferron. Ferritin values rose in most countries: 22% of patients had ≥ 800 ng/mL in the recent years of study. TSAT levels increased to a lesser degree over time. Japan had much lower IV iron dosing and ferritin levels, but similar TSAT levels. In adjusted analyses, serum ferritin and TSAT levels increased signifcantly by 14 ng/mL and 0.16%, respectively, for every 100 mg/month higher mean monthly iron dose. IV iron prescription patterns varied between countries and changed over time from 1999 to 2011. IV iron use and dose increased in most countries, with notable increases in ferritin but not TSAT levels. With rising cumulative IV iron doses, studies of the effects of changing IV iron dosing and other anemia management practices on clinical outcomes should be a high priority.
Publisher: Elsevier BV
Date: 08-2001
DOI: 10.1046/J.1523-1755.2001.060002777.X
Abstract: The replacement of renal function for critically ill patients is procedurally complex and expensive, and none of the available techniques have proven superiority in terms of benefit to patient mortality. In hemodynamically unstable or severely catabolic patients, however, the continuous therapies have practical and theoretical advantages when compared with conventional intermittent hemodialysis (IHD). We present a single center experience accumulated over 18 months since July 1998 with a hybrid technique named sustained low-efficiency dialysis (SLED), in which standard IHD equipment was used with reduced dialysate and blood flow rates. Twelve-hour treatments were performed nocturnally, allowing unrestricted access to the patient for daytime procedures and tests. One hundred forty-five SLED treatments were performed in 37 critically ill patients in whom IHD had failed or been withheld. The overall mean SLED treatment duration was 10.4 hours because 51 SLED treatments were prematurely discontinued. Of these discontinuations, 11 were for intractable hypotension, and the majority of the remainder was for extracorporeal blood circuit clotting. Hemodynamic stability was maintained during most SLED treatments, allowing the achievement of prescribed ultrafiltration goals in most cases with an overall mean shortfall of only 240 mL per treatment. Direct dialysis quantification in nine patients showed a mean delivered double-pool Kt/V of 1.36 per (completed) treatment. Mean phosphate removal was 1.5 g per treatment. Mild hypophosphatemia and/or hypokalemia requiring supplementation were observed in 25 treatments. Observed hospital mortality was 62.2%, which was not significantly different from the expected mortality as determined from the APACHE II illness severity scoring system. SLED is a viable alternative to traditional continuous renal replacement therapies for critically ill patients in whom IHD has failed or been withheld, although prospective studies directly comparing two modalities are required to define the exact role for SLED in this setting.
Publisher: Public Library of Science (PLoS)
Date: 07-05-2014
Publisher: Wiley
Date: 10-2012
DOI: 10.1111/J.1542-4758.2012.00740.X
Abstract: Antimicrobial locks (AMLs) are effective in preventing catheter-associated bloodstream infections (CABSI) in hemodialysis (HD) patients, but may increase antibiotic resistance. In our center, gentamicin-heparin locks have been used for all HD central venous catheters since July 1, 2004. We previously reported a significant reduction in CABSI rates, but a short-term trend to increased gentamicin resistance among coagulase-negative staphylococci (CNS). We present a further 3-year follow-up study of bacterial resistance in our dialysis center. We examined the susceptibility of bacterial isolates from CABSI from July 1, 2006 to July 31, 2009, restricting analyses to CNS, gram-negative bacilli, and Staphylococcus aureus. We compared the frequency of gentamicin resistance in these isolates between four groups: CABSI in HD patients, non-CABSI in HD patients, peritonitis in peritoneal dialysis (PD) patients, and bloodstream infection in the non-end-stage kidney failure general population. For CNS isolates, the frequency of gentamicin resistance was similar between the CABSI and PD peritonitis groups, but higher in both groups than the general population. The pattern was similar for S. aureus although the differences were of borderline statistical significance. The frequency of gentamicin resistance among gram-negative bacilli isolates did not differ between groups. Gentamicin resistance was more common than expected in CNS and possibly S. aureus isolates from CABSI, although this resistance may be part of a generally higher frequency of antibiotic resistance in the dialysis population, rather than a direct result of AML use. AMLs remain a valuable clinical tool although surveillance is needed to ensure that benefits continue to outweigh risks.
Publisher: Oxford University Press (OUP)
Date: 09-08-2020
DOI: 10.1093/NDT/GFZ160
Abstract: Withdrawal from dialysis is an increasingly common cause of death in patients with end-stage kidney disease (ESKD). As most published reports of dialysis withdrawal have been outside the Oceania region, the aims of this study were to determine the frequency, temporal pattern and predictors of dialysis withdrawal in Australian and New Zealand patients receiving chronic haemodialysis. This study included all people with ESKD in Australia and New Zealand who commenced chronic haemodialysis between 1 January 1997 and 31 December 2016, using data from the Australia and New Zealand Dialysis and Transplant (ANZDATA) Registry. Competing risk regression models were used to identify predictors of dialysis withdrawal mortality, using non-withdrawal cause of death as the competing risk event. Among 40 447 people receiving chronic haemodialysis (median age 62 years, 61% male, 9% Indigenous), dialysis withdrawal mortality rates increased from 1.02 per 100 patient-years (11% of all deaths) during the period 1997–2000 to 2.20 per 100 patient-years (32% of all deaths) during 2013–16 (P & 0.001). Variables that were significantly associated with a higher likelihood of haemodialysis withdrawal were older age {≥70 years subdistribution hazard ratio [SHR] 1.77 [95% confidence interval (CI) 1.66–1.89] reference 60–70 years}, female sex [SHR 1.14 (95% CI 1.09–1.21)], white race [Asian SHR 0.56 (95% CI 0.49–0.65), Aboriginal and Torres Strait Islander SHR 0.83 (95% CI 0.74–0.93), Pacific Islander SHR 0.47 (95% CI 0.39–0.68), reference white race], coronary artery disease [SHR 1.18 (95% CI 1.11–1.25)], cerebrovascular disease [SHR 1.15 (95% CI 1.08–1.23)], chronic lung disease [SHR 1.13 (95% CI 1.06–1.21)] and more recent era [2013–16 SHR 3.96 (95% CI 3.56–4.48) reference 1997–2000]. Death due to haemodialysis withdrawal has become increasingly common in Australia and New Zealand over time. Predictors of haemodialysis withdrawal include older age, female sex, white race and haemodialysis commencement in a more recent era.
Publisher: Wiley
Date: 17-04-2012
Publisher: Oxford University Press (OUP)
Date: 26-08-2015
DOI: 10.1093/AJE/KWV090
Abstract: In the application of marginal structural models to compare time-varying treatments, it is rare that the hierarchical structure of a data set is accounted for or that the impact of unmeasured confounding on estimates is assessed. These issues often arise when analyzing data sets drawn from clinical registries, where patients may be clustered within health-care providers, and the amount of data collected from each patient may be limited by design (e.g., to reduce costs or encourage provider participation). We compared the survival of patients undergoing treatment with various dialysis types, where some patients switched dialysis modality during the course of their treatment, by estimating a marginal structural model using data from the Australia and New Zealand Dialysis and Transplant Registry, 2003-2011. The number of variables recorded by the registry is limited, and patients are clustered within the dialysis centers responsible for their treatment, so we assessed the impact of accounting for unmeasured confounding or clustering on estimated treatment effects. Accounting for clustering had limited impact, and only unreasonable levels of unmeasured confounding would have changed conclusions about treatment comparisons. Our analysis serves as a case study in assessing the impact of unmeasured confounding and clustering in the application of marginal structural models.
Publisher: S. Karger AG
Date: 2017
DOI: 10.1159/000476052
Abstract: b i Background/Aims: /i /b Polyethylenimine-coated polyacrylonitrile (AN69ST) membrane is expected to improve the outcomes of critically ill patients treated by continuous renal replacement therapy (CRRT). b i Methods: /i /b Using a Japanese health insurance claim database, we identified adult patients receiving CRRT in intensive care units (ICUs) from April 2014 to October 2015. We used a multivariable logistic regression model to assess in-hospital mortality and Fine and Gray's proportional subhazards model to assess the ICU length of stay (ICU-LOS) accounting for the competing risks. b i Results: /i /b Of 2,469 ICU patients, 156 were treated by AN69ST membrane. Crude in-hospital mortality was 50.0% in the AN69ST group and 54.0% in the non-AN69ST group. Adjusted odds ratio (OR) of AN69ST membrane use for in-hospital mortality was 0.65 (95% CI 0.45-0.93). The use of AN69ST membrane was also independently associated with shorter ICU-LOS. b i Conclusion: /i /b This retrospective observational study suggested that CRRT with AN69ST membrane might be associated with better in-hospital outcomes.
Publisher: Wiley
Date: 12-09-2016
DOI: 10.1111/NEP.12688
Abstract: There remains debate on which dialysis modality offers better survival outcomes for patients. We compare the survival of patients undergoing home haemodialysis (HD) with a permanent vascular access, facility HD with a permanent vascular access, facility HD with a central venous catheter or peritoneal dialysis. We considered adult patients from the Australia and New Zealand Dialysis and Transplant Registry who commenced dialysis between 1 October 2003 and 31 December 2011. Patients were followed until death, transplant, loss to follow-up or 31 December 2011. Marginal structural models for mortality were used to account for time-varying treatment, comorbidities and baseline covariates. Unmeasured differences between treatment groups may remain even after adjustment for measured differences, so the potential effects of unmeasured confounding were explicitly modelled. There were 20,191 patients who underwent ≥90 days of dialysis (median 2.25 years, interquartile range 1-3.75 years). There were significant differences in age, gender, comorbidities and other variables between treatment groups at baseline. Thirty per cent of patients had at least one treatment change. Relative to facility HD with permanent access, the risk of death for home HD patients with a permanent access was lower in the first year (at 9 months: hazard ratio 0.41, 95% CI 0.25-0.67, adjusted for all baseline covariates). Findings were robust to unmeasured confounding within plausible ranges. Relative to facility HD with permanent vascular access, home HD conferred better survival prospects, while peritoneal dialysis was associated with a higher risk and facility HD with a catheter the highest risk, especially within the first year of dialysis.
Publisher: Springer Science and Business Media LLC
Date: 20-06-2018
DOI: 10.1007/S10157-018-1602-2
Abstract: The aim of this study was to investigate in vitro biocompatibility of Reguneal™, a new bicarbonate containing peritoneal dialysis fluid (PDF) for Japan, and compare it with other PDFs available in that country. We assessed basal cytotoxicity using in vitro proliferation of cultured fibroblasts, L-929, determining the quantity of living cells by the uptake of Neutral Red. Levels of ten glucose degradation products (GDPs) were measured by a validated ultrahigh-performance liquid chromatography method in combination with an ultraviolet detector. We compared inhibition of fibroblast cell growth between brands of PDF, adjusting for dextrose and GDP concentrations using random-effects mixed models. The results demonstrate that cytotoxicity of Reguneal™ is comparable to a sterile-filtered control and is less cytotoxic than most of the other PDFs, most of which significantly inhibited cell growth. As a "class effect", increasing dextrose and GDP concentrations were non-significantly but positively associated with cytotoxicity. As a "brand effect", these relationships varied widely between brands, and some PDFs had significant residual effects on basal cytotoxicity through mechanisms that were unassociated with either dextrose or GDP concentration. Our study suggests that Reguneal™ is a biocompatible PDF. The results of our study also highlight that dextrose and GDPs are important for biocompatibility, but alone are not a complete surrogate. The results of our study need to be confirmed in other tissue culture models, and should lead to further research on determinants of biocompatibility and the effect of such PDFs on clinical outcomes.
Publisher: SAGE Publications
Date: 12-2018
Abstract: Acute kidney injury (AKI) is common in critically ill neonates, and peritoneal dialysis (PD) can be a lifesaving option. In China, however, much of the equipment for PD in neonates is not available. We describe results with a novel system for PD, which has been developed locally to improve access to therapy and care for critically ill neonates requiring PD in China. The system comprises a 14-gauge single-lumen central venous catheter serving as a PD catheter, inserted by Seldinger technique, with an adapted twin bag PD system. Ten neonates with AKI were treated using the novel PD system. The 10 patients ranged in age from 1 day to 22 days, with bodyweights between 700 g and 3,300 g. Average time to renal function recovery was between 14 and 96 hours. Complications related to the novel PD system included leak ( n = 1), catheter displacement ( n = 1), and catheter obstruction ( n = 1). There were no complications related to insertion, no cases of peritonitis or exit-site infection, and no subsequent hernias. A comparison of costs indicated that the novel PD system is less expensive than conventional systems involving open insertion of Tenckhoff catheters. Peritoneal dialysis using the novel PD system is simple, safe, and effective for suitable neonates with AKI in China.
Publisher: Elsevier BV
Date: 11-2017
Publisher: Elsevier BV
Date: 04-2006
Abstract: The optimal combination of hemodialysis (HD) dose and session length remains uncertain, and previous studies have not conclusively shown session length to be an important independent determinant of patient mortality. The objective of this study was to examine associations between HD dose and session length with mortality risk using data from the Australian and New Zealand Dialysis and Transplant Registry. Analyses were performed using a prospective inception cohort comprising all incident adult patients treated by thrice-weekly maintenance HD, who commenced renal replacement therapy with HD between 1 April 1997 and 31 March 2004. In all, 6593 patients were identified, of whom 4193 had sufficient data for multivariate analyses. HD dose (single pool fractional clearance of urea, Kt/V) and session length were included in analyses as those recorded 12 months after HD inception to reduce confounding by residual renal function. The outcome examined was patient mortality. Survival analyses included Kaplan-Meier calculations of survival and Cox regression for multivariate analyses. Covariates in Cox models included patient demographics, co-morbid medical conditions at HD inception, and HD operating parameters. After adjustment for covariates and each other, Kt/V of 1.30-1.39 and session length of 4.5-4.9 h were associated with the lowest mortality risk. There was no interaction between HD dose and session length. Thus, the optimal combination for mortality appears to be Kt/V of > or = 1.3 and session length of > or = 4.5 h. These data suggest a randomized controlled trial to test these hypotheses, and support the inclusion of criteria relating to session length in definitions of adequate HD practice.
Publisher: Elsevier BV
Date: 10-2006
Publisher: Elsevier BV
Date: 2015
DOI: 10.1038/KI.2014.275
Abstract: Intravenous (IV) iron is required for optimal management of anemia in the majority of hemodialysis (HD) patients. While IV iron prescription has increased over time, the best dosing strategy is unknown and any effect of IV iron on survival is unclear. Here we used adjusted Cox regression to analyze associations between IV iron dose and clinical outcomes in 32,435 HD patients in 12 countries from 2002 to 2011 in the Dialysis Outcomes and Practice Patterns Study. The primary exposure was total prescribed IV iron dose over the first 4 months in the study, expressed as an average dose/month. Compared with 100-199 mg/month (the most common dose range), case-mix-adjusted mortality was similar for the 0, 1-99, and 200-299 mg/month categories but significantly higher for the 300-399 mg/month (HR of 1.13, 95% CI of 1.00-1.27) and 400 mg/month or more (HR of 1.18, 95% CI of 1.07-1.30) groups. Convergent validity was proved by an instrumental variable analysis, using HD facility as the instrument, and by an analysis expressing IV iron dose/kg body weight. Associations with cause-specific mortality (cardiovascular, infectious, and other) were generally similar to those for all-cause mortality. The hospitalization risk was elevated among patients receiving 300 mg/month or more compared with 100-199 mg/month (HR of 1.12, 95% CI of 1.07-1.18). In light of these associations, a well-powered clinical trial to evaluate the safety of different IV iron-dosing strategies in HD patients is urgently needed.
Publisher: Elsevier BV
Date: 06-2014
DOI: 10.1016/J.JVIR.2014.01.023
Abstract: A previous clinical trial showed that radiologic insertion of first peritoneal dialysis (PD) catheters by modified Seldinger technique is noninferior to laparoscopic surgery in patients at low risk in a clinical trial setting. The present cohort study was performed to confirm clinical effectiveness of radiologic insertion in everyday practice, including insertion in patients with expanded eligibility criteria and by fellows in training. Between 2004 and 2009, 286 PD catheters were inserted in 249 patients, 133 with fluoroscopic guidance in the radiology department and 153 by laparoscopic surgery. Survival analyses were performed with the primary outcome of complication-free catheter survival and secondary outcomes of overall catheter survival and patient survival. Outcomes were assessed at last follow-up, as long as 365 days after PD catheter insertion. In the radiologic group, unadjusted 365-day complication-free catheter, overall catheter, and patient survival rates were 22.6%, 81.2%, and 82.7%, respectively, compared with 22.9% (P = .52), 76.5% (P = .4), and 92.8% (P = .01), respectively, in the laparoscopic group. Frequencies of in idual complications were similar between groups. Adjusting for patient age, comorbidity, and previous PD catheter, the hazard ratio (HR) for catheter complications by radiologic versus laparoscopic insertion is 0.90 (95% confidence interval [CI], 0.62-1.31) the HR for overall catheter survival is 1.25 (95% CI, 0.59-2.65) and that for death is 2.47 (95% CI, 0.84-7.3). Radiologic PD catheter insertion is a clinically effective alternative to laparoscopic surgery, although there was poorer long-term survival with radiologic catheter placement, possibly because of preferential selection of radiologic insertion for more frail patients.
Publisher: Wiley
Date: 25-07-2014
DOI: 10.1111/NEP.12269
Abstract: The financial burden of the increasing dialysis population challenges healthcare resources internationally. Home haemodialysis offers many benefits over conventional facility dialysis including superior clinical, patient-centred outcomes and reduced cost. This review updates a previous review, conducted a decade prior, incorporating contemporary home dialysis techniques of frequent and nocturnal dialysis. We sought comparative cost-effectiveness studies of home versus facility haemodialysis (HD) for people with end-stage kidney failure (ESKF). We conducted a systematic review of literature from January 2000-March 2014. Studies were included if they provided comparative information on the costs, health outcomes and cost-effectiveness ratios of home HD and facility HD. We searched medical and health economic databases using MeSH headings and text words for economic evaluation and haemodialysis. Six studies of economic evaluations that compared home to facility HD were identified. Two studies compared home nocturnal HD, one home nocturnal and daily home HD, and three compared contemporary home HD to facility HD. Overall these studies suggest that contemporary home HD modalities are less costly and more effective than facility HD. Home HD start-up costs tend to be higher in the short term, but these are offset by cost savings over the longer term. Contemporaneous dialysis modalities including nocturnal and daily home haemodialysis are cost-effective or cost-saving compared with facility-based haemodialysis. This result is largely driven by lower staff costs, and better health outcomes for survival and quality of life. Expanding the proportion of haemodialysis patients managed at home is likely to produce cost savings.
Publisher: Wiley
Date: 30-11-2014
DOI: 10.1111/SDI.12327
Abstract: Remote real-time treatment monitoring for home hemodialysis (HHD) was driven by concerns over patient safety in the early era of HHD. However, decades of clinical experience supported by objective data suggest that HHD is very safe and that remote monitoring is unlikely to avert serious adverse events. As a result, such remote monitoring is not routinely offered in the current era and is generally considered an unnecessary expense. However, a one-size-fits-all approach to abandon remote monitoring may overlook potential opportunities: to improve the clinical care of patients dialyzing at home and to give patients the confidence to perform HHD in an unsupervised setting.
Publisher: Wiley
Date: 25-03-2015
DOI: 10.1111/NEP.12388
Abstract: Currently available calcium- and aluminium-based phosphate binders are dose limited because of potential toxicity, and newer proprietary phosphate binders are expensive. We examined phosphate-binding effects of the bile acid sequestrant colestipol, a non-proprietary drug that is in the same class as sevelamer. The trial was an 8 week prospective feasibility study in stable hemodialysis patients using colestipol as the only phosphate binder, preceded and followed by a washout phase of all other phosphate binders. The primary study endpoint was weekly measurements of serum phosphate. Secondary endpoints were serum calcium, lipids and coagulation status. Analyses used random effects mixed models. Thirty patients were screened for participation of which 26 met criteria for treatment. At a mean dose of 8.8 g/24 h of colestipol by study end, serum phosphate dropped from 2.24 to 1.96 mmol/L (P < 0.001). Three patients required calcium supplementation. LDL cholesterol dropped from 1.75 to 1.2 mmol/L (P < 0.001). Three patients dropped out because of side effects or intolerance of the required dose. The results support the feasibility of a larger trial to determine the efficacy of colestipol as a phosphate binder and that other non-proprietary anion-exchange resins may also warrant investigation.
Publisher: SAGE Publications
Date: 26-02-2023
DOI: 10.1177/08968608231154156
Abstract: Recently, we validated a simple method for estimating peritoneal dialysis (PD) peritonitis rate. Despite good agreement between estimates and gold-standard measurements in two large dialysis registries, the International Society of Peritoneal Dialysis (ISPD) was hesitant to recommend adoption of the estimating equation. Their perception is that inaccuracies, as small as they are, might still be detrimental to clinical decision-making. In this study, we apply new analyses to the original validation data sets. We quantify agreement using standards from the International Organization for Standardization (ISO). We also identify a subset of centres with poorest performance of the estimating equation and qualitatively assess the potential for compromised clinical decision-making associated with its use. Inter-assay % coefficient of variation between estimates and measurements was 4.2% in the Australia and New Zealand Dialysis and Transplant Registry and 4.6% in Le Registre de Dialyse Péritonéale de Langue Française, easily meeting ISO requirements. Mandel’s h values and Grubb’s tests confirmed more outlying estimates compared to the measurements, while Mandel’s k values and Cochran’s C tests showed that identical precision by the two methods. Misclassification of centres as being above versus below the ISPD standard of 0.4 episodes atient-year occurred only with rates close to the threshold, affecting approximately 3% of patient-years. In the 26 (out of 268) centres with poorest performance of the estimating equation, examination of the time series of their annual PD peritonitis rate estimates/measurements showed that using estimates would not detrimental to clinical decision-making. In conclusion, the estimating equation is sufficiently accurate for routine clinical use.
Publisher: Registre de Dialyse Peritoneale de Langue Francaise (RDPLF)
Date: 15-12-2021
Abstract: Peritonitis is the most important therapy-related complication of peritoneal dialysis (PD). Unfortunately, many PD centers around the world do not accurately record peritonitis rate, mainly because they cannot ascertain PD patient time-at-risk from “patient flow” data - that is, calculating PD patient-days from dates when patients start and finish PD. We propose a simplified method of calculating PD peritonitis rate using PD patient time-at-risk from “patient stock” data - - that is, calculating PD patient-days from the number of prevalent PD patients at the center at the start of the year and the corresponding number at the end. We compared gold-standard measurements of annual PD peritonitis rates with simplified ones in the Australia and New Zealand Dialysis and Transplant Registry (ANZDATA) / New Zealand (NZ) PD Registry, and Le Registre de Dialyse Péritonéale de Langue Française et hémodialyse à domicile (the RDPLF). A total of 268 centers from 9 countries with 4311 center-years and 110,185 patient-years of follow-up were modelled. Overall agreement was excellent with a concordance correlation coefficient of 0.978 (95% confidence interval [CI] 0.975-0.980) in ANZDATA / NZ PD Registry, and 0.978 (0.977-0.980) in the RDPLF. There was statistically significant lower agreement for smaller centers in the registries at 0.972 (0.966-0.976) and 0.973 (0.970-0.976) respectively, although the performance of the simplified formula remains clinically sound in even these centers. The simplified method of calculating PD peritonitis rate is accurate, and will allow more centers around the world to measure, report, and work on reducing PD peritonitis rates.
Publisher: Oxford University Press (OUP)
Date: 13-07-2018
DOI: 10.1093/NDT/GFY223
Abstract: There are advantages to home dialysis for patients, and kidney care programs, but use remains low in most countries. Health-care policy-makers have many levers to increase use of home dialysis, one of them being economic incentives. These include how health-care funding is provided to kidney care programs and dialysis facilities how physicians are remunerated for care of home dialysis patients and financial incentives-or removal of disincentives-for home dialysis patients. This report is based on a comprehensive literature review summarizing the impact of economic incentives for home dialysis and a workshop that brought together an international group of policy-makers, health economists and home dialysis experts to discuss how economic incentives (or removal of economic disincentives) might be used to increase the use of home dialysis. The results of the literature review and the consensus of workshop participants were that financial incentives to dialysis facilities for home dialysis (for instance, through activity-based funding), particularly in for-profit systems, could lead to a small increase in use of home dialysis. The evidence was less clear on the impact of economic incentives for nephrologists, and participants felt this was less important than a nephrologist workforce in support of home dialysis. Workshop participants felt that patient-borne costs experienced by home dialysis patients were unjust and inequitable, though participants noted that there was no evidence that decreasing patient-borne costs would increase use of home dialysis, even among low-income patients. The use of financial incentives for home dialysis-whether directed at dialysis facilities, nephrologists or patients-is only one part of a high-performing system that seeks to increase use of home dialysis.
Publisher: Elsevier BV
Date: 03-2015
DOI: 10.1053/J.AJKD.2014.10.020
Abstract: Although home hemodialysis (HD) is associated with better survival compared with hospital HD, the burden of treatment may be intensified for patients and their caregivers and deter patients from this treatment choice. We describe patient and caregiver perspectives and experiences of home HD to inform home HD programs that align with patient preferences. Systematic review of qualitative studies. Adults with chronic kidney disease and caregivers of both home and hospital dialysis patients who expressed opinions about home HD. MEDLINE, EMBASE, PsycINFO, CINAHL, and reference lists were searched to October 2013. Thematic synthesis. 24 studies involving 221 patients (home HD [n=109], hospital HD [n=97], and predialysis [n=15]) and 121 caregivers were eligible. We identified 5 themes: vulnerability of dialyzing independently (fear of self-needling, feeling unqualified, and anticipating catastrophic complications), fear of being alone (social isolation and medical disconnection), concern of family burden (emotional demands on caregivers, imposing responsibility, family involvement, and medicalizing the home), opportunity to thrive (re-establishing a healthy self-identity, gaining control and freedom, strengthening relationships, experiencing improved health, and ownership of decision), and appreciating medical responsiveness (attentive monitoring and communication, depending on learning and support, developing readiness, and clinician validation). Non-English articles were excluded. Patients and caregivers perceive that home HD offers the opportunity to thrive improves freedom, flexibility, and well-being and strengthens relationships. However, some voice anxiety and fear about starting home HD due to the confronting nature of the treatment and isolation from medical and social support. Acknowledging and addressing these apprehensions can improve the delivery of predialysis and home HD programs to better support patients and caregivers considering home HD.
Publisher: SAGE Publications
Date: 17-01-2020
Abstract: We describe peritoneal dialysis (PD) prescription variations among Peritoneal Dialysis Outcomes and Practice Patterns Study (PDOPPS) participants on continuous ambulatory PD (CAPD) and automated PD (APD n = 4657) from Australia/New Zealand (A/NZ), Canada, Japan, Thailand, United Kingdom (UK), and United States (US). CAPD was more commonly used in Thailand and Japan, while APD predominated over CAPD in A/NZ, Canada, the US, and the UK. Total prescribed PD volume normalized to the surface area was the highest in Thailand and the lowest in Japan (for both APD and CAPD) and the UK (for CAPD). PD patients from Thailand had the lowest residual urine volume and residual renal urea clearance, yet achieved the highest dialysis urea clearance. Japanese patients had the lowest dialysis urea clearances for both APD and CAPD. Despite having similar urine volumes to patients in A/NZ, Canada, Japan, and the UK, US CAPD and APD patients used 2.5% and 3.86% glucose PD solutions more frequently, whereas fewer than 25% of these patients used icodextrin. Over half of the patients in A/NZ, Canada, the UK, and Japan used icodextrin, whereas it was hardly used in Thailand. Japan and Thailand were more likely to use 1.5% glucose solutions for their PD prescription. There are considerable international variations in PD modality use and prescription patterns that translate into important differences in achieved dialysis clearances. Ongoing recruitment of additional PDOPPS participants and accrual of follow-up time will allow us to test the associations between specific PD prescription regimens and clinical and patient-reported outcomes.
Publisher: Springer Science and Business Media LLC
Date: 19-07-2007
DOI: 10.1007/S10754-007-9023-X
Abstract: In New Zealand, patients receive treatment for end-stage renal disease (ESRD) within the tax-funded health system. All hospital and specialist outpatient services are free, while general practitioner consultations and pharmaceuticals prescribed outside of hospitals incur copayments. Total ESRD prevalence is 0.07%, half the U.S. rate, and the prevalence of home-based and self-care dialysis is the highest in the world. Medical staff are not subject to direct financial incentives that could affect treatment choice. Estimated total expenditure per ESRD patient is relatively low. Funding constraints encourage physicians and patients to consider the probable benefit of dialysis for a patient before treatment is prescribed.
Publisher: Springer Science and Business Media LLC
Date: 30-01-2019
Publisher: Wiley
Date: 28-10-2015
DOI: 10.1111/NEP.12538
Abstract: Due to the paucity of studies focusing on primary glomerulonephritis, the second commonest cause of end-stage-kidney-disease in most of the developed world, we sought to review outcomes of these renal pathologies. We reviewed renal outcomes and mortality for primary glomerulonephritis patients enrolled in the New Zealand Glomerulonephritis Study between 1972 and 1983. There were 765 patients with median follow-up of 30 years (range 0.1-42 years). They were predominantly New Zealand European, male and hypertensive. Poor renal outcomes and increased mortality were associated with hypertension, heavy proteinuria, impaired renal function and older age at diagnosis. Ethnicity was not significantly associated with progression to end-stage-kidney-disease although NZ Maori patients were at significantly increased risk of death. Patients with rapidly progressive glomerulonephritis had the highest risk of reaching end-stage-kidney-disease while the cumulative incidence of end-stage-kidney-disease was 20% and 30% for those with immunoglobulin-A nephropathy and membranous nephropathy respectively. Mortality risk was high for patients with rapidly progressive glomerulonephritis and anti-glomerular basement membrane disease. The era of diagnosis did not have much effect on outcomes except for patients with focal segmental glomerulosclerosis or immunoglobulin A nephropathy but this could be type II error. We report one of the longest follow-up studies on biopsy-proven glomerulonephritides. Age, hypertension, and severity of chronic kidney disease at diagnosis were strong predictors of the development of end-stage-kidney-disease and death. The specific renal pathology had a profound impact upon prognosis and therefore should continue to drive efforts to find targeted therapeutic options for these glomerulonephritides.
Publisher: Elsevier BV
Date: 02-2019
DOI: 10.1016/J.JSAMS.2019.08.019
Abstract: Increasing physical activity is a priority worldwide, including for older adults who may have difficulty performing traditional forms of exercise, and for whom retention of muscle mass is an important consideration. Water-based exercise may provide an alternative if benefits are comparable. We compared the impact on body composition of 24-week water- versus land-walking interventions in healthy but inactive older adults. Randomised, controlled trial. 72 participants (62.5±6.8yr) were randomised to a land-walking (LW), water-walking (WW) or control (C) group in a supervised centre-based program. The exercise groups trained 3 times/week at matched intensity (%HRR), increasing from 40-45% to 55-65% heart rate reserve (HRR). Height, weight, body mass index (BMI), waist and hip girths were recorded dual X-ray absorptiometry (DXA) provided fat and lean tissue masses. Participants were re-assessed 24 weeks after completion of the intervention. There were no significant changes in body mass or BMI following either exercise protocol, however central adiposity was reduced in both exercise groups, and the WW group increased lower limb lean mass. These benefits did not persist over the follow-up period. Exercise can confer beneficial effects on body composition which are not evident when examining weight or BMI. Both WW and LW improved body composition. Water walking can be recommended as an exercise strategy for this age group due to its beneficial effects on body composition which are similar to, or exceed, those associated with land-walking. For benefits to persist, it appears that exercise needs to be maintained.
Publisher: Wiley
Date: 30-06-2004
DOI: 10.1111/J.1492-7535.2004.01103.X
Abstract: The issues surrounding anemia management in patients receiving dialysis therapy are complex and widely debated. Although numerous trials have been published, clinical practice patterns may differ, particularly in the presence of uncertainty about the optimal management of anemia in this setting. We examined data from the Australia and New Zealand Dialysis and Transplant Registry (ANZDATA) regarding use of erythropoietic agents (EA), hemoglobin, and ferritin concentrations and transferrin saturation in 8476 prevalent dialysis patients in Australia and New Zealand during the 6 months preceding March 31, 2001. From this cross-sectional survey, we examined the distribution of reported hemoglobin concentration, transferrin saturation, and ferritin concentration. Among hemodialysis patients, other predictors of hemoglobin examined included urea reduction ratio (URR), age, sex, and the presence of comorbidities. In Australia, 87% of dialysis patients received an EA in contrast, only 42% of New Zealand patients received an EA. Hemoglobin concentrations were significantly higher in Australia, where 16% of reported values were <100 g/L, compared to New Zealand where 37% reported values were <100 g/L. Transferrin saturation and serum ferritin concentrations were significantly correlated, but less strongly among those receiving EA than those not receiving these agents. Both transferrin saturation and serum ferritin were significantly and independently associated with hemoglobin concentration, as were age and sex. The association with ferritin was inverse: higher serum ferritin concentrations were associated with lower hemoglobin concentrations. There was poor agreement (kappa = 0.15) between categories of low transferrin saturation (<20%) and low ferritin concentrations ( /=65%, whereas the group with a reported URR <65% had a significantly lower hemoglobin concentration. There was a wide variation in reported hemoglobin concentrations in this population. Potential contributing factors include variable patient responsiveness to EA and iron, differing regulations in Australia and New Zealand regarding government subsidy of EA, and the lack of consensus among physicians regarding hemoglobin target values. Although a cross-sectional study cannot directly address the predictive value of iron indices for iron deficiency, it appears likely that transferrin and ferritin have different relationships with hemoglobin, and measurement of both may have greater clinical utility than either parameter alone.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 18-03-2020
Abstract: Because fluid overload in patients undergoing hemodialysis contributes to cardiovascular morbidity and mortality, there is a global trend to use low-sodium dialysate in hemodialysis with the goal of reducing fluid overload. To investigate whether lower dialysate sodium during hemodialysis improves left ventricular mass, the authors conducted a randomized clinical trial of 99 adults that compared use of low-sodium dialysate (135 mM) with conventional dialysate (140 mM) for 12 months. Although participants with lower dialysate sodium showed significant improvement in fluid status, the intervention had no effect on left ventricular mass index. The intervention also increased intradialytic hypotension. Given these findings, the current trend to lower dialysate sodium should be reassessed, pending the results of large trials with hard clinical end points. Fluid overload in patients undergoing hemodialysis contributes to cardiovascular morbidity and mortality. There is a global trend to lower dialysate sodium with the goal of reducing fluid overload. To investigate whether lower dialysate sodium during hemodialysis reduces left ventricular mass, we conducted a randomized trial in which patients received either low-sodium dialysate (135 mM) or conventional dialysate (140 mM) for 12 months. We included participants who were aged years old, had a predialysis serum sodium ≥135 mM, and were receiving hemodialysis at home or a self-care satellite facility. Exclusion criteria included hemodialysis frequency .5 times per week and use of sodium profiling or hemodiafiltration. The main outcome was left ventricular mass index by cardiac magnetic resonance imaging. The 99 participants had a median age of 51 years old 67 were men, 31 had diabetes mellitus, and 59 had left ventricular hypertrophy. Over 12 months of follow-up, relative to control, a dialysate sodium concentration of 135 mmol/L did not change the left ventricular mass index, despite significant reductions at 6 and 12 months in interdialytic weight gain, in extracellular fluid volume, and in plasma B-type natriuretic peptide concentration (ratio of intervention to control). The intervention increased intradialytic hypotension (odds ratio [OR], 7.5 95% confidence interval [95% CI], 1.1 to 49.8 at 6 months and OR, 3.6 95% CI, 0.5 to 28.8 at 12 months). Five participants in the intervention arm could not complete the trial because of hypotension. We found no effect on health-related quality of life measures, perceived thirst or xerostomia, or dietary sodium intake. Dialysate sodium of 135 mmol/L did not reduce left ventricular mass relative to control, despite improving fluid status. The Australian New Zealand Clinical Trials Registry, ACTRN12611000975998.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 10-07-2020
Publisher: Springer Science and Business Media LLC
Date: 21-07-2014
Publisher: Elsevier BV
Date: 12-2021
DOI: 10.1053/J.AJKD.2021.03.018
Abstract: Mortality is an important outcome for all dialysis stakeholders. We examined associations between dialysis modality and mortality in the modern era. Observational study comparing dialysis inception cohorts 1998-2002, 2003-2007, 2008-2012, and 2013-2017. Australia and New Zealand (ANZ) dialysis population. The primary exposure was dialysis modality: facility hemodialysis (HD), continuous ambulatory peritoneal dialysis (CAPD), automated PD (APD), or home HD. The main outcome was death. Cause-specific proportional hazards models with shared frailty and subdistribution proportional hazards (Fine and Gray) models, adjusting for available confounding covariates. In 52,097 patients, the overall death rate improved from ~15 deaths per 100 patient-years in 1998-2002 to ~11 in 2013-2017, with the largest cause-specific contribution from decreased infectious death. Relative to facility HD, mortality with CAPD and APD has improved over the years, with adjusted hazard ratios in 2013-2017 of 0.88 (95% CI, 0.78-0.99) and 0.91 (95% CI, 0.82-1.00), respectively. Increasingly, patients with lower clinical risk have been adopting APD, and to a lesser extent CAPD. Relative to facility HD, mortality with home HD was lower throughout the entire period of observation, despite increasing adoption by older patients and those with more comorbidities. All effects were generally insensitive to the modeling approach (initial vs time-varying modality, cause-specific versus subdistribution regression), different follow-up time intervals (5 year vs 7 year vs 10 year). There was no effect modification by diabetes, comorbidity, or sex. Potential for residual confounding, limited generalizability. The survival of patients on PD in 2013-2017 appears greater than the survival for patients on facility HD in ANZ. Additional research is needed to assess whether changing clinical risk profiles over time, varied dialysis prescription, and morbidity from dialysis access contribute to these findings.
Publisher: Wiley
Date: 29-09-2006
DOI: 10.1111/J.1440-1797.2006.00651.X
Abstract: Although proton pump inhibitors (PPI) are usually safe and effective therapeutic agents, serious adverse effects can occur. The aim of the present study was to report and analyse the clinical features of 15 patients with acute interstitial nephritis (AIN) and acute renal failure from PPI that were referred to renal services in Auckland over a period of 3 years. The clinical presentation, therapeutic drugs, demographic details and renal outcome of the patients were considered. The population at risk and total PPI exposure were able to be defined. The diagnosis of AIN was made by renal biopsy in 12 cases. In all patients, the time-course of drug exposure and improvement of renal function on withdrawal suggested PPI were causal. The median patient age was 78 years. The mean baseline serum creatinine level was 83 micromol/L, peak level 392 micromol/L, and recovery level 139 micromol/L. The erythrocyte sedimentation rate (ESR) and C-reactive protein were elevated at the time of diagnosis in the 11 and 12 patients, respectively, where this information was collected (ESR mean 85 mm/h, and C-reactive protein mean 81 mg/L). AIN occurred at 8 per 100 000 patient years (95% confidence level 2.6-18.7 per 100 000 patient years). Although four patients presented with an acute systemic allergic reaction, 11 were asymptomatic with an insidious development of renal failure. PPI are now the most commonly identified cause of AIN in the Auckland area. Recovery occurs after withdrawal of the drug but is often incomplete. Early diagnosis may be facilitated by clinician awareness of the insidious onset of renal failure, and an elevated erythrocyte sedimentation rate and C-reactive protein.
Publisher: Oxford University Press (OUP)
Date: 12-11-2010
DOI: 10.1093/NDT/GFQ694
Abstract: Prolonged intermittent renal replacement therapy (PIRRT) is a dialysis modality for critically ill patients that in theory combines the superior detoxification and haemodynamic stability of the continuous renal replacement therapy (CRRT) with the operational convenience, reduced haemorrhagic risk and low cost of conventional intermittent haemodialysis. However, the extent to which PIRRT should replace these other modalities is uncertain because comparative studies of mortality are lacking. We retrospectively examined the mortality data from three general intensive care units (ICUs) in different countries that have switched their predominant therapeutic approach from CRRT to PIRRT. We assessed whether this practice change was associated with a change in mortality rate. Data were analysed from ICUs in New Zealand, Australia and Italy. The study population comprised all patients requiring renal replacement therapy from 1 January 1995 to 31 December 2005 (n = 1347), the period of time spanning the change from CRRT to PIRRT in each unit. Poisson regression models were used to estimate the incident rate ratio (IRR) for death, comparing the periods before and after change to PIRRT in each unit. Estimates were adjusted for patient illness severity (APACHE II score) and for the underlying time trend in mortality rate over time. The change from CRRT to PIRRT was not associated with any increase in mortality rate, with an adjusted IRR of 1.02 (0.61-1.71). The IRR was virtually identical in the three ICUs (P-value = 0.63 for the difference in the IRR between ICUs). Switching from CRRT to PIRRT was not associated with a change in mortality rate. Pending the results of a randomized trial, our study provides evidence that PIRRT might be equivalent to CRRT in the general ICU patient.
Publisher: Wiley
Date: 25-02-2013
DOI: 10.1111/NEP.12030
Abstract: Recent data have suggested that glomerular filtration rate (GFR) is better predicted in New Zealand (NZ) Māori and Pacific People using the equations for Black people that predict higher GFR for any given serum creatinine. We hypothesized that this might be due to a higher rate of creatinine generation in NZ Māori and Pacific People. To compare creatinine kinetics between different ethnic groups in a cohort of NZ peritoneal dialysis patients. In this retrospective single-centre observational study, creatinine kinetics in 181 patients were determined from timed serum s les, peritoneal dialysate and urine collections between 1 October 2004 and 31 July 2011. Ethnicity was classified as Asian, NZ European, NZ Māori and Pacific People. A total of 799 s les from 181 patients were analysed: 194 in Asians, 127 in NZ Europeans, 268 in NZ Māori, 207 in Pacific People. Pacific People had the highest serum creatinine and lean body mass, and the highest creatinine generation rate at 1349 mg/day, compared with 1049 for Asians, 1186 for NZ Europeans and 1094 for NZ Māori (P = 0.0001). After adjustment for confounding factors, Pacific People had a greater creatinine generation by 140 mg/day compared with NZ Europeans (P = 0.047). Pacific People on peritoneal dialysis in NZ have higher serum creatinine, lean body mass and creatinine generation than other ethnic groups. This is consistent with previous observations that equations for predicting GFR in Black people may have increased accuracy in some Australasian non-White non-Asian populations.
Publisher: International Scientific Information, Inc.
Date: 2015
DOI: 10.12659/MST.895556
Publisher: BMJ
Date: 08-2017
Publisher: Springer Science and Business Media LLC
Date: 08-2015
Publisher: Springer Science and Business Media LLC
Date: 14-08-2013
Publisher: Springer Science and Business Media LLC
Date: 15-05-2012
Abstract: Home hemodialysis is common in New Zealand and associated with lower cost, improved survival and better patient experience. We present the case of a fully trained home hemodialysis patient who exsanguinated at home as a result of an incorrect wash back procedure. The case involves a 67 year old male with a history of well controlled hypertension and impaired glucose tolerance. He commenced on peritoneal dialysis in 2006 following the development of end stage kidney failure secondary to focal segmental glomerulosclerosis. He transferred to hemodialysis due to peritoneal membrane failure in 2010, and successfully trained for home hemodialysis over a 20 week period. Following one month of uncomplicated dialysis at home, he was found deceased on his machine at home in the midst of dialysis. His death occurred during the wash back procedure performed using the “open circuit” method, and resulted from misconnection of the saline bag to the venous end of the extracorporeal blood circuit instead of the arterial end. This led to approximately 2.3L of his blood being pumped into the saline bag resulting in hypovolaemic shock and death from exsanguination. Despite successful training, critical procedural errors can still be made by patients on home hemodialysis. In this case, the error involved misconnection of the saline bag for wash back. This case should prompt providers of home hemodialysis to review their training protocols and manuals. Manufacturers of dialysis machinery should be encouraged to design machines specifically for home hemodialysis, and consider distinguishing the arterial and venous ends of the extracorporeal blood circuit with colour coding or incompatible connectivity, to prevent occurrences such as these in the future.
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 2009
Publisher: SAGE Publications
Date: 10-2007
Abstract: The aim of this study was to identify risk factors for lupus nephritis including clinical, laboratory, and ethnic factors in a cohort of lupus patients in New Zealand. A retrospective study of patients from two teaching hospitals in Auckland, New Zealand. Patients were selected if they had attended as either an inpatient, or a rheumatology outpatient between 2000 and 2005. 170 patients had SLE according to ACR classification. Lupus nephritis (LN) was diagnosed according to ACR criteria. Clinical, laboratory, and ethnic data were gathered from the patient notes. Twenty-four patients had LN at diagnosis and 32 patients developed LN after diagnosis. LN was associated with serositis ( P = 0.008), cutaneous vasculitis ( P = 0.026), anaemia ( P = 0.005), CRP elevation months ( P 0.001), hypocomplementaemia months ( P 0.0001). Patients with elevated doublestranded DNA (dsDNA) ( × normal) were more likely to develop type IV LN ( P = 0.0096). Forty-one percent of patients were Caucasian, 12% Maori, 23% Pacific People, 16% Asian, 6% Indian. Maori patients with SLE (odds ratio (OR) = 8.47, 95% confidence interval (CI) = 2.11—33.96, P = 0.002), and Pacific People (OR = 3.11, 95% CI = 1.29—11.48, P = 0.014) had increased risk for developing LN. Anaemia at presentation (hazard ratio (HR) 3.2, 95% CI = 1.4—7.1, P = 0.004), and low complement months (HR = 3.4, 95% CI = 1.4—8.7, P = 0.008) were independent risk factors for developing LN after SLE diagnosis. In New Zealand, Pacific People and Maori patients with SLE have a higher incidence of LN, and patients with anaemia and hypocomplementaemia are more likely to develop LN after diagnosis. Patients with high dsDNA levels are more likely to develop Type IV lupus nephritis. Lupus (2007) 16, 830—837.
Publisher: Wiley
Date: 30-05-2006
DOI: 10.1111/J.1440-1797.2006.00572.X
Abstract: The dosing and quantification of acute renal replacement therapy has emerged as one of the most pressing issues in the management of critically-ill patients with acute kidney injury. Although there is ongoing debate as to the best marker of uraemic injury in this setting, several landmark studies have identified clearance-related expressions of acute renal replacement therapy dose as important determinants of survival. Part 1 of this review examines the factors affecting delivery of prescribed acute renal replacement therapy dose. The review continues in Part 2 and examines the implications of recent advances in this area for clinical practice.
Publisher: Oxford University Press
Date: 24-05-2018
DOI: 10.1093/MED/9780199592548.003.0233_UPDATE_001
Abstract: This chapter summarizes current best practice with respect to intermittent haemodialysis and sustained low-efficiency dialysis (SLED) for those with acute kidney injury. These modalities can be delivered using a variety of technology platforms. These platforms for the most part use online dialysate, and water quality needs to be monitored and maintained to current standards. Intermittent haemodialysis and SLED provide reasonable outcomes in experienced hands, and ameliorate morbidity and mortality resulting from the ‘acute uraemic syndrome’: that is, intractable infection, non-resolving shock, and haemorrhage. Careful consideration needs to be given to appropriate modality selection for patients. Lower-efficiency modalities such as continuous therapies or SLED are more appropriate for patients at risk from dialysis disequilibrium syndrome, those with abdominal compartment syndrome, and those who are haemodynamically unstable (including cardiogenic shock). Care should be taken to avoid complications related to rapid fluid and solute removal, anticoagulation, and vascular access. Intradialytic hypotension is detrimental for both general and renal recovery of critically ill patients, and can be mitigated by sodium and ultrafiltration profiling, and frequent treatments and prolonged treatment time to minimize ultrafiltration goals and rates. Irrespective of the modality applied, an adequate dialysis dose must be achieved. This is facilitated through the use of optimally placed and technically superior central venous catheters, and well-considered prescription of haemodialysis and SLED operating parameters. Dose should be monitored regularly through urea kinetic modelling, either using Kt/V for thrice-weekly schedules or the corrected equivalent renal urea clearance (EKRc) for more frequent ones.
Publisher: Springer Science and Business Media LLC
Date: 17-10-2019
DOI: 10.1186/S12882-019-1561-1
Abstract: Heavy metal poisoning can cause debilitating illness if left untreated, and its management in anuric patients poses challenges. Literature with which to guide clinical practice in this area is rather scattered. We present a case of symptomatic lead and arsenic poisoning from use of Ayurvedic medicine in a 28-year-old man with end-stage kidney disease on chronic hemodialysis. We describe his treatment course with chelating agents and extracorporeal blood purification, and review the relevant literature to provide general guidance. Cumulative clinical experience assists in identifying preferred chelators and modalities of extracorporeal blood purification when managing such patients. However, a larger body of real-world or clinical trial evidence is necessary to inform evidence-based guidelines for the management of heavy metal poisoning in anuric patients.
Publisher: Wiley
Date: 10-2014
DOI: 10.1111/HDI.12235
Publisher: Wiley
Date: 23-02-2012
DOI: 10.1111/J.1440-1797.2011.01558.X
Abstract: Accurate estimation of glomerular filtration rate (GFR) allows early detection of renal disease and maximizes opportunity for intervention. To assess the accuracy of estimated GFR (eGFR) in an Australian and New Zealand cohort with chronic kidney disease using the 4-variable Modification of Diet in Renal Disease equation (MDRD(4V)), the Chronic Kidney Disease Epidemiology Collaboration (CKD-EPI) equations, and the Cockcroft and Gault equation with actual and ideal body weight. Retrospective review of patients who had measured GFR (mGFR) by 51Cr-EDTA clearance and simultaneous measurements of serum biochemistry and anthropometrics. eGFR was compared with mGFR using the concordance correlation coefficient (CCC) and Bland-Altman measures of agreement. 178 patients had 441 radioisotope measurements of GFR. Mean mGFR of was 22.6 mL/min per 1.73 m(2) . The MDRD(4V) equation using the 'black' correction factor was most accurate with a mean eGFR of 19.74 (CCC 0.733, bias -2.86). The CKD-EPI equations also using the 'black' correction factors were almost as good at 19.11 (CCC 0.719, bias -3.49). The Cockcroft-Gault creatinine clearance values had the poorest agreement with mGFR. In the 18 nonwhite non-Asian patients, the MDRD(4V) and CKD-EPI equations were generally less accurate although the use of the 'black' correction factor resulted in greater accuracy for both equations. The MDRD(4V) equation was the most accurate. However, its accuracy might be less for nonwhite non-Asian patients if the 'black' correction factor is omitted. Further study of the estimation of GFR in Australian and New Zealand ethnic subgroups would be helpful.
Publisher: Wiley
Date: 21-07-2015
DOI: 10.1111/NEP.12482
Abstract: Patient socialization and preservation of socioeconomic status are important patient-centred outcomes for those who start dialysis, and retention of employment is a key enabler. This study examined the influence of dialysis inception and modality upon these outcomes in a contemporary Japanese cohort. We conducted a survey of prevalent chronic dialysis patients from 5 dialysis centres in Japan. All patients who had been on peritoneal dialysis (PD) since dialysis inception were recruited, and matched with a s le of those on in-centre haemodialysis (ICHD). We assessed patients' current social functioning (Short Form 36 Health Survey), and evaluated changes to patient employment status, annual income, and general health condition from the pre-dialysis period to the current time. A total of 179 patients were studied (102 PD and 77 ICHD). There were no differences in social functioning by modality. Among them, 113 were employed in the pre-dialysis period with no difference by modality. Of these, 22% became unemployed after dialysis inception, with a corresponding decline in average working hours and annual income. The odds of unemployment after dialysis inception were 5.02 fold higher in those on ICHD compared to those on PD, after adjustment for covariates. There were no changes for those who were already unemployed in the pre-dialysis period. Employment status is significantly h ered by dialysis inception, although PD was associated with superior retention of employment and greater income compared to ICHD. This supports a positive role for PD in preservation of socioeconomic status and potentially other patient-centred outcomes.
Publisher: SAGE Publications
Date: 12-2018
Abstract: There is an emerging practice pattern of automated peritoneal dialysis (APD) in China. We report on outcomes compared to continuous ambulatory peritoneal dialysis (CAPD) in a Chinese cohort. Data were sourced from the Baxter Healthcare (China) Investment Co. Ltd Patient Support Program database, comprising an inception cohort commencing PD between 1 January 2005 and 13 August 2015. We used time-dependent cause-specific Cox proportional hazards and Fine-Gray competing risks (kidney transplantation, change to hemodialysis) models to estimate relative mortality risk between APD and CAPD. We adjusted or matched for age, gender, employment, insurance, primary renal disease, size of PD program, and year of dialysis inception. We used cluster robust regression to account for center effect. We modeled 100,351 subjects from 1,178 centers over 240,803 patient-years. Of these, 368 received APD at some time. Compared with patients on CAPD, those on APD were significantly younger, more likely to be male, employed, self-paying, and from larger programs. Overall, APD was associated with a hazard ratio (HR) for death of 0.79 (95% confidence interval [CI] 0.64 – 0.97) compared with CAPD in Cox proportional hazards models, and 0.76 (0.62 – 0.95) in Fine-Gray competing risks regression models. There was prominent effect modification by follow-up time: benefit was observed only up to 4 years follow-up, after which risk of death was similar. Automated peritoneal dialysis is associated with an overall lower adjusted risk of death compared with CAPD in China. Analyses are limited by the likelihood of important selection bias arising from group imbalance, and residual confounding from unavailability of important clinical covariates such as comorbidity and Kt/V.
Publisher: Elsevier BV
Date: 03-2009
DOI: 10.1053/J.AJKD.2008.09.019
Abstract: Catheter-restricted antimicrobial lock (AML) use reduces catheter-associated bloodstream infection (CA-BSI) in clinical trial settings, but may not be as effective in clinical settings and may increase bacterial resistance. Quality improvement report analyzed using a cross-sectional time series (unbalanced panel) design. The study cohort comprised all prevalent adults treated with hemodialysis through a tunneled catheter for any, but not necessarily all, of the time from January 1, 2003, to June 30, 2006, in Manukau City, New Zealand (135,346 catheter-days, 404 tunneled catheters, 320 patients). Catheter-restricted AMLs (heparin plus gentamicin) for all tunneled catheters from July 1, 2004. Repeated observations of CA-BSI, hospitalization, tunneled catheter removal, and death from CA-BSI analyzed by using generalized estimating equations with a single level of clustering for each tunneled catheter and patterns of bacterial resistance analyzed by using simple descriptive statistics. AML use was associated with reductions in rates of CA-BSI and hospitalization for CA-BSI by 52% and 69% for patients with tunneled catheters locked continuously with AMLs since their insertion compared with those with tunneled catheters that were not, respectively. AML exposure also was associated with a trend to increased gentamicin resistance amongst coagulase-negative staphylococci isolates, a pattern similar to that observed for BSIs in our general hemodialysis population in which tunneled catheters were not the source of BSI, but different from that in the general non-end-stage renal disease population in the region. This is an uncontrolled observational study and cannot prove causality. The follow-up period of 18 months is longer than for other studies, but still too short to definitely answer whether AML use drives bacterial resistance. A change to use of AMLs may improve clinical outcomes however, additional study of associated bacterial resistance is needed before AML use becomes standard care.
Publisher: SAGE Publications
Date: 08-04-2022
Abstract: Peritoneal dialysis (PD)-related peritonitis is one of the top priorities for care and research among PD stakeholders. This study summarizes PD peritonitis rates from available population-based national or regional registries around the world, examining trends over time. This is a systematic review of PD peritonitis rates in patients treated with PD for kidney failure, from census-based national or provincial/statewide rovider registries or databases. MEDLINE (via PubMed) was searched from inception to August 2020, and inquiries made to national registry personnel using the International Comparisons section of the 2018 United States Renal Data System Annual Data Report as a contact list. Quantitative synthesis was done using weighted random-effects Poisson regression. Of 81 countries that reported utilization of PD, 19 did not have a traditional dialysis registry (governed by either professional societies or government entities), and only 33 monitored PD peritonitis rates correctly and accessibly. There is wide variation in PD peritonitis rates between countries, although the global average has been decreasing over time, from 0.600 episodes atient-year in 1992 to 0.303 in 2019. Other sources of variability include the continent in which the country is nested and the size of its PD population. PD peritonitis, despite its importance for PD stakeholders, is under-monitored. While the global rate is decreasing over time, the presence and extent of this improvement varies from country to country. There is an opportunity for better monitoring, research into underachieving and overachieving nations and development of international clinical support networks.
Publisher: Springer Science and Business Media LLC
Date: 23-04-2019
Publisher: Wiley
Date: 22-07-2020
DOI: 10.1111/SDI.12906
Publisher: SAGE Publications
Date: 27-07-2021
Publisher: Registre de Dialyse Peritoneale de Langue Francaise (RDPLF)
Date: 06-09-2022
Abstract: Peritonitis is the most important therapy-related complication of peritoneal dialysis (PD). Monthly or quarterly PD peritonitis rate statistics are used to identify special cause variation within or between in idual PD centres, to highlight any need for quality improvement. Unfortunately, many PD centres do not accurately “patient flow” (i.e., when patients start and finish on PD), and therefore cannot measure PD peritonitis rate. In this study, we validate an estimating formula for month-on-month annualised PD peritonitis rate, that calculates time-at-risk from “patient stock” (i.e., the number of prevalent patients on PD at the beginning and end of the month). We compared centers’ estimated peritonitis rates with gold-standard measurements in the Australia and New Zealand Dialysis and Transplant Registry / New Zealand PD Registry, and Le Registre de Dialyse Péritonéale de Langue Française et hémodialyse à domicile. A total of 268 centers from 9 countries with 1,020,260 patient-months of follow-up and 19,669 episodes of peritonitis were modeled. Overall agreement was excellent between estimates and gold-standard measurements with a concordance correlation coefficient (CCC) of 0.998 (95% confidence interval [CI] 0.998-0.998) in both registries. There was statistically significant lower agreement for smaller centers, although the CCC was still greater than 0.995. There were no instances of clinically significant misclassification of centers as being compliant or non-compliant with PD peritonitis standards with the use of the estimating formula. The simplified method of calculating the PD peritonitis rate is accurate and will allow more centers around the world to measure, report, and work on reducing PD peritonitis rates.
Publisher: Oxford University Press (OUP)
Date: 07-09-2016
DOI: 10.1093/NDT/GFV330
Abstract: Home dialysis can offer improved quality of life and economic benefits compared with facility dialysis. Yet the uptake of home dialysis remains low around the world, which may be partly due to patients' lack of knowledge and barriers to shared and informed decision-making. We aimed to describe patient and caregiver values, beliefs and experiences when considering home dialysis, to inform strategies to align policy and practice with patients' needs. Semi-structured interviews with adult patients with chronic kidney disease Stage 4-5D (on dialysis <1 year) and their caregivers, recruited from three nephrology centres in New Zealand. Transcripts were analysed thematically. In total, 43 patients [pre-dialysis (n = 18), peritoneal dialysis (n = 13), home haemodialysis (n = 4) and facility haemodialysis (n = 9)] and 9 caregivers participated. We identified five themes related to home dialysis: lacking decisional power (complexity of information, limited exposure to home dialysis, feeling disempowered, deprived of choice, pressure to choose), sustaining relationships (maintaining cultural involvement, family influence, trusting clinicians, minimizing social isolation), reducing lifestyle disruption (sustaining employment, avoiding relocation, considering additional expenses, seeking flexible schedules, creating free time), gaining confidence in choice (guarantee of safety, depending on professional certainty, reassurance from peers, overcoming fears) and maximizing survival. To engage and empower patients and caregivers to consider home dialysis, a stronger emphasis on the development of patient-focused educational programmes and resources is suggested. Pre-dialysis and home dialysis programmes that address health literacy and focus on cultural and social values may reduce fears and build confidence around decisions to undertake home dialysis. Financial burdens may be minimized through provision of reimbursement programmes, employment support and additional assistance for patients, particularly those residing in remote areas.
Publisher: SAGE Publications
Date: 18-02-2020
Abstract: There are a number of misconceptions around the identified early survival benefit of peritoneal dialysis (PD) relative to hemodialysis (HD), including that such benefits “even out in the end” since the relative risk of death over time eventually encompasses 1.0 (or even an estimate that is unfavorable to PD) that the early benefit is, in fact, most likely due to unmeasured confounding and such benefits are only due to the influence of central venous catheters and “crash starters” in the HD group. In fact, the early survival benefit results in a substantial gain of patient life years in PD cohorts relative to HD ones, even if it the benefit appears to “even out in the end,” is relatively insensitive to unmeasured confounding, and persists even when the effects of central venous catheters are accounted for. In this review, the calculations and arguments are made to support these tenets. Survival on dialysis is still one of the most important considerations for all stakeholders in the end-stage kidney disease community, including patients who rank it among their top priorities. Shared decision-making is a fundamental patient right and requires both balanced information and an iterative mechanism for a consensual decision based on shared understanding and purpose. A cornerstone of this process should be an explicit discussion of the early survival benefit of PD relative to HD.
Publisher: Elsevier BV
Date: 10-2001
Publisher: Elsevier BV
Date: 10-2017
DOI: 10.1053/J.AJKD.2017.05.014
Abstract: Bleeding from dialysis vascular access (arteriovenous fistulas, arteriovenous grafts, and vascular catheters) is uncommon. Death from these bleeds is rare and likely to be under-reported, with incident rates of fewer than 1 episode for every 1,000 patient-years on dialysis, meaning that dialysis units may experience this catastrophic event only once a decade. There is an opportunity to learn from (and therefore prevent) these bleeding deaths. We reviewed all reported episodes of death due to vascular access bleeding in Australia and New Zealand over a 14-year period together with in idual dialysis units' root cause analyses on each event. In this perspective, we provide a clinically useful summary of the evidence and knowledge gained from these rare events. Our conclusion is that death due to dialysis vascular access hemorrhage is an uncommon, catastrophic, but potentially preventable event if the right policies and procedures are put in place.
Publisher: Elsevier BV
Date: 03-2016
DOI: 10.1016/J.HLC.2015.08.012
Abstract: Atrial fibrillation (AF) is the commonest cardiac arrhythmia including in end-stage renal failure patients, but controversy remains whether these patients benefit from anticoagulation. We reviewed the characteristics, management and outcomes of end-stage renal failure patients on dialysis with AF. All patients started on dialysis at Middlemore Hospital between January 2000 and December 2008 who had AF were studied. Data regarding demographics, co-morbidities, renal disease, AF and embolic, bleeding and/or mortality events were recorded. There were 141 out of 774(18.2%) dialysis patients with AF followed-up for 4.4+/-2.5 years, and 41.8%(59) were on warfarin. Incidence of all embolic events, ischaemic stroke, all bleeding and intracranial bleed were 4.1, 3.1, 9.6 and 0.82/100 person years respectively. Warfarin anticoagulation was associated with increased risk of intracranial bleed (hazards ratio=11.1, P=0.038), but not total embolic, bleeding events or mortality during follow-up (P=0.317-0.980). All three scores (CHADS2, CHA2DS2-VASc and HAS-BLED) could detect all embolic events (c=0.808-0.838), but not bleeding events (c=0.459-0.498). Anticoagulation with warfarin didn't significantly reduce embolism or mortality in dialysis patients with AF, but increased the risk of intracranial bleeds. Convention risk scores predict embolic but not bleeding events in these patients.
Publisher: Springer Science and Business Media LLC
Date: 02-2019
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 10-2011
DOI: 10.2215/CJN.00670111
Publisher: SAGE Publications
Date: 18-05-2022
DOI: 10.1177/08968608211016127
Abstract: Survival of peritoneal dialysis (PD) patients in Japan is high, but few reports exist on cause-specific mortality, transfer to haemodialysis (HD) or hybrid dialysis and hospitalisation risks. We aimed to identify reasons for transfer to HD, hybrid dialysis and hospitalisation in the Japan Peritoneal Dialysis and Outcomes Practice Patterns Study. This observational study included 808 adult PD patients across 31 facilities in Japan in 2014–2017. Information on all-cause and cause-specific mortality and hospitalisation and permanent transfer to HD and PD/HD hybrid therapy were prospectively collected and rates calculated. Median follow-up time was 1.66 years where 162 patients transferred to HD, 79 transferred to hybrid dialysis and 74 patients died. All-cause and cardiovascular disease (CVD)-related mortality rates were 5.1 and 1.7 deaths/100 patient-years, respectively. Rates of transfer to HD and hybrid therapy were 11.2 and 5.5 transfers/100 patient-years, respectively. Among HD transfers, 40% were due to infection (including peritonitis), while 20% were due to inadequate solute/water clearance. Eighty-one percent of hybrid dialysis transfers were due to inadequate solute/water clearance. All--cause, peritonitis-related and CVD-related hospitalisation rates were 120.4, 21.1 and 15.6/100 patient-years, respectively. Median hospital length of stay was 19 days. Mortality, hospitalisation and transfer to HD/hybrid dialysis rates are relatively low in Japan compared to many other countries with hybrid transfers, accounting for one-third of dialysis transfers from PD. Further study is needed to explain the high inter-facility variation in hospitalisation rates and how to further reduce hospitalisation rates for Japanese PD patients.
Publisher: Oxford University Press
Date: 10-2015
DOI: 10.1093/MED/9780199592548.001.0001
Abstract: With expert input from additional section editors William G. Bennett, Jeremy R. Chapman, Adrian Covic, Marc E. De Broe, Vivekanand Jha, Neil Sheerin, Robert Unwin, and Adrian Woolf, the Oxford Textbook of Clinical Nephrology is a three-volume international textbook of nephrology with an unrivalled clinical approach backed up by science. It has been completely rewritten in 365 chapters for its fourth edition to bring it right up to date, make it easier to obtain rapid answers to questions, and to suit delivery in electronic formats as well as in print. This edition offers increased focus on the medical aspects of transplantation, HIV-associated renal disease, and infection and renal disease, alongside entirely new sections on genetic topics and clinical and physiological aspects of fluid/electrolyte and tubular disorders. The emphasis throughout is on marrying advances in scientific research with clinical management. The target audience is primarily the nephrologist in clinical practice and training as well as other healthcare professionals with an interest in renal disease.
Publisher: Microbiology Society
Date: 02-2011
Abstract: We present a case of soft tissue infection caused by the basidiomycete Phellinus undulatus . To our knowledge, this is the first reported case of human infection caused by this fungus. Definitive identification was only possible through molecular analysis as the isolate failed to produce any distinct morphological features in vitro.
Publisher: SAGE Publications
Date: 07-04-2021
Publisher: Elsevier BV
Date: 04-2016
DOI: 10.1053/J.AJKD.2015.09.025
Abstract: Intensive hemodialysis (HD) is characterized by increased frequency and/or session length compared to conventional HD. Previous analyses from Australia and New Zealand did not suggest benefit with intensive HD, although recent research suggests that relationships have changed. We present updated analyses. Observational cohort study using marginal structural modeling to adjust for changes in renal replacement modality and time-varying medical comorbid conditions. Adults initiating renal replacement therapy since March 31, 1996, followed up through December 31, 2012 this analysis included 40,842 patients over 2,187,689 patient-months. Time-varying renal replacement modality: conventional facility HD (≤3 times per week, ≤6 hours per session), quasi-intensive facility HD (between conventional and intensive), intensive facility HD (≥5 times per week, any hours per session), conventional home HD, quasi-intensive home HD, intensive home HD, peritoneal dialysis, deceased donor kidney transplantation, and living donor kidney transplantation. Patient mortality, with a 3-month lag in primary analyses and 6- and 12-month lags in sensitivity analyses. Conventional facility HD was the reference group. Conventional home HD had a similar mortality risk. For quasi-intensive home HD, mortality risk was lower (HR, 0.56 95% CI, 0.44-0.73). For intensive home HD, mortality risk was nonsignificantly lower in primary analyses and significantly lower using a 6-month lag (HR, 0.41 95% CI, 0.20-0.85), but not using a 12-month lag. For quasi-intensive facility HD, mortality risk was nonsignificantly lower in primary analyses, although significantly lower using 6- (HR, 0.41 95% CI, 0.20-0.85) and 12-month lags (HR, 0.59 95% CI, 0.44-0.80). Mortality risk was similar between intensive and conventional facility HD. For peritoneal dialysis, mortality risk was greater than for conventional facility HD (HR, 1.07 95% CI, 1.03-1.12). Kidney transplantation had the lowest mortality risk. Potential residual confounding from limited collection of comorbid condition, socioeconomic, and medication data. There is an emerging HD dose-effect in Australia and New Zealand, with lower mortality risks associated with some of the more intensive HD regimens in these countries.
Publisher: S. Karger AG
Date: 2019
DOI: 10.1159/000499357
Abstract: b i Background: /i /b oXiris is a blood purification product that has been launched recently in China. In addition to renal function support and fluid management capabilities, it can also adsorb cytokines and endotoxins. This may complement standard treatment for septic acute kidney injury (AKI) patients to control the litude of systemic inflammatory response responsible for acute tissue and organ damage. Objectives of our study are to elucidate characteristics of septic AKI patients who respond to treatment with oXiris and to describe the performance of oXiris through patient cases in the absence of large randomized trials on clinical use of oXiris for septic AKI patients in China. b i Summary: /i /b Here, we present 4 cases managed in intensive care units of major hospitals in China. Key practical aspects from an expert meeting discussing these cases have been included as guidance for the use of oXiris in septic AKI patients. b i Key Messages: /i /b Based on the experience gathered from 4 cases, oXiris should be used early in the treatment of septic AKI patients as an adjuvant therapy with good infection source control. It should not be used to delay or replace infection source control. These cases also demonstrated that patients with high risk of bleeding can use oXiris without additional anticoagulation for up to 36 h without implications on serum protein levels and platelet count. Short of definitive biomarkers to gauge the ideal blood purification initiation and discontinuation time for septic AKI patients, clinical judgment is key to determining optimal use of oXiris in septic AKI patients.
Publisher: Wiley
Date: 04-2015
DOI: 10.1111/HDI.12248
Abstract: Interest in home hemodialysis (HD) is high because of the reported benefits and its excellent safety record. However, the potential for serious adverse events (AEs) exists when patients perform HD in their homes without supervision. We review the epidemiology of dialysis-related emergencies during home HD, and present a conceptual and practical framework for the prevention and management of serious AEs for those patients performing home HD. In addition, we describe a formal monitored and iterative quality assurance program, and make suggestions for the future development of safety strategies to mitigate the risk of AEs in home HD.
Publisher: Elsevier BV
Date: 02-2016
Publisher: Elsevier BV
Date: 11-2016
DOI: 10.1016/J.KINT.2016.06.040
Abstract: Incremental hemodialysis may offer substantial clinical, patient-centered, and cost benefits. In the largest and most rigorous study to date, Mathew et al. show that incremental hemodialysis is associated with satisfactory survival in those with adequate residual renal function and reasonable general health. While encouraging, these findings are a call to action for a suitably powered randomized clinical trial, to generate definitive evidence of benefit with this approach and define optimal practice for "real-world" implementation.
Publisher: MDPI AG
Date: 18-04-2018
DOI: 10.3390/NU10040502
Publisher: SAGE Publications
Date: 03-2014
Publisher: Elsevier BV
Date: 2017
Publisher: Elsevier BV
Date: 09-2015
DOI: 10.1053/J.AJKD.2015.03.014
Abstract: In most studies, home dialysis associates with greater survival than facility hemodialysis (HD). However, the relationship between mortality risk and modality can vary by era. We describe and compare changes in survival with facility HD, peritoneal dialysis, and home HD over a 15-year period using data from The Australia and New Zealand Dialysis and Transplant Registry (ANZDATA). An observational inception cohort study, using Cox proportional hazards and competing-risks regression. All adult patients initiating renal replacement therapy in Australia and New Zealand since March 31, 1998, followed up to December 31, 2012. Era at dialysis inception (1998-2002, 2003-2007, and 2008-2012). We adjusted for time-varying dialysis modality and comorbid conditions, demographics, initial state/country of treatment, late referral for nephrology care, primary kidney disease, and kidney function at dialysis inception. Patient mortality. Survival on dialysis therapy has improved despite increasing patient comorbid conditions. Compared to 1998 to 2002, there has been a 21% reduction in mortality for those on facility HD therapy, a 27% reduction for those on peritoneal dialysis therapy, and a 49% reduction for those on home HD therapy. Potential for residual confounding from limited collection of comorbid conditions analyses lack data for blood pressure, fluid volume status, socioeconomics, medication, and biochemical parameters. Our study indicates that outcomes on dialysis therapy are improving with time and that this improvement is most marked with home dialysis modalities, especially home HD. This might be the result of better dialysis care (eg, improving predialysis care and more appropriate selection of patients for home dialysis). Other contributing factors are possible, such as improvements in general care of patient comorbid conditions and improvements in dialysis technology, although further research is needed to clarify these issues.
Publisher: Elsevier BV
Date: 2018
DOI: 10.1016/J.KINT.2017.09.014
Abstract: In this issue of Kidney International, Tennankore et al. present a robust observational study that evaluates different submodalities of intensive hemodialysis. Their paper addresses a critical question, but more importantly illustrates the limits of statistical adjustment when there is major crossover between groups and important center effect. Intensive hemodialysis remains a strong contender to meet the needs of modern patients, and-no matter how challenging-well-deserving of further definitive study in clinical trials.
Publisher: Wiley
Date: 16-11-2017
DOI: 10.1111/NEP.12920
Abstract: There is little research exploring the association between clinicians' behaviours and home dialysis uptake. This paper aims to better understand the influence of clinicians on home dialysis modality recommendations and uptake. Online survey of all NZ renal units to determine the influence of in iduals within pre-dialysis teams. We used the self-declaration scale of influence to rate the identified member's perceived influence on decision-making. We used this measure of 'decisional power' to compare the perceived influence of pre-dialysis nurses with nephrologists using both parametric and non-parametric methods. We developed a generalized linear model to investigate the relationship between the influence of nephrologists and pre-dialysis nurses with home dialysis uptake by in idual centre using additional data from Australian and New Zealand Dialysis and Transplant Registry (ANZDATA). Finally, respondents rated the importance of a list of patient and service-level factors in recommendations for home dialysis. Data suggest the nephrologists are the most influential member of the pre-dialysis team. This contrasts with perceptions of survey respondents who view pre-dialysis nurses as most influential. Nephrologists' recommendations are likely to be a successful way of increasing home dialysis. A single point increase in nephrologist decisional power is associated with a 6.1% increase in the prevalence of home dialysis. The decisional power around home dialysis in NZ sits with nephrologists. It is therefore critical that nephrologists exercise their decisional power in advocating home dialysis and address reasons why they may not recommend home dialysis to well-suited and appropriate patients.
Publisher: Wiley
Date: 04-2015
DOI: 10.1111/HDI.12243
Abstract: Planning and funding a home hemodialysis (HD) program requires a well-organized effort and close collaboration between clinicians and administrators. This resource provides guidance on the processes that are involved, including a thorough situational analysis of the dialysis landscape, emphasizing the opportunity for a home HD program careful consideration of the clinical and operational characteristics of a proposed home HD program at your institution the development of a compelling business case, highlighting the clinical and organizational benefits of a home HD program and careful construction and evaluation of a request for proposal.
Publisher: Oxford University Press (OUP)
Date: 19-03-2004
DOI: 10.1093/NDT/GFH218
Publisher: Wiley
Date: 16-01-2019
Publisher: Wiley
Date: 30-05-2006
DOI: 10.1111/J.1440-1797.2006.00581.X
Abstract: The dosing and quantification of acute renal replacement therapy has emerged as one of the most pressing issues in the management of critically-ill patients with acute kidney injury. Although there is ongoing debate as to the best marker of uraemic injury in this setting, several landmark studies have identified clearance-related expressions of acute renal replacement therapy dose as important determinants of survival. Part 1 of this review examined the factors affecting the delivery of prescribed acute renal replacement therapy dose. Part 2 summarises and contextualises findings from recent dose-outcome studies, and reviews clinical tools to assist in the prescription and quantification of acute renal replacement therapy dose.
Publisher: Oxford University Press (OUP)
Date: 05-06-2007
DOI: 10.1093/NDT/GFM220
Abstract: Dialysate [Na+] is often overlooked as a contributor to hypertension in patients on haemodialysis (HD). We report observational experience with a facility level decrease in dialysate [Na+] from 141 mmol/l to 138 mmol/l, in the absence of concurrent change with respect to dietary sodium regulation. The s le comprised all patients (n=52) dialysing at a single HD facility over an 8-month period flanking the change in dialysate [Na+]. Outcomes included repeated observations of blood pressure (BP), interdialytic weight gain (IDWG), pre-dialysis plasma [Na+] and adverse events. Predictors other than dialysate [Na+] included patient demographics, clinical characteristics and number of antihypertensive medications. The study used a longitudinal unbalanced panel design, and hierarchical linear and Poisson mixed models. In multivariate analyses, the change in dialysate [Na+] was associated with a statistically significant small to medium-sized decrease in pre- and post-dialysis systolic and diastolic BP, pre-dialysis plasma [Na+], but not IDWG. Change was greatest in the patient tertile with the highest initial BP. There was no change in the frequency of adverse events. Modelling dialysate [Na+] exposure as the diffusion gradient from dialysate to blood water did not improve the strength of associations. A facility level decrease in dialysate [Na+] from 141 mmol/l to 138 mmol/l appears to be safe and well tolerated, and a useful means of improving BP control. The lack of change in IDWG probably reflects lack of dietary salt restriction, and but does raise the issue of volume-independent effects of sodium exposure on BP.
Publisher: Wiley
Date: 05-2019
DOI: 10.1111/NEP.13556
Abstract: Haemodialysis is usually started at a frequency of three times a week, with occasional patients starting twice weekly ('incremental dialysis'). Incremental haemodialysis (HD) may preserve residual kidney function and has been associated with reduced mortality. In the present study, we report prevalence and outcomes of incremental dialysis in Australia and New Zealand. The cohort was all adults starting renal replacement therapy with HD in Australia and New Zealand 2004-2015. We used cox proportional hazards modelling with a primary exposure of dialysis frequency at first survey date (≥ or <3 times per week). The primary outcome was all-cause mortality (primary), cardiovascular and non-cardiovascular mortality (secondary). Eight-hundred fifty of 27 513 subjects were started on twice weekly HD (prevalence 3%). Compared to conventional patients, incremental dialysis patients were older (67 vs 62 years, P < 0.001), had a lower body mass index (26.1 vs 27.7 kg/m Incremental dialysis was used infrequently, and there was evidence of patient level differences. All-cause mortality was similar, but there were differences in cause specific mortality. Incremental dialysis needs to be tested in prospective trials to define the safety and efficacy of this approach.
Publisher: Springer Science and Business Media LLC
Date: 12-2015
Publisher: Wiley
Date: 12-2003
DOI: 10.1111/J.1440-1797.2003.00207.X
Abstract: Renal replacement therapy is frequently required for critically ill patients with a high risk of bleeding. Conventional heparinization strategies to prevent extracorporeal blood circuit clotting can cause significant haemorrhage in such patients because of systemic anticoagulation. Regional citrate anticoagulation (RCA) is a well-established technique that minimizes this complication by the decalcification of blood in the extracorporeal circuit such that it is incapable of clotting. To date, there are no reports on the use of RCA for sustained low-efficiency dialysis/diafiltration (SLED), a hybrid therapy that involves the use of conventional haemodialysis machinery to deliver lower solute clearances over prolonged periods of time. In preparation for clinical study, an in vitro simulation of SLED was devised (blood substitute flow 250 mL/min, dialysate flow 200 mL/min, predilution haemofiltration 100 mL/min). Blood substitute was decalcified by an infusion of 4% trisodium citrate (TSC) proximally into the extracorporeal blood circuit, with partial restoration of calcium homeostasis from dialysate containing ionized [Ca2+] at 0.9 mmol/L. This simulation was used to establish first the 4% TSC requirement for therapeutic decalcification, and second the associated changes in ionized [Ca2+] and [Mg2+] within the blood substitute from chelation with citrate and subsequent removal of the resulting alent cation-citrate complex. Serial measurements of blood substitute [Ca2+] from strategic points along the extracorporeal circuit showed therapeutic decalcification was not achieved with 4% TSC infusion rates up to 400 mL/h, and extrapolation of experimental results suggests that 450 mL/h will be required. Under these conditions, ionized [Ca2+] and [Mg2+] in the blood substitute venous return and would be 0.42 and 0.2 mmol/L, respectively, with 0.35 mmol of citrate being returned per minute via the blood substitute venous return. These results were modelled for various changes in SLED operating parameters, and discussed in detail. An appropriate regimen for 4% TSC infusion and alent cation replacement is proposed for clinical study in the future.
Publisher: Elsevier BV
Date: 07-2010
DOI: 10.1053/J.JRN.2009.10.002
Abstract: To consider the Kidney Disease Outcomes Quality Initiative recommendation of using multiple nutritional measurements for patients on maintenance dialysis, we explored data for independent and joint associations of nutritional indicators with mortality risk among maintenance hemodialysis patients treated in 12 countries. Dialysis units in seven European countries, the United States, Canada, Australia, New Zealand, and Japan. Mortality risk. We conducted a prospective cohort study of 40,950 patients from phases I to III of the Dialysis Outcomes and Practice Patterns Study (1996-2008). Independent and joint effects (interactions) of nutritional indicators (serum creatinine, serum albumin, normalized protein catabolic rate, body mass index [BMI]) on mortality risk were assessed by Cox regression with adjustments for demographics, years on dialysis, and comorbidities. Important variations in nutritional indicators were seen by country and patient characteristics. Poorer nutritional status assessed by each indicator was independently associated with higher mortality risk across regions. Significant multiplicative interactions (each p < or = 0.01) between indicators were also observed. For ex le, by using patients with serum creatinine 7.5-10.5 mg/dL and BMI 21-25 kg/m(2) as referent, BMI 10.5 mg/dL (relative risk = 0.68) but with higher mortality risk among those with creatinine <7.5 mg/dL (relative risk = 1.38). The association of lower albumin concentration with higher mortality risk was stronger for patients with lower BMI or lower creatinine. The joint effects of nutritional indicators on mortality indicate the need to use multiple measurements when assessing the nutritional status of hemodialysis patients.
Publisher: Elsevier BV
Date: 04-2013
DOI: 10.1053/J.AJKD.2012.10.020
Abstract: There is revived interest in home hemodialysis (HD), which is spurred by cost containment and experience indicating lower mortality risk compared with facility HD and peritoneal dialysis (PD). Social barriers to home HD include disruptions to the home environment, interference with family life, overburdening of support networks, and fear of social isolation. A submodality of home HD, in which patients from urban settings undertake independent HD in unstaffed nonmedical community-based home-like settings, is described in this study. The survival of patients treated in this manner is compared with that of those using conventional home HD. An observational cohort study using the Australia and New Zealand Dialysis and Transplant Registry. All adult patients starting renal replacement therapy in New Zealand since March 31, 2000, followed up through December 31, 2010. The main predictor was time-varying dialysis modality (home HD, facility HD, PD, and community house HD), adjusting for the confounding effects of patient demographics and time-varying comorbid conditions. Patient mortality. 4,709 patients with 12,883 patient-years of follow-up (5,591, PD 1,532, home HD 5,647, facility HD and 113, community house HD) were analyzed. Community house HD patients were younger, healthier, and more likely to be Pacific people than those using other modalities, including home HD. Relative to home HD, adjusted mortality HRs were 2.18 (95% CI, 1.78-2.67) for facility HD, 2.17 (95% CI, 1.77-2.66) for PD, and 1.48 (95% CI, 0.64-3.40) for community house HD. Small number of patients receiving community house HD, possible residual confounding from the limited collection of comorbid conditions (eg, no collection of cognitive or motor impairment), and absence of socioeconomic, medication, and biochemical data in analyses. Within limits, this study shows community house HD to be both safe and effective. Community house HD provides an option to improve the uptake of home HD.
Publisher: Springer Science and Business Media LLC
Date: 15-07-2013
Abstract: The current literature recognises that left ventricular hypertrophy makes a key contribution to the high rate of premature cardiovascular mortality in dialysis patients. Determining how we might intervene to ameliorate left ventricular hypertrophy in dialysis populations has become a research priority. Reducing sodium exposure through lower dialysate sodium may be a promising intervention in this regard. However there is clinical equipoise around this intervention because the benefit has not yet been demonstrated in a robust prospective clinical trial, and several observational studies have suggested sodium lowering interventions may be deleterious in some dialysis patients. The Sodium Lowering in Dialysate (SoLID) study is funded by the Health Research Council of New Zealand. It is a multi-centre, prospective, randomised, single-blind (outcomes assessor), controlled parallel assignment 3-year clinical trial. The SoLID study is designed to study what impact low dialysate sodium has upon cardiovascular risk in dialysis patients. The study intends to enrol 118 home hemodialysis patients from 6 sites in New Zealand over 24 months and follow up each participant over 12 months. Key exclusion criteria are: patients who dialyse more frequently than 3.5 times per week, pre-dialysis serum sodium of mM, and maintenance hemodiafiltration. In addition, some medical conditions, treatments or participation in other dialysis trials, which contraindicate the SoLID study intervention or confound its effects, will be exclusion criteria. The intervention and control groups will be dialysed using dialysate sodium 135 mM and 140 mM respectively, for 12 months. The primary outcome measure is left ventricular mass index, as measured by cardiac magnetic resonance imaging, after 12 months of intervention. Eleven or more secondary outcomes will be studied in an attempt to better understand the physiologic and clinical mechanisms by which lower dialysate sodium alters the primary end point. The SoLID study is designed to clarify the effect of low dialysate sodium upon the cardiovascular outcomes of dialysis patients. The study results will provide much needed information about the efficacy of a cost effective, economically sustainable solution to a condition which is curtailing the lives of so many dialysis patients. Australian and New Zealand Clinical Trials Registry number: ACTRN12611000975998
Publisher: Wiley
Date: 09-04-2012
Publisher: Wiley
Date: 04-2015
DOI: 10.1111/HDI.12254
Abstract: Patient selection and training is arguably the most important step toward building a successful home hemodialysis (HD) program. We present a step-by-step account of home HD training to guide providers who are developing home HD programs. Although home HD training is an important step in allowing patients to undergo dialysis in the home, there is a surprising lack of systematic research in this field. Innovations and research in this area will be pivotal in further promoting a higher acceptance rate of home HD as the renal replacement therapy of choice.
Publisher: Springer Science and Business Media LLC
Date: 25-10-2016
Publisher: BMJ
Date: 03-2017
Publisher: MDPI AG
Date: 24-01-2020
DOI: 10.3390/RS12030375
Abstract: We employed a global high-resolution inverse model to optimize the CH4 emission using Greenhouse gas Observing Satellite (GOSAT) and surface observation data for a period from 2011–2017 for the two main source categories of anthropogenic and natural emissions. We used the Emission Database for Global Atmospheric Research (EDGAR v4.3.2) for anthropogenic methane emission and scaled them by country to match the national inventories reported to the United Nations Framework Convention on Climate Change (UNFCCC). Wetland and soil sink prior fluxes were simulated using the Vegetation Integrative Simulator of Trace gases (VISIT) model. Biomass burning prior fluxes were provided by the Global Fire Assimilation System (GFAS). We estimated a global total anthropogenic and natural methane emissions of 340.9 Tg CH4 yr−1 and 232.5 Tg CH4 yr−1, respectively. Country-scale analysis of the estimated anthropogenic emissions showed that all the top-emitting countries showed differences with their respective inventories to be within the uncertainty range of the inventories, confirming that the posterior anthropogenic emissions did not deviate from nationally reported values. Large countries, such as China, Russia, and the United States, had the mean estimated emission of 45.7 ± 8.6, 31.9 ± 7.8, and 29.8 ± 7.8 Tg CH4 yr−1, respectively. For natural wetland emissions, we estimated large emissions for Brazil (39.8 ± 12.4 Tg CH4 yr−1), the United States (25.9 ± 8.3 Tg CH4 yr−1), Russia (13.2 ± 9.3 Tg CH4 yr−1), India (12.3 ± 6.4 Tg CH4 yr−1), and Canada (12.2 ± 5.1 Tg CH4 yr−1). In both emission categories, the major emitting countries all had the model corrections to emissions within the uncertainty range of inventories. The advantages of the approach used in this study were: (1) use of high-resolution transport, useful for simulations near emission hotspots, (2) prior anthropogenic emissions adjusted to the UNFCCC reports, (3) combining surface and satellite observations, which improves the estimation of both natural and anthropogenic methane emissions over spatial scale of countries.
Publisher: Elsevier BV
Date: 10-2006
Publisher: Wiley
Date: 15-07-2014
Publisher: Elsevier BV
Date: 05-2005
DOI: 10.1111/J.1523-1755.2005.00293.X
Abstract: In clinical trials, equation 7 from the Modification of Diet in Renal Disease (MDRD) Study is the most accurate formula for the prediction of glomerular filtration rate (GFR) from serum creatinine. An alternative approach has been developed using evolving connectionist systems (ECOS), which are novel computing structures that can be trained to generate accurate output from a given set of input variables. This study aims to compare the prediction errors associated with each method, using data that reproduce routine clinical practice as opposed to the artificial setting of clinical trials. The methods were compared using 441 radioisotope measurements of GFR in 178 chronic kidney disease patients from 12 centers in Australia and New Zealand. All clinical and laboratory measurements were obtained from the patients' center rather than central laboratories, as would be the case in routine clinical practice. Both the MDRD formula and ECOS used the same predictive variables, and both were optimized to the study cohort by stepwise regression and training, respectively. Mean measured GFR in the cohort was 22.6 mL/min/1.73 m(2). The bias and precision of the MDRD formula were -3.5 mL/min/1.73 m(2) and 34.5%, respectively, improving to -1.2 mL/min/1.73 m(2) and 31.1% after maximal optimization of the formula to study data. The bias and precision of the ECOS were 0.7 mL/min/1.73 m(2) and 32.6%, respectively, improving to -0.1 mL/min/1.73 m(2) and 16.6% after maximal optimization of the system to study data. The prediction of GFR using ECOS was improved by accounting for the center from where clinical and laboratory measurements originated within the connectionist model. Algebraic formulas will be associated with greater prediction error in routine clinical practice than in the original trials, and machine intelligence is more likely to predict GFR accurately in this setting.
Publisher: CRC Press
Date: 03-11-2005
Publisher: Wiley
Date: 15-05-2016
DOI: 10.1111/HDI.12424
Publisher: BMJ
Date: 15-04-2015
Publisher: Springer Science and Business Media LLC
Date: 20-09-2014
Publisher: Elsevier BV
Date: 11-2004
DOI: 10.1053/J.AJKD.2004.08.011
Abstract: Analyses based on the National Cooperative Dialysis Study (NCDS) provided the impetus for routine quantification of delivered dialysis dose in hemodialysis practice throughout the world, by suggesting minimum targets for small solute (urea) clearance. Morbidity and mortality in dialysis populations remain high despite many technological advances in dialysis delivery. A number of observational studies reported association between higher dose of dialysis as measured by Kt/V urea or urea reduction ratio with lower mortality risk. During the 1990s, a steady increase in dialysis dose and a modest reduction in mortality on dialysis were observed. However, observational studies only reveal associations and are limited by selection bias and confounding. The Kidney Disease Outcomes Quality Initiative guidelines on dialysis adequacy are based on results of observational studies and expert opinion. Since the NCDS, the HEMO Study was the first major randomized clinical trial designed to study the effect of dose of dialysis and dialyzer flux on patient outcomes. Despite adequate separation of dose and flux, however, results of the trial did not prove a beneficial effect of higher dose. The Dialysis Outcomes and Practice Patterns Study (DOPPS), in a major international effort designed to examine the effect of practice patterns on outcomes, has made significant contributions to the topic of dialysis dose. The following review critically examines data from observational studies, including the DOPPS, and from the HEMO Study, emphasizing important lessons from both, and discusses future paradigms for achieving dialysis adequacy to improve patient outcomes.
Publisher: Elsevier BV
Date: 06-2017
DOI: 10.1016/J.KINT.2017.02.024
Abstract: There is a strong biological plausibility for benefit from removal of larger uremic toxins and increasing positive clinical experience with hemodiafiltration. However, evidence supporting hemodiafiltration is not definitive with studies that are often limited by serious methodological shortcomings. Morena et al. show that hemodiafiltration may prevent intradialytic hypotension, albeit in a study that also has some shortcomings. Ongoing research for hemodiafiltration is still needed through high-quality clinical trials that adhere to standards for clinical trial conduct and reporting.
Publisher: Wiley
Date: 25-09-2013
DOI: 10.1111/SDI.12141
Abstract: Hemodialysis (HD) catheter-related infection (CRI) and septicemia contribute to adverse outcomes. The impact of seasonality and prophylactic dialysis practices during high-risk periods remain unexplored. This multicenter study analyzed DOPPS data from 12,122 HD patients (from 442 facilities) to determine the association between seasonally related climatic variables and CRI and septicemia. Climatic variables were determined by linkage to National Climatic Data Center of National Oceanic and Atmospheric Administration data. Catheter care protocols were examined to determine if they could mitigate infection risk during high-risk seasons. Survival models were used to estimate the adjusted hazard ratio (AHR) of septicemia by season and by facility catheter dressing protocol. The overall catheter-related septicemia rate was 0.47 per 1000 catheter days. It varied by season, with an AHR for summer of 1.46 (95% CI: 1.19-1.80) compared with winter. Septicemia was associated with temperature (AHR = 1.07 95% CI: 1.02-1.13 p < 0.001). Dressing protocols using chlorhexidine (AHR of septicemia = 0.55 95% CI: 0.39-0.78) were associated with fewest episodes of CRI or septicemia. Higher catheter-related septicemia in summer may be due to seasonal conditions (e.g., heat, perspiration) that facilitate bacterial growth and compromise protective measures. Extra vigilance and use of chlorhexidine-based dressing protocols may provide prophylaxis against CRI and septicemia.
Publisher: BMJ
Date: 08-1995
Publisher: Springer Science and Business Media LLC
Date: 11-07-2015
Publisher: Ovid Technologies (Wolters Kluwer Health)
Date: 19-10-2018
DOI: 10.2215/CJN.06830617
Abstract: Improved knowledge about factors that influence patient choices when considering dialysis modality could facilitate health care interventions to increase rates of home dialysis. We aimed to quantify the attributes of dialysis care and the tradeoffs that patients consider when making decisions about dialysis modalities. We conducted a prospective, discrete choice experiment survey with random parameter logit analysis to quantify preferences and tradeoffs for attributes of dialysis treatment in 143 adult patients with CKD expected to require RRT within 12 months (predialysis). The attributes included schedule flexibility, patient out of pocket costs, subsidized transport services, level of nursing support, life expectancy, dialysis training time, wellbeing on dialysis, and dialysis schedule (frequency and duration). We reported outcomes using β -coefficients with corresponding odds ratios and 95% confidence intervals for choosing home-based dialysis (peritoneal dialysis or hemodialysis) compared with facility hemodialysis. Home-based therapies were significantly preferred with the following attributes: longer survival (odds ratio per year, 1.63 95% confidence interval, 1.25 to 2.12), increased treatment flexibility (odds ratio, 9.22 95% confidence interval, 2.71 to 31.3), improved wellbeing (odds ratio, 210 95% confidence interval, 15 to 2489), and more nursing support (odds ratio, 87.3 95% confidence interval, 3.8 to 2014). Respondents were willing to accept additional out of pocket costs of approximately New Zealand $400 (United States $271) per month (95% confidence interval, New Zealand $333 to $465) to receive increased nursing support. Patients were willing to accept out of pocket costs of New Zealand $223 (United States $151) per month (95% confidence interval, New Zealand $195 to $251) for more treatment flexibility. Patients preferred home dialysis over facility-based care when increased nursing support was available and when longer survival, wellbeing, and flexibility were expected. Sociodemographics, such as age, ethnicity, and income, influenced patient choice.
Publisher: Elsevier
Date: 2010
Publisher: Elsevier BV
Date: 03-2002
Abstract: Continuous renal replacement therapies have practical and theoretical advantages compared with conventional intermittent hemodialysis in hemodynamically unstable or severely catabolic patients with acute renal failure (ARF). Sustained low-efficiency dialysis (SLED) is a hybrid modality introduced July 1998 at the University of Arkansas for Medical Sciences that involves the application of a conventional hemodialysis machine with reduced dialysate and blood flow rates for 12-hour nocturnal treatments. Nine critically ill patients with ARF were studied during a single SLED treatment to determine delivered dialysis dose and the most appropriate model for the description of urea kinetics during treatment. Five patients were men, mean Acute Physiology and Chronic Health Evaluation (APACHE) II score was 28.9 and mean weight was 92.5 kg. Kt/V was determined by the reference method of direct dialysate quantification (DDQ) combined with an equilibrated postdialysis plasma water urea nitrogen (PUN) concentration and four other methods that were either blood or dialysate based, single or double pool, or model independent (whole-body kinetic method). Solute removal indices (SRIs) were determined from net urea removal and urea distribution volume supplied from DDQ (reference method) and by mass balance using variables supplied from blood-based formal variable-volume single-pool (VVSP) urea kinetic modeling. Equivalent renal urea clearances (EKRs) were calculated from urea generation rates and time-averaged concentrations for PUN based on weekly mass balance with kinetic variables supplied by either DDQ (reference method) or formal blood-based VVSP modeling. Mean Kt/V determined by the reference method was 1.40 and not significantly different when determined by formal VVSP modeling, DDQ using an immediate postdialysis PUN, or the whole-body kinetic method. Correction of single-pool Kt/V by a Daugirdas rate equation did not yield plausible results. Mean SRI and EKR by the reference methods were 0.61 and 24.8 mL/min, respectively, and not significantly different when determined by blood-based methods. A single-pool urea kinetic model adequately described intradialytic PUN profiles, indicating that SLED was associated with minimal urea disequilibrium. This was supported by the parity between hemodialyzer and whole-body urea clearances, and the mean postdialytic urea rebound of 4.1% (P = 0.13 versus zero). Additional prospective studies are needed in this setting to define the optimal method for dialysis quantification, targets for azotemic control, and optimal modality of renal replacement therapy. In conclusion, SLED delivers a high dose of dialysis with minimal associated urea disequilibrium and can be quantified by Kt/V, SRI, and EKR from blood-based methods using single-pool urea kinetic models.
Publisher: Oxford University Press (OUP)
Date: 18-03-2004
DOI: 10.1093/NDT/GFG625
Publisher: Elsevier BV
Date: 11-2011
DOI: 10.1053/J.AJKD.2011.04.027
Abstract: There is a resurgence of interest in home hemodialysis (HD), especially frequent or extended forms involving unconventionally frequent (>3 times/wk) and/or long (>6 hours) treatments. This resurgence is driven by cost containment and experience suggesting lower mortality risk compared with facility HD and peritoneal dialysis (PD). We performed an observational cohort study using the Australia and New Zealand Dialysis and Transplant Registry, using marginal structural modeling to adjust for time-varying medical comorbidity as both a source of selection bias and an intermediary variable on the causal pathway to death. All adult patients starting renal replacement therapy in Australia and New Zealand since March 31, 1996, followed up to December 31, 2007. The main predictor was dialysis modality (conventional facility HD, conventional home HD, frequent/extended facility HD, frequent/extended home HD, and PD). We adjusted for the confounding effects of patient demographics and comorbid conditions. Patient mortality. We analyzed 26,016 patients with 856,007 patient-months of follow-up. Relative to conventional facility HD, adjusted mortality HRs were 0.51 (95% CI, 0.44-0.59) for conventional home HD, 1.16 (95% CI, 0.94-1.44) for frequent/extended facility HD, 0.53 (95% CI, 0.41-0.68) for frequent/extended home HD, and 1.10 (95% CI, 1.06-1.16) for PD. The apparent benefit of home HD on mortality risk was less for patients who were nonwhite, non-Asian, and older. Potential for residual confounding from the limited collection of comorbid conditions (no collection of cognitive or motor impairment, depression, left ventricular volume or structure, or blood pressure/fluid volume status) and lack of socioeconomic, medication, and biochemical data in analyses. Our study supports a survival advantage of home HD without a difference between conventional and frequent/extended modalities. Suitably designed clinical trials of frequent/extended HD are needed to determine the presence and extent of mortality benefit with this modality.
Location: United States of America
Location: New Zealand
Location: United States of America
Start Date: 2013
End Date: 2014
Funder: Auckland Medical Research Foundation
View Funded ActivityStart Date: 2004
End Date: 2006
Funder: Auckland Medical Research Foundation
View Funded ActivityStart Date: 2003
End Date: 2003
Funder: Kidney Health New Zealand
View Funded ActivityStart Date: 2004
End Date: 2005
Funder: Kidney Health New Zealand
View Funded ActivityStart Date: 2004
End Date: 2005
Funder: Maurice and Phyllis Paykel Trust
View Funded ActivityStart Date: 2014
End Date: 2018
Funder: Ministry of Health, New Zealand
View Funded ActivityStart Date: 2010
End Date: 2012
Funder: Maurice and Phyllis Paykel Trust
View Funded ActivityStart Date: 2013
End Date: 2015
Funder: Royal Australasian College of Physicians
View Funded ActivityStart Date: 2010
End Date: 2011
Funder: Health Research Council of New Zealand
View Funded ActivityStart Date: 2010
End Date: 2013
Funder: Health Research Council of New Zealand
View Funded ActivityStart Date: 2017
End Date: 2021
Funder: National Health and Medical Research Council
View Funded Activity